Sample records for building dependable distributed

  1. Frequency distributions and correlations of solar X-ray flare parameters

    NASA Technical Reports Server (NTRS)

    Crosby, Norma B.; Aschwanden, Markus J.; Dennis, Brian R.

    1993-01-01

    Frequency distributions of flare parameters are determined from over 12,000 solar flares. The flare duration, the peak counting rate, the peak hard X-ray flux, the total energy in electrons, and the peak energy flux in electrons are among the parameters studied. Linear regression fits, as well as the slopes of the frequency distributions, are used to determine the correlations between these parameters. The relationship between the variations of the frequency distributions and the solar activity cycle is also investigated. Theoretical models for the frequency distribution of flare parameters are dependent on the probability of flaring and the temporal evolution of the flare energy build-up. The results of this study are consistent with stochastic flaring and exponential energy build-up. The average build-up time constant is found to be 0.5 times the mean time between flares.

  2. Pion distribution amplitude and quasidistributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radyushkin, Anatoly V.

    2017-03-27

    We extend our analysis of quasidistributions onto the pion distribution amplitude. Using the formalism of parton virtuality distribution amplitudes, we establish a connection between the pion transverse momentum dependent distribution amplitude Ψ(x,k 2 ⊥) and the pion quasidistribution amplitude (QDA) Q π(y,p 3). We build models for the QDAs from the virtuality-distribution-amplitude-based models for soft transverse momentum dependent distribution amplitudes, and analyze the p3 dependence of the resulting QDAs. As there are many models claimed to describe the primordial shape of the pion distribution amplitude, we present the p 3-evolution patterns for models producing some popular proposals: Chernyak-Zhitnitsky, flat, andmore » asymptotic distribution amplitude. Finally, our results may be used as a guide for future studies of the pion distribution amplitude on the lattice using the quasidistribution approach.« less

  3. Nonperturbative evolution of parton quasi-distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radyushkin, A. V.

    2017-02-14

    Using the formalism of parton virtuality distribution functions (VDFs) we establish a connection between the transverse momentum dependent distributions (TMDs) F(x,k ⊥ 2) and quasi-distributions (PQDs) Q(y,p 3) introduced recently by X. Ji for lattice QCD extraction of parton distributions f(x). We build models for PQDs from the VDF-based models for soft TMDs, and analyze the p 3 dependence of the resulting PQDs. We observe a strong nonperturbative evolution of PQDs for small and moderately large values of p 3 reflecting the transverse momentum dependence of TMDs. Furthermore, the study of PQDs on the lattice in the domain of strongmore » nonperturbative effects opens a new perspective for investigation of the 3-dimensional hadron structure.« less

  4. WASTE HANDLING BUILDING ELECTRICAL SYSTEM DESCRIPTION DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S.C. Khamamkar

    2000-06-23

    The Waste Handling Building Electrical System performs the function of receiving, distributing, transforming, monitoring, and controlling AC and DC power to all waste handling building electrical loads. The system distributes normal electrical power to support all loads that are within the Waste Handling Building (WHB). The system also generates and distributes emergency power to support designated emergency loads within the WHB within specified time limits. The system provides the capability to transfer between normal and emergency power. The system provides emergency power via independent and physically separated distribution feeds from the normal supply. The designated emergency electrical equipment will bemore » designed to operate during and after design basis events (DBEs). The system also provides lighting, grounding, and lightning protection for the Waste Handling Building. The system is located in the Waste Handling Building System. The system consists of a diesel generator, power distribution cables, transformers, switch gear, motor controllers, power panel boards, lighting panel boards, lighting equipment, lightning protection equipment, control cabling, and grounding system. Emergency power is generated with a diesel generator located in a QL-2 structure and connected to the QL-2 bus. The Waste Handling Building Electrical System distributes and controls primary power to acceptable industry standards, and with a dependability compatible with waste handling building reliability objectives for non-safety electrical loads. It also generates and distributes emergency power to the designated emergency loads. The Waste Handling Building Electrical System receives power from the Site Electrical Power System. The primary material handling power interfaces include the Carrier/Cask Handling System, Canister Transfer System, Assembly Transfer System, Waste Package Remediation System, and Disposal Container Handling Systems. The system interfaces with the MGR Operations Monitoring and Control System for supervisory monitoring and control signals. The system interfaces with all facility support loads such as heating, ventilation, and air conditioning, office, fire protection, monitoring and control, safeguards and security, and communications subsystems.« less

  5. Target mass effects in parton quasi-distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radyushkin, A. V.

    We study the impact of non-zero (and apparently large) value of the nucleon mass M on the shape of parton quasi-distributions Q(y,p 3), in particular on its change with the change of the nucleon momentum p 3. We observe that the usual target-mass corrections induced by the M-dependence of the twist-2 operators are rather small. Moreover, we show that within the framework based on parametrizations by transverse momentum dependent distribution functions (TMDs) these corrections are canceled by higher-twist contributions. Lastly, we identify a novel source of kinematic target-mass dependence of TMDs and build models corrected for such dependence. We findmore » that resulting changes may be safely neglected for p 3≳2M.« less

  6. Target mass effects in parton quasi-distributions

    DOE PAGES

    Radyushkin, A. V.

    2017-05-11

    We study the impact of non-zero (and apparently large) value of the nucleon mass M on the shape of parton quasi-distributions Q(y,p 3), in particular on its change with the change of the nucleon momentum p 3. We observe that the usual target-mass corrections induced by the M-dependence of the twist-2 operators are rather small. Moreover, we show that within the framework based on parametrizations by transverse momentum dependent distribution functions (TMDs) these corrections are canceled by higher-twist contributions. Lastly, we identify a novel source of kinematic target-mass dependence of TMDs and build models corrected for such dependence. We findmore » that resulting changes may be safely neglected for p 3≳2M.« less

  7. Experimental Analysis of Steel Beams Subjected to Fire Enhanced by Brillouin Scattering-Based Fiber Optic Sensor Data.

    PubMed

    Bao, Yi; Chen, Yizheng; Hoehler, Matthew S; Smith, Christopher M; Bundy, Matthew; Chen, Genda

    2017-01-01

    This paper presents high temperature measurements using a Brillouin scattering-based fiber optic sensor and the application of the measured temperatures and building code recommended material parameters into enhanced thermomechanical analysis of simply supported steel beams subjected to combined thermal and mechanical loading. The distributed temperature sensor captures detailed, nonuniform temperature distributions that are compared locally with thermocouple measurements with less than 4.7% average difference at 95% confidence level. The simulated strains and deflections are validated using measurements from a second distributed fiber optic (strain) sensor and two linear potentiometers, respectively. The results demonstrate that the temperature-dependent material properties specified in the four investigated building codes lead to strain predictions with less than 13% average error at 95% confidence level and that the Europe building code provided the best predictions. However, the implicit consideration of creep in Europe is insufficient when the beam temperature exceeds 800°C.

  8. Wind interference effect on an octagonal plan shaped tall building due to square plan shaped tall buildings

    NASA Astrophysics Data System (ADS)

    Kar, Rony; Dalui, Sujit Kumar

    2016-03-01

    The variation of pressure at the faces of the octagonal plan shaped tall building due to interference of three square plan shaped tall building of same height is analysed by computational fluid dynamics module, namely ANSYS CFX for 0° wind incidence angle only. All the buildings are closely spaced (distance between two buildings varies from 0.4 h to 2 h, where h is the height of the building). Different cases depending upon the various positions of the square plan shaped buildings are analysed and compared with the octagonal plan shaped building in isolated condition. The comparison is presented in the form of interference factors (IF) and IF contours. Abnormal pressure distribution is observed in some cases. Shielding and channelling effect on the octagonal plan shaped building due to the presence of the interfering buildings are also noted. In the interfering condition the pressure distribution at the faces of the octagonal plan shaped building is not predictable. As the distance between the principal octagonal plan shaped building and the third square plan shaped interfering building increases the behaviour of faces becomes more systematic. The coefficient of pressure (C p) for each face of the octagonal plan shaped building in each interfering case can be easily found if we multiply the IF with the C p in the isolated case.

  9. ATLAS software configuration and build tool optimisation

    NASA Astrophysics Data System (ADS)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.

  10. Control of dispatch dynamics for lowering the cost of distributed generation in the built environment

    NASA Astrophysics Data System (ADS)

    Flores, Robert Joseph

    Distributed generation can provide many benefits over traditional central generation such as increased reliability and efficiency while reducing emissions. Despite these potential benefits, distributed generation is generally not purchased unless it reduces energy costs. Economic dispatch strategies can be designed such that distributed generation technologies reduce overall facility energy costs. In this thesis, a microturbine generator is dispatched using different economic control strategies, reducing the cost of energy to the facility. Several industrial and commercial facilities are simulated using acquired electrical, heating, and cooling load data. Industrial and commercial utility rate structures are modeled after Southern California Edison and Southern California Gas Company tariffs and used to find energy costs for the simulated buildings and corresponding microturbine dispatch. Using these control strategies, building models, and utility rate models, a parametric study examining various generator characteristics is performed. An economic assessment of the distributed generation is then performed for both the microturbine generator and parametric study. Without the ability to export electricity to the grid, the economic value of distributed generation is limited to reducing the individual costs that make up the cost of energy for a building. Any economic dispatch strategy must be built to reduce these individual costs. While the ability of distributed generation to reduce cost depends of factors such as electrical efficiency and operations and maintenance cost, the building energy demand being serviced has a strong effect on cost reduction. Buildings with low load factors can accept distributed generation with higher operating costs (low electrical efficiency and/or high operations and maintenance cost) due to the value of demand reduction. As load factor increases, lower operating cost generators are desired due to a larger portion of the building load being met in an effort to reduce demand. In addition, buildings with large thermal demand have access to the least expensive natural gas, lowering the cost of operating distributed generation. Recovery of exhaust heat from DG reduces cost only if the buildings thermal demand coincides with the electrical demand. Capacity limits exist where annual savings from operation of distributed generation decrease if further generation is installed. For low operating cost generators, the approximate limit is the average building load. This limit decreases as operating costs increase. In addition, a high capital cost of distributed generation can be accepted if generator operating costs are low. As generator operating costs increase, capital cost must decrease if a positive economic performance is desired.

  11. Experimental Analysis of Steel Beams Subjected to Fire Enhanced by Brillouin Scattering-Based Fiber Optic Sensor Data

    PubMed Central

    Bao, Yi; Chen, Yizheng; Hoehler, Matthew S.; Smith, Christopher M.; Bundy, Matthew; Chen, Genda

    2016-01-01

    This paper presents high temperature measurements using a Brillouin scattering-based fiber optic sensor and the application of the measured temperatures and building code recommended material parameters into enhanced thermomechanical analysis of simply supported steel beams subjected to combined thermal and mechanical loading. The distributed temperature sensor captures detailed, nonuniform temperature distributions that are compared locally with thermocouple measurements with less than 4.7% average difference at 95% confidence level. The simulated strains and deflections are validated using measurements from a second distributed fiber optic (strain) sensor and two linear potentiometers, respectively. The results demonstrate that the temperature-dependent material properties specified in the four investigated building codes lead to strain predictions with less than 13% average error at 95% confidence level and that the Europe building code provided the best predictions. However, the implicit consideration of creep in Europe is insufficient when the beam temperature exceeds 800°C. PMID:28239230

  12. ETICS: the international software engineering service for the grid

    NASA Astrophysics Data System (ADS)

    Meglio, A. D.; Bégin, M.-E.; Couvares, P.; Ronchieri, E.; Takacs, E.

    2008-07-01

    The ETICS system is a distributed software configuration, build and test system designed to fulfil the needs of improving the quality, reliability and interoperability of distributed software in general and grid software in particular. The ETICS project is a consortium of five partners (CERN, INFN, Engineering Ingegneria Informatica, 4D Soft and the University of Wisconsin-Madison). The ETICS service consists of a build and test job execution system based on the Metronome software and an integrated set of web services and software engineering tools to design, maintain and control build and test scenarios. The ETICS system allows taking into account complex dependencies among applications and middleware components and provides a rich environment to perform static and dynamic analysis of the software and execute deployment, system and interoperability tests. This paper gives an overview of the system architecture and functionality set and then describes how the EC-funded EGEE, DILIGENT and OMII-Europe projects are using the software engineering services to build, validate and distribute their software. Finally a number of significant use and test cases will be described to show how ETICS can be used in particular to perform interoperability tests of grid middleware using the grid itself.

  13. Dynamic Rate Dependent Elastoplastic Damage Modeling of Concrete Subject to Blast Loading: Formulation and Computational Aspects

    DTIC Science & Technology

    1990-10-31

    and ZIP Code) Dept. of Civil Fng. & Oper. Research Building 410 Princeton University Bolling AFB, D.C. 20332-6448 ?rinceton, NJ 08544 i. NAME OF FUNDING...Department of Civil Engineering & Operations Research Princeton University Princeton, N.J. 08544 31 October 1990 Final Technical Report Approved for public...release; Distribution is unlimited. I Prepared for Air Force Office of Scientific Research Building 410 Boiling AFB, D.C. 20332-6448 PREFACE This work

  14. Building Complex Kondo Impurities by Manipulating Entangled Spin Chains.

    PubMed

    Choi, Deung-Jang; Robles, Roberto; Yan, Shichao; Burgess, Jacob A J; Rolf-Pissarczyk, Steffen; Gauyacq, Jean-Pierre; Lorente, Nicolás; Ternes, Markus; Loth, Sebastian

    2017-10-11

    The creation of molecule-like structures in which magnetic atoms interact controllably is full of potential for the study of complex or strongly correlated systems. Here, we create spin chains in which a strongly correlated Kondo state emerges from magnetic coupling of transition-metal atoms. We build chains up to ten atoms in length by placing Fe and Mn atoms on a Cu 2 N surface with a scanning tunneling microscope. The atoms couple antiferromagnetically via superexchange interaction through the nitrogen atom network of the surface. The emergent Kondo resonance is spatially distributed along the chain. Its strength can be controlled by mixing atoms of different transition metal elements and manipulating their spatial distribution. We show that the Kondo screening of the full chain by the electrons of the nonmagnetic substrate depends on the interatomic entanglement of the spins in the chain, demonstrating the prerequisites to build and probe spatially extended strongly correlated nanostructures.

  15. Toward an operational tool to simulate green roof hydrological impact at the basin scale: a new version of the distributed rainfall-runoff model Multi-Hydro.

    PubMed

    Versini, Pierre-Antoine; Gires, Auguste; Tchinguirinskaia, Ioulia; Schertzer, Daniel

    2016-10-01

    Currently widespread in new urban projects, green roofs have shown a positive impact on urban runoff at the building scale: decrease and slow-down of the peak discharge, and decrease of runoff volume. The present work aims to study their possible impact at the catchment scale, more compatible with stormwater management issues. For this purpose, a specific module dedicated to simulating the hydrological behaviour of a green roof has been developed in the distributed rainfall-runoff model (Multi-Hydro). It has been applied on a French urban catchment where most of the building roofs are flat and assumed to accept the implementation of a green roof. Catchment responses to several rainfall events covering a wide range of meteorological situations have been simulated. The simulation results show green roofs can significantly reduce runoff volume and the magnitude of peak discharge (up to 80%) depending on the rainfall event and initial saturation of the substrate. Additional tests have been made to assess the susceptibility of this response regarding both spatial distributions of green roofs and precipitation. It appears that the total area of greened roofs is more important than their locations. On the other hand, peak discharge reduction seems to be clearly dependent on spatial distribution of precipitation.

  16. Global distribution of urban parameters derived from high-resolution global datasets for weather modelling

    NASA Astrophysics Data System (ADS)

    Kawano, N.; Varquez, A. C. G.; Dong, Y.; Kanda, M.

    2016-12-01

    Numerical model such as Weather Research and Forecasting model coupled with single-layer Urban Canopy Model (WRF-UCM) is one of the powerful tools to investigate urban heat island. Urban parameters such as average building height (Have), plain area index (λp) and frontal area index (λf), are necessary inputs for the model. In general, these parameters are uniformly assumed in WRF-UCM but this leads to unrealistic urban representation. Distributed urban parameters can also be incorporated into WRF-UCM to consider a detail urban effect. The problem is that distributed building information is not readily available for most megacities especially in developing countries. Furthermore, acquiring real building parameters often require huge amount of time and money. In this study, we investigated the potential of using globally available satellite-captured datasets for the estimation of the parameters, Have, λp, and λf. Global datasets comprised of high spatial resolution population dataset (LandScan by Oak Ridge National Laboratory), nighttime lights (NOAA), and vegetation fraction (NASA). True samples of Have, λp, and λf were acquired from actual building footprints from satellite images and 3D building database of Tokyo, New York, Paris, Melbourne, Istanbul, Jakarta and so on. Regression equations were then derived from the block-averaging of spatial pairs of real parameters and global datasets. Results show that two regression curves to estimate Have and λf from the combination of population and nightlight are necessary depending on the city's level of development. An index which can be used to decide which equation to use for a city is the Gross Domestic Product (GDP). On the other hand, λphas less dependence on GDP but indicated a negative relationship to vegetation fraction. Finally, a simplified but precise approximation of urban parameters through readily-available, high-resolution global datasets and our derived regressions can be utilized to estimate a global distribution of urban parameters for later incorporation into a weather model, thus allowing us to acquire a global understanding of urban climate (Global Urban Climatology). Acknowledgment: This research was supported by the Environment Research and Technology Development Fund (S-14) of the Ministry of the Environment, Japan.

  17. Solar potential scaling and the urban road network topology

    NASA Astrophysics Data System (ADS)

    Najem, Sara

    2017-01-01

    We explore the scaling of cities' solar potentials with their number of buildings and reveal a latent dependence between the solar potential and the length of the corresponding city's road network. This scaling is shown to be valid at the grid and block levels and is attributed to a common street length distribution. Additionally, we compute the buildings' solar potential correlation function and length in order to determine the set of critical exponents typifying the urban solar potential universality class.

  18. Exploring similarities among many species distributions

    USGS Publications Warehouse

    Simmerman, Scott; Wang, Jingyuan; Osborne, James; Shook, Kimberly; Huang, Jian; Godsoe, William; Simons, Theodore R.

    2012-01-01

    Collecting species presence data and then building models to predict species distribution has been long practiced in the field of ecology for the purpose of improving our understanding of species relationships with each other and with the environment. Due to limitations of computing power as well as limited means of using modeling software on HPC facilities, past species distribution studies have been unable to fully explore diverse data sets. We build a system that can, for the first time to our knowledge, leverage HPC to support effective exploration of species similarities in distribution as well as their dependencies on common environmental conditions. Our system can also compute and reveal uncertainties in the modeling results enabling domain experts to make informed judgments about the data. Our work was motivated by and centered around data collection efforts within the Great Smoky Mountains National Park that date back to the 1940s. Our findings present new research opportunities in ecology and produce actionable field-work items for biodiversity management personnel to include in their planning of daily management activities.

  19. Hot wet spots of Swiss buildings - detecting clusters of flood exposure

    NASA Astrophysics Data System (ADS)

    Röthlisberger, Veronika; Zischg, Andreas; Keiler, Margreth

    2016-04-01

    Where are the hotspots of flood exposure in Switzerland? There is no single answer but rather a wide range of findings depending on the databases and methods used. In principle, the analysis of flood exposure is the overlay of two spatial datasets, one on flood hazard and one on assets, e.g. buildings. The presented study aims to test a new developed approach which is based on public available Swiss data. On the hazard side, these are two different types of flood hazard maps each representing a similar return period beyond the dimensioning of structural protection systems. When it comes to assets we use nationwide harmonized data on building, namely a complete dataset of building polygons to which we assign features as volume, residents and monetary value. For the latter we apply findings of multivariate analyses of insurance data. By overlaying building polygons with the flood hazard map we identify the exposed buildings. We analyse the resulting spatial distribution of flood exposure at different levels of scales (local to regional) using administrative units (e.g. municipalities) but also artificial grids with a corresponding size (e.g. 5 000 m). The presentation focuses on the identification of hotspots highlighting the influence of the applied data and methods, e.g. local scan statistics testing intensities within and without potential clusters or log relative exposure surfaces based on kernel intensity estimates. We find a major difference of identified hotspots between absolute values and normalized values of exposure. Whereas the hotspots of flood exposure in absolute figures mirrors the underlying distribution of buildings, the hotspots of flood exposure ratios show very different pictures. We conclude that findings on flood exposure vary depending on the data and moreover the methods used and therefore need to be communicated carefully and appropriate to different stakeholders who may use the information for decision making on flood risk management.

  20. Statistical distribution of building lot frontage: application for Tokyo downtown districts

    NASA Astrophysics Data System (ADS)

    Usui, Hiroyuki

    2018-03-01

    The frontage of a building lot is the determinant factor of the residential environment. The statistical distribution of building lot frontages shows how the perimeters of urban blocks are shared by building lots for a given density of buildings and roads. For practitioners in urban planning, this is indispensable to identify potential districts which comprise a high percentage of building lots with narrow frontage after subdivision and to reconsider the appropriate criteria for the density of buildings and roads as residential environment indices. In the literature, however, the statistical distribution of building lot frontages and the density of buildings and roads has not been fully researched. In this paper, based on the empirical study in the downtown districts of Tokyo, it is found that (1) a log-normal distribution fits the observed distribution of building lot frontages better than a gamma distribution, which is the model of the size distribution of Poisson Voronoi cells on closed curves; (2) the statistical distribution of building lot frontages statistically follows a log-normal distribution, whose parameters are the gross building density, road density, average road width, the coefficient of variation of building lot frontage, and the ratio of the number of building lot frontages to the number of buildings; and (3) the values of the coefficient of variation of building lot frontages, and that of the ratio of the number of building lot frontages to that of buildings are approximately equal to 0.60 and 1.19, respectively.

  1. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  2. New frontier, new power: the retail environment in Australia's dark market.

    PubMed

    Carter, S M

    2003-12-01

    To investigate the role of the retail environment in cigarette marketing in Australia, one of the "darkest" markets in the world. Analysis of 172 tobacco industry documents; and articles and advertisements found by hand searching Australia's three leading retail trade journals. As Australian cigarette marketing was increasingly restricted, the retail environment became the primary communication vehicle for building cigarette brands. When retail marketing was restricted, the industry conceded only incrementally and under duress, and at times continues to break the law. The tobacco industry targets retailers via trade promotional expenditure, financial and practical assistance with point of sale marketing, alliance building, brand advertising, and distribution. Cigarette brand advertising in retail magazines are designed to build brand identities. Philip Morris and British American Tobacco are now competing to control distribution of all products to retailers, placing themselves at the heart of retail business. Cigarette companies prize retail marketing in Australia's dark market. Stringent point of sale marketing restrictions should be included in any comprehensive tobacco control measures. Relationships between retailers and the industry will be more difficult to regulate. Retail press advertising and trade promotional expenditure could be banned. In-store marketing assistance, retail-tobacco industry alliance building, and new electronic retail distribution systems may be less amenable to regulation. Alliances between the health and retail sectors and financial support for a move away from retail dependence on tobacco may be necessary to effect cultural change.

  3. Downscaling NASA Climatological Data to Produce Detailed Climate Zone Maps

    NASA Technical Reports Server (NTRS)

    Chandler, William S.; Hoell, James M.; Westberg, David J.; Whitlock, Charles H.; Zhang, Taiping; Stackhouse, P. W.

    2011-01-01

    The design of energy efficient sustainable buildings is heavily dependent on accurate long-term and near real-time local weather data. To varying degrees the current meteorological networks over the globe have been used to provide these data albeit often from sites far removed from the desired location. The national need is for access to weather and solar resource data accurate enough to use to develop preliminary building designs within a short proposal time limit, usually within 60 days. The NASA Prediction Of Worldwide Energy Resource (POWER) project was established by NASA to provide industry friendly access to globally distributed solar and meteorological data. As a result, the POWER web site (power.larc.nasa.gov) now provides global information on many renewable energy parameters and several buildings-related items but at a relatively coarse resolution. This paper describes a method of downscaling NASA atmospheric assimilation model results to higher resolution and maps those parameters to produce building climate zone maps using estimates of temperature and precipitation. The distribution of climate zones for North America with an emphasis on the Pacific Northwest for just one year shows very good correspondence to the currently defined distribution. The method has the potential to provide a consistent procedure for deriving climate zone information on a global basis that can be assessed for variability and updated more regularly.

  4. Role of City Texture in Urban Heat Islands at Nighttime

    NASA Astrophysics Data System (ADS)

    Sobstyl, J. M.; Emig, T.; Qomi, M. J. Abdolhosseini; Ulm, F.-J.; Pellenq, R. J.-M.

    2018-03-01

    An urban heat island (UHI) is a climate phenomenon that results in an increased air temperature in cities when compared to their rural surroundings. In this Letter, the dependence of an UHI on urban geometry is studied. Multiyear urban-rural temperature differences and building footprints data combined with a heat radiation scaling model are used to demonstrate for more than 50 cities worldwide that city texture—measured by a building distribution function and the sky view factor—explains city-to-city variations in nocturnal UHIs. Our results show a strong correlation between nocturnal UHIs and the city texture.

  5. Role of City Texture in Urban Heat Islands at Nighttime.

    PubMed

    Sobstyl, J M; Emig, T; Qomi, M J Abdolhosseini; Ulm, F-J; Pellenq, R J-M

    2018-03-09

    An urban heat island (UHI) is a climate phenomenon that results in an increased air temperature in cities when compared to their rural surroundings. In this Letter, the dependence of an UHI on urban geometry is studied. Multiyear urban-rural temperature differences and building footprints data combined with a heat radiation scaling model are used to demonstrate for more than 50 cities worldwide that city texture-measured by a building distribution function and the sky view factor-explains city-to-city variations in nocturnal UHIs. Our results show a strong correlation between nocturnal UHIs and the city texture.

  6. Morphological plasticity of the depth generalist coral, Montastraea cavernosa, on mesophotic reefs in Bermuda.

    PubMed

    Goodbody-Gringley, Gretchen; Waletich, Justin

    2018-04-02

    Scleractinian corals have global ecological, structural, social, and economic importance that is disproportionately large relative to their areal extent. These reef building corals form the architectural framework for shallow water tropical reef systems, supporting the most productive and biologically diverse marine ecosystems on Earth (Veron, 1995). Reef-building scleractinian species are dependent on photosynthetic products supplied by symbiotic zooxanthellae of the genus Symbiodinium, restricting their distribution to the photic zone (Stambler, 2011). This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  7. A simulation-based efficiency comparison of AC and DC power distribution networks in commercial buildings

    DOE PAGES

    Gerber, Daniel L.; Vossos, Vagelis; Feng, Wei; ...

    2017-06-12

    Direct current (DC) power distribution has recently gained traction in buildings research due to the proliferation of on-site electricity generation and battery storage, and an increasing prevalence of internal DC loads. The research discussed in this paper uses Modelica-based simulation to compare the efficiency of DC building power distribution with an equivalent alternating current (AC) distribution. The buildings are all modeled with solar generation, battery storage, and loads that are representative of the most efficient building technology. A variety of paramet ric simulations determine how and when DC distribution proves advantageous. These simulations also validate previous studies that use simplermore » approaches and arithmetic efficiency models. This work shows that using DC distribution can be considerably more efficient: a medium sized office building using DC distribution has an expected baseline of 12% savings, but may also save up to 18%. In these results, the baseline simulation parameters are for a zero net energy (ZNE) building that can island as a microgrid. DC is most advantageous in buildings with large solar capacity, large battery capacity, and high voltage distribution.« less

  8. A simulation-based efficiency comparison of AC and DC power distribution networks in commercial buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Daniel L.; Vossos, Vagelis; Feng, Wei

    Direct current (DC) power distribution has recently gained traction in buildings research due to the proliferation of on-site electricity generation and battery storage, and an increasing prevalence of internal DC loads. The research discussed in this paper uses Modelica-based simulation to compare the efficiency of DC building power distribution with an equivalent alternating current (AC) distribution. The buildings are all modeled with solar generation, battery storage, and loads that are representative of the most efficient building technology. A variety of paramet ric simulations determine how and when DC distribution proves advantageous. These simulations also validate previous studies that use simplermore » approaches and arithmetic efficiency models. This work shows that using DC distribution can be considerably more efficient: a medium sized office building using DC distribution has an expected baseline of 12% savings, but may also save up to 18%. In these results, the baseline simulation parameters are for a zero net energy (ZNE) building that can island as a microgrid. DC is most advantageous in buildings with large solar capacity, large battery capacity, and high voltage distribution.« less

  9. New frontier, new power: the retail environment in Australia's dark market

    PubMed Central

    Carter, S

    2003-01-01

    Objective: To investigate the role of the retail environment in cigarette marketing in Australia, one of the "darkest" markets in the world. Design: Analysis of 172 tobacco industry documents; and articles and advertisements found by hand searching Australia's three leading retail trade journals. Results: As Australian cigarette marketing was increasingly restricted, the retail environment became the primary communication vehicle for building cigarette brands. When retail marketing was restricted, the industry conceded only incrementally and under duress, and at times continues to break the law. The tobacco industry targets retailers via trade promotional expenditure, financial and practical assistance with point of sale marketing, alliance building, brand advertising, and distribution. Cigarette brand advertising in retail magazines are designed to build brand identities. Philip Morris and British American Tobacco are now competing to control distribution of all products to retailers, placing themselves at the heart of retail business. Conclusions: Cigarette companies prize retail marketing in Australia's dark market. Stringent point of sale marketing restrictions should be included in any comprehensive tobacco control measures. Relationships between retailers and the industry will be more difficult to regulate. Retail press advertising and trade promotional expenditure could be banned. In-store marketing assistance, retail–tobacco industry alliance building, and new electronic retail distribution systems may be less amenable to regulation. Alliances between the health and retail sectors and financial support for a move away from retail dependence on tobacco may be necessary to effect cultural change. PMID:14645954

  10. Loading estimates of lead, copper, cadmium, and zinc in urban runoff from specific sources.

    PubMed

    Davis, A P; Shokouhian, M; Ni, S

    2001-08-01

    Urban stormwater runoff is being recognized as a substantial source of pollutants to receiving waters. A number of investigators have found significant levels of metals in runoff from urban areas, especially in highway runoff. As an initiatory study, this work estimates lead, copper, cadmium, and zinc loadings from various sources in a developed area utilizing information available in the literature, in conjunction with controlled experimental and sampling investigations. Specific sources examined include building siding and roofs; automobile brakes, tires, and oil leakage; and wet and dry atmospheric deposition. Important sources identified are building siding for all four metals, vehicle brake emissions for copper and tire wear for zinc. Atmospheric deposition is an important source for cadmium, copper, and lead. Loadings and source distributions depend on building and automobile density assumptions and the type of materials present in the area examined. Identified important sources are targeted for future comprehensive mechanistic studies. Improved information on the metal release and distributions from the specific sources, along with detailed characterization of watershed areas will allow refinements in the predictions.

  11. Indirect estimation of signal-dependent noise with nonadaptive heterogeneous samples.

    PubMed

    Azzari, Lucio; Foi, Alessandro

    2014-08-01

    We consider the estimation of signal-dependent noise from a single image. Unlike conventional algorithms that build a scatterplot of local mean-variance pairs from either small or adaptively selected homogeneous data samples, our proposed approach relies on arbitrarily large patches of heterogeneous data extracted at random from the image. We demonstrate the feasibility of our approach through an extensive theoretical analysis based on mixture of Gaussian distributions. A prototype algorithm is also developed in order to validate the approach on simulated data as well as on real camera raw images.

  12. An authentication infrastructure for today and tomorrow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engert, D.E.

    1996-06-01

    The Open Software Foundation`s Distributed Computing Environment (OSF/DCE) was originally designed to provide a secure environment for distributed applications. By combining it with Kerberos Version 5 from MIT, it can be extended to provide network security as well. This combination can be used to build both an inter and intra organizational infrastructure while providing single sign-on for the user with overall improved security. The ESnet community of the Department of Energy is building just such an infrastructure. ESnet has modified these systems to improve their interoperability, while encouraging the developers to incorporate these changes and work more closely together tomore » continue to improve the interoperability. The success of this infrastructure depends on its flexibility to meet the needs of many applications and network security requirements. The open nature of Kerberos, combined with the vendor support of OSF/DCE, provides the infrastructure for today and tomorrow.« less

  13. Attributes of the Federal Energy Management Program's Federal Site Building Characteristics Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loper, Susan A.; Sandusky, William F.

    2010-12-31

    Typically, the Federal building stock is referred to as a group of about one-half million buildings throughout the United States. Additional information beyond this level is generally limited to distribution of that total by agency and maybe distribution of the total by state. However, additional characterization of the Federal building stock is required as the Federal sector seeks ways to implement efficiency projects to reduce energy and water use intensity as mandated by legislation and Executive Order. Using a Federal facility database that was assembled for use in a geographic information system tool, additional characterization of the Federal building stockmore » is provided including information regarding the geographical distribution of sites, building counts and percentage of total by agency, distribution of sites and building totals by agency, distribution of building count and floor space by Federal building type classification by agency, and rank ordering of sites, buildings, and floor space by state. A case study is provided regarding how the building stock has changed for the Department of Energy from 2000 through 2008.« less

  14. Reserch on Spatial and Temporal Distribution of Color Steel Building Based on Multi-Source High-Resolution Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Yang, S. W.; Ma, J. J.; Wang, J. M.

    2018-04-01

    As representative vulnerable regions of the city, dense distribution areas of temporary color steel building are a major target for control of fire risks, illegal buildings, environmental supervision, urbanization quality and enhancement for city's image. In the domestic and foreign literature, the related research mainly focuses on fire risks and violation monitoring. However, due to temporary color steel building's special characteristics, the corresponding research about temporal and spatial distribution, and influence on urban spatial form etc. has not been reported. Therefore, firstly, the paper research aim plans to extract information of large-scale color steel building from high-resolution images. Secondly, the color steel plate buildings were classified, and the spatial and temporal distribution and aggregation characteristics of small (temporary buildings) and large (factory building, warehouse, etc.) buildings were studied respectively. Thirdly, the coupling relationship between the spatial distribution of color steel plate and the spatial pattern of urban space was analysed. The results show that there is a good coupling relationship between the color steel plate building and the urban spatial form. Different types of color steel plate building represent the pattern of regional differentiation of urban space and the phased pattern of urban development.

  15. An open source platform for multi-scale spatially distributed simulations of microbial ecosystems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segre, Daniel

    2014-08-14

    The goal of this project was to develop a tool for facilitating simulation, validation and discovery of multiscale dynamical processes in microbial ecosystems. This led to the development of an open-source software platform for Computation Of Microbial Ecosystems in Time and Space (COMETS). COMETS performs spatially distributed time-dependent flux balance based simulations of microbial metabolism. Our plan involved building the software platform itself, calibrating and testing it through comparison with experimental data, and integrating simulations and experiments to address important open questions on the evolution and dynamics of cross-feeding interactions between microbial species.

  16. Effects of resource-dependent cannibalism on population size distribution and individual life history in a case-bearing caddisfly

    PubMed Central

    Okuda, Noboru

    2018-01-01

    Resource availability often determines the intensity of cannibalism, which has a considerable effect on population size distribution and individual life history. Larvae of the caddisfly Psilotreta kisoensis build portable cases from sedimentary sands and often display cannibalism. For this species, the availability of preferable case material is a critical factor that affects larval fitness, and material is locally variable depending on the underlying geology. In this study, we investigated how sand quality as a case material determines cannibalism frequency among larvae and, in turn, how the differential cannibalism frequency affects the body-size distribution and voltinism. Rearing experiments within a cohort revealed that a bimodal size distribution developed regardless of material quality. However, as the preferable material became abundant, the proportion of larger to smaller individuals increased. Consecutive experiments suggested that smaller larvae were more frequently cannibalized by larger ones and excluded from the population when preferable smooth material was abundant. This frequent cannibalism resulted in a bimodal size distribution with a significantly higher proportion of larger compared to smaller individuals. The size-dependent cannibalism was significantly suppressed when the larvae were raised in an environment with a scarcity of the preferable case material. This is probably because larvae cannot enjoy the benefit of rapid growth by cannibalism due to the difficulties in enlarging their case. At low cannibalism the growth of smaller individuals was stunted, and this was probably due to risk of cannibalism by larger individuals. This growth reduction in small individuals led to a bimodal size-distribution but with a lower proportion of larger to smaller individuals compared to at high cannibalism. A field study in two streams showed a similar size distribution of larvae as was found in the rearing experiment. The bimodal ratio has consequences for life history, since a size-bimodal population causes a cohort splitting: only larvae that were fully grown at 1 year had a univoltine life cycle, whereas larvae with a stunted growth continued their larval life for another year (semivoltine). This study suggests that availability of preferable case building material is an important factor that affects cannibalism, which in turn affects larval population size structure and cohort splitting. PMID:29466375

  17. Effects of resource-dependent cannibalism on population size distribution and individual life history in a case-bearing caddisfly.

    PubMed

    Okano, Jun-Ichi; Okuda, Noboru

    2018-01-01

    Resource availability often determines the intensity of cannibalism, which has a considerable effect on population size distribution and individual life history. Larvae of the caddisfly Psilotreta kisoensis build portable cases from sedimentary sands and often display cannibalism. For this species, the availability of preferable case material is a critical factor that affects larval fitness, and material is locally variable depending on the underlying geology. In this study, we investigated how sand quality as a case material determines cannibalism frequency among larvae and, in turn, how the differential cannibalism frequency affects the body-size distribution and voltinism. Rearing experiments within a cohort revealed that a bimodal size distribution developed regardless of material quality. However, as the preferable material became abundant, the proportion of larger to smaller individuals increased. Consecutive experiments suggested that smaller larvae were more frequently cannibalized by larger ones and excluded from the population when preferable smooth material was abundant. This frequent cannibalism resulted in a bimodal size distribution with a significantly higher proportion of larger compared to smaller individuals. The size-dependent cannibalism was significantly suppressed when the larvae were raised in an environment with a scarcity of the preferable case material. This is probably because larvae cannot enjoy the benefit of rapid growth by cannibalism due to the difficulties in enlarging their case. At low cannibalism the growth of smaller individuals was stunted, and this was probably due to risk of cannibalism by larger individuals. This growth reduction in small individuals led to a bimodal size-distribution but with a lower proportion of larger to smaller individuals compared to at high cannibalism. A field study in two streams showed a similar size distribution of larvae as was found in the rearing experiment. The bimodal ratio has consequences for life history, since a size-bimodal population causes a cohort splitting: only larvae that were fully grown at 1 year had a univoltine life cycle, whereas larvae with a stunted growth continued their larval life for another year (semivoltine). This study suggests that availability of preferable case building material is an important factor that affects cannibalism, which in turn affects larval population size structure and cohort splitting.

  18. Computational models of O-LM cells are recruited by low or high theta frequency inputs depending on h-channel distributions

    PubMed Central

    Sekulić, Vladislav; Skinner, Frances K

    2017-01-01

    Although biophysical details of inhibitory neurons are becoming known, it is challenging to map these details onto function. Oriens-lacunosum/moleculare (O-LM) cells are inhibitory cells in the hippocampus that gate information flow, firing while phase-locked to theta rhythms. We build on our existing computational model database of O-LM cells to link model with function. We place our models in high-conductance states and modulate inhibitory inputs at a wide range of frequencies. We find preferred spiking recruitment of models at high (4–9 Hz) or low (2–5 Hz) theta depending on, respectively, the presence or absence of h-channels on their dendrites. This also depends on slow delayed-rectifier potassium channels, and preferred theta ranges shift when h-channels are potentiated by cyclic AMP. Our results suggest that O-LM cells can be differentially recruited by frequency-modulated inputs depending on specific channel types and distributions. This work exposes a strategy for understanding how biophysical characteristics contribute to function. DOI: http://dx.doi.org/10.7554/eLife.22962.001 PMID:28318488

  19. Uncertainty Analysis of Power Grid Investment Capacity Based on Monte Carlo

    NASA Astrophysics Data System (ADS)

    Qin, Junsong; Liu, Bingyi; Niu, Dongxiao

    By analyzing the influence factors of the investment capacity of power grid, to depreciation cost, sales price and sales quantity, net profit, financing and GDP of the second industry as the dependent variable to build the investment capacity analysis model. After carrying out Kolmogorov-Smirnov test, get the probability distribution of each influence factor. Finally, obtained the grid investment capacity uncertainty of analysis results by Monte Carlo simulation.

  20. Colloidal polymers with controlled sequence and branching constructed from magnetic field assembled nanoparticles.

    PubMed

    Bannwarth, Markus B; Utech, Stefanie; Ebert, Sandro; Weitz, David A; Crespy, Daniel; Landfester, Katharina

    2015-03-24

    The assembly of nanoparticles into polymer-like architectures is challenging and usually requires highly defined colloidal building blocks. Here, we show that the broad size-distribution of a simple dispersion of magnetic nanocolloids can be exploited to obtain various polymer-like architectures. The particles are assembled under an external magnetic field and permanently linked by thermal sintering. The remarkable variety of polymer-analogue architectures that arises from this simple process ranges from statistical and block copolymer-like sequencing to branched chains and networks. This library of architectures can be realized by controlling the sequencing of the particles and the junction points via a size-dependent self-assembly of the single building blocks.

  1. Local geology controlled the feasibility of vitrifying Iron Age buildings.

    PubMed

    Wadsworth, Fabian B; Heap, Michael J; Damby, David E; Hess, Kai-Uwe; Najorka, Jens; Vasseur, Jérémie; Fahrner, Dominik; Dingwell, Donald B

    2017-01-12

    During European prehistory, hilltop enclosures made from polydisperse particle-and-block stone walling were exposed to temperatures sufficient to partially melt the constituent stonework, leading to the preservation of glassy walls called 'vitrified forts'. During vitrification, the granular wall rocks partially melt, sinter viscously and densify, reducing inter-particle porosity. This process is strongly dependent on the solidus temperature, the particle sizes, the temperature-dependence of the viscosity of the evolving liquid phase, as well as the distribution and longevity of heat. Examination of the sintering behaviour of 45 European examples reveals that it is the raw building material that governs the vitrification efficiency. As Iron Age forts were commonly constructed from local stone, we conclude that local geology directly influenced the degree to which buildings were vitrified in the Iron Age. Additionally, we find that vitrification is accompanied by a bulk material strengthening of the aggregates of small sizes, and a partial weakening of larger blocks. We discuss these findings in the context of the debate surrounding the motive of the wall-builders. We conclude that if wall stability by bulk strengthening was the desired effect, then vitrification represents an Iron Age technology that failed to be effective in regions of refractory local geology.

  2. Implementation and characterization of a stable optical frequency distribution system.

    PubMed

    Bernhardt, Birgitta; Hänsch, Theodor W; Holzwarth, Ronald

    2009-09-14

    An optical frequency distribution system has been developed that continuously delivers a stable optical frequency of 268 THz (corresponding to a wavelength of 1118 nm) to different experiments in our institute. For that purpose, a continuous wave (cw) fiber laser has been stabilized onto a frequency comb and distributed across the building by the use of a fiber network. While the light propagates through the fiber, acoustic and thermal effects counteract against the stability and accuracy of the system. However, by employing proper stabilization methods a stability of 2 x 10(-13) tau(-1/2) is achieved, limited by the available radio frequency (RF) reference. Furthermore, the issue of counter-dependant results of the Allan deviation was examined during the data evaluation.

  3. Estimation of the Relationship Between Remotely Sensed Anthropogenic Heat Discharge and Building Energy Use

    NASA Technical Reports Server (NTRS)

    Zhou, Yuyu; Weng, Qihao; Gurney, Kevin R.; Shuai, Yanmin; Hu, Xuefei

    2012-01-01

    This paper examined the relationship between remotely sensed anthropogenic heat discharge and energy use from residential and commercial buildings across multiple scales in the city of Indianapolis, Indiana, USA. The anthropogenic heat discharge was estimated with a remote sensing-based surface energy balance model, which was parameterized using land cover, land surface temperature, albedo, and meteorological data. The building energy use was estimated using a GIS-based building energy simulation model in conjunction with Department of Energy/Energy Information Administration survey data, the Assessor's parcel data, GIS floor areas data, and remote sensing-derived building height data. The spatial patterns of anthropogenic heat discharge and energy use from residential and commercial buildings were analyzed and compared. Quantitative relationships were evaluated across multiple scales from pixel aggregation to census block. The results indicate that anthropogenic heat discharge is consistent with building energy use in terms of the spatial pattern, and that building energy use accounts for a significant fraction of anthropogenic heat discharge. The research also implies that the relationship between anthropogenic heat discharge and building energy use is scale-dependent. The simultaneous estimation of anthropogenic heat discharge and building energy use via two independent methods improves the understanding of the surface energy balance in an urban landscape. The anthropogenic heat discharge derived from remote sensing and meteorological data may be able to serve as a spatial distribution proxy for spatially-resolved building energy use, and even for fossil-fuel CO2 emissions if additional factors are considered.

  4. Driven translocation of Polymer through a nanopore: effect of heterogeneous flexibility

    NASA Astrophysics Data System (ADS)

    Adhikari, Ramesh; Bhattacharya, Aniket

    2014-03-01

    We have studied translocation of a model bead-spring polymer through a nanopore whose building blocks consist of alternate stiff and flexible segments and variable elastic bond potentials. For the case of uniform spring potential translocation of a symmetric periodic stiff-flexible chain of contour length N and segment length m (mod(N,2m)=0), we find that the end-to-end distance and the mean first passage time (MFPT) have weak dependence on the length m. The characteristic periodic pattern of the waiting time distribution captures the stiff and flexible segments of the chain with stiff segments taking longer time to translocate. But when we vary both the elastic bond energy, and the bending energy, as well as the length of stiff/flexible segments, we discover novel patterns in the waiting time distribution which brings out structural information of the building blocks of the translocating chain. Partially supported by UCF Office of Research and Commercialization & College of Science SEED grant.

  5. Design and Implementation of Geothermal Energy Systems at West Chester University

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, James

    West Chester University has launched a comprehensive transformation of its campus heating and cooling systems from traditional fossil fuels to geothermal. This change will significantly decrease the institution's carbon footprint and serve as a national model for green campus efforts. The institution has designed a phased series of projects to build a district geo-exchange system with shared well fields, central pumping station and distribution piping to provide the geo-exchange water to campus buildings as their internal building HVAC systems are changed to be able to use the geo-exchange water. This project addresses the US Department of Energy Office of Energymore » Efficiency and Renewable Energy (EERE) goal to invest in clean energy technologies that strengthen the economy, protect the environment, and reduce dependence on foreign oil. In addition, this project advances EERE's efforts to establish geothermal energy as an economically competitive contributor to the US energy supply.« less

  6. A method for calculating the dose to a multi-storey building due to radiation scattered from the roof of an adjacent radiotherapy facility.

    PubMed

    Zavgorodni, S F

    2001-09-01

    With modern urbanization trends, situations occur where a general-purpose multi-storey building would have to be constructed adjacent to a radiotherapy facility. In cases where the building would not be in the primary x-ray beam, "skyshine" radiation is normally accounted for. The radiation scattered from the roof side-wise towards the building can also be a major contributing factor. However, neither the NCRP reports nor recently published literature considered this. The current paper presents a simple formula to calculate the dose contribution from scattered radiation in such circumstances. This equation includes workload, roof thickness, field size, distance to the reference point and a normalized angular photon distribution function f(theta), where theta is the angle between central axis of the primary beam and photon direction. The latter was calculated by the Monte Carlo method (EGS4 code) for each treatment machine in our department. For angles theta exceeding approximately 20 degrees (i.e., outside the primary beam and its penumbra) the angular distribution function f(theta) was found to have little dependence on the shielding barrier thickness and the beam energy. An analytical approximation of this function has been obtained. Measurements have been performed to verify this calculation technique. An agreement within 40% was found between calculated and measured dose rates. The latter combined the scattered radiation and the dose from "skyshine" radiation. Some overestimation of the dose resulted from uncertainties in the radiotherapy building drawings and in evaluation of the "skyshine" contribution.

  7. Scientific approaches to science policy.

    PubMed

    Berg, Jeremy M

    2013-11-01

    The development of robust science policy depends on use of the best available data, rigorous analysis, and inclusion of a wide range of input. While director of the National Institute of General Medical Sciences (NIGMS), I took advantage of available data and emerging tools to analyze training time distribution by new NIGMS grantees, the distribution of the number of publications as a function of total annual National Institutes of Health support per investigator, and the predictive value of peer-review scores on subsequent scientific productivity. Rigorous data analysis should be used to develop new reforms and initiatives that will help build a more sustainable American biomedical research enterprise.

  8. Landslide vulnerability criteria: a case study from Umbria, central Italy.

    PubMed

    Galli, Mirco; Guzzetti, Fausto

    2007-10-01

    Little is known about the vulnerability to landslides, despite landslides causing frequent and widespread damage to the population and the built-up environment in many areas of the world. Lack of information about vulnerability to landslides limits our ability to determine landslide risk. This paper provides information on the vulnerability of buildings and roads to landslides in Umbria, central Italy. Information on 103 landslides of the slide and slide-earth flow types that have resulted in damage to buildings and roads at 90 sites in Umbria is used to establish dependencies between the area of the landslide and the vulnerability to landslides. The dependencies obtained are applied in the hills surrounding the town of Collazzone, in central Umbria, an area for which a detailed landslide inventory map is available. By exploiting the landslide inventory and the established vulnerability curves, the geographical distribution of the vulnerability to landslides is mapped and statistics of the expected damage are calculated. Reliability and limits of the vulnerability thresholds and of the obtained vulnerability assessment are discussed.

  9. Transfer Learning for Class Imbalance Problems with Inadequate Data.

    PubMed

    Al-Stouhi, Samir; Reddy, Chandan K

    2016-07-01

    A fundamental problem in data mining is to effectively build robust classifiers in the presence of skewed data distributions. Class imbalance classifiers are trained specifically for skewed distribution datasets. Existing methods assume an ample supply of training examples as a fundamental prerequisite for constructing an effective classifier. However, when sufficient data is not readily available, the development of a representative classification algorithm becomes even more difficult due to the unequal distribution between classes. We provide a unified framework that will potentially take advantage of auxiliary data using a transfer learning mechanism and simultaneously build a robust classifier to tackle this imbalance issue in the presence of few training samples in a particular target domain of interest. Transfer learning methods use auxiliary data to augment learning when training examples are not sufficient and in this paper we will develop a method that is optimized to simultaneously augment the training data and induce balance into skewed datasets. We propose a novel boosting based instance-transfer classifier with a label-dependent update mechanism that simultaneously compensates for class imbalance and incorporates samples from an auxiliary domain to improve classification. We provide theoretical and empirical validation of our method and apply to healthcare and text classification applications.

  10. Deterioration of building materials in Roman catacombs: the influence of visitors.

    PubMed

    Sanchez-Moral, S; Luque, L; Cuezva, S; Soler, V; Benavente, D; Laiz, L; Gonzalez, J M; Saiz-Jimenez, C

    2005-10-15

    In the last decades, damages on building materials and mural paintings were observed in Roman catacombs. The damages were due to extensive formation of biofilms induced by artificial illumination and humidity. Microenvironmental data (temperature, CO(2) concentration, humidity, and atmospheric pressure) clearly showed the negative influence of visitors. Increasing heat, light and water vapour condensation into corridors and cubicles favoured biofilm development. The composition of biofilms was different and depended mainly on distance to illumination sources and humidity, thus denoting the influence of light on the growth of phototrophic microorganisms in the catacombs. In addition, biofilm distribution was governed by the type of material to be colonised. This study shows that countermeasures are needed to prevent deterioration of hypogean environments.

  11. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1991-01-01

    A number of topics related to building a generalized distributed system model are discussed. The effects of distributed database modeling on evaluation of transaction rollbacks, the measurement of effects of distributed database models on transaction availability measures, and a performance analysis of static locking in replicated distributed database systems are covered.

  12. Scaling and allometry in the building geometries of Greater London

    NASA Astrophysics Data System (ADS)

    Batty, M.; Carvalho, R.; Hudson-Smith, A.; Milton, R.; Smith, D.; Steadman, P.

    2008-06-01

    Many aggregate distributions of urban activities such as city sizes reveal scaling but hardly any work exists on the properties of spatial distributions within individual cities, notwithstanding considerable knowledge about their fractal structure. We redress this here by examining scaling relationships in a world city using data on the geometric properties of individual buildings. We first summarise how power laws can be used to approximate the size distributions of buildings, in analogy to city-size distributions which have been widely studied as rank-size and lognormal distributions following Zipf [ Human Behavior and the Principle of Least Effort (Addison-Wesley, Cambridge, 1949)] and Gibrat [ Les Inégalités Économiques (Librarie du Recueil Sirey, Paris, 1931)]. We then extend this analysis to allometric relationships between buildings in terms of their different geometric size properties. We present some preliminary analysis of building heights from the Emporis database which suggests very strong scaling in world cities. The data base for Greater London is then introduced from which we extract 3.6 million buildings whose scaling properties we explore. We examine key allometric relationships between these different properties illustrating how building shape changes according to size, and we extend this analysis to the classification of buildings according to land use types. We conclude with an analysis of two-point correlation functions of building geometries which supports our non-spatial analysis of scaling.

  13. STUDIES OF COSMIC-RAY MUONS AND NEUTRONS IN A FIVE-STORY CONCRETE BUILDING.

    PubMed

    Chen, Wei-Lin; Sheu, Rong-Jiun

    2018-05-01

    This study thoroughly determined the flux and dose rate distributions of cosmic-ray muons and neutrons in a five-story concrete building by comparing measurements with Monte Carlo simulations of cosmic-ray showers. An angular-energy-dependent surface source comprising secondary muons and neutrons at a height of 200 m above ground level was established and verified, which was used to concatenate the shower development in the upper atmosphere with subsequent simulations of radiation transport down to ground level, including the effect of the terrain and studied building. A Berkeley Lab cosmic-ray detector and a highly sensitive Bonner cylinder were used to perform muon and neutron measurements on each building floor. After careful calibration and correction, the measured responses of the two detectors were discovered to be reasonably consistent with the theoretical predictions, thus confirming the validity of the two-step calculation model employed in this study. The annual effective doses from cosmic-ray muons and neutrons on the open roof of the building were estimated to be 115.2 and 35.2 μSv, respectively. Muons and neutrons were attenuated floor-by-floor with different attenuation factors of 0.97 and 0.78, and their resultant dose rates on the first floor of the building were 97.8 and 9.9 μSv, respectively.

  14. Local geology controlled the feasibility of vitrifying Iron Age buildings

    USGS Publications Warehouse

    Fabian B Wadsworth,; Michael J Heap,; Damby, David; Kai-Uwe Hess,; Jens Najorka,; Jérémie Vasseur,; Dominik Fahrner,; Donald B Dingwell,

    2017-01-01

    During European prehistory, hilltop enclosures made from polydisperse particle-and-block stone walling were exposed to temperatures sufficient to partially melt the constituent stonework, leading to the preservation of glassy walls called ‘vitrified forts’. During vitrification, the granular wall rocks partially melt, sinter viscously and densify, reducing inter-particle porosity. This process is strongly dependent on the solidus temperature, the particle sizes, the temperature-dependence of the viscosity of the evolving liquid phase, as well as the distribution and longevity of heat. Examination of the sintering behaviour of 45 European examples reveals that it is the raw building material that governs the vitrification efficiency. As Iron Age forts were commonly constructed from local stone, we conclude that local geology directly influenced the degree to which buildings were vitrified in the Iron Age. Additionally, we find that vitrification is accompanied by a bulk material strengthening of the aggregates of small sizes, and a partial weakening of larger blocks. We discuss these findings in the context of the debate surrounding the motive of the wall-builders. We conclude that if wall stability by bulk strengthening was the desired effect, then vitrification represents an Iron Age technology that failed to be effective in regions of refractory local geology.

  15. Side-information-dependent correlation channel estimation in hash-based distributed video coding.

    PubMed

    Deligiannis, Nikos; Barbarien, Joeri; Jacobs, Marc; Munteanu, Adrian; Skodras, Athanassios; Schelkens, Peter

    2012-04-01

    In the context of low-cost video encoding, distributed video coding (DVC) has recently emerged as a potential candidate for uplink-oriented applications. This paper builds on a concept of correlation channel (CC) modeling, which expresses the correlation noise as being statistically dependent on the side information (SI). Compared with classical side-information-independent (SII) noise modeling adopted in current DVC solutions, it is theoretically proven that side-information-dependent (SID) modeling improves the Wyner-Ziv coding performance. Anchored in this finding, this paper proposes a novel algorithm for online estimation of the SID CC parameters based on already decoded information. The proposed algorithm enables bit-plane-by-bit-plane successive refinement of the channel estimation leading to progressively improved accuracy. Additionally, the proposed algorithm is included in a novel DVC architecture that employs a competitive hash-based motion estimation technique to generate high-quality SI at the decoder. Experimental results corroborate our theoretical gains and validate the accuracy of the channel estimation algorithm. The performance assessment of the proposed architecture shows remarkable and consistent coding gains over a germane group of state-of-the-art distributed and standard video codecs, even under strenuous conditions, i.e., large groups of pictures and highly irregular motion content.

  16. Market dynamics and stock price volatility

    NASA Astrophysics Data System (ADS)

    Li, H.; Rosser, J. B., Jr.

    2004-06-01

    This paper presents a possible explanation for some of the empirical properties of asset returns within a heterogeneous-agents framework. The model turns out, even if we assume the input fundamental value follows an simple Gaussian distribution lacking both fat tails and volatility dependence, these features can show up in the time series of asset returns. In this model, the profit comparison and switching between heterogeneous play key roles, which build a connection between endogenous market and the emergence of stylized facts.

  17. Critical systems for public health management of floods, North Dakota.

    PubMed

    Wiedrich, Tim W; Sickler, Juli L; Vossler, Brenda L; Pickard, Stephen P

    2013-01-01

    Availability of emergency preparedness funding between 2002 and 2009 allowed the North Dakota Department of Health to build public health response capabilities. Five of the 15 public health preparedness capability areas identified by the Centers for Disease Control and Prevention in 2011 have been thoroughly tested by responses to flooding in North Dakota in 2009, 2010, and 2011; those capability areas are information sharing, emergency operations coordination, medical surge, material management and distribution, and volunteer management. Increasing response effectiveness has depended on planning, implementation of new information technology, changes to command and control procedures, containerized response materials, and rapid contract procedures. Continued improvement in response and maintenance of response capabilities is dependent on ongoing funding.

  18. Synthetic polycations with controlled charge density and molecular weight as building blocks for biomaterials.

    PubMed

    Kleinberger, Rachelle M; Burke, Nicholas A D; Zhou, Christal; Stöver, Harald D H

    2016-01-01

    A series of polycations prepared by RAFT copolymerization of N-(3-aminopropyl)methacrylamide hydrochloride (APM) and N-(2-hydroxypropyl)methacrylamide, with molecular weights of 15 and 40 kDa, and APM content of 10-75 mol%, were tested as building blocks for electrostatically assembled hydrogels such as those used for cell encapsulation. Complexation and distribution of these copolymers within anionic calcium alginate gels, as well as cytotoxicity, cell attachment, and cell proliferation on surfaces grafted with the copolymers were found to depend on composition and molecular weight. Copolymers with lower cationic charge density and lower molecular weight showed less cytotoxicity and cell adhesion, and were more mobile within alginate gels. These findings aid in designing improved polyelectrolyte complexes for use as biomaterials.

  19. Design and Implementation of Geothermal Energy Systems at West Chester University

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cuprak, Greg

    West Chester University has launched a comprehensive transformation of its campus heating and cooling systems from traditional fossil fuels (coal, oil and natural gas) to geothermal. This change will significantly decrease the institution’s carbon footprint and serve as a national model for green campus efforts. The institution has designed a phased series of projects to build a district geo-exchange system with shared well fields, central pumping station and distribution piping to provide the geo-exchange water to campus buildings as their internal building HVAC systems is changed to be able to use the geo-exchange water. This project addresses the US Departmentmore » of Energy Office of Energy Efficiency and Renewable Energy (EERE) goal to invest in clean energy technologies that strengthen the economy, protect the environment, and reduce dependence on foreign oil. In addition, this project advances EERE’s efforts to establish geothermal energy as an economically competitive contributor to the US energy supply.« less

  20. Validity of total and segmental impedance measurements for prediction of body composition across ethnic population groups.

    PubMed

    Deurenberg, P; Deurenberg-Yap, M; Schouten, F J M

    2002-03-01

    To test the impact of body build factors on the validity of impedance-based body composition predictions across (ethnic) population groups and to study the suitability of segmental impedance measurements. Cross-sectional observational study. Ministry of Health and School of Physical Education, Nanyang Technological University, Singapore. A total of 291 female and male Chinese, Malays and Indian Singaporeans, aged 18-69, body mass index (BMI) 16.0-40.2 kg/ m2. Anthropometric parameters were measured in addition to impedance (100 kHz) of the total body, arms and legs. Impedance indexes were calculated as height2/impedance. Arm length (span) and leg length (sitting height), wrist and knee width were measured from which body build indices were calculated. Total body water (TBW) was measured using deuterium oxide dilution. Extra cellular water (ECW) was measured using bromide dilution. Body fat percentage was determined using a chemical four-compartment model. The bias of TBW predicted from total body impedance index (bias: measured minus predicted TBW) was different among the three ethnic groups, TBW being significantly underestimated in Indians compared to Chinese and Malays. This bias was found to be dependent on body water distribution (ECW/TBW) and parameters of body build, mainly relative (to height) arm length. After correcting for differences in body water distribution and body build parameters the differences in bias across the ethnic groups disappeared. The impedance index using total body impedance was better correlated with TBW than the impedance index of arm or leg impedance, even after corrections for body build parameters. The study shows that ethnic-specific bias of impedance-based prediction formulas for body composition is due mainly to differences in body build among the ethnic groups. This means that the use of 'general' prediction equations across different (ethnic) population groups without prior testing of their validity should be avoided. Total body impedance has higher predictive value than segmental impedance.

  1. Transforming RNA-Seq data to improve the performance of prognostic gene signatures.

    PubMed

    Zwiener, Isabella; Frisch, Barbara; Binder, Harald

    2014-01-01

    Gene expression measurements have successfully been used for building prognostic signatures, i.e for identifying a short list of important genes that can predict patient outcome. Mostly microarray measurements have been considered, and there is little advice available for building multivariable risk prediction models from RNA-Seq data. We specifically consider penalized regression techniques, such as the lasso and componentwise boosting, which can simultaneously consider all measurements and provide both, multivariable regression models for prediction and automated variable selection. However, they might be affected by the typical skewness, mean-variance-dependency or extreme values of RNA-Seq covariates and therefore could benefit from transformations of the latter. In an analytical part, we highlight preferential selection of covariates with large variances, which is problematic due to the mean-variance dependency of RNA-Seq data. In a simulation study, we compare different transformations of RNA-Seq data for potentially improving detection of important genes. Specifically, we consider standardization, the log transformation, a variance-stabilizing transformation, the Box-Cox transformation, and rank-based transformations. In addition, the prediction performance for real data from patients with kidney cancer and acute myeloid leukemia is considered. We show that signature size, identification performance, and prediction performance critically depend on the choice of a suitable transformation. Rank-based transformations perform well in all scenarios and can even outperform complex variance-stabilizing approaches. Generally, the results illustrate that the distribution and potential transformations of RNA-Seq data need to be considered as a critical step when building risk prediction models by penalized regression techniques.

  2. Transforming RNA-Seq Data to Improve the Performance of Prognostic Gene Signatures

    PubMed Central

    Zwiener, Isabella; Frisch, Barbara; Binder, Harald

    2014-01-01

    Gene expression measurements have successfully been used for building prognostic signatures, i.e for identifying a short list of important genes that can predict patient outcome. Mostly microarray measurements have been considered, and there is little advice available for building multivariable risk prediction models from RNA-Seq data. We specifically consider penalized regression techniques, such as the lasso and componentwise boosting, which can simultaneously consider all measurements and provide both, multivariable regression models for prediction and automated variable selection. However, they might be affected by the typical skewness, mean-variance-dependency or extreme values of RNA-Seq covariates and therefore could benefit from transformations of the latter. In an analytical part, we highlight preferential selection of covariates with large variances, which is problematic due to the mean-variance dependency of RNA-Seq data. In a simulation study, we compare different transformations of RNA-Seq data for potentially improving detection of important genes. Specifically, we consider standardization, the log transformation, a variance-stabilizing transformation, the Box-Cox transformation, and rank-based transformations. In addition, the prediction performance for real data from patients with kidney cancer and acute myeloid leukemia is considered. We show that signature size, identification performance, and prediction performance critically depend on the choice of a suitable transformation. Rank-based transformations perform well in all scenarios and can even outperform complex variance-stabilizing approaches. Generally, the results illustrate that the distribution and potential transformations of RNA-Seq data need to be considered as a critical step when building risk prediction models by penalized regression techniques. PMID:24416353

  3. Steve Frank | NREL

    Science.gov Websites

    Commercial Buildings Research Group. Steve's areas of expertise are electric power distribution systems, DC techniques for maximizing the energy efficiency of electrical distribution systems in commercial buildings

  4. Distributed Energy Resources On-Site Optimization for Commercial Buildings with Electric and Thermal Storage Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lacommare, Kristina S H; Stadler, Michael; Aki, Hirohisa

    The addition of storage technologies such as flow batteries, conventional batteries, and heat storage can improve the economic as well as environmental attractiveness of on-site generation (e.g., PV, fuel cells, reciprocating engines or microturbines operating with or without CHP) and contribute to enhanced demand response. In order to examine the impact of storage technologies on demand response and carbon emissions, a microgrid's distributed energy resources (DER) adoption problem is formulated as a mixed-integer linear program that has the minimization of annual energy costs as its objective function. By implementing this approach in the General Algebraic Modeling System (GAMS), the problemmore » is solved for a given test year at representative customer sites, such as schools and nursing homes, to obtain not only the level of technology investment, but also the optimal hourly operating schedules. This paper focuses on analysis of storage technologies in DER optimization on a building level, with example applications for commercial buildings. Preliminary analysis indicates that storage technologies respond effectively to time-varying electricity prices, i.e., by charging batteries during periods of low electricity prices and discharging them during peak hours. The results also indicate that storage technologies significantly alter the residual load profile, which can contribute to lower carbon emissions depending on the test site, its load profile, and its adopted DER technologies.« less

  5. Real-time identification of indoor pollutant source positions based on neural network locator of contaminant sources and optimized sensor networks.

    PubMed

    Vukovic, Vladimir; Tabares-Velasco, Paulo Cesar; Srebric, Jelena

    2010-09-01

    A growing interest in security and occupant exposure to contaminants revealed a need for fast and reliable identification of contaminant sources during incidental situations. To determine potential contaminant source positions in outdoor environments, current state-of-the-art modeling methods use computational fluid dynamic simulations on parallel processors. In indoor environments, current tools match accidental contaminant distributions with cases from precomputed databases of possible concentration distributions. These methods require intensive computations in pre- and postprocessing. On the other hand, neural networks emerged as a tool for rapid concentration forecasting of outdoor environmental contaminants such as nitrogen oxides or sulfur dioxide. All of these modeling methods depend on the type of sensors used for real-time measurements of contaminant concentrations. A review of the existing sensor technologies revealed that no perfect sensor exists, but intensity of work in this area provides promising results in the near future. The main goal of the presented research study was to extend neural network modeling from the outdoor to the indoor identification of source positions, making this technology applicable to building indoor environments. The developed neural network Locator of Contaminant Sources was also used to optimize number and allocation of contaminant concentration sensors for real-time prediction of indoor contaminant source positions. Such prediction should take place within seconds after receiving real-time contaminant concentration sensor data. For the purpose of neural network training, a multizone program provided distributions of contaminant concentrations for known source positions throughout a test building. Trained networks had an output indicating contaminant source positions based on measured concentrations in different building zones. A validation case based on a real building layout and experimental data demonstrated the ability of this method to identify contaminant source positions. Future research intentions are focused on integration with real sensor networks and model improvements for much more complicated contamination scenarios.

  6. Review of Methods for Buildings Energy Performance Modelling

    NASA Astrophysics Data System (ADS)

    Krstić, Hrvoje; Teni, Mihaela

    2017-10-01

    Research presented in this paper gives a brief review of methods used for buildings energy performance modelling. This paper gives also a comprehensive review of the advantages and disadvantages of available methods as well as the input parameters used for modelling buildings energy performance. European Directive EPBD obliges the implementation of energy certification procedure which gives an insight on buildings energy performance via exiting energy certificate databases. Some of the methods for buildings energy performance modelling mentioned in this paper are developed by employing data sets of buildings which have already undergone an energy certification procedure. Such database is used in this paper where the majority of buildings in the database have already gone under some form of partial retrofitting - replacement of windows or installation of thermal insulation but still have poor energy performance. The case study presented in this paper utilizes energy certificates database obtained from residential units in Croatia (over 400 buildings) in order to determine the dependence between buildings energy performance and variables from database by using statistical dependencies tests. Building energy performance in database is presented with building energy efficiency rate (from A+ to G) which is based on specific annual energy needs for heating for referential climatic data [kWh/(m2a)]. Independent variables in database are surfaces and volume of the conditioned part of the building, building shape factor, energy used for heating, CO2 emission, building age and year of reconstruction. Research results presented in this paper give an insight in possibilities of methods used for buildings energy performance modelling. Further on it gives an analysis of dependencies between buildings energy performance as a dependent variable and independent variables from the database. Presented results could be used for development of new building energy performance predictive model.

  7. Distributed dynamic simulations of networked control and building performance applications.

    PubMed

    Yahiaoui, Azzedine

    2018-02-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.

  8. Distributed dynamic simulations of networked control and building performance applications

    PubMed Central

    Yahiaoui, Azzedine

    2017-01-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper. PMID:29568135

  9. Guest Editor's Introduction: Special section on dependable distributed systems

    NASA Astrophysics Data System (ADS)

    Fetzer, Christof

    1999-09-01

    We rely more and more on computers. For example, the Internet reshapes the way we do business. A `computer outage' can cost a company a substantial amount of money. Not only with respect to the business lost during an outage, but also with respect to the negative publicity the company receives. This is especially true for Internet companies. After recent computer outages of Internet companies, we have seen a drastic fall of the shares of the affected companies. There are multiple causes for computer outages. Although computer hardware becomes more reliable, hardware related outages remain an important issue. For example, some of the recent computer outages of companies were caused by failed memory and system boards, and even by crashed disks - a failure type which can easily be masked using disk mirroring. Transient hardware failures might also look like software failures and, hence, might be incorrectly classified as such. However, many outages are software related. Faulty system software, middleware, and application software can crash a system. Dependable computing systems are systems we can rely on. Dependable systems are, by definition, reliable, available, safe and secure [3]. This special section focuses on issues related to dependable distributed systems. Distributed systems have the potential to be more dependable than a single computer because the probability that all computers in a distributed system fail is smaller than the probability that a single computer fails. However, if a distributed system is not built well, it is potentially less dependable than a single computer since the probability that at least one computer in a distributed system fails is higher than the probability that one computer fails. For example, if the crash of any computer in a distributed system can bring the complete system to a halt, the system is less dependable than a single-computer system. Building dependable distributed systems is an extremely difficult task. There is no silver bullet solution. Instead one has to apply a variety of engineering techniques [2]: fault-avoidance (minimize the occurrence of faults, e.g. by using a proper design process), fault-removal (remove faults before they occur, e.g. by testing), fault-evasion (predict faults by monitoring and reconfigure the system before failures occur), and fault-tolerance (mask and/or contain failures). Building a system from scratch is an expensive and time consuming effort. To reduce the cost of building dependable distributed systems, one would choose to use commercial off-the-shelf (COTS) components whenever possible. The usage of COTS components has several potential advantages beyond minimizing costs. For example, through the widespread usage of a COTS component, design failures might be detected and fixed before the component is used in a dependable system. Custom-designed components have to mature without the widespread in-field testing of COTS components. COTS components have various potential disadvantages when used in dependable systems. For example, minimizing the time to market might lead to the release of components with inherent design faults (e.g. use of `shortcuts' that only work most of the time). In addition, the components might be more complex than needed and, hence, potentially have more design faults than simpler components. However, given economic constraints and the ability to cope with some of the problems using fault-evasion and fault-tolerance, only for a small percentage of systems can one justify not using COTS components. Distributed systems built from current COTS components are asynchronous systems in the sense that there exists no a priori known bound on the transmission delay of messages or the execution time of processes. When designing a distributed algorithm, one would like to make sure (e.g. by testing or verification) that it is correct, i.e. satisfies its specification. Many distributed algorithms make use of consensus (eventually all non-crashed processes have to agree on a value), leader election (a crashed leader is eventually replaced by a new leader, but at any time there is at most one leader) or a group membership detection service (a crashed process is eventually suspected to have crashed but only crashed processes are suspected). From a theoretical point of view, the service specifications given for such services are not implementable in asynchronous systems. In particular, for each implementation one can derive a counter example in which the service violates its specification. From a practical point of view, the consensus, the leader election, and the membership detection problem are solvable in asynchronous distributed systems. In this special section, Raynal and Tronel show how to bridge this difference by showing how to implement the group membership detection problem with a negligible probability [1] to fail in an asynchronous system. The group membership detection problem is specified by a liveness condition (L) and a safety property (S): (L) if a process p crashes, then eventually every non-crashed process q has to suspect that p has crashed; and (S) if a process q suspects p, then p has indeed crashed. One can show that either (L) or (S) is implementable, but one cannot implement both (L) and (S) at the same time in an asynchronous system. In practice, one only needs to implement (L) and (S) such that the probability that (L) or (S) is violated becomes negligible. Raynal and Tronel propose and analyse a protocol that implements (L) with certainty and that can be tuned such that the probability that (S) is violated becomes negligible. Designing and implementing distributed fault-tolerant protocols for asynchronous systems is a difficult but not an impossible task. A fault-tolerant protocol has to detect and mask certain failure classes, e.g. crash failures and message omission failures. There is a trade-off between the performance of a fault-tolerant protocol and the failure classes the protocol can tolerate. One wants to tolerate as many failure classes as needed to satisfy the stochastic requirements of the protocol [1] while still maintaining a sufficient performance. Since clients of a protocol have different requirements with respect to the performance/fault-tolerance trade-off, one would like to be able to customize protocols such that one can select an appropriate performance/fault-tolerance trade-off. In this special section Hiltunen et al describe how one can compose protocols from micro-protocols in their Cactus system. They show how a group RPC system can be tailored to the needs of a client. In particular, they show how considering additional failure classes affects the performance of a group RPC system. References [1] Cristian F 1991 Understanding fault-tolerant distributed systems Communications of ACM 34 (2) 56-78 [2] Heimerdinger W L and Weinstock C B 1992 A conceptual framework for system fault tolerance Technical Report 92-TR-33, CMU/SEI [3] Laprie J C (ed) 1992 Dependability: Basic Concepts and Terminology (Vienna: Springer)

  10. Time-sliced perturbation theory for large scale structure I: general formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blas, Diego; Garny, Mathias; Sibiryakov, Sergey

    2016-07-01

    We present a new analytic approach to describe large scale structure formation in the mildly non-linear regime. The central object of the method is the time-dependent probability distribution function generating correlators of the cosmological observables at a given moment of time. Expanding the distribution function around the Gaussian weight we formulate a perturbative technique to calculate non-linear corrections to cosmological correlators, similar to the diagrammatic expansion in a three-dimensional Euclidean quantum field theory, with time playing the role of an external parameter. For the physically relevant case of cold dark matter in an Einstein-de Sitter universe, the time evolution ofmore » the distribution function can be found exactly and is encapsulated by a time-dependent coupling constant controlling the perturbative expansion. We show that all building blocks of the expansion are free from spurious infrared enhanced contributions that plague the standard cosmological perturbation theory. This paves the way towards the systematic resummation of infrared effects in large scale structure formation. We also argue that the approach proposed here provides a natural framework to account for the influence of short-scale dynamics on larger scales along the lines of effective field theory.« less

  11. Spectroscopic investigation of some building blocks of organic conductors: A comparative study

    NASA Astrophysics Data System (ADS)

    Mukherjee, V.; Yadav, T.

    2017-04-01

    Theoretical molecular structures and IR and Raman spectra of di and tetra methyl substituted tetrathiafulvalene and tetraselenafulvalene molecules have been studied. These molecules belong to the organic conductor family and are immensely used as building blocks of several organic conducting devices. The Hartree-Fock and density functional theory with exchange functional B3LYP have been employed for computational purpose. We have also performed normal coordinate analysis to scale the theoretical frequencies and to calculate potential energy distributions for the conspicuous assignments. The exciting frequency and temperature dependent Raman spectra have also presented. Optimization results reveal that the sulphur derivatives possess boat shape while selenium derivatives possess planner structures. Natural bond orbitals analysis has also been performed to study second order interaction between donors and acceptors and to compute molecular orbital occupancy and energy.

  12. Mass Spectrometric and Synchrotron Radiation based techniques for the identification and distribution of painting materials in samples from paints of Josep Maria Sert

    PubMed Central

    2012-01-01

    Background Establishing the distribution of materials in paintings and that of their degradation products by imaging techniques is fundamental to understand the painting technique and can improve our knowledge on the conservation status of the painting. The combined use of chromatographic-mass spectrometric techniques, such as GC/MS or Py/GC/MS, and the chemical mapping of functional groups by imaging SR FTIR in transmission mode on thin sections and SR XRD line scans will be presented as a suitable approach to have a detailed characterisation of the materials in a paint sample, assuring their localisation in the sample build-up. This analytical approach has been used to study samples from Catalan paintings by Josep Maria Sert y Badía (20th century), a muralist achieving international recognition whose canvases adorned international buildings. Results The pigments used by the painter as well as the organic materials used as binders and varnishes could be identified by means of conventional techniques. The distribution of these materials by means of Synchrotron Radiation based techniques allowed to establish the mixtures used by the painter depending on the purpose. Conclusions Results show the suitability of the combined use of SR μFTIR and SR μXRD mapping and conventional techniques to unequivocally identify all the materials present in the sample and their localization in the sample build-up. This kind of approach becomes indispensable to solve the challenge of micro heterogeneous samples. The complementary interpretation of the data obtained with all the different techniques allowed the characterization of both organic and inorganic materials in the samples layer by layer as well as to establish the painting techniques used by Sert in the works-of-art under study. PMID:22616949

  13. Evaluation of design flood estimates with respect to sample size

    NASA Astrophysics Data System (ADS)

    Kobierska, Florian; Engeland, Kolbjorn

    2016-04-01

    Estimation of design floods forms the basis for hazard management related to flood risk and is a legal obligation when building infrastructure such as dams, bridges and roads close to water bodies. Flood inundation maps used for land use planning are also produced based on design flood estimates. In Norway, the current guidelines for design flood estimates give recommendations on which data, probability distribution, and method to use dependent on length of the local record. If less than 30 years of local data is available, an index flood approach is recommended where the local observations are used for estimating the index flood and regional data are used for estimating the growth curve. For 30-50 years of data, a 2 parameter distribution is recommended, and for more than 50 years of data, a 3 parameter distribution should be used. Many countries have national guidelines for flood frequency estimation, and recommended distributions include the log Pearson II, generalized logistic and generalized extreme value distributions. For estimating distribution parameters, ordinary and linear moments, maximum likelihood and Bayesian methods are used. The aim of this study is to r-evaluate the guidelines for local flood frequency estimation. In particular, we wanted to answer the following questions: (i) Which distribution gives the best fit to the data? (ii) Which estimation method provides the best fit to the data? (iii) Does the answer to (i) and (ii) depend on local data availability? To answer these questions we set up a test bench for local flood frequency analysis using data based cross-validation methods. The criteria were based on indices describing stability and reliability of design flood estimates. Stability is used as a criterion since design flood estimates should not excessively depend on the data sample. The reliability indices describe to which degree design flood predictions can be trusted.

  14. SU-F-T-474: Evaluation of Dose Perturbation, Temperature and Sensitivity Variation With Accumulated Dose of MOSFET Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ganesan, B; Prakasarao, A; Singaravelu, G

    Purpose: The use of mega voltage gamma and x-ray sources with their skin sparring qualities in radiation therapy has been a boon in relieving patient discomfort and allowing high tumor doses to be given with fewer restrictions due to radiation effects in the skin. However, high doses given to deep tumors may require careful consideration of dose distribution in the buildup region in order to avoid irreparable damage to the skin. Methods: To measure the perturbation of MOSFET detector in Co60,6MV and 15MV the detector was placed on the surface of the phantom covered with the brass build up cap.more » To measure the effect of temperature the MOSFET detector was kept on the surface of hot water polythene container and the radiation was delivere. In order to measure the sensitivity variation with accumulated dose Measurements were taken by delivering the dose of 200 cGy to MOSFET until the MOSFET absorbed dose comes to 20,000 cGy Results: the Measurement was performed by positioning the bare MOSFET and MOSFET with brass build up cap on the top surface of the solid water phantom for various field sizes in order to find whether there is any attenuation caused in the dose distribution. The response of MOSFET was monitored for temperature ranging from 42 degree C to 22 degree C. The integrated dose dependence of MOSFET dosimeter sensitivity over different energy is not well characterized. This work investigates the dual-bias MOSFET dosimeter sensitivity response to 6 MV and 15 MV beams. Conclusion: From this study it is observed that unlike diode, bare MOSFET does not perturb the radiation field.. It is observed that the build-up influences the temperature dependency of MOSFET and causes some uncertainty in the readings. In the case of sensitivity variation with accumulated dose MOSFET showed higher sensitivity with dose accumulation for both the energies.« less

  15. On the viability of exploiting L-shell fluorescence for X-ray polarimetry

    NASA Technical Reports Server (NTRS)

    Weisskopf, M. C.; Sutherland, P. G.; Elsner, R. F.; Ramsey, B. D.

    1985-01-01

    It has been suggested that one may build an X-ray polarimeter by exploiting the polarization dependence of the angular distribution of L-shell fluorescence photons. In this paper the sensitivity of this approach to polarimetry is examined theoretically. The calculations are applied to several detection schemes using imaging proportional counters that would have direct application in X-ray astronomy. It is found, however, that the sensitivity of this method for measuring X-ray polarization is too low to be of use for other than laboratory applications.

  16. Combined risk assessment of nonstationary monthly water quality based on Markov chain and time-varying copula.

    PubMed

    Shi, Wei; Xia, Jun

    2017-02-01

    Water quality risk management is a global hot research linkage with the sustainable water resource development. Ammonium nitrogen (NH 3 -N) and permanganate index (COD Mn ) as the focus indicators in Huai River Basin, are selected to reveal their joint transition laws based on Markov theory. The time-varying moments model with either time or land cover index as explanatory variables is applied to build the time-varying marginal distributions of water quality time series. Time-varying copula model, which takes the non-stationarity in the marginal distribution and/or the time variation in dependence structure between water quality series into consideration, is constructed to describe a bivariate frequency analysis for NH 3 -N and COD Mn series at the same monitoring gauge. The larger first-order Markov joint transition probability indicates water quality state Class V w , Class IV and Class III will occur easily in the water body of Bengbu Sluice. Both marginal distribution and copula models are nonstationary, and the explanatory variable time yields better performance than land cover index in describing the non-stationarities in the marginal distributions. In modelling the dependence structure changes, time-varying copula has a better fitting performance than the copula with the constant or the time-trend dependence parameter. The largest synchronous encounter risk probability of NH 3 -N and COD Mn simultaneously reaching Class V is 50.61%, while the asynchronous encounter risk probability is largest when NH 3 -N and COD Mn is inferior to class V and class IV water quality standards, respectively.

  17. Wind tunnel tests for wind pressure distribution on gable roof buildings.

    PubMed

    Jing, Xiao-kun; Li, Yuan-qi

    2013-01-01

    Gable roof buildings are widely used in industrial buildings. Based on wind tunnel tests with rigid models, wind pressure distributions on gable roof buildings with different aspect ratios were measured simultaneously. Some characteristics of the measured wind pressure field on the surfaces of the models were analyzed, including mean wind pressure, fluctuating wind pressure, peak negative wind pressure, and characteristics of proper orthogonal decomposition results of the measured wind pressure field. The results show that extremely high local suctions often occur in the leading edges of longitudinal wall and windward roof, roof corner, and roof ridge which are the severe damaged locations under strong wind. The aspect ratio of building has a certain effect on the mean wind pressure coefficients, and the effect relates to wind attack angle. Compared with experimental results, the region division of roof corner and roof ridge from AIJ2004 is more reasonable than those from CECS102:2002 and MBMA2006.The contributions of the first several eigenvectors to the overall wind pressure distributions become much bigger. The investigation can offer some basic understanding for estimating wind load distribution on gable roof buildings and facilitate wind-resistant design of cladding components and their connections considering wind load path.

  18. To acquire more detailed radiation drive by use of ``quasi-steady'' approximation in atomic kinetics

    NASA Astrophysics Data System (ADS)

    Ren, Guoli; Pei, Wenbing; Lan, Ke; Gu, Peijun; Li, Xin

    2012-10-01

    In current routine 2D simulation of hohlraum physics, we adopt the principal-quantum- number(n-level) average atom model(AAM) in NLTE plasma description. However, the detailed experimental frequency-dependant radiative drive differs from our n-level simulated drive, which reminds us the need of a more detailed atomic kinetics description. The orbital-quantum- number(nl-level) average atom model is a natural consideration, however the nl-level in-line calculation needs much more computational resource. By distinguishing the rapid bound-bound atomic processes from the relative slow bound-free atomic processes, we found a method to build up a more detailed bound electron distribution(nl-level even nlm-level) using in-line n-level calculated plasma conditions(temperature, density, and average ionization degree). We name this method ``quasi-steady approximation'' in atomic kinetics. Using this method, we re-build the nl-level bound electron distribution (Pnl), and acquire a new hohlraum radiative drive by post-processing. Comparison with the n-level post-processed hohlraum drive shows that we get an almost identical radiation flux but with more fine frequency-denpending spectrum structure which appears only in nl-level transition with same n number(n=0) .

  19. The Building History of XUV disks of M83& NGC2403 with TRGB Archaeology

    NASA Astrophysics Data System (ADS)

    Koda, Jin

    2015-06-01

    We propose deep HSC g & i-band imaging of two extended ultraviolet (XUV) disks of M83 and NGC2403. These galaxies have the prototype XUV disks with the largest size ( 1 deg and 30 arcmin). The Subaru HSC permits unprecedentedly deep imaging over these gigantic XUV disks, including sufficient surrounding areas which are used for sky subtraction and statistical estimation of background contamination. This project probes the building history of the XUV disks using archeological stellar populations, especially the tip of red giant branch (TRGB) stars (age 2-14 Gyr). Their presence and distribution over the XUV disks will reveal any star formation (SF) occurring over the past 2 Gyr, 4-6 Gyr, and beyond - i.e., the epochs preceding the recent (UV-traced) state of SF. Their color depends strongly on metallicity, thus providing an additional measure of star-gas recycling during the evolution of the XUV disks. In addition, we will detect young & massive main sequence stars (<100 Myr) and He-burning stars (100-500 Myr). Comparing various generations of stars, in terms of number densities and spatial distributions, will reveal the much-unexplored SF history in the XUV disks.

  20. Balancing Hydronic Systems in Multifamily Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruch, Russell; Ludwig, Peter; Maurer, Tessa

    2014-07-01

    In multifamily hydronic systems, temperature imbalance may be caused by undersized piping, improperly adjusted balancing valves, inefficient water temperature and flow levels, and owner/occupant interaction with the boilers, distribution, and controls. The imbalance leads to tenant discomfort, higher energy use intensity, and inefficient building operation. This research, conducted by Building America team Partnership for Advanced Residential Retrofit, explores cost-effective distribution upgrades and balancing measures in multifamily hydronic systems, providing a resource to contractors, auditors, and building owners on best practices to improve tenant comfort and lower operating costs. The team surveyed existing knowledge on cost-effective retrofits for optimizing distribution inmore » typical multifamily hydronic systems, with the aim of identifying common situations and solutions, and then conducted case studies on two Chicago area buildings with known balancing issues in order to quantify the extent of temperature imbalance. At one of these buildings a booster pump was installed on a loop to an underheated wing of the building. This study found that unit temperature in a multifamily hydronic building can vary as much as 61°F, particularly if windows are opened or tenants use intermittent supplemental heating sources like oven ranges. Average temperature spread at the building as a result of this retrofit decreased from 22.1°F to 15.5°F.« less

  1. FAST TRACK COMMUNICATION: Suppressing anomalous diffusion by cooperation

    NASA Astrophysics Data System (ADS)

    Dybiec, Bartłomiej

    2010-08-01

    Within a continuous time random walk scenario we consider a motion of a complex of particles which moves coherently. The motion of every particle is characterized by the waiting time and jump length distributions which are of the power-law type. Due to the interactions between particles it is assumed that the waiting time is adjusted to the shortest or to the longest waiting time. Analogously, the jump length is adjusted to the shortest or to the longest jump length. We show that adjustment to the shortest waiting time can suppress the subdiffusive behavior even in situations when the exponent characterizing the waiting time distribution assures subdiffusive motion of a single particle. Finally, we demonstrate that the characteristic of the motion depends on the number of particles building a complex.

  2. Flood impacts on a water distribution network

    NASA Astrophysics Data System (ADS)

    Arrighi, Chiara; Tarani, Fabio; Vicario, Enrico; Castelli, Fabio

    2017-12-01

    Floods cause damage to people, buildings and infrastructures. Water distribution systems are particularly exposed, since water treatment plants are often located next to the rivers. Failure of the system leads to both direct losses, for instance damage to equipment and pipework contamination, and indirect impact, since it may lead to service disruption and thus affect populations far from the event through the functional dependencies of the network. In this work, we present an analysis of direct and indirect damages on a drinking water supply system, considering the hazard of riverine flooding as well as the exposure and vulnerability of active system components. The method is based on interweaving, through a semi-automated GIS procedure, a flood model and an EPANET-based pipe network model with a pressure-driven demand approach, which is needed when modelling water distribution networks in highly off-design conditions. Impact measures are defined and estimated so as to quantify service outage and potential pipe contamination. The method is applied to the water supply system of the city of Florence, Italy, serving approximately 380 000 inhabitants. The evaluation of flood impact on the water distribution network is carried out for different events with assigned recurrence intervals. Vulnerable elements exposed to the flood are identified and analysed in order to estimate their residual functionality and to simulate failure scenarios. Results show that in the worst failure scenario (no residual functionality of the lifting station and a 500-year flood), 420 km of pipework would require disinfection with an estimated cost of EUR 21 million, which is about 0.5 % of the direct flood losses evaluated for buildings and contents. Moreover, if flood impacts on the water distribution network are considered, the population affected by the flood is up to 3 times the population directly flooded.

  3. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity.

    PubMed

    Pecevski, Dejan; Maass, Wolfgang

    2016-01-01

    Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p (*) that generates the examples it receives. This holds even if p (*) contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference.

  4. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity123

    PubMed Central

    Pecevski, Dejan

    2016-01-01

    Abstract Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p* that generates the examples it receives. This holds even if p* contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference. PMID:27419214

  5. Selection of organisms for the co-evolution-based study of protein interactions.

    PubMed

    Herman, Dorota; Ochoa, David; Juan, David; Lopez, Daniel; Valencia, Alfonso; Pazos, Florencio

    2011-09-12

    The prediction and study of protein interactions and functional relationships based on similarity of phylogenetic trees, exemplified by the mirrortree and related methodologies, is being widely used. Although dependence between the performance of these methods and the set of organisms used to build the trees was suspected, so far nobody assessed it in an exhaustive way, and, in general, previous works used as many organisms as possible. In this work we asses the effect of using different sets of organism (chosen according with various phylogenetic criteria) on the performance of this methodology in detecting protein interactions of different nature. We show that the performance of three mirrortree-related methodologies depends on the set of organisms used for building the trees, and it is not always directly related to the number of organisms in a simple way. Certain subsets of organisms seem to be more suitable for the predictions of certain types of interactions. This relationship between type of interaction and optimal set of organism for detecting them makes sense in the light of the phylogenetic distribution of the organisms and the nature of the interactions. In order to obtain an optimal performance when predicting protein interactions, it is recommended to use different sets of organisms depending on the available computational resources and data, as well as the type of interactions of interest.

  6. Selection of organisms for the co-evolution-based study of protein interactions

    PubMed Central

    2011-01-01

    Background The prediction and study of protein interactions and functional relationships based on similarity of phylogenetic trees, exemplified by the mirrortree and related methodologies, is being widely used. Although dependence between the performance of these methods and the set of organisms used to build the trees was suspected, so far nobody assessed it in an exhaustive way, and, in general, previous works used as many organisms as possible. In this work we asses the effect of using different sets of organism (chosen according with various phylogenetic criteria) on the performance of this methodology in detecting protein interactions of different nature. Results We show that the performance of three mirrortree-related methodologies depends on the set of organisms used for building the trees, and it is not always directly related to the number of organisms in a simple way. Certain subsets of organisms seem to be more suitable for the predictions of certain types of interactions. This relationship between type of interaction and optimal set of organism for detecting them makes sense in the light of the phylogenetic distribution of the organisms and the nature of the interactions. Conclusions In order to obtain an optimal performance when predicting protein interactions, it is recommended to use different sets of organisms depending on the available computational resources and data, as well as the type of interactions of interest. PMID:21910884

  7. Automated global structure extraction for effective local building block processing in XCS.

    PubMed

    Butz, Martin V; Pelikan, Martin; Llorà, Xavier; Goldberg, David E

    2006-01-01

    Learning Classifier Systems (LCSs), such as the accuracy-based XCS, evolve distributed problem solutions represented by a population of rules. During evolution, features are specialized, propagated, and recombined to provide increasingly accurate subsolutions. Recently, it was shown that, as in conventional genetic algorithms (GAs), some problems require efficient processing of subsets of features to find problem solutions efficiently. In such problems, standard variation operators of genetic and evolutionary algorithms used in LCSs suffer from potential disruption of groups of interacting features, resulting in poor performance. This paper introduces efficient crossover operators to XCS by incorporating techniques derived from competent GAs: the extended compact GA (ECGA) and the Bayesian optimization algorithm (BOA). Instead of simple crossover operators such as uniform crossover or one-point crossover, ECGA or BOA-derived mechanisms are used to build a probabilistic model of the global population and to generate offspring classifiers locally using the model. Several offspring generation variations are introduced and evaluated. The results show that it is possible to achieve performance similar to runs with an informed crossover operator that is specifically designed to yield ideal problem-dependent exploration, exploiting provided problem structure information. Thus, we create the first competent LCSs, XCS/ECGA and XCS/BOA, that detect dependency structures online and propagate corresponding lower-level dependency structures effectively without any information about these structures given in advance.

  8. Wind Tunnel Tests for Wind Pressure Distribution on Gable Roof Buildings

    PubMed Central

    2013-01-01

    Gable roof buildings are widely used in industrial buildings. Based on wind tunnel tests with rigid models, wind pressure distributions on gable roof buildings with different aspect ratios were measured simultaneously. Some characteristics of the measured wind pressure field on the surfaces of the models were analyzed, including mean wind pressure, fluctuating wind pressure, peak negative wind pressure, and characteristics of proper orthogonal decomposition results of the measured wind pressure field. The results show that extremely high local suctions often occur in the leading edges of longitudinal wall and windward roof, roof corner, and roof ridge which are the severe damaged locations under strong wind. The aspect ratio of building has a certain effect on the mean wind pressure coefficients, and the effect relates to wind attack angle. Compared with experimental results, the region division of roof corner and roof ridge from AIJ2004 is more reasonable than those from CECS102:2002 and MBMA2006.The contributions of the first several eigenvectors to the overall wind pressure distributions become much bigger. The investigation can offer some basic understanding for estimating wind load distribution on gable roof buildings and facilitate wind-resistant design of cladding components and their connections considering wind load path. PMID:24082851

  9. Incorporation of a spatial source distribution and a spatial sensor sensitivity in a laser ultrasound propagation model using a streamlined Huygens' principle.

    PubMed

    Laloš, Jernej; Babnik, Aleš; Možina, Janez; Požar, Tomaž

    2016-03-01

    The near-field, surface-displacement waveforms in plates are modeled using interwoven concepts of Green's function formalism and streamlined Huygens' principle. Green's functions resemble the building blocks of the sought displacement waveform, superimposed and weighted according to the simplified distribution. The approach incorporates an arbitrary circular spatial source distribution and an arbitrary circular spatial sensitivity in the area probed by the sensor. The displacement histories for uniform, Gaussian and annular normal-force source distributions and the uniform spatial sensor sensitivity are calculated, and the corresponding weight distributions are compared. To demonstrate the applicability of the developed scheme, measurements of laser ultrasound induced solely by the radiation pressure are compared with the calculated waveforms. The ultrasound is induced by laser pulse reflection from the mirror-surface of a glass plate. The measurements show excellent agreement not only with respect to various wave-arrivals but also in the shape of each arrival. Their shape depends on the beam profile of the excitation laser pulse and its corresponding spatial normal-force distribution. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. a Statistical Texture Feature for Building Collapse Information Extraction of SAR Image

    NASA Astrophysics Data System (ADS)

    Li, L.; Yang, H.; Chen, Q.; Liu, X.

    2018-04-01

    Synthetic Aperture Radar (SAR) has become one of the most important ways to extract post-disaster collapsed building information, due to its extreme versatility and almost all-weather, day-and-night working capability, etc. In view of the fact that the inherent statistical distribution of speckle in SAR images is not used to extract collapsed building information, this paper proposed a novel texture feature of statistical models of SAR images to extract the collapsed buildings. In the proposed feature, the texture parameter of G0 distribution from SAR images is used to reflect the uniformity of the target to extract the collapsed building. This feature not only considers the statistical distribution of SAR images, providing more accurate description of the object texture, but also is applied to extract collapsed building information of single-, dual- or full-polarization SAR data. The RADARSAT-2 data of Yushu earthquake which acquired on April 21, 2010 is used to present and analyze the performance of the proposed method. In addition, the applicability of this feature to SAR data with different polarizations is also analysed, which provides decision support for the data selection of collapsed building information extraction.

  11. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Foudriat, E. C.

    1991-01-01

    A modeling tool for both analysis and design of distributed systems is discussed. Since many research institutions have access to networks of workstations, the researchers decided to build a tool running on top of the workstations to function as a prototype as well as a distributed simulator for a computing system. The effects of system modeling on performance prediction in distributed systems and the effect of static locking and deadlocks on the performance predictions of distributed transactions are also discussed. While the probability of deadlock is considerably small, its effects on performance could be significant.

  12. Building America Case Study: Balancing Hydronic Systems in Multifamily Buildings, Chicago, Illinois (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2014-09-01

    In multifamily hydronic systems, temperature imbalance may be caused by undersized piping, improperly adjusted balancing valves, inefficient water temperature and flow levels, and owner/occupant interaction with the boilers, distribution and controls. The effects of imbalance include tenant discomfort, higher energy use intensity and inefficient building operation. This paper explores cost-effective distribution upgrades and balancing measures in multifamily hydronic systems, providing a resource to contractors, auditors, and building owners on best practices to improve tenant comfort and lower operating costs. The research was conducted by The Partnership for Advanced Residential Retrofit (PARR) in conjunction with Elevate Energy. The team surveyed existingmore » knowledge on cost-effective retrofits for optimizing distribution in typical multifamily hydronic systems, with the aim of identifying common situations and solutions, and then conducted case studies on two Chicago area buildings with known balancing issues in order to quantify the extent of temperature imbalance. At one of these buildings a booster pump was installed on a loop to an underheated wing of the building. This study found that unit temperature in a multifamily hydronic building can vary as much as 61 degrees F, particularly if windows are opened or tenants use intermittent supplemental heating sources like oven ranges. Average temperature spread at the building as a result of this retrofit decreased from 22.1 degrees F to 15.5 degrees F.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendes, Goncalo; Feng, Wei; Stadler, Michael

    The following paper conducts a regional analysis of the U.S. and Chinese buildings? potential for adopting Distributed Energy Resources (DER). The expected economics of DER in 2020-2025 is modeled for a commercial and a multi-family residential building in different climate zones. The optimal building energy economic performance is calculated using the Distributed Energy Resources Customer Adoption Model (DER CAM) which minimizes building energy costs for a typical reference year of operation. Several DER such as combined heat and power (CHP) units, photovoltaics, and battery storage are considered. The results indicate DER have economic and environmental competitiveness potential, especially for commercialmore » buildings in hot and cold climates of both countries. In the U.S., the average expected energy cost savings in commercial buildings from DER CAM?s suggested investments is 17percent, while in Chinese buildings is 12percent. The electricity tariffs structure and prices along with the cost of natural gas, represent important factors in determining adoption of DER, more so than climate. High energy pricing spark spreads lead to increased economic attractiveness of DER. The average emissions reduction in commercial buildings is 19percent in the U.S. as a result of significant investments in PV, whereas in China, it is 20percent and driven by investments in CHP. Keywords: Building Modeling and Simulation, Distributed Energy Resources (DER), Energy Efficiency, Combined Heat and Power (CHP), CO2 emissions 1. Introduction The transition from a centralized and fossil-based energy paradigm towards the decentralization of energy supply and distribution has been a major subject of research over the past two decades. Various concerns have brought the traditional model into question; namely its environmental footprint, its structural inflexibility and inefficiency, and more recently, its inability to maintain acceptable reliability of supply. Under such a troubled setting, distributed energy resources (DER) comprising of small, modular, electrical renewable or fossil-based electricity generation units placed at or near the point of energy consumption, has gained much attention as a viable alternative or addition to the current energy system. In 2010, China consumed about 30percent of its primary energy in the buildings sector, leading the country to pay great attention to DER development and its applications in buildings. During the 11th Five Year Plan (FYP), China has implemented 371 renewable energy building demonstration projects, and 210 photovoltaics (PV) building integration projects. At the end of the 12th FYP, China is targeting renewable energy to provide 10percent of total building energy, and to save 30 metric tons of CO2 equivalents (mtce) of energy with building integrated renewables. China is also planning to implement one thousand natural gas-based distributed cogeneration demonstration projects with energy utilization rates over 70percent in the 12th FYP. All these policy targets require significant DER systems development for building applications. China?s fast urbanization makes building energy efficiency a crucial economic issue; however, only limited studies have been done that examine how to design and select suitable building energy technologies in its different regions. In the U.S., buildings consumed 40percent of the total primary energy in 2010 [1] and it is estimated that about 14 billion m2 of floor space of the existing building stock will be remodeled over the next 30 years. Most building?s renovation work has been on building envelope, lighting and HVAC systems. Although interest has emerged, less attention is being paid to DER for buildings. This context has created opportunities for research, development and progressive deployment of DER, due to its potential to combine the production of power and heat (CHP) near the point of consumption and delivering multiple benefits to customers, such as cost« less

  14. Simulation of probabilistic wind loads and building analysis

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Chamis, Christos C.

    1991-01-01

    Probabilistic wind loads likely to occur on a structure during its design life are predicted. Described here is a suitable multifactor interactive equation (MFIE) model and its use in the Composite Load Spectra (CLS) computer program to simulate the wind pressure cumulative distribution functions on four sides of a building. The simulated probabilistic wind pressure load was applied to a building frame, and cumulative distribution functions of sway displacements and reliability against overturning were obtained using NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), a stochastic finite element computer code. The geometry of the building and the properties of building members were also considered as random in the NESSUS analysis. The uncertainties of wind pressure, building geometry, and member section property were qualified in terms of their respective sensitivities on the structural response.

  15. Technology Solutions Case Study: Balancing Hydronic Systems in Multifamily Buildings, Chicago, Illinois

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2014-09-01

    In multifamily building hydronic systems, temperature imbalance may be caused by undersized piping, improperly adjusted balancing valves, inefficient water temperature and flow levels, and owner/occupant interaction with the boilers, distribution and controls. The effects of imbalance include tenant discomfort, higher energy use intensity and inefficient building operation. In this case study , Partnership for Advanced Residential Retrofit and Elevate Energy. explores cost-effective distribution upgrades and balancing measures in multifamily hydronic systems, providing a resource to contractors, auditors, and building owners on best practices to improve tenant comfort and lower operating costs.

  16. The methane distribution on Titan: high resolution spectroscopy in the near-IR with Keck NIRSPEC/AO

    NASA Astrophysics Data System (ADS)

    Adamkovics, Mate; Mitchell, Jonathan L.

    2014-11-01

    The distribution of methane on Titan is a diagnostic of regional scale meteorology and large scale atmospheric circulation. The observed formation of clouds and the transport of heat through the atmosphere both depend on spatial and temporal variations in methane humidity. We have performed observations to measure the the distribution on methane Titan using high spectral resolution near-IR (H-band) observations made with NIRSPEC, with adaptive optics, at Keck Observatory in July 2014. This work builds on previous attempts at this measurement with improvement in the observing protocol and data reduction, together with increased integration times. Radiative transfer models using line-by-line calculation of methane opacities from the HITRAN2012 database are used to retrieve methane abundances. We will describe analysis of the reduced observations, which show latitudinal spatial variation in the region the spectrum that is thought to be sensitive to methane abundance. Quantifying the methane abundance variation requires models that include the spatial variation in surface albedo and meridional haze gradient; we will describe (currently preliminary) analysis of the the methane distribution and uncertainties in the retrieval.

  17. Technology Solutions Case Study: Long-Term Monitoring of Mini-Split Ductless Heat Pumps in the Northeast, Devens and Easthampton, Massachusetts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Transformations, Inc., has extensive experience building high-performance homes - production and custom - in a variety of Massachusetts locations and uses mini-split heat pumps (MSHPs) for space conditioning in most of its homes. The use of MSHPs for simplified space-conditioning distribution provides significant first-cost savings, which offsets the increased investment in the building enclosure. In this project, the U.S. Department of Energy Building America team Building Science Corporation evaluated the long-term performance of MSHPs in 8 homes during a period of 3 years. The work examined electrical use of MSHPs, distributions of interior temperatures and humidity when using simplified (two-point)more » heating systems in high-performance housing, and the impact of open-door/closed-door status on temperature distributions.« less

  18. Epitrochoid Power-Law Nozzle Rapid Prototype Build/Test Project (Briefing Charts)

    DTIC Science & Technology

    2015-02-01

    Production Approved for public release; distribution is unlimited. PA clearance # 15122. 4 Epitrochoid Power-Law Nozzle Build/Test Build on SpaceX ...Multiengine Approach SpaceX ) Approved for public release; distribution is unlimited. PA clearance # 15122. Engines: Merlin 1D on Falcon 9 v1.1 (Photo 5...to utilize features of high performance engines advances and the economies of scale of the multi-engine approach of SpaceX Falcon 9 – Rapid Prototype

  19. Policy Building Blocks: Helping Policymakers Determine Policy Staging for the Development of Distributed PV Markets: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doris, E.

    2012-04-01

    There is a growing body of qualitative and a limited body of quantitative literature supporting the common assertion that policy drives development of clean energy resources. Recent work in this area indicates that the impact of policy depends on policy type, length of time in place, and economic and social contexts of implementation. This work aims to inform policymakers about the impact of different policy types and to assist in the staging of those policies to maximize individual policy effectiveness and development of the market. To do so, this paper provides a framework for policy development to support the marketmore » for distributed photovoltaic systems. Next steps include mathematical validation of the framework and development of specific policy pathways given state economic and resource contexts.« less

  20. A uniform approach for programming distributed heterogeneous computing systems

    PubMed Central

    Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas

    2014-01-01

    Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater’s performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations. PMID:25844015

  1. Modelling of DNA-Mediated of Two- and -Three dimensional Protein-Protein and Protein-Nanoparticle Self-Assembly

    NASA Astrophysics Data System (ADS)

    Millan, Jaime; McMillan, Janet; Brodin, Jeff; Lee, Byeongdu; Mirkin, Chad; Olvera de La Cruz, Monica

    Programmable DNA interactions represent a robust scheme to self-assemble a rich variety of tunable superlattices, where intrinsic and in some cases non-desirable nano-scale building blocks interactions are substituted for DNA hybridization events. Recent advances in synthesis has allowed the extension of this successful scheme to proteins, where DNA distribution can be tuned independently of protein shape by selectively addressing surface residues, giving rise to assembly properties in three dimensional protein-nanoparticle superlattices dependent on DNA distribution. In parallel to this advances, we introduced a scalable coarse-grained model that faithfully reproduces the previously observed co-assemblies from nanoparticles and proteins conjugates. Herein, we implement this numerical model to explain the stability of complex protein-nanoparticle binary superlattices and to elucidate experimentally inaccessible features such as protein orientation. Also, we will discuss systematic studies that highlight the role of DNA distribution and sequence on two-dimensional protein-protein and protein-nanoparticle superlattices.

  2. A uniform approach for programming distributed heterogeneous computing systems.

    PubMed

    Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas

    2014-12-01

    Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater's performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations.

  3. Browndye: A software package for Brownian dynamics

    NASA Astrophysics Data System (ADS)

    Huber, Gary A.; McCammon, J. Andrew

    2010-11-01

    A new software package, Browndye, is presented for simulating the diffusional encounter of two large biological molecules. It can be used to estimate second-order rate constants and encounter probabilities, and to explore reaction trajectories. Browndye builds upon previous knowledge and algorithms from software packages such as UHBD, SDA, and Macrodox, while implementing algorithms that scale to larger systems. Program summaryProgram title: Browndye Catalogue identifier: AEGT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: MIT license, included in distribution No. of lines in distributed program, including test data, etc.: 143 618 No. of bytes in distributed program, including test data, etc.: 1 067 861 Distribution format: tar.gz Programming language: C++, OCaml ( http://caml.inria.fr/) Computer: PC, Workstation, Cluster Operating system: Linux Has the code been vectorised or parallelized?: Yes. Runs on multiple processors with shared memory using pthreads RAM: Depends linearly on size of physical system Classification: 3 External routines: uses the output of APBS [1] ( http://www.poissonboltzmann.org/apbs/) as input. APBS must be obtained and installed separately. Expat 2.0.1, CLAPACK, ocaml-expat, Mersenne Twister. These are included in the Browndye distribution. Nature of problem: Exploration and determination of rate constants of bimolecular interactions involving large biological molecules. Solution method: Brownian dynamics with electrostatic, excluded volume, van der Waals, and desolvation forces. Running time: Depends linearly on size of physical system and quadratically on precision of results. The included example executes in a few minutes.

  4. The role of order in distributed programs

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Marzullo, Keith

    1989-01-01

    The role of order in building distributed systems is discussed. It is the belief that a principle of event ordering underlies the wide range of operating systems mechanisms that were put forward for building robust distributed software. Stated concisely, this principle achieves correct distributed behavior by ordering classes of distributed events that conflict with one another. By focusing on order, simplified descriptions can be obtained and convincingly correct solutions to problems that might otherwise have looked extremely complex. Moreover, it is observed that there are a limited number of ways to obtain order, and that the choice made impacts greatly on performance.

  5. Balancing Hydronic Systems in Multifamily Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruch, R.; Ludwig, P.; Maurer, T.

    2014-07-01

    In multifamily hydronic systems, temperature imbalance may be caused by undersized piping, improperly adjusted balancing valves, inefficient water temperature and flow levels, and owner/occupant interaction with the boilers, distribution and controls. The effects of imbalance include tenant discomfort, higher energy use intensity and inefficient building operation. This paper explores cost-effective distribution upgrades and balancing measures in multifamily hydronic systems, providing a resource to contractors, auditors, and building owners on best practices to improve tenant comfort and lower operating costs. The research was conducted by The Partnership for Advanced Residential Retrofit (PARR) in conjunction with Elevate Energy. The team surveyed existingmore » knowledge on cost-effective retrofits for optimizing distribution in typical multifamily hydronic systems, with the aim of identifying common situations and solutions, and then conducted case studies on two Chicago area buildings with known balancing issues in order to quantify the extent of temperature imbalance. At one of these buildings a booster pump was installed on a loop to an underheated wing of the building. This study found that unit temperature in a multifamily hydronic building can vary as much as 61 degrees F, particularly if windows are opened or tenants use intermittent supplemental heating sources like oven ranges. Average temperature spread at the building as a result of this retrofit decreased from 22.1 degrees F to 15.5 degrees F.« less

  6. Angular distributions of absorbed dose of Bremsstrahlung and secondary electrons induced by 18-, 28- and 38-MeV electron beams in thick targets.

    PubMed

    Takada, Masashi; Kosako, Kazuaki; Oishi, Koji; Nakamura, Takashi; Sato, Kouichi; Kamiyama, Takashi; Kiyanagi, Yoshiaki

    2013-03-01

    Angular distributions of absorbed dose of Bremsstrahlung photons and secondary electrons at a wide range of emission angles from 0 to 135°, were experimentally obtained using an ion chamber with a 0.6 cm(3) air volume covered with or without a build-up cap. The Bremsstrahlung photons and electrons were produced by 18-, 28- and 38-MeV electron beams bombarding tungsten, copper, aluminium and carbon targets. The absorbed doses were also calculated from simulated photon and electron energy spectra by multiplying simulated response functions of the ion chambers, simulated with the MCNPX code. Calculated-to-experimental (C/E) dose ratios obtained are from 0.70 to 1.57 for high-Z targets of W and Cu, from 15 to 135° and the C/E range from 0.6 to 1.4 at 0°; however, the values of C/E for low-Z targets of Al and C are from 0.5 to 1.8 from 0 to 135°. Angular distributions at the forward angles decrease with increasing angles; on the other hand, the angular distributions at the backward angles depend on the target species. The dependences of absorbed doses on electron energy and target thickness were compared between the measured and simulated results. The attenuation profiles of absorbed doses of Bremsstrahlung beams at 0, 30 and 135° were also measured.

  7. Spatiotemporal patterns of population distribution as crucial element for risk management

    NASA Astrophysics Data System (ADS)

    Gokesch, Karin; Promper, Catrin; van Westen, Cees J.; Glade, Thomas

    2014-05-01

    The spatiotemporal distribution and presence of the population in a certain area is a crucial element within natural hazard risk management, especially in the case of rapid onset hazard events and emergency management. When fast onset hazards such as earthquakes, flash floods or industrial accidents occur, people may not have adequate time for evacuation and the emergency management requires a fast response and reaction. Therefore, information on detailed distribution of people affected by a certain hazard is important for a fast assessment of the situation including the number and the type of people (distinguishing between elderly or handicapped people, children, working population etc.) affected. This study thus aims at analyzing population distribution on an hourly basis for different days e.g. workday or holiday. The applied method combines the basic assessment of population distribution in a given area with specific location-related patterns of distribution-changes over time. The calculations are based on detailed information regarding the expected presence of certain groups of people, e.g. school children, working or elderly people, which all show different patterns of movement over certain time periods. The study area is the city of Waidhofen /Ybbs located in the Alpine foreland in the Southwest of Lower Austria. This city serves as a regional center providing basic infrastructure, shops and schools for the surrounding countryside. Therefore a lot of small and medium businesses are located in this area showing a rather high variation of population present at different times of the day. The available building footprint information was classified with respect to building type and occupancy type, which was used to estimate the expected residents within the buildings, based on the floorspace of the buildings and the average floorspace per person. Additional information on the distribution and the average duration of stay of the people in these buildings was assessed using general population statistics and specific information about selected buildings, such as schools, hospitals or homes for the elderly, to calculate the distribution patterns for each group of people over time.

  8. Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Model View Definition

    DTIC Science & Technology

    2013-06-01

    building information models ( BIM ) at the coordinated design stage of building construction. 1.3 Approach To...standard for exchanging Building Information Modeling ( BIM ) data, which defines hundreds of classes for common use in software, currently supported by...specifications, Construction Operations Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF:

  9. Stability and Degradation Mechanisms of Metal-Organic Frameworks Containing the Zr6O4(OH)4 Secondary Building Unit

    DTIC Science & Technology

    2013-03-18

    0188 3. DATES COVERED (From - To) - UU UU UU UU Approved for public release; distribution is unlimited. Stability and degradation mechanisms of metal ...Stability and degradation mechanisms of metal –organic frameworks containing the Zr6O4(OH)4 secondary building unit Report Title See publication. 3...Stability and degradation mechanisms of metal –organic frameworks containing the Zr6O4(OH)4 secondary building unit Approved for public release; distribution

  10. Method of determining dispersion dependence of refractive index of nanospheres building opals

    NASA Astrophysics Data System (ADS)

    Kępińska, Mirosława; Starczewska, Anna; Duka, Piotr

    2017-11-01

    The method of determining dispersion dependence of refractive index of nanospheres building opals is presented. In this method basing on angular dependences of the spectral positions of Bragg diffraction minima on transmission spectra for opal series of known spheres diameter, the spectrum of effective refractive index for opals and then refractive index for material building opal's spheres is determined. The described procedure is used for determination of neff(λ) for opals and nsph(λ) for material which spheres building investigated opals are made of. The obtained results are compared with literature data of nSiO2(λ) considered in the analysis and interpretation of extremes related to the light diffraction at (hkl) SiO2 opal planes.

  11. A Study towards Building An Optimal Graph Theory Based Model For The Design of Tourism Website

    NASA Astrophysics Data System (ADS)

    Panigrahi, Goutam; Das, Anirban; Basu, Kajla

    2010-10-01

    Effective tourism website is a key to attract tourists from different parts of the world. Here we identify the factors of improving the effectiveness of website by considering it as a graph, where web pages including homepage are the nodes and hyperlinks are the edges between the nodes. In this model, the design constraints for building a tourism website are taken into consideration. Our objectives are to build a framework of an effective tourism website providing adequate level of information, service and also to enable the users to reach to the desired page by spending minimal loading time. In this paper an information hierarchy specifying the upper limit of outgoing link of a page has also been proposed. Following the hierarchy, the web developer can prepare an effective tourism website. Here loading time depends on page size and network traffic. We have assumed network traffic as uniform and the loading time is directly proportional with page size. This approach is done by quantifying the link structure of a tourism website. In this approach we also propose a page size distribution pattern of a tourism website.

  12. Assessing population exposure for landslide risk analysis using dasymetric cartography

    NASA Astrophysics Data System (ADS)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric cartography. This work was supported by the FCT - Portuguese Foundation for Science and Technology.

  13. Simulation of concentration distribution of urban particles under wind

    NASA Astrophysics Data System (ADS)

    Chen, Yanghou; Yang, Hangsheng

    2018-02-01

    The concentration of particulate matter in the air is too high, which seriously affects people’s health. The concentration of particles in densely populated towns is also high. Understanding the distribution of particles in the air helps to remove them passively. The concentration distribution of particles in urban streets is simulated by using the FLUENT software. The simulation analysis based on Discrete Phase Modelling (DPM) of FLUENT. Simulation results show that the distribution of the particles is caused by different layout of buildings. And it is pointed out that in the windward area of the building and the leeward sides of the high-rise building are the areas with high concentration of particles. Understanding the concentration of particles in different areas is also helpful for people to avoid and reduce the concentration of particles in high concentration areas.

  14. 76 FR 63923 - Notice of Availability To Distribute a Final Environmental Impact Statement (FEIS) for the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-14

    ... and Environmental Design Green Building Rating System of the U.S. Green Building Council. Alternatives... International Falls, MN AGENCY: Public Buildings Service, General Services Administration (GSA). ACTION: Notice...: October 14, 2011. FOR FURTHER INFORMATION CONTACT: Donald R. Melcher, Jr., GSA Public Buildings Service...

  15. ASSESSMENT OF VENTILATION RATES IN 100 BASE OFFICE BUILDINGS

    EPA Science Inventory

    The U.S. EPA's Office of Radiation and Indoor Air studied 100 public and private office buildings across the U.S. from 1994-1998. The purpose of the study, entitled The Building Assessment Survey and Evaluation Study (BASE), was to: a) provide a distribution of IAQ, building, and...

  16. Building Security and Personal Safety. SPEC Kit 150.

    ERIC Educational Resources Information Center

    Bingham, Karen Havill

    This report on a survey of Association of Research Libraries (ARL) member libraries on building security and personal safety policies examines three areas in detail: (1) general building security (access to the building, key distribution, patrols or monitors, intrusion prevention, lighting, work environment after dark); (2) problem behavior…

  17. Lateral-torsional response of base-isolated buildings with curved surface sliding system subjected to near-fault earthquakes

    NASA Astrophysics Data System (ADS)

    Mazza, Fabio

    2017-08-01

    The curved surface sliding (CSS) system is one of the most in-demand techniques for the seismic isolation of buildings; yet there are still important aspects of its behaviour that need further attention. The CSS system presents variation of friction coefficient, depending on the sliding velocity of the CSS bearings, while friction force and lateral stiffness during the sliding phase are proportional to the axial load. Lateral-torsional response needs to be better understood for base-isolated structures located in near-fault areas, where fling-step and forward-directivity effects can produce long-period (horizontal) velocity pulses. To analyse these aspects, a six-storey reinforced concrete (r.c.) office framed building, with an L-shaped plan and setbacks in elevation, is designed assuming three values of the radius of curvature for the CSS system. Seven in-plan distributions of dynamic-fast friction coefficient for the CSS bearings, ranging from a constant value for all isolators to a different value for each, are considered in the case of low- and medium-type friction properties. The seismic analysis of the test structures is carried out considering an elastic-linear behaviour of the superstructure, while a nonlinear force-displacement law of the CSS bearings is considered in the horizontal direction, depending on sliding velocity and axial load. Given the lack of knowledge of the horizontal direction at which near-fault ground motions occur, the maximum torsional effects and residual displacements are evaluated with reference to different incidence angles, while the orientation of the strongest observed pulses is considered to obtain average values.

  18. Center for the Built Environment: Research on Building HVAC Systems

    Science.gov Websites

    , and lessons learned. Underfloor Air Distribution (UFAD) Cooling Airflow Design Tool Developing simplified design tools for optimization of underfloor systems. Underfloor Air Distribution (UFAD) Cost Near-ZNE Buildings Setpoint Energy Savings Calculator UFAD Case Studies UFAD Cooling Design Tool UFAD

  19. A support architecture for reliable distributed computing systems

    NASA Technical Reports Server (NTRS)

    Dasgupta, Partha; Leblanc, Richard J., Jr.

    1988-01-01

    The Clouds project is well underway to its goal of building a unified distributed operating system supporting the object model. The operating system design uses the object concept of structuring software at all levels of the system. The basic operating system was developed and work is under progress to build a usable system.

  20. Coupling indoor airflow, HVAC, control and building envelope heat transfer in the Modelica Buildings library

    DOE PAGES

    Zuo, Wangda; Wetter, Michael; Tian, Wei; ...

    2015-07-13

    Here, this paper describes a coupled dynamic simulation of an indoor environment with heating, ventilation, and air conditioning (HVAC) systems, controls and building envelope heat transfer. The coupled simulation can be used for the design and control of ventilation systems with stratified air distributions. Those systems are commonly used to reduce building energy consumption while improving the indoor environment quality. The indoor environment was simulated using the fast fluid dynamics (FFD) simulation programme. The building fabric heat transfer, HVAC and control system were modelled using the Modelica Buildings library. After presenting the concept, the mathematical algorithm and the implementation ofmore » the coupled simulation were introduced. The coupled FFD–Modelica simulation was then evaluated using three examples of room ventilation with complex flow distributions with and without feedback control. Lastly, further research and development needs were also discussed.« less

  1. Coupling indoor airflow, HVAC, control and building envelope heat transfer in the Modelica Buildings library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuo, Wangda; Wetter, Michael; Tian, Wei

    Here, this paper describes a coupled dynamic simulation of an indoor environment with heating, ventilation, and air conditioning (HVAC) systems, controls and building envelope heat transfer. The coupled simulation can be used for the design and control of ventilation systems with stratified air distributions. Those systems are commonly used to reduce building energy consumption while improving the indoor environment quality. The indoor environment was simulated using the fast fluid dynamics (FFD) simulation programme. The building fabric heat transfer, HVAC and control system were modelled using the Modelica Buildings library. After presenting the concept, the mathematical algorithm and the implementation ofmore » the coupled simulation were introduced. The coupled FFD–Modelica simulation was then evaluated using three examples of room ventilation with complex flow distributions with and without feedback control. Lastly, further research and development needs were also discussed.« less

  2. Halogen dependent symmetry change in two series of wheel cluster organic frameworks built from La18 tertiary building units.

    PubMed

    Fang, Wei-Hui; Zhang, Lei; Zhang, Jian; Yang, Guo-Yu

    2016-01-25

    Two series of wheel cluster organic frameworks (WCOFs) built from La18 tertiary building units are hydrothermally made, which show halogen-dependent structural symmetry, and demonstrate different chiral performances.

  3. URBANopt Advanced Analytics Platform | Buildings | NREL

    Science.gov Websites

    -use districts, different buildings may peak in energy consumption at different times. In certain cases applications. Districts, Neighborhoods, and Campuses For districts with different building types and mixed-use buildings? How does energy consumption vary depending on different building efficiency scenarios (e.g

  4. [Carbon footprint of buildings in the urban agglomeration of central Liaoning, China].

    PubMed

    Shi, Yu; Yun, Ying Xia; Liu, Chong; Chu, Ya Qi

    2017-06-18

    With the development of urbanization in China, buildings consumed lots of material and energy. How to estimate carbon emission of buildings is an important scientific problem. Carbon footprint of the central Liaoning agglomeration was studied with carbon footprint approach, geographic information system (GIS) and high-resolution remote sensing (HRRS) technology. The results showed that the construction carbon footprint coefficient of central Liaoning urban agglomeration was 269.16 kg·m -2 . The approach of interpreting total building area and spatial distribution with HRRS was effective, and the accuracy was 89%. The extraction approach was critical for total carbon footprint and spatial distribution estimation. The building area and total carbon footprint of central Liaoning urban agglomeration in descending order was Shenyang, Anshan, Fushun, Liao-yang, Yingkou, Tieling and Benxi. The annual average increment of footprint from 2011 to 2013 in descending order was Shenyang, Benxi, Fushun, Anshan, Tieling, Yingkou and Liaoyang. The accurate estimation of construction carbon footprint spatial and its distribution was of significance for the planning and optimization of carbon emission reduction.

  5. Development of probabilistic multimedia multipathway computer codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; LePoire, D.; Gnanapragasam, E.

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less

  6. An Improved Simulation of the Diurnally Varying Street Canyon Flow

    NASA Astrophysics Data System (ADS)

    Yaghoobian, Neda; Kleissl, Jan; Paw U, Kyaw Tha

    2012-11-01

    The impact of diurnal variation of temperature distribution over building and ground surfaces on the wind flow and scalar transport in street canyons is numerically investigated using the PArallelized LES Model (PALM). The Temperature of Urban Facets Indoor-Outdoor Building Energy Simulator (TUF-IOBES) is used for predicting urban surface heat fluxes as boundary conditions for a modified version of PALM. TUF-IOBES dynamically simulates indoor and outdoor building surface temperatures and heat fluxes in an urban area taking into account weather conditions, indoor heat sources, building and urban material properties, composition of the building envelope (e.g. windows, insulation), and HVAC equipment. Temperature (and heat flux) distribution over urban surfaces of the 3-D raster-type geometry of TUF-IOBES makes it possible to provide realistic, high resolution boundary conditions for the numerical simulation of flow and scalar transport in an urban canopy. Compared to some previous analyses using uniformly distributed thermal forcing associated with urban surfaces, the present analysis shows that resolving non-uniform thermal forcings can provide more detailed and realistic patterns of the local air flow and pollutant dispersion in urban canyons.

  7. Structure and Mechanical Properties of the AlSi10Mg Alloy Samples Manufactured by Selective Laser Melting

    NASA Astrophysics Data System (ADS)

    Li, Xiaodan; Ni, Jiaqiang; Zhu, Qingfeng; Su, Hang; Cui, Jianzhong; Zhang, Yifei; Li, Jianzhong

    2017-11-01

    The AlSi10Mg alloy samples with the size of 14×14×91mm were produced by the selective laser melting (SLM) method in different building direction. The structures and the properties at -70°C of the sample in different direction were investigated. The results show that the structure in different building direction shows different morphology. The fish scale structures distribute on the side along the building direction, and the oval structures distribute on the side vertical to the building direction. Some pores in with the maximum size of 100 μm exist of the structure. And there is no major influence for the build orientation on the tensile properties. The tensile strength and the elongation of the sample in the building direction are 340 Mpa and 11.2 % respectively. And the tensile strength and the elongation of the sample vertical to building direction are 350 Mpa and 13.4 % respectively

  8. Chilled water study EEAP program for Walter Reed Army Medical Center: Book 2. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-02-01

    The Energy Engineering Analysis Program (EEAP) Study for Walter Reed Army Medical Center (WRAMC) was to provide a thorough examination of the central chilled water plants on site. WRAMC is comprised of seventy-one (71) buildings located on a 113-acre site in Washington, D.C. There are two (2) central chilled water plants (Buildings 48 and 49) each with a primary chilled water distribution system. In addition to the two (2) central plants, three (3) buildings utilize their own independent chillers. Two (2) of the independent chillers (Buildings 7 and T-2), one of which is inoperative (T-2), are smaller air-cooled units, whilemore » the third (Building 54) has a 1,900-ton chilled water plant comprised of three (3) centrifugal chillers. Of the two (2) central chilled water plants, Building 48 houses six (6) chillers totalling 7,080 tons of cooling and Building 49 houses one (1) chiller with 660 tons of cooling. The total chiller cooling capacity available on site is 9,840 tons. The chilled water systems were reviewed for alternative ways of conserving energy on site and reducing the peak-cooling load. Distribution systems were reviewed to determine which buildings were served by each of the chilled water plants and to determine chilled water usage on site. Evaluations were made of building exterior and interior composition in order to estimate cooling loads. Interviews with site personnel helped Entech better understand the chilled water plants, the distribution systems, and how each system was utilized.« less

  9. Impact of atmospheric particulate matter pollutants to IAQ of airport terminal buildings: A first field study at Tianjin Airport, China

    NASA Astrophysics Data System (ADS)

    Ren, Jianlin; Cao, Xiaodong; Liu, Junjie

    2018-04-01

    Passengers usually spend hours in the airport terminal buildings waiting for their departure. During the long waiting period, ambient fine particles (PM2.5) and ultrafine particles (UFP) generated by airliners may penetrate into terminal buildings through open doors and the HVAC system. However, limited data are available on passenger exposure to particulate pollutants in terminal buildings. We conducted on-site measurements on PM2.5 and UFP concentration and the particle size distribution in the terminal building of Tianjin Airport, China during three different seasons. The results showed that the PM2.5 concentrations in the terminal building were considerably larger than the values guided by Chinese standard and WHO on all of the tested seasons, and the conditions were significantly affected by the outdoor air (Spearman test, p < 0.01). The indoor/outdoor PM2.5 ratios (I/O) ranged from 0.67 to 0.84 in the arrival hall and 0.79 to 0.96 in the departure hall. The particle number concentration in the terminal building presented a bi-modal size distribution, with one mode being at 30 nm and another mode at 100 nm. These results were totally different from the size distribution measured in a normal urban environment. The total UFP exposure during the whole waiting period (including in the terminal building and airliner cabin) of a passenger is approximately equivalent to 11 h of exposure to normal urban environments. This study is expected to contribute to the improvement of indoor air quality and health of passengers in airport terminal buildings.

  10. Particle size distribution and composition in a mechanically ventilated school building during air pollution episodes.

    PubMed

    Parker, J L; Larson, R R; Eskelson, E; Wood, E M; Veranth, J M

    2008-10-01

    Particle count-based size distribution and PM(2.5) mass were monitored inside and outside an elementary school in Salt Lake City (UT, USA) during the winter atmospheric inversion season. The site is influenced by urban traffic and the airshed is subject to periods of high PM(2.5) concentration that is mainly submicron ammonium and nitrate. The school building has mechanical ventilation with filtration and variable-volume makeup air. Comparison of the indoor and outdoor particle size distribution on the five cleanest and five most polluted school days during the study showed that the ambient submicron particulate matter (PM) penetrated the building, but indoor concentrations were about one-eighth of outdoor levels. The indoor:outdoor PM(2.5) mass ratio averaged 0.12 and particle number ratio for sizes smaller than 1 microm averaged 0.13. The indoor submicron particle count and indoor PM(2.5) mass increased slightly during pollution episodes but remained well below outdoor levels. When the building was occupied the indoor coarse particle count was much higher than ambient levels. These results contribute to understanding the relationship between ambient monitoring station data and the actual human exposure inside institutional buildings. The study confirms that staying inside a mechanically ventilated building reduces exposure to outdoor submicron particles. This study supports the premise that remaining inside buildings during particulate matter (PM) pollution episodes reduces exposure to submicron PM. New data on a mechanically ventilated institutional building supplements similar studies made in residences.

  11. Impact of the spectral and spatial properties of natural light on indoor gas-phase chemistry: Experimental and modeling study.

    PubMed

    Blocquet, M; Guo, F; Mendez, M; Ward, M; Coudert, S; Batut, S; Hecquet, C; Blond, N; Fittschen, C; Schoemaecker, C

    2018-05-01

    The characteristics of indoor light (intensity, spectral, spatial distribution) originating from outdoors have been studied using experimental and modeling tools. They are influenced by many parameters such as building location, meteorological conditions, and the type of window. They have a direct impact on indoor air quality through a change in chemical processes by varying the photolysis rates of indoor pollutants. Transmittances of different windows have been measured and exhibit different wavelength cutoffs, thus influencing the potential of different species to be photolysed. The spectral distribution of light entering indoors through the windows was measured under different conditions and was found to be weakly dependent on the time of day for indirect cloudy, direct sunshine, partly cloudy conditions contrary to the light intensity, in agreement with calculations of the transmittance as a function of the zenithal angle and the calculated outdoor spectral distribution. The same conclusion can be drawn concerning the position within the room. The impact of these light characteristics on the indoor chemistry has been studied using the INCA-Indoor model by considering the variation in the photolysis rates of key indoor species. Depending on the conditions, photolysis processes can lead to a significant production of radicals and secondary species. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. The distribution of odd nitrogen in the lower stratosphere and possible perturbations caused by stratospheric air transport

    NASA Technical Reports Server (NTRS)

    Isaksen, I. S. A.; Hesstvedt, E.

    1973-01-01

    In the lower stratosphere a significant production of odd nitrogen results from the reaction N2O + O(D-1) yields 2NO. Since the transport is relatively slow, odd nitrogen builds up with a maximum mixing ratio of 2 x 10 to the minus 8th power at 30 Km. Profiles of odd nitrogen, for different latitudes, winter and summer, are computed from one-dimensional transport models. Variations with latitude are small. Horizontal transport is therefore not believed to alter our results significantly. In order to evaluate the effect of odd nitrogen upon the ozone layer, NO(x) profiles are calculated. OH is here a key component, since it converts NO2 to HNO3. In the region where ozone is determined by chemistry rather than by transport (above 25 km), NO2 is found to be relatively abundant. The effect of stratospheric transport on the NO(x) distribution is shown to depend critically upon the height of emission. The effect increases by a factor of 5 or more for a change of flight level from 18 km to 23 km. This strong dependence should be duely considered when future stratospheric transport is discussed.

  13. Prospects for development of heat supply systems in high-rise districts

    NASA Astrophysics Data System (ADS)

    Zhila, Viktor; Solovyeva, Elena

    2018-03-01

    The article analyzes the main advantages and disadvantages of centralized and decentralized heat supply systems in high-rise districts. The main schemes of centralized heat supply systems are considered. They include centralized heat supply from boiler houses, centralized heat supply from autonomous heat sources, heat supply from roof boiler houses and door-to-door heating supply. For each of these variant, the gas distribution systems are considered and analyzed. These systems vary depending on the heat source location. For each of these systems, technical and economic indicators are taken into account, the analysis of which allows choosing the best option for districts where high-rise buildings predominate.

  14. Simulation of Mean Flow and Turbulence over a 2D Building Array Using High-Resolution CFD and a Distributed Drag Force Approach

    DTIC Science & Technology

    2016-06-16

    procedure. The predictive capabilities of the high-resolution computational fluid dynamics ( CFD ) simulations of urban flow are validated against a very...turbulence over a 2D building array using high-resolution CFD and a distributed drag force approach a Department of Mechanical Engineering, University

  15. Marketing and Distributive Education Curriculum Guide: Hardware-Building Materials, Farm and Garden.

    ERIC Educational Resources Information Center

    Cluck, Janice Bora

    Designed to be used with the General Marketing Curriculum Planning Guide (ED 156 860), this guide is intended to provide the curriculum coordinator with a basis for planning a comprehensive program in the field of marketing for farm and garden hardware building materials; it is designed also to allow marketing and distributive education…

  16. A sensor-less LED dimming system based on daylight harvesting with BIPV systems.

    PubMed

    Yoo, Seunghwan; Kim, Jonghun; Jang, Cheol-Yong; Jeong, Hakgeun

    2014-01-13

    Artificial lighting in office buildings typically requires 30% of the total energy consumption of the building, providing a substantial opportunity for energy savings. To reduce the energy consumed by indoor lighting, we propose a sensor-less light-emitting diode (LED) dimming system using daylight harvesting. In this study, we used light simulation software to quantify and visualize daylight, and analyzed the correlation between photovoltaic (PV) power generation and indoor illumination in an office with an integrated PV system. In addition, we calculated the distribution of daylight illumination into the office and dimming ratios for the individual control of LED lights. Also, we were able directly to use the electric power generated by PV system. As a result, power consumption for electric lighting was reduced by 40 - 70% depending on the season and the weather conditions. Thus, the dimming system proposed in this study can be used to control electric lighting to reduce energy use cost-effectively and simply.

  17. [THE USE OF OPEN REAL ESTATE DATABASES FOR THE ANALYSIS OF INFLUENCE OF CONCOMITANT FACTORS ON THE STATE OF THE URBAN POPULATION'S HEALTH].

    PubMed

    Zheleznyak, E V; Khripach, L V

    2015-01-01

    There was suggested a new method of the assessment of certain social-lifestyle factors in hygienic health examination of the urban population, based on the work with the open real estate databases on residential areas of the given city. On the example of the Moscow FlatInfo portal for a sample of 140 residents of the city of Moscow there was studied the distribution of such available for analysis factors as a typical design of the building, where studied citizen resides, the year of its construction and the market price of 1m2 of housing space in this house. The latter value is a quantitative integrated assessment of the social and lifestyle quality of housing, depending on the type and technical condition of the building, neighborhood environment, infrastructure of the region and many other factors, and may be a useful supplemental index in hygienic research.

  18. Building resilience of the Global Positioning System to space weather

    NASA Astrophysics Data System (ADS)

    Fisher, Genene; Kunches, Joseph

    2011-12-01

    Almost every aspect of the global economy now depends on GPS. Worldwide, nations are working to create a robust Global Navigation Satellite System (GNSS), which will provide global positioning, navigation, and timing (PNT) services for applications such as aviation, electric power distribution, financial exchange, maritime navigation, and emergency management. The U.S. government is examining the vulnerabilities of GPS, and it is well known that space weather events, such as geomagnetic storms, contribute to errors in single-frequency GPS and are a significant factor for differential GPS. The GPS industry has lately begun to recognize that total electron content (TEC) signal delays, ionospheric scintillation, and solar radio bursts can also interfere with daily operations and that these threats grow with the approach of the next solar maximum, expected to occur in 2013. The key challenges raised by these circumstances are, first, to better understand the vulnerability of GPS technologies and services to space weather and, second, to develop policies that will build resilience and mitigate risk.

  19. Software packager user's guide

    NASA Technical Reports Server (NTRS)

    Callahan, John R.

    1995-01-01

    Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.

  20. Determination of the spectral dependence of reduced scattering and quantitative second-harmonic generation imaging for detection of fibrillary changes in ovarian cancer

    NASA Astrophysics Data System (ADS)

    Campbell, Kirby R.; Tilbury, Karissa B.; Campagnola, Paul J.

    2015-03-01

    Here, we examine ovarian cancer extracellular matrix (ECM) modification by measuring the wavelength dependence of optical scattering measurements and quantitative second-harmonic generation (SHG) imaging metrics in the range of 800-1100 nm in order to determine fibrillary changes in ex vivo normal ovary, type I, and type II ovarian cancer. Mass fractals of the collagen fiber structure is analyzed based on a power law correlation function using spectral dependence measurements of the reduced scattering coefficient μs' where the mass fractal dimension is related to the power. Values of μs' are measured using independent methods of determining the values of μs and g by on-axis attenuation measurements using the Beer-Lambert Law and by fitting the angular distribution of scattering to the Henyey-Greenstein phase function, respectively. Quantitativespectral SHG imaging on the same tissues determines FSHG/BSHG creation ratios related to size and harmonophore distributions. Both techniques probe fibril packing order, but the optical scattering probes structures of sizes from about 50-2000 nm where SHG imaging - although only able to resolve individual fibers - builds contrast from the assembly of fibrils. Our findings suggest that type I ovarian tumor structure has the most ordered collagen fibers followed by normal ovary then type II tumors showing the least order.

  1. On the impact of topography and building mask on time varying gravity due to local hydrology

    NASA Astrophysics Data System (ADS)

    Deville, S.; Jacob, T.; Chéry, J.; Champollion, C.

    2013-01-01

    We use 3 yr of surface absolute gravity measurements at three sites on the Larzac plateau (France) to quantify the changes induced by topography and the building on gravity time-series, with respect to an idealized infinite slab approximation. Indeed, local topography and buildings housing ground-based gravity measurement have an effect on the distribution of water storage changes, therefore affecting the associated gravity signal. We first calculate the effects of surrounding topography and building dimensions on the gravity attraction for a uniform layer of water. We show that a gravimetric interpretation of water storage change using an infinite slab, the so-called Bouguer approximation, is generally not suitable. We propose to split the time varying gravity signal in two parts (1) a surface component including topographic and building effects (2) a deep component associated to underground water transfer. A reservoir modelling scheme is herein presented to remove the local site effects and to invert for the effective hydrological properties of the unsaturated zone. We show that effective time constants associated to water transfer vary greatly from site to site. We propose that our modelling scheme can be used to correct for the local site effects on gravity at any site presenting a departure from a flat topography. Depending on sites, the corrected signal can exceed measured values by 5-15 μGal, corresponding to 120-380 mm of water using the Bouguer slab formula. Our approach only requires the knowledge of daily precipitation corrected for evapotranspiration. Therefore, it can be a useful tool to correct any kind of gravimetric time-series data.

  2. 7 CFR 51.56 - Buildings and structures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... its distribution throughout the plant and washing machinery; (c) There shall also be an efficient... STANDARDS) Regulations 1 Requirements for Plants Operating Under Continuous Inspection on A Contract Basis § 51.56 Buildings and structures. The packing plant buildings shall be properly constructed and...

  3. New rain shed (Building No. 241), overhead pipeline and raw ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    New rain shed (Building No. 241), overhead pipeline and raw water tank T4. Distribution pump house can be seen at the center of building. - Hawaii Volcanoes National Park Water Collection System, Hawaii Volcanoes National Park, Volcano, Hawaii County, HI

  4. Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Application of Model View Definition Attributes

    DTIC Science & Technology

    2013-06-01

    Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...to develop a life-cycle building model have resulted in the definition of a “core” building information model that contains general information de...develop an information -exchange Model View Definition (MVD) for building electrical systems. The objective of the current work was to document the

  5. Population Density Modeling for Diverse Land Use Classes: Creating a National Dasymetric Worker Population Model

    NASA Astrophysics Data System (ADS)

    Trombley, N.; Weber, E.; Moehl, J.

    2017-12-01

    Many studies invoke dasymetric mapping to make more accurate depictions of population distribution by spatially restricting populations to inhabited/inhabitable portions of observational units (e.g., census blocks) and/or by varying population density among different land classes. LandScan USA uses this approach by restricting particular population components (such as residents or workers) to building area detected from remotely sensed imagery, but also goes a step further by classifying each cell of building area in accordance with ancillary land use information from national parcel data (CoreLogic, Inc.'s ParcelPoint database). Modeling population density according to land use is critical. For instance, office buildings would have a higher density of workers than warehouses even though the latter would likely have more cells of detection. This paper presents a modeling approach by which different land uses are assigned different densities to more accurately distribute populations within them. For parts of the country where the parcel data is insufficient, an alternate methodology is developed that uses National Land Cover Database (NLCD) data to define the land use type of building detection. Furthermore, LiDAR data is incorporated for many of the largest cities across the US, allowing the independent variables to be updated from two-dimensional building detection area to total building floor space. In the end, four different regression models are created to explain the effect of different land uses on worker distribution: A two-dimensional model using land use types from the parcel data A three-dimensional model using land use types from the parcel data A two-dimensional model using land use types from the NLCD data, and A three-dimensional model using land use types from the NLCD data. By and large, the resultant coefficients followed intuition, but importantly allow the relationships between different land uses to be quantified. For instance, in the model using two-dimensional building area, commercial building area had a density 2.5 times greater than public building area and 4 times greater than industrial building area. These coefficients can be applied to define the ratios at which population is distributed to building cells. Finally, possible avenues for refining the methodology are presented.

  6. Modeling mobile source emissions during traffic jams in a micro urban environment.

    PubMed

    Kondrashov, Valery V; Reshetin, Vladimir P; Regens, James L; Gunter, James T

    2002-01-01

    Urbanization typically involves a continuous increase in motor vehicle use, resulting in congestion known as traffic jams. Idling emissions due to traffic jams combine with the complex terrain created by buildings to concentrate atmospheric pollutants in localized areas. This research simulates emissions concentrations and distributions for a congested street in Minsk, Belarus. Ground-level (up to 50-meters above the street's surface) pollutant concentrations were calculated using STAR (version 3.10) with emission factors obtained from the U.S. Environmental Protection Agency, wind speed and direction, and building location and size. Relative emissions concentrations and distributions were simulated at 1-meter and 10-meters above street level. The findings demonstrate the importance of wind speed and direction, and building size and location on emissions concentrations and distributions, with the leeward sides of buildings retaining up to 99 percent of the emitted pollutants within 1-meter of street level, and up to 77 percent 10-meters above the street.

  7. Architecture Knowledge for Evaluating Scalable Databases

    DTIC Science & Technology

    2015-01-16

    problems, arising from the proliferation of new data models and distributed technologies for building scalable, available data stores . Architects must...longer are relational databases the de facto standard for building data repositories. Highly distributed, scalable “ NoSQL ” databases [11] have emerged...This is especially challenging at the data storage layer. The multitude of competing NoSQL database technologies creates a complex and rapidly

  8. Modeling the direct sun component in buildings using matrix algebraic approaches: Methods and validation

    DOE PAGES

    Lee, Eleanor S.; Geisler-Moroder, David; Ward, Gregory

    2017-12-23

    Simulation tools that enable annual energy performance analysis of optically-complex fenestration systems have been widely adopted by the building industry for use in building design, code development, and the development of rating and certification programs for commercially-available shading and daylighting products. The tools rely on a three-phase matrix operation to compute solar heat gains, using as input low-resolution bidirectional scattering distribution function (BSDF) data (10–15° angular resolution; BSDF data define the angle-dependent behavior of light-scattering materials and systems). Measurement standards and product libraries for BSDF data are undergoing development to support solar heat gain calculations. Simulation of other metrics suchmore » as discomfort glare, annual solar exposure, and potentially thermal discomfort, however, require algorithms and BSDF input data that more accurately model the spatial distribution of transmitted and reflected irradiance or illuminance from the sun (0.5° resolution). This study describes such algorithms and input data, then validates the tools (i.e., an interpolation tool for measured BSDF data and the five-phase method) through comparisons with ray-tracing simulations and field monitored data from a full-scale testbed. Simulations of daylight-redirecting films, a micro-louvered screen, and venetian blinds using variable resolution, tensor tree BSDF input data derived from interpolated scanning goniophotometer measurements were shown to agree with field monitored data to within 20% for greater than 75% of the measurement period for illuminance-based performance parameters. The three-phase method delivered significantly less accurate results. We discuss the ramifications of these findings on industry and provide recommendations to increase end user awareness of the current limitations of existing software tools and BSDF product libraries.« less

  9. Modeling the direct sun component in buildings using matrix algebraic approaches: Methods and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Eleanor S.; Geisler-Moroder, David; Ward, Gregory

    Simulation tools that enable annual energy performance analysis of optically-complex fenestration systems have been widely adopted by the building industry for use in building design, code development, and the development of rating and certification programs for commercially-available shading and daylighting products. The tools rely on a three-phase matrix operation to compute solar heat gains, using as input low-resolution bidirectional scattering distribution function (BSDF) data (10–15° angular resolution; BSDF data define the angle-dependent behavior of light-scattering materials and systems). Measurement standards and product libraries for BSDF data are undergoing development to support solar heat gain calculations. Simulation of other metrics suchmore » as discomfort glare, annual solar exposure, and potentially thermal discomfort, however, require algorithms and BSDF input data that more accurately model the spatial distribution of transmitted and reflected irradiance or illuminance from the sun (0.5° resolution). This study describes such algorithms and input data, then validates the tools (i.e., an interpolation tool for measured BSDF data and the five-phase method) through comparisons with ray-tracing simulations and field monitored data from a full-scale testbed. Simulations of daylight-redirecting films, a micro-louvered screen, and venetian blinds using variable resolution, tensor tree BSDF input data derived from interpolated scanning goniophotometer measurements were shown to agree with field monitored data to within 20% for greater than 75% of the measurement period for illuminance-based performance parameters. The three-phase method delivered significantly less accurate results. We discuss the ramifications of these findings on industry and provide recommendations to increase end user awareness of the current limitations of existing software tools and BSDF product libraries.« less

  10. Analysis of the Dependence between Energy Demand Indicators in Buildings Based on Variants for Improving Energy Efficiency in a School Building

    NASA Astrophysics Data System (ADS)

    Skiba, Marta; Rzeszowska, Natalia

    2017-09-01

    One of the five far-reaching goals of the European Union is climate change and sustainable energy use. The first step in the implementation of this task is to reduce energy demand in buildings to a minimum by 2021, and in the case of public buildings by 2019. This article analyses the possibility of improving energy efficiency in public buildings, the relationship between particular indicators of the demand for usable energy (UE), final energy (FE) and primary energy (PE) in buildings and the impact of these indicators on the assessment of energy efficiency in public buildings, based on 5 variants of extensive thermal renovation of a school building. The analysis of the abovementioned variants confirms that the thermal renovation of merely the outer envelope of the building is insufficient and requires the use of additional energy sources, for example RES. Moreover, each indicator of energy demand in the building plays a key role in assessing the energy efficiency of the building. For this reason it is important to analyze each of them individually, as well as the dependencies between them.

  11. Building-to-Grid Integration through Commercial Building Portfolios Participating in Energy and Frequency Regulation Markets

    NASA Astrophysics Data System (ADS)

    Pavlak, Gregory S.

    Building energy use is a significant contributing factor to growing worldwide energy demands. In pursuit of a sustainable energy future, commercial building operations must be intelligently integrated with the electric system to increase efficiency and enable renewable generation. Toward this end, a model-based methodology was developed to estimate the capability of commercial buildings to participate in frequency regulation ancillary service markets. This methodology was integrated into a supervisory model predictive controller to optimize building operation in consideration of energy prices, demand charges, and ancillary service revenue. The supervisory control problem was extended to building portfolios to evaluate opportunities for synergistic effect among multiple, centrally-optimized buildings. Simulation studies performed showed that the multi-market optimization was able to determine appropriate opportunities for buildings to provide frequency regulation. Total savings were increased by up to thirteen percentage points, depending on the simulation case. Furthermore, optimizing buildings as a portfolio achieved up to seven additional percentage points of savings, depending on the case. Enhanced energy and cost savings opportunities were observed by taking the novel perspective of optimizing building portfolios in multiple grid markets, motivating future pursuits of advanced control paradigms that enable a more intelligent electric grid.

  12. Low cost self-made pressure distribution sensors for ergonomic chair: Are they suitable for posture monitoring?

    PubMed

    Martinaitis, Arnas; Daunoraviciene, Kristina

    2018-05-18

    Long sitting causes many health problems for people. Healthy sitting monitoring systems, like real-time pressure distribution measuring, is in high demand and many methods of posture recognition were developed. Such systems are usually expensive and hardly available for the regular user. The aim of study is to develop low cost but sensitive enough pressure sensors and posture monitoring system. New self-made pressure sensors have been developed and tested, and prototype of pressure distribution measuring system was designed. Sensors measured at average noise amplitude of a = 56 mV (1.12%), average variation in sequential measurements of the same sensor s = 17 mV (0.34%). Signal variability between sensors averaged at 100 mV (2.0%). Weight to signal dependency graph was measured and hysteresis calculated. Results suggested the use of total sixteen sensors for posture monitoring system with accuracy of < 1.5% after relaxation and repeatability of around 2%. Results demonstrate that hand-made sensor sensitivity and repeatability are acceptable for posture monitoring, and it is possible to build low cost pressure distribution measurement system with graphical visualization without expensive equipment or complicated software.

  13. Joint Real-Time Energy and Demand-Response Management using a Hybrid Coalitional-Noncooperative Game

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Fulin; Gu, Yi; Hao, Jun

    In order to model the interactions among utility companies, building demands and renewable energy generators (REGs), a hybrid coalitional-noncooperative game framework has been proposed. We formulate a dynamic non-cooperative game to study the energy dispatch within multiple utility companies, while we take a coalitional perspective on REGs and buildings demands through a hedonic coalition formation game approach. In this case, building demands request different power supply from REGs, then the building demands can be organized into an ultimate coalition structure through a distributed hedonic shift algorithm. At the same time, utility companies can also obtain a stable power generation profile.more » In addition, the interactive progress among the utility companies and building demands which cannot be supplied by REGs is implemented by distributed game theoretic algorithms. Numerical results illustrate that the proposed hybrid coalitional-noncooperative game scheme reduces the cost of both building demands and utility companies compared with the initial scene.« less

  14. Fractal analysis of urban catchments and their representation in semi-distributed models: imperviousness and sewer system

    NASA Astrophysics Data System (ADS)

    Gires, Auguste; Tchiguirinskaia, Ioulia; Schertzer, Daniel; Ochoa-Rodriguez, Susana; Willems, Patrick; Ichiba, Abdellah; Wang, Lipen; Pina, Rui; Van Assel, Johan; Bruni, Guendalina; Murla Tuyls, Damian; ten Veldhuis, Marie-Claire

    2017-04-01

    Land use distribution and sewer system geometry exhibit complex scale dependent patterns in urban environment. This scale dependency is even more visible in a rasterized representation where only a unique class is affected to each pixel. Such features are well grasped with fractal tools, which are based scale invariance and intrinsically designed to characterise and quantify the space filled by a geometrical set exhibiting complex and tortuous patterns. Fractal tools have been widely used in hydrology but seldom in the specific context of urban hydrology. In this paper, they are used to analyse surface and sewer data from 10 urban or peri-urban catchments located in 5 European countries in the framework of the NWE Interreg RainGain project (www.raingain.eu). The aim was to characterise urban catchment properties accounting for the complexity and inhomogeneity typical of urban water systems. Sewer system density and imperviousness (roads or buildings), represented in rasterized maps of 2 m x 2 m pixels, were analysed to quantify their fractal dimension, characteristic of scaling invariance. It appears that both sewer density and imperviousness exhibit scale invariant features that can be characterized with the help of fractal dimensions ranging from 1.6 to 2, depending on the catchment. In a given area, consistent results were found for the two geometrical features, yielding a robust and innovative way of quantifying the level of urbanization. The representation of imperviousness in operational semi-distributed hydrological models for these catchments was also investigated by computing fractal dimensions of the geometrical sets made up of the sub-catchments with coefficients of imperviousness greater than a range of thresholds. It enables to quantify how well spatial structures of imperviousness are represented in the urban hydrological models.

  15. Characterization of an Aluminum Alloy Hemispherical Shell Fabricated via Direct Metal Laser Melting

    NASA Astrophysics Data System (ADS)

    Holesinger, T. G.; Carpenter, J. S.; Lienert, T. J.; Patterson, B. M.; Papin, P. A.; Swenson, H.; Cordes, N. L.

    2016-03-01

    The ability of additive manufacturing to directly fabricate complex shapes provides characterization challenges for part qualification. The orientation of the microstructures produced by these processes will change relative to the surface normal of a complex part. In this work, the microscopy and x-ray tomography of an AlSi10Mg alloy hemispherical shell fabricated using powder bed metal additive manufacturing are used to illustrate some of these challenges. The shell was manufactured using an EOS M280 system in combination with EOS-specified powder and process parameters. The layer-by-layer process of building the shell with the powder bed additive manufacturing approach results in a position-dependent microstructure that continuously changes its orientation relative to the shell surface normal. X-ray tomography was utilized to examine the position-dependent size and distribution of porosity and surface roughness in the 98.6% dense part. Optical and electron microscopy were used to identify global and local position-dependent structures, grain morphologies, chemistry, and precipitate sizes and distributions. The rapid solidification processes within the fusion zone (FZ) after the laser transit results in a small dendrite size. Cell spacings taken from the structure in the middle of the FZ were used with published relationships to estimate a cooling rate of ~9 × 105 K/s. Uniformly-distributed, nanoscale Si precipitates were found within the primary α-Al grains. A thin, distinct boundary layer containing larger α-Al grains and extended regions of the nanocrystalline divorced eutectic material surrounds the FZ. Subtle differences in the composition between the latter layer and the interior of the FZ were noted with scanning transmission electron microscopy (STEM) spectral imaging.

  16. Assessing hail risk for a building portfolio by generating stochastic events

    NASA Astrophysics Data System (ADS)

    Nicolet, Pierrick; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel; Nguyen, Liliane; Voumard, Jérémie

    2015-04-01

    Among the natural hazards affecting buildings, hail is one of the most costly and is nowadays a major concern for building insurance companies. In Switzerland, several costly events were reported these last years, among which the July 2011 event, which cost around 125 million EUR to the Aargauer public insurance company (North-western Switzerland). This study presents the new developments in a stochastic model which aims at evaluating the risk for a building portfolio. Thanks to insurance and meteorological radar data of the 2011 Aargauer event, vulnerability curves are proposed by comparing the damage rate to the radar intensity (i.e. the maximum hailstone size reached during the event, deduced from the radar signal). From these data, vulnerability is defined by a two-step process. The first step defines the probability for a building to be affected (i.e. to claim damages), while the second, if the building is affected, attributes a damage rate to the building from a probability distribution specific to the intensity class. To assess the risk, stochastic events are then generated by summing a set of Gaussian functions with 6 random parameters (X and Y location, maximum hailstone size, standard deviation, eccentricity and orientation). The location of these functions is constrained by a general event shape and by the position of the previously defined functions of the same event. For each generated event, the total cost is calculated in order to obtain a distribution of event costs. The general events parameters (shape, size, …) as well as the distribution of the Gaussian parameters are inferred from two radar intensity maps, namely the one of the aforementioned event, and a second from an event which occurred in 2009. After a large number of simulations, the hailstone size distribution obtained in different regions is compared to the distribution inferred from pre-existing hazard maps, built from a larger set of radar data. The simulation parameters are then adjusted by trial and error, in order to get the best reproduction of the expected distributions. The value of the mean annual risk obtained using the model is also compared to the mean annual risk calculated using directly the hazard maps. According to the first results, the return period of an event inducing a total damage cost equal or greater than 125 million EUR for the Aargauer insurance company would be of around 10 to 40 years.

  17. GCM Simulations of Titan's Paleoclimate

    NASA Astrophysics Data System (ADS)

    Lora, Juan M.; Lunine, Jonathan; Russell, Joellen; Hayes, Alexander

    2014-11-01

    The hemispheric asymmetry observed in the distribution of Titan's lakes and seas has been suggested to be the result of asymmetric seasonal forcing, where a relative moistening of the north occurs in the current epoch due to its longer and less intense summers. General circulation models (GCMs) of present-day Titan have also shown that the atmosphere transports methane away from the equator. In this work, we use a Titan GCM to investigate the effects that changes in Titan's effective orbital parameters have had on its climate in recent geologic history. The simulations show that the climate is relatively insensitive to changes in orbital parameters, with persistently dry low latitudes and wet polar regions. The amount of surface methane that builds up over either pole depends on the insolation distribution, confirming the influence of orbital forcing on the distribution of surface liquids. The evolution of the orbital forcing implies that the surface reservoir must be transported on timescales of ~30 kyr, in which case the asymmetry reverses with a period of ~125 kyr. Otherwise, the orbital forcing is insufficient for generating the observed dichotomy.

  18. Effect of Unprofessional Supervision on Durability of Buildings.

    PubMed

    Yahaghi, Javad

    2018-02-01

    The durability of buildings which depends on the nature of the supervisory system used in their construction is an important feature of the construction industry. This article tries to draw the readers' attention to the effect of untrained and unprofessional building supervisors and their unethical performance on the durability of buildings.

  19. Cold Climate and Retrofit Applications for Air-to-Air Heat Pumps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baxter, Van D

    2015-01-01

    Air source heat pumps (ASHP) including air-to-air ASHPs are easily applied to buildings almost anywhere for new construction as well as retrofits or renovations. They are widespread in milder climate regions but their use in cold regions is hampered due to low heating efficiency and capacity at cold outdoor temperatures. Retrofitting air-to-air ASHPs to existing buildings is relatively easy if the building already has an air distribution system. For buildings without such systems alternative approaches are necessary. Examples are ductless, minisplit heat pumps or central heat pumps coupled to small diameter, high velocity (SDHV) air distribution systems. This article presentsmore » two subjects: 1) a summary of R&D investigations aimed at improving the cold weather performance of ASHPs, and 2) a brief discussion of building retrofit options using air-to-air ASHP systems.« less

  20. Distributed communication: Implications of cultural-historical activity theory (CHAT) for communication disorders.

    PubMed

    Hengst, Julie A

    2015-01-01

    This article proposes distributed communication as a promising theoretical framework for building supportive environments for child language development. Distributed communication is grounded in an emerging intersection of cultural-historical activity theory (CHAT) and theories of communicative practices that argue for integrating accounts of language, cognition and culture. The article first defines and illustrates through selected research articles, three key principles of distributed communication: (a) language and all communicative resources are inextricably embedded in activity; (b) successful communication depends on common ground built up through short- and long-term histories of participation in activities; and (c) language cannot act alone, but is always orchestrated with other communicative resources. It then illustrates how these principles are fully integrated in everyday interactions by drawing from my research on Cindy Magic, a verbal make-believe game played by a father and his two daughters. Overall, the research presented here points to the remarkably complex communicative environments and sophisticated forms of distributed communication children routinely engage in as they interact with peer and adult communication partners in everyday settings. The article concludes by considering implications of these theories for, and examples of, distributed communication relevant to clinical intervention. Readers will learn about (1) distributed communication as a conceptual tool grounded in an emerging intersection of cultural-historical activity theory and theories of communicative practices and (2) how to apply distributed communication to the study of child language development and to interventions for children with communication disorders. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Statistical study of spatio-temporal distribution of precursor solar flares associated with major flares

    NASA Astrophysics Data System (ADS)

    Gyenge, N.; Ballai, I.; Baranyi, T.

    2016-07-01

    The aim of the present investigation is to study the spatio-temporal distribution of precursor flares during the 24 h interval preceding M- and X-class major flares and the evolution of follower flares. Information on associated (precursor and follower) flares is provided by Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI). Flare list, while the major flares are observed by the Geostationary Operational Environmental Satellite (GOES) system satellites between 2002 and 2014. There are distinct evolutionary differences between the spatio-temporal distributions of associated flares in about one-day period depending on the type of the main flare. The spatial distribution was characterized by the normalized frequency distribution of the quantity δ (the distance between the major flare and its precursor flare normalized by the sunspot group diameter) in four 6 h time intervals before the major event. The precursors of X-class flares have a double-peaked spatial distribution for more than half a day prior to the major flare, but it changes to a lognormal-like distribution roughly 6 h prior to the event. The precursors of M-class flares show lognormal-like distribution in each 6 h subinterval. The most frequent sites of the precursors in the active region are within a distance of about 0.1 diameter of sunspot group from the site of the major flare in each case. Our investigation shows that the build-up of energy is more effective than the release of energy because of precursors.

  2. Novel single photon sources for new generation of quantum communications

    DTIC Science & Technology

    2017-06-13

    be used as building blocks for quantum cryptography and quantum key distribution There were numerous important achievements for the projects in the...single photon sources that will be used as build- ing blocks for quantum cryptography and quantum key distribution There were numerous im- portant...and enable absolutely secured information transfer between distant nodes – key prerequisite for quantum cryptography . Experiment: the experimental

  3. Understanding re-distribution of road deposited particle-bound pollutants using a Bayesian Network (BN) approach.

    PubMed

    Liu, An; Wijesiri, Buddhi; Hong, Nian; Zhu, Panfeng; Egodawatta, Prasanna; Goonetilleke, Ashantha

    2018-05-08

    Road deposited pollutants (build-up) are continuously re-distributed by external factors such as traffic and wind turbulence, influencing stormwater runoff quality. However, current stormwater quality modelling approaches do not account for the re-distribution of pollutants. This undermines the accuracy of stormwater quality predictions, constraining the design of effective stormwater treatment measures. This study, using over 1000 data points, developed a Bayesian Network modelling approach to investigate the re-distribution of pollutant build-up on urban road surfaces. BTEX, which are a group of highly toxic pollutants, was the case study pollutants. Build-up sampling was undertaken in Shenzhen, China, using a dry and wet vacuuming method. The research outcomes confirmed that the vehicle type and particle size significantly influence the re-distribution of particle-bound BTEX. Compared to heavy-duty traffic in commercial areas, light-duty traffic dominates the re-distribution of particles of all size ranges. In industrial areas, heavy-duty traffic re-distributes particles >75 μm, and light-duty traffic re-distributes particles <75 μm. In residential areas, light-duty traffic re-distributes particles >300 μm and <75 μm and heavy-duty traffic re-distributes particles in the 300-150 μm range. The study results provide important insights to improve stormwater quality modelling and the interpretation of modelling outcomes, contributing to safeguard the urban water environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Impact of buildings on surface solar radiation over urban Beijing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Bin; Liou, Kuo-Nan; Gu, Yu

    The rugged surface of an urban area due to varying buildings can interact with solar beams and affect both the magnitude and spatiotemporal distribution of surface solar fluxes. Here we systematically examine the impact of buildings on downward surface solar fluxes over urban Beijing by using a 3-D radiation parameterization that accounts for 3-D building structures vs. the conventional plane-parallel scheme. We find that the resulting downward surface solar flux deviations between the 3-D and the plane-parallel schemes are generally ±1–10 W m -2 at 800 m grid resolution and within ±1 W m -2 at 4 km resolution. Pairsmore » of positive–negative flux deviations on different sides of buildings are resolved at 800 m resolution, while they offset each other at 4 km resolution. Flux deviations from the unobstructed horizontal surface at 4 km resolution are positive around noon but negative in the early morning and late afternoon. The corresponding deviations at 800 m resolution, in contrast, show diurnal variations that are strongly dependent on the location of the grids relative to the buildings. Both the magnitude and spatiotemporal variations of flux deviations are largely dominated by the direct flux. Furthermore, we find that flux deviations can potentially be an order of magnitude larger by using a finer grid resolution. Atmospheric aerosols can reduce the magnitude of downward surface solar flux deviations by 10–65 %, while the surface albedo generally has a rather moderate impact on flux deviations. The results imply that the effect of buildings on downward surface solar fluxes may not be critically significant in mesoscale atmospheric models with a grid resolution of 4 km or coarser. However, the effect can play a crucial role in meso-urban atmospheric models as well as microscale urban dispersion models with resolutions of 1 m to 1 km.« less

  5. Soldier-Robot Team Communication: An Investigation of Exogenous Orienting Visual Display Cues and Robot Reporting Preferences

    DTIC Science & Technology

    2018-02-12

    usability preference. Results under the second focus showed that the frequency with which participants expected status updates differed depending upon the...assistance requests for both navigational route and building selection depending on the type of exogenous visual cues displayed? 3) Is there a difference...in response time to visual reports for both navigational route and building selection depending on the type of exogenous visual cues displayed? 4

  6. Growth, chamber building rate and reproduction time of Palaeonummulites venosus under natural conditions.

    NASA Astrophysics Data System (ADS)

    Kinoshita, Shunichi; Eder, Wolfgang; Wöger, Julia; Hohenegger, Johann; Briguglio, Antonino

    2017-04-01

    Investigations on Palaeonummulites venosus using the natural laboratory approach for determining chamber building rate, test diameter increase rate, reproduction time and longevity is based on the decomposition of monthly obtained frequency distributions based on chamber number and test diameter into normal-distributed components. The shift of the component parameters 'mean' and 'standard deviation' during the investigation period of 15 months was used to calculate Michaelis-Menten functions applied to estimate the averaged chamber building rate and diameter increase rate under natural conditions. The individual dates of birth were estimated using the inverse averaged chamber building rate and the inverse diameter increase rate fitted by the individual chamber number or the individual test diameter at the sampling date. Distributions of frequencies and densities (i.e. frequency divided by sediment weight) based on chamber building rate and diameter increase rate resulted both in a continuous reproduction through the year with two peaks, the stronger in May /June determined as the beginning of the summer generation (generation1) and the weaker in November determined as the beginning of the winter generation (generation 2). This reproduction scheme explains the existence of small and large specimens in the same sample. Longevity, calculated as the maximum difference in days between the individual's birth date and the sampling date seems to be round about one year, obtained by both estimations based on the chamber building rate and the diameter increase rate.

  7. Robust quantum network architectures and topologies for entanglement distribution

    NASA Astrophysics Data System (ADS)

    Das, Siddhartha; Khatri, Sumeet; Dowling, Jonathan P.

    2018-01-01

    Entanglement distribution is a prerequisite for several important quantum information processing and computing tasks, such as quantum teleportation, quantum key distribution, and distributed quantum computing. In this work, we focus on two-dimensional quantum networks based on optical quantum technologies using dual-rail photonic qubits for the building of a fail-safe quantum internet. We lay out a quantum network architecture for entanglement distribution between distant parties using a Bravais lattice topology, with the technological constraint that quantum repeaters equipped with quantum memories are not easily accessible. We provide a robust protocol for simultaneous entanglement distribution between two distant groups of parties on this network. We also discuss a memory-based quantum network architecture that can be implemented on networks with an arbitrary topology. We examine networks with bow-tie lattice and Archimedean lattice topologies and use percolation theory to quantify the robustness of the networks. In particular, we provide figures of merit on the loss parameter of the optical medium that depend only on the topology of the network and quantify the robustness of the network against intermittent photon loss and intermittent failure of nodes. These figures of merit can be used to compare the robustness of different network topologies in order to determine the best topology in a given real-world scenario, which is critical in the realization of the quantum internet.

  8. A Bivariate Mixed Distribution with a Heavy-tailed Component and its Application to Single-site Daily Rainfall Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chao ..; Singh, Vijay P.; Mishra, Ashok K.

    2013-02-06

    This paper presents an improved brivariate mixed distribution, which is capable of modeling the dependence of daily rainfall from two distinct sources (e.g., rainfall from two stations, two consecutive days, or two instruments such as satellite and rain gauge). The distribution couples an existing framework for building a bivariate mixed distribution, the theory of copulae and a hybrid marginal distribution. Contributions of the improved distribution are twofold. One is the appropriate selection of the bivariate dependence structure from a wider admissible choice (10 candidate copula families). The other is the introduction of a marginal distribution capable of better representing lowmore » to moderate values as well as extremes of daily rainfall. Among several applications of the improved distribution, particularly presented here is its utility for single-site daily rainfall simulation. Rather than simulating rainfall occurrences and amounts separately, the developed generator unifies the two processes by generalizing daily rainfall as a Markov process with autocorrelation described by the improved bivariate mixed distribution. The generator is first tested on a sample station in Texas. Results reveal that the simulated and observed sequences are in good agreement with respect to essential characteristics. Then, extensive simulation experiments are carried out to compare the developed generator with three other alternative models: the conventional two-state Markov chain generator, the transition probability matrix model and the semi-parametric Markov chain model with kernel density estimation for rainfall amounts. Analyses establish that overall the developed generator is capable of reproducing characteristics of historical extreme rainfall events and is apt at extrapolating rare values beyond the upper range of available observed data. Moreover, it automatically captures the persistence of rainfall amounts on consecutive wet days in a relatively natural and easy way. Another interesting observation is that the recognized ‘overdispersion’ problem in daily rainfall simulation ascribes more to the loss of rainfall extremes than the under-representation of first-order persistence. The developed generator appears to be a sound option for daily rainfall simulation, especially in particular hydrologic planning situations when rare rainfall events are of great importance.« less

  9. The Use of Contingency Management and Motivational/Skills-Building Therapy to Treat Young Adults with Marijuana Dependence

    ERIC Educational Resources Information Center

    Carroll, Kathleen M.; Easton, Caroline J.; Nich, Charla; Hunkele, Karen A.; Neavins, Tara M.; Sinha, Rajita; Ford, Haley L.; Vitolo, Sally A.; Doebrick, Cheryl A.; Rounsaville, Bruce J.

    2006-01-01

    Marijuana-dependent young adults (N = 136), all referred by the criminal justice system, were randomized to 1 of 4 treatment conditions: a motivational/skills-building intervention (motivational enhancement therapy/cognitive-behavioral therapy; MET/CBT) plus incentives contingent on session attendance or submission of marijuana-free urine…

  10. The morphology of cometary dust: Subunit size distributions down to tens of nanometres

    NASA Astrophysics Data System (ADS)

    Mannel, Thurid; Bentley, Mark; Boakes, Peter; Jeszenszky, Harald; Levasseur-Regourd, Anny-Chantal; Schmied, Roland; Torkar, Klaus

    2017-04-01

    The Rosetta orbiter carried a dedicated analysis suite for cometary dust. One of the key instruments was MIDAS (Micro-Imaging Dust Analysis System), an atomic force microscope that scanned the surfaces of hundreds of (sub-)micrometre particles in 3D with resolutions down to nanometres. This provided the opportunity to study the morphology of the smallest cometary dust; initial investigation revealed that the particles are agglomerates of smaller subunits [1] with different structural properties [2]. To understand the (surface-) structure of the dust particles and the origin of their smallest building blocks, a number of particles were investigated in detail and the size distribution of their subunits determined [3]. Here we discuss the subunit size distributions ranging from tens of nanometres to a few micrometres. The differences between the subunit size distributions for particles collected pre-perihelion, close to perihelion, and during a huge outburst are examined, as well as the dependence of subunit size on particle size. A case where a particle was fragmented in consecutive scans allows a direct comparison of fragment and subunit size distributions. Finally, the small end of the subunit size distribution is investigated: the smallest determined sizes will be reviewed in the context of other cometary missions, interplanetary dust particles believed to originate from comets, and remote observations. It will be discussed if the smallest subunits can be interpreted as fundamental building blocks of our early Solar System and if their origin was in our protoplanetary disc or the interstellar material. References: [1] M.S. Bentley, R. Schmied, T. Mannel et al., Aggregate dust particles at comet 67P/Chruyumov-Gerasimenko, Nature, 537, 2016. doi:10.1038/nature19091 [2] T. Mannel, M.S. Bentley, R. Schmied et al., Fractal cometary dust - a window into the early Solar system, MNRAS, 462, 2016. doi:10.1093/mnras/stw2898 [3] R. Schmied, T. Mannel, H. Jeszenszky, M.S. Bentley, Properties of cometary dust down to the nanometre scale, poster at the conference 'Comets: A new vision after Rosetta/Philae' in Toulouse, 14-18 November 2016.

  11. Lattice Boltzmann Study on Seawall-Break Flows under the Influence of Breach and Buildings

    NASA Astrophysics Data System (ADS)

    Mei, Qiu-Ying; Zhang, Wen-Huan; Wang, Yi-Hang; Chen, Wen-Wen

    2017-10-01

    In the process of storm surge, the seawater often overflows and even destroys the seawall. The buildings near the shore are usually inundated by the seawater through the breach. However, at present, there is little study focusing on the effects of buildings and breach on the seawall-break flows. In this paper, the lattice Boltzmann (LB) model with nine velocities in two dimensions (D2Q9) for the shallow water equations is adopted to simulate the seawall-break flows. The flow patterns and water depth distributions for the seawall-break flows under various densities, layouts and shapes of buildings and different breach discharges, sizes and locations are investigated. It is found that when buildings with a high enough density are perpendicular to the main flow direction, an obvious backwater phenomenon appears near buildings while this phenomenon does not occur when buildings with the same density are parallel to the main flow direction. Moreover, it is observed that the occurrence of backwater phenomenon is independent of the building shape. As to the effects of breach on the seawall-break flows, it is found that only when the breach discharge is large enough or the breach size is small enough, the effects of asymmetric distribution of buildings on the seawall-break flows become important. The breach location only changes the flow pattern in the upstream area of the first building that seawater meets, but has little impact on the global water depth distribution. Supported by the National Natural Science Foundation of China under Grant No. 11502124, the Natural Science Foundation of Zhejiang Province under Grant No. LQ16A020001, the Scientific Research Fund of Zhejiang Provincial Education Department under Grant No. Y201533808, the Natural Science Foundation of Ningbo under Grant No. 2016A610075, and is sponsored by K.C. Wong Magna Fund in Ningbo University.

  12. 78 FR 69113 - Statutorily Mandated Designation of Difficult Development Areas for 2014

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-18

    ... that an individual can claim depends on the individual's marginal tax rate). For buildings placed in... agencies allocate the state's credit ceiling among low-income housing buildings whose owners have applied... provide IRC Section 42 credits to owners of buildings based on the percentage of certain building costs...

  13. Using Basic Quality Management Concepts to Produce Total Quality School Buildings.

    ERIC Educational Resources Information Center

    Herman, Jerry J.

    1994-01-01

    Quality control in designing and building school buildings depends on customer feedback. Outlines and graphically demonstrates the interrelationships among the input sources; the information acquired; and the three phases of predesign, construction, and completion. (MLF)

  14. Macroscopic Quantum Tunneling in Superconducting Junctions of β-Ag2Se Topological Insulator Nanowire.

    PubMed

    Kim, Jihwan; Kim, Bum-Kyu; Kim, Hong-Seok; Hwang, Ahreum; Kim, Bongsoo; Doh, Yong-Joo

    2017-11-08

    We report on the fabrication and electrical transport properties of superconducting junctions made of β-Ag 2 Se topological insulator (TI) nanowires in contact with Al superconducting electrodes. The temperature dependence of the critical current indicates that the superconducting junction belongs to a short and diffusive junction regime. As a characteristic feature of the narrow junction, the critical current decreases monotonously with increasing magnetic field. The stochastic distribution of the switching current exhibits the macroscopic quantum tunneling behavior, which is robust up to T = 0.8 K. Our observations indicate that the TI nanowire-based Josephson junctions can be a promising building block for the development of nanohybrid superconducting quantum bits.

  15. AMPTE/CCE observations of the plasma composition below 17 keV during the September 4, 1984 magnetic storm

    NASA Technical Reports Server (NTRS)

    Shelley, E. G.; Klumpar, D. M.; Peterson, W. K.; Ghielmetti, A.; Balsiger, H.; Geiss, J.; Rosenbauer, H.

    1985-01-01

    Observations from the Hot Plasma Composition Experiment on the AMPTE/CCE spacecraft during the magnetic storm of 4-5 September 1984 reveal that significant injection of ions of terrestrial origin accompanied the storm development. The compression of the magnetosphere at storm sudden commencement carried the magnetopause inside the CCE orbit clearly revealing the shocked solar wind plasma. A build up of suprathermal ions is observed near the plasmapause during the storm main phase and recovery phase. Pitch angle distributions in the ring current during the main phase show differences between H(+) and O(+) that suggest mass dependent injection, transport and/or loss processes.

  16. A new technology perspective and engineering tools approach for large, complex and distributed mission and safety critical systems components

    NASA Technical Reports Server (NTRS)

    Carrio, Miguel A., Jr.

    1988-01-01

    Rapidly emerging technology and methodologies have out-paced the systems development processes' ability to use them effectively, if at all. At the same time, the tools used to build systems are becoming obsolescent themselves as a consequence of the same technology lag that plagues systems development. The net result is that systems development activities have not been able to take advantage of available technology and have become equally dependent on aging and ineffective computer-aided engineering tools. New methods and tools approaches are essential if the demands of non-stop and Mission and Safety Critical (MASC) components are to be met.

  17. A multi-level model of emerging technology: An empirical study of the evolution of biotechnology from 1976 to 2003

    PubMed Central

    van Witteloostuijn, Arjen

    2018-01-01

    In this paper, we develop an ecological, multi-level model that can be used to study the evolution of emerging technology. More specifically, by defining technology as a system composed of a set of interacting components, we can build upon the argument of multi-level density dependence from organizational ecology to develop a distribution-independent model of technological evolution. This allows us to distinguish between different stages of component development, which provides more insight into the emergence of stable component configurations, or dominant designs. We validate our hypotheses in the biotechnology industry by using patent data from the USPTO from 1976 to 2003. PMID:29795575

  18. - XSUMMER- Transcendental functions and symbolic summation in FORM

    NASA Astrophysics Data System (ADS)

    Moch, S.; Uwer, P.

    2006-05-01

    Harmonic sums and their generalizations are extremely useful in the evaluation of higher-order perturbative corrections in quantum field theory. Of particular interest have been the so-called nested sums, where the harmonic sums and their generalizations appear as building blocks, originating for example, from the expansion of generalized hypergeometric functions around integer values of the parameters. In this paper we discuss the implementation of several algorithms to solve these sums by algebraic means, using the computer algebra system FORM. Program summaryTitle of program:XSUMMER Catalogue identifier:ADXQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXQ_v1_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland License:GNU Public License and FORM License Computers:all Operating system:all Program language:FORM Memory required to execute:Depending on the complexity of the problem, recommended at least 64 MB RAM No. of lines in distributed program, including test data, etc.:9854 No. of bytes in distributed program, including test data, etc.:126 551 Distribution format:tar.gz Other programs called:none External files needed:none Nature of the physical problem:Systematic expansion of higher transcendental functions in a small parameter. The expansions arise in the calculation of loop integrals in perturbative quantum field theory. Method of solution:Algebraic manipulations of nested sums. Restrictions on complexity of the problem:Usually limited only by the available disk space. Typical running time:Dependent on the complexity of the problem.

  19. Rapid Deployment of Optimal Control for Building HVAC Systems Using Innovative Software Tools and a Hybrid Heuristic/Model Based Control Approach

    DTIC Science & Technology

    2017-03-21

    for public release; distribution is unlimited 13. SUPPLEMENTARY NOTES None 14. ABSTRACT ESTCP project EW-201409 aimed at demonstrating the benefits ...of innovative software technology for building HV AC systems. These benefits included reduced system energy use and cost as wetl as improved...Control Approach March 2017 This document has been cleared for public release; Distribution Statement A

  20. Corelli: a peer-to-peer dynamic replication service for supporting latency-dependent content in community networks

    NASA Astrophysics Data System (ADS)

    Tyson, Gareth; Mauthe, Andreas U.; Kaune, Sebastian; Mu, Mu; Plagemann, Thomas

    2009-01-01

    The quality of service for latency dependent content, such as video streaming, largely depends on the distance and available bandwidth between the consumer and the content. Poor provision of these qualities results in reduced user experience and increased overhead. To alleviate this, many systems operate caching and replication, utilising dedicated resources to move the content closer to the consumer. Latency-dependent content creates particular issues for community networks, which often display the property of strong internal connectivity yet poor external connectivity. However, unlike traditional networks, communities often cannot deploy dedicated infrastructure for both monetary and practical reasons. To address these issues, this paper proposes Corelli, a peer-to-peer replication infrastructure designed for use in community networks. In Corelli, high capacity peers in communities autonomously build a distributed cache to dynamically pre-fetch content early on in its popularity lifecycle. By exploiting the natural proximity of peers in the community, users can gain extremely low latency access to content whilst reducing egress utilisation. Through simulation, it is shown that Corelli considerably increases accessibility and improves performance for latency dependent content. Further, Corelli is shown to offer adaptive and resilient mechanisms that ensure that it can respond to variations in churn, demand and popularity.

  1. The 'Natural Laboratory', a tool for deciphering growth, lifetime and population dynamics in larger benthic foraminifera

    NASA Astrophysics Data System (ADS)

    Hohenegger, Johann

    2015-04-01

    The shells of symbiont-bearing larger benthic Foraminifera (LBF) represent the response to physiological requirements in dependence of environmental conditions. All compartments of the shell such as chambers and chamberlets accommodate the growth of the cell protoplasm and are adaptations for housing photosymbiotic algae. Investigations on the biology of LBF were predominantly based on laboratory studies. The lifetime of LBF under natural conditions is still unclear. LBF, which can build >100 chambers during their lifetime, are thought to live at least one year under natural conditions. This is supported by studies on population dynamics of eulittoral foraminifera. In species characterized by a time-restricted single reproduction period the mean size of specimens increases from small to large during lifetime simultaneously reducing individual number. This becomes more complex when two or more reproduction times are present within a one-year cycle leading to a mixture of abundant small individuals with few large specimens during the year, while keeping mean size more or less constant. This mixture is typical for most sublittoral megalospheric (gamonts or schizonts) LBF. Nothing is known on the lifetime of agamonts, the diploid asexually reproducing generation. In all hyaline LBF it is thought to be significantly longer than 1 year based on the large size and considering the mean chamber building rate of the gamont/schizonts. Observations on LBF under natural conditions have not been performed yet in the deeper sublittoral. This reflects the difficulties due to intense hydrodynamics that hinder deploying technical equipment for studies in the natural environment. Therefore, studying growth, lifetime and reproduction of sublittoral LBF under natural conditions can be performed using the so-called 'natural laboratory' in comparison with laboratory investigations. The best sampling method in the upper sublittoral from 5 to 70 m depth is by SCUBA diving. Irregular sampling intervals caused by differing weather conditions may range from weeks to one month, whereby the latter represents the upper limit: larger intervals could render the data set worthless. The number of sampling points at the location must be more than 4, randomly distributed and approximately 5m apart to smooth the effects of patchy distributions, which are typical for most LBF. Only three simple measurements are necessary to determine chamber building rate and population dynamics under natural conditions. These are the number of individuals, number of chambers and the largest diameter of the individual. The determination of a standardized sample surface area, which is necessary for population dynamic investigations, depends on the sampling method. Reproduction and longevity can be estimated based on shell size using the date where the mean abundance of specimens with minimum size (expected after a one month's growth) characterizes the reproduction period. Then the difference to the date with the mean abundance of specimens characterized by large size indicating readiness for reproduction marks the life time. Calculation of the chamber-building rate based on chamber number is more complex and depends on the reproduction period and longevity. This can be fitted with theoretical growth functions (e.g. Michaelis Menten Function). According to the above mentioned methods, chamber building rates, longevity and population dynamics can be obtained for the shallow sublittoral symbiont-bearing LBF using the 'natural laboratory'.

  2. Assessment of Muscle Fatigue from TF Distributions of SEMG Signals

    DTIC Science & Technology

    2008-06-01

    Wigner - Ville distribution ( WVD ) holds the...techniques used to build a TF distribution of SEMG signals, namely spectrogram, Wigner - Ville , Choi- Williams and smoothed pseudo Wigner - Ville . SEMG signals...spectrogram but also other Cohen’s class TF distributions , such as the Choi-Williams distribution (CWD) and the smoothed pseudo Wigner - Ville distribution

  3. Humidity Distributions in Multilayered Walls of High-rise Buildings

    NASA Astrophysics Data System (ADS)

    Gamayunova, Olga; Musorina, Tatiana; Ishkov, Alexander

    2018-03-01

    The limitation of free territories in large cities is the main reason for the active development of high-rise construction. Given the large-scale projects of high-rise buildings in recent years in Russia and abroad and their huge energy consumption, one of the fundamental principles in the design and reconstruction is the use of energy-efficient technologies. The main heat loss in buildings occurs through enclosing structures. However, not always the heat-resistant wall will be energy-efficient and dry at the same time (perhaps waterlogging). Temperature and humidity distributions in multilayer walls were studied in the paper, and the interrelation of other thermophysical characteristics was analyzed.

  4. Distribution Management System Volt/VAR Evaluation | Grid Modernization |

    Science.gov Websites

    NREL Distribution Management System Volt/VAR Evaluation Distribution Management System Volt/VAR Evaluation This project involves building a prototype distribution management system testbed that links a GE Grid Solutions distribution management system to power hardware-in-the-loop testing. This setup is

  5. Characteristics of mobile MOSFET dosimetry system for megavoltage photon beams

    PubMed Central

    Kumar, A. Sathish; Sharma, S. D.; Ravindran, B. Paul

    2014-01-01

    The characteristics of a mobile metal oxide semiconductor field effect transistor (mobile MOSFET) detector for standard bias were investigated for megavoltage photon beams. This study was performed with a brass alloy build-up cap for three energies namely Co-60, 6 and 15 MV photon beams. The MOSFETs were calibrated and the performance characteristics were analyzed with respect to dose rate dependence, energy dependence, field size dependence, linearity, build-up factor, and angular dependence for all the three energies. A linear dose-response curve was noted for Co-60, 6 MV, and 15 MV photons. The calibration factors were found to be 1.03, 1, and 0.79 cGy/mV for Co-60, 6 MV, and 15 MV photon energies, respectively. The calibration graph has been obtained to the dose up to 600 cGy, and the dose-response curve was found to be linear. The MOSFETs were found to be energy independent both for measurements performed at depth as well as on the surface with build-up. However, field size dependence was also analyzed for variable field sizes and found to be field size independent. Angular dependence was analyzed by keeping the MOSFET dosimeter in parallel and perpendicular orientation to the angle of incidence of the radiation with and without build-up on the surface of the phantom. The maximum variation for the three energies was found to be within ± 2% for the gantry angles 90° and 270°, the deviations without the build-up for the same gantry angles were found to be 6%, 25%, and 60%, respectively. The MOSFET response was found to be independent of dose rate for all three energies. The dosimetric characteristics of the MOSFET detector make it a suitable in vivo dosimeter for megavoltage photon beams. PMID:25190992

  6. Characteristics of mobile MOSFET dosimetry system for megavoltage photon beams.

    PubMed

    Kumar, A Sathish; Sharma, S D; Ravindran, B Paul

    2014-07-01

    The characteristics of a mobile metal oxide semiconductor field effect transistor (mobile MOSFET) detector for standard bias were investigated for megavoltage photon beams. This study was performed with a brass alloy build-up cap for three energies namely Co-60, 6 and 15 MV photon beams. The MOSFETs were calibrated and the performance characteristics were analyzed with respect to dose rate dependence, energy dependence, field size dependence, linearity, build-up factor, and angular dependence for all the three energies. A linear dose-response curve was noted for Co-60, 6 MV, and 15 MV photons. The calibration factors were found to be 1.03, 1, and 0.79 cGy/mV for Co-60, 6 MV, and 15 MV photon energies, respectively. The calibration graph has been obtained to the dose up to 600 cGy, and the dose-response curve was found to be linear. The MOSFETs were found to be energy independent both for measurements performed at depth as well as on the surface with build-up. However, field size dependence was also analyzed for variable field sizes and found to be field size independent. Angular dependence was analyzed by keeping the MOSFET dosimeter in parallel and perpendicular orientation to the angle of incidence of the radiation with and without build-up on the surface of the phantom. The maximum variation for the three energies was found to be within ± 2% for the gantry angles 90° and 270°, the deviations without the build-up for the same gantry angles were found to be 6%, 25%, and 60%, respectively. The MOSFET response was found to be independent of dose rate for all three energies. The dosimetric characteristics of the MOSFET detector make it a suitable in vivo dosimeter for megavoltage photon beams.

  7. Numerical Studies of Thermal Conditions in Cities - Systematic Model Simulations of Idealized Urban Domains

    NASA Astrophysics Data System (ADS)

    Heene, V.; Buchholz, S.; Kossmann, M.

    2016-12-01

    Numerical studies of thermal conditions in cities based on model simulations of idealized urban domains are carried out to investigate how changes in the characteristics of urban areas influence street level air temperatures. The simulated modifications of the urban characteristics represent possible adaptation measures for heat reduction in cities, which are commonly used in urban planning. Model simulations are performed with the thermodynamic version of the 3-dimensional micro-scale urban climate model MUKLIMO_3. The simulated idealized urban areas are designed in a simplistic way, i. e. defining homogeneous squared cities of one settlement type, without orography and centered in the model domain. To assess the impact of different adaptation measures the characteristics of the urban areas have been systematically modified regarding building height, albedo of building roof and impervious surfaces, fraction of impervious surfaces between buildings, and percentage of green roofs. To assess the impact of green and blue infrastructure in cities, different configurations for parks and lakes have been investigated - e. g. varying size and distribution within the city. The experiments are performed for different combinations of typical German settlement types and surrounding rural types under conditions of a typical summer day in July. The adaptation measures implemented in the experiments show different impacts for different settlement types mainly due to the differences in building density, building height or impervious surface fraction. Parks and lakes implemented as adaptation measure show strong potential to reduce daytime air temperature, with cooling effects on their built-up surroundings. At night lakes generate negative and positive effects on air temperature, depending on water temperature. In general, all adaptation measures implemented in experiments reveal different impacts on day and night air temperature.

  8. Incorporating residual temperature and specific humidity in predicting weather-dependent warm-season electricity consumption

    NASA Astrophysics Data System (ADS)

    Guan, Huade; Beecham, Simon; Xu, Hanqiu; Ingleton, Greg

    2017-02-01

    Climate warming and increasing variability challenges the electricity supply in warm seasons. A good quantitative representation of the relationship between warm-season electricity consumption and weather condition provides necessary information for long-term electricity planning and short-term electricity management. In this study, an extended version of cooling degree days (ECDD) is proposed for better characterisation of this relationship. The ECDD includes temperature, residual temperature and specific humidity effects. The residual temperature is introduced for the first time to reflect the building thermal inertia effect on electricity consumption. The study is based on the electricity consumption data of four multiple-street city blocks and three office buildings. It is found that the residual temperature effect is about 20% of the current-day temperature effect at the block scale, and increases with a large variation at the building scale. Investigation of this residual temperature effect provides insight to the influence of building designs and structures on electricity consumption. The specific humidity effect appears to be more important at the building scale than at the block scale. A building with high energy performance does not necessarily have low specific humidity dependence. The new ECDD better reflects the weather dependence of electricity consumption than the conventional CDD method.

  9. An Applied Physicist Does Econometrics

    NASA Astrophysics Data System (ADS)

    Taff, L. G.

    2010-02-01

    The biggest problem those attempting to understand econometric data, via modeling, have is that economics has no F = ma. Without a theoretical underpinning, econometricians have no way to build a good model to fit observations to. Physicists do, and when F = ma failed, we knew it. Still desiring to comprehend econometric data, applied economists turn to mis-applying probability theory---especially with regard to the assumptions concerning random errors---and choosing extremely simplistic analytical formulations of inter-relationships. This introduces model bias to an unknown degree. An applied physicist, used to having to match observations to a numerical or analytical model with a firm theoretical basis, modify the model, re-perform the analysis, and then know why, and when, to delete ``outliers'', is at a considerable advantage when quantitatively analyzing econometric data. I treat two cases. One is to determine the household density distribution of total assets, annual income, age, level of education, race, and marital status. Each of these ``independent'' variables is highly correlated with every other but only current annual income and level of education follow a linear relationship. The other is to discover the functional dependence of total assets on the distribution of assets: total assets has an amazingly tight power law dependence on a quadratic function of portfolio composition. Who knew? )

  10. Distributional and regularized radiation fields of non-uniformly moving straight dislocations, and elastodynamic Tamm problem

    NASA Astrophysics Data System (ADS)

    Lazar, Markus; Pellegrini, Yves-Patrick

    2016-11-01

    This work introduces original explicit solutions for the elastic fields radiated by non-uniformly moving, straight, screw or edge dislocations in an isotropic medium, in the form of time-integral representations in which acceleration-dependent contributions are explicitly separated out. These solutions are obtained by applying an isotropic regularization procedure to distributional expressions of the elastodynamic fields built on the Green tensor of the Navier equation. The obtained regularized field expressions are singularity-free, and depend on the dislocation density rather than on the plastic eigenstrain. They cover non-uniform motion at arbitrary speeds, including faster-than-wave ones. A numerical method of computation is discussed, that rests on discretizing motion along an arbitrary path in the plane transverse to the dislocation, into a succession of time intervals of constant velocity vector over which time-integrated contributions can be obtained in closed form. As a simple illustration, it is applied to the elastodynamic equivalent of the Tamm problem, where fields induced by a dislocation accelerated from rest beyond the longitudinal wave speed, and thereafter put to rest again, are computed. As expected, the proposed expressions produce Mach cones, the dynamic build-up and decay of which is illustrated by means of full-field calculations.

  11. Integrated Multi-Scale Data Analytics and Machine Learning for the Distribution Grid and Building-to-Grid Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Emma M.; Hendrix, Val; Chertkov, Michael

    This white paper introduces the application of advanced data analytics to the modernized grid. In particular, we consider the field of machine learning and where it is both useful, and not useful, for the particular field of the distribution grid and buildings interface. While analytics, in general, is a growing field of interest, and often seen as the golden goose in the burgeoning distribution grid industry, its application is often limited by communications infrastructure, or lack of a focused technical application. Overall, the linkage of analytics to purposeful application in the grid space has been limited. In this paper wemore » consider the field of machine learning as a subset of analytical techniques, and discuss its ability and limitations to enable the future distribution grid and the building-to-grid interface. To that end, we also consider the potential for mixing distributed and centralized analytics and the pros and cons of these approaches. Machine learning is a subfield of computer science that studies and constructs algorithms that can learn from data and make predictions and improve forecasts. Incorporation of machine learning in grid monitoring and analysis tools may have the potential to solve data and operational challenges that result from increasing penetration of distributed and behind-the-meter energy resources. There is an exponentially expanding volume of measured data being generated on the distribution grid, which, with appropriate application of analytics, may be transformed into intelligible, actionable information that can be provided to the right actors – such as grid and building operators, at the appropriate time to enhance grid or building resilience, efficiency, and operations against various metrics or goals – such as total carbon reduction or other economic benefit to customers. While some basic analysis into these data streams can provide a wealth of information, computational and human boundaries on performing the analysis are becoming significant, with more data and multi-objective concerns. Efficient applications of analysis and the machine learning field are being considered in the loop.« less

  12. Distribution trend of high-rise buildings worldwide and factor exploration

    NASA Astrophysics Data System (ADS)

    Yu, Shao-Qiao

    2017-08-01

    This paper elaborates the development phenomenon of high-rise buildings nowadays. The development trend of super high-rise buildings worldwide is analyzed based on data from the Council on Tall Buildings and Urban Habitat, taking the top 100 high-rise buildings in different continents and with the time development and building type as the objects. Through analysis, the trend of flourishing of UAE super high-rise buildings and stable development of European and American high-rise buildings is obtained. The reasons for different development degrees of the regions are demonstrated from the aspects of social development, economy, culture and consciousness. This paper also presents unavoidable issues of super high-rise buildings and calls for rational treatment to these buildings.

  13. Analysis of Numerical Models for Dispersion of Chemical/Biological Agents in Complex Building Environments

    DTIC Science & Technology

    2004-11-01

    variation in ventilation rates over time and the distribution of ventilation air within a building, and to estimate the impact of envelope air ... tightening efforts on infiltration rates. • It may be used to determine the indoor air quality performance of a building before construction, and to

  14. An Investigation of Energy Consumption and Cost in Large Air-Conditioned Buildings. An Interim Report.

    ERIC Educational Resources Information Center

    Milbank, N. O.

    Two similarly large buildings and air conditioning systems are comparatively analyzed as to energy consumption, costs, and inefficiency during certain measured periods of time. Building design and velocity systems are compared to heating, cooling, lighting and distribution capabilities. Energy requirements for pumps, fans and lighting are found to…

  15. Common Object Library Description

    DTIC Science & Technology

    2012-08-01

    Information Modeling ( BIM ) technology to be successful, it must be consistently applied across many projects, by many teams. The National Building Information ...distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT For Building Information Modeling ( BIM ) technology to be successful, it must be... BIM standards and for future research projects. 15. SUBJECT TERMS building information modeling ( BIM ), object

  16. Analysis of building deformation in landslide area using multisensor PSInSAR™ technique.

    PubMed

    Ciampalini, Andrea; Bardi, Federica; Bianchini, Silvia; Frodella, William; Del Ventisette, Chiara; Moretti, Sandro; Casagli, Nicola

    2014-12-01

    Buildings are sensitive to movements caused by ground deformation. The mapping both of spatial and temporal distribution, and of the degree of building damages represents a useful tool in order to understand the landslide evolution, magnitude and stress distribution. The high spatial resolution of space-borne SAR interferometry can be used to monitor displacements related to building deformations. In particular, PSInSAR technique is used to map and monitor ground deformation with millimeter accuracy. The usefulness of the above mentioned methods was evaluated in San Fratello municipality (Sicily, Italy), which was historically affected by landslides: the most recent one occurred on 14th February 2010. PSInSAR data collected by ERS 1/2, ENVISAT, RADARSAT-1 were used to study the building deformation velocities before the 2010 landslide. The X-band sensors COSMO-SkyMed and TerraSAR-X were used in order to monitor the building deformation after this event. During 2013, after accurate field inspection on buildings and structures, damage assessment map of San Fratello were created and then compared to the building deformation velocity maps. The most interesting results were obtained by the comparison between the building deformation velocity map obtained through COSMO-SkyMed and the damage assessment map. This approach can be profitably used by local and Civil Protection Authorities to manage the post-event phase and evaluate the residual risks.

  17. Spatial Information in local society's cultural conservation and research

    NASA Astrophysics Data System (ADS)

    Jang, J.-J.; Liao, H.-M.; Fan, I.-C.

    2015-09-01

    Center for Geographic Information Science, Research Center for Humanities and Social Sciences,Academia Sinica (GIS center), Coordinate short-, medium-, and long-term operations of multidisciplinary researches focusing on related topics in the sciences and humanities. Based on the requirements of multi-disciplinary research applications, sustain collection and construction of sustaining and unifying spatial base data and knowledge and building of spatial data infrastructure. Since the 1990s, GIS center build geographic information platform: "Time and space infrastructure of Chinese civilization" (Chinese Civilizationin Time and Space, CCTS) and "Taiwan History and Culture Map" (Taiwan History and Culture in Time and Space, THCTS) . the goal of both system is constructing an integrated GIS-based application infrastructure on the spatial extent of China and Taiwan, in the timeframe of Chinese and Taiwanese history, and with the contents of Chinese and Taiwanese civilization. Base on THCTS, we began to build Cultural Resources GIS(CRGIS, http://crgis.rchss.sinica.edu.tw) in 2006, to collect temples, historic Monuments, historic buildings, old trees, wind lions god and other cultural resource in Taiwan, and provide a platform for the volunteers to make for all types of tangible, intangible cultural resources, add, edit, organize and query data via Content Management System(CMS) . CRGIS collected aggregated 13,000 temples, 4,900 churches. On this basis, draw a variety of religious beliefs map-multiple times Temple distributions, different main god distributions, church distribution. Such as Mazu maps, Multiple times temple distributions map (before 1823, 1823-1895,1895-1949,1949-2015 years) at Taijiang inner sea areas in Tainan. In Taiwan, there is a religious ritual through folk activities for a period ranging from one day to several days, passing specific geospatial range and passes through some temples or houses. Such an important folk activity somewhat similar to Western parade, called " raojing " , the main spirit is passing through of these ranges in the process, to reach the people within bless range, many scholars and academic experts's folk research are dependent on such spatial information. 2012, GIS center applied WebGIS and GPS to gather raojing activities spatial information in cooperation with multi-units, aggregated seven sessions, 22 days, 442 temples had pass through . The atlas also published named "Atlas of the 2012 Religious Processions in the Tainan Region" in 2014. we also applied national cultural resources data form relevant government authorities, through the metadata design and data processing(geocoding), established cultural geospatial and thematic information ,such as 800 monuments, 1,100 historic buildings and 4,300 old trees data. In recent years, based on CRGIS technology and operational concepts, different domain experts or local culture-ahistory research worker/team had to cooperate with us to establish local or thematic material and cultural resources. As in collaboration with local culture-history research worker in Kinmen County in 2012, build Kinmen intangible cultural assets - Wind Lion God ,set metadata and build 122 wind lion god `s attribute data and maps through field survey, it is worth mention such fieldwork data integrity is more than the official registration data form Kinmen National Park, the number of is wind lion god more than 40; in 2013,we were in cooperation with academic experts to establish property data and map of the theatre during the Japanese colonial era in Taiwan, a total of 170 theatres ; we were in cooperation with Japanese scholars, used his 44 detaile field survey data of sugar refineries during the Japanese colonial era in Taiwan ,to produce a sugar refineries distribution map and extend to a thematic web(http://map.net.tw/) [The Cultural Heritage Maps of Taiwan Suger Factories in a Century]site according to CRGIS independent cultural concept. Deployment and operation of the CRGIS, the meaning is not only build the thematic GIS system ,but also contain these concepts: Open Data, Wikipedia ,Public Participation, and we provide an interactive platform with culture resource data and geospatial technology. In addition to providing these reference material for local culture education, local cultural recognition, but to further cooperate with scholars, academic experts , local culture-history research worker / team, accumulated rich record of the past, research results, through the spatial database planning, data processing(ex. geocoding), field survey, geospatial materials overlapping, such as nesting geospatial technology to compile the cultural information and value-added applications.

  18. Managed traffic evacuation using distributed sensor processing

    NASA Astrophysics Data System (ADS)

    Ramuhalli, Pradeep; Biswas, Subir

    2005-05-01

    This paper presents an integrated sensor network and distributed event processing architecture for managed in-building traffic evacuation during natural and human-caused disasters, including earthquakes, fire and biological/chemical terrorist attacks. The proposed wireless sensor network protocols and distributed event processing mechanisms offer a new distributed paradigm for improving reliability in building evacuation and disaster management. The networking component of the system is constructed using distributed wireless sensors for measuring environmental parameters such as temperature, humidity, and detecting unusual events such as smoke, structural failures, vibration, biological/chemical or nuclear agents. Distributed event processing algorithms will be executed by these sensor nodes to detect the propagation pattern of the disaster and to measure the concentration and activity of human traffic in different parts of the building. Based on this information, dynamic evacuation decisions are taken for maximizing the evacuation speed and minimizing unwanted incidents such as human exposure to harmful agents and stampedes near exits. A set of audio-visual indicators and actuators are used for aiding the automated evacuation process. In this paper we develop integrated protocols, algorithms and their simulation models for the proposed sensor networking and the distributed event processing framework. Also, efficient harnessing of the individually low, but collectively massive, processing abilities of the sensor nodes is a powerful concept behind our proposed distributed event processing algorithms. Results obtained through simulation in this paper are used for a detailed characterization of the proposed evacuation management system and its associated algorithmic components.

  19. Preliminary conceptual design for geothermal space heating conversion of school district 50 joint facilities at Pagosa Springs, Colorado. GTA report no. 6

    NASA Astrophysics Data System (ADS)

    Engen, I. A.

    1981-11-01

    This feasibility study and preliminary conceptual design effect assesses the conversion of a high school and gym, and a middle school building to geothermal space heating is assessed. A preliminary cost benefit assessment made on the basis of estimated costs for conversion, system maintenance, debt service, resource development, electricity to power pumps, and savings from from reduced natural gas consumption concluded that an economic conversion depended on development of an adequate geothermal resource (approximately 1500F, 400 gpm). Material selection assumed that the geothermal water to the main supply system was isolated to minimize effects of corrosion and deposition, and that system compatible components are used for the building modifications. Asbestos cement distribution pipe, a stainless steel heat exchanger, and stainless steel lined valves were recommended for the supply, heat transfer, and disposal mechanisms, respectively. A comparison of the calculated average gas consumption cost, escalated at 10% per year, with conversion project cost, both in 1977 dollars, showed that the project could be amortized over less than 20 years at current interest rates.

  20. Self-organizing layers from complex molecular anions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warneke, Jonas; McBriarty, Martin E.; Riechers, Shawn L.

    Ions are promising building blocks for tunable self-organizing materials with advanced technological applications. However, because of strong Coulomb attraction with counterions, the intrinsic properties of ions are difficult to exploit for preparation of bulk materials. Here, we report the precisely-controlled preparation of macroscopic surface layers by soft landing of mass selected complex anions which determine the self organization of the layers with their molecular properties. The family of halogenated dodecaborates [B12X12]2- (X = F, Cl, Br, I), in which the internal charge distribution between core and shell regions of the molecular ions systematically vary, was deposited on different self assembledmore » monolayer surfaces (SAMs) on gold at high coverage. Layers of anions were found to be stabilized by accumulation of neutral molecules. Different phases, self-organization mechanisms and optical properties were observed to depend upon the internal charge distribution of the deposited anions, the underlying surface and the coadsorbed molecules. This demonstrates rational control of the properties of anion based layers.« less

  1. Best of Both Worlds Comment on "(Re) Making the Procrustean Bed? Standardization and Customization as Competing Logics in Healthcare".

    PubMed

    Needham, Catherine

    2017-08-16

    This article builds on Mannion and Exworthy's account of the tensions between standardization and customization within health services to explore why these tensions exist. It highlights the limitations of explanations which root them in an expression of managerialism versus professionalism and suggests that each logic is embedded in a set of ontological, epistemological and moral commitments which are held in tension. At the front line of care delivery, people cannot resolve these tensions but must navigate and negotiate them. The legitimacy of a health system depends on its ability to deliver the 'best of both worlds' to citizens, offering the reassurance of sameness and the dignity of difference. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  2. Distributed Learning Metadata Standards

    ERIC Educational Resources Information Center

    McClelland, Marilyn

    2004-01-01

    Significant economies can be achieved in distributed learning systems architected with a focus on interoperability and reuse. The key building blocks of an efficient distributed learning architecture are the use of standards and XML technologies. The goal of plug and play capability among various components of a distributed learning system…

  3. Role development of nurses for technology-dependent children attending mainstream schools in Japan.

    PubMed

    Shimizu, Fumie; Suzuki, Machiko

    2015-04-01

    To describe the role development of nurses caring for medical technology-dependent children attending Japanese mainstream schools. Semi-structured interviews with 21 nurses caring for technology-dependent children were conducted and analyzed using the modified grounded theory approach. Nurses developed roles centered on maintaining technology-dependent children's physical health to support children's learning with each other, through building relationships, learning how to interact with children, understanding the children and the school community, and realizing the meaning of supporting technology-dependent children. These findings support nurses to build relationships of mutual trust with teachers and children, and learn on the job in mainstream schools. © 2015, Wiley Periodicals, Inc.

  4. Explaining observed red and blue-shifts using multi-stranded coronal loops

    NASA Astrophysics Data System (ADS)

    Regnier, S.; Walsh, R. W.; Pearson, J.

    2012-03-01

    Magnetic plasma loops have been termed the building blocks of the solar atmosphere. However, it must be recognised that if the range of loop structures we can observe do consist of many ''sub-resolution'' elements, then current one-dimensional hydrodynamic models are really only applicable to an individual plasma element or strand. Thus a loop should be viewed is an amalgamation of these strands. They could operate in thermal isolation from one another with a wide range of temperatures occurring across the structural elements. This scenario could occur when the energy release mechanism consists of localised, discrete bursts of energy that are due to small scale reconnection sites within the coronal magnetic field- the nanoflare coronal heating mechanism. These energy bursts occur in a time-dependent manner, distributed along the loop/strand length, giving a heating function that depends on space and time. An important observational discovery with the Hinode/EIS spectrometer is the existence of red and blue-shifts in coronal loops depending on the location of the footpoints (inner or outer parts of the active region), and the temperature of the emission line in which the Doppler shifts are measured. Based on the multi-stranded model developed by Sarkar and Walsh (2008, ApJ, 683, 516), we show that red and blue-shifts exist in different simulated Hinode/EIS passbands: cooler lines (OV-SiVII) being dominated by red-shifts, whilst hotter lines (FeXV-CaXVII) are a combination of both. The distribution of blue-shifts depends on the energy input and not so much on the heating location. Characteristic Doppler shifts generated fit well with observed values. We also simulate the Hinode/EIS rasters to closely compare our simulation with the observations. Even if not statistically significant, loops can have footpoints with opposite Doppler shifts.

  5. Linking interseismic deformation with coseismic slip using dynamic rupture simulations

    NASA Astrophysics Data System (ADS)

    Yang, H.; He, B.; Weng, H.

    2017-12-01

    The largest earthquakes on earth occur at subduction zones, sometimes accompanied by devastating tsunamis. Reducing losses from megathrust earthquakes and tsunami demands accurate estimate of rupture scenarios for future earthquakes. Interseismic locking distribution derived from geodetic observations is often used to qualitatively evaluate future earthquake potential. However, how to quantitatively estimate the coseismic slip from the locking distribution remains challenging. Here we derive the coseismic rupture process of the 2012 Mw 7.6 Nicoya, Costa Rica, earthquake from interseismic locking distribution using spontaneous rupture simulation. We construct a three-dimensional elastic medium with a curved fault, which is governed by the linear slip-weakening law. The initial stress on the fault is set based on the build-up stress inferred from locking and the dynamic friction coefficient from fast-speed sliding experiments. Our numerical results of coseismic slip distribution, moment rate function and final earthquake moment are well consistent with those derived from seismic and geodetic observations. Furthermore, we find that the epicentral locations affect rupture scenarios and may lead to various sizes of earthquakes given the heterogeneous stress distribution. In the Nicoya region, less than half of rupture initiation regions where the locking degree is greater than 0.6 can develop into large earthquakes (Mw > 7.2). The results of location-dependent earthquake magnitudes underscore the necessity of conducting a large number of simulations to quantitatively evaluate seismic hazard from the interseismic locking models.

  6. 7. WEYMOUTH FILTRATION PLANT, BUILDING 1 INTERIOR: LA VERNE CONTROL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. WEYMOUTH FILTRATION PLANT, BUILDING 1 INTERIOR: LA VERNE CONTROL ROOM, REGULATES DISTRIBUTION OF WATER, CONTROLS POWER HOUSES. - F. E. Weymouth Filtration Plant, 700 North Moreno Avenue, La Verne, Los Angeles County, CA

  7. Modeling and Optimization of Commercial Buildings and Stationary Fuel Cell Systems (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ainscough, C.; McLarty, D.; Sullivan, R.

    2013-10-01

    This presentation describes the Distributed Generation Building Energy Assessment Tool (DG-BEAT) developed by the National Renewable Energy Laboratory and the University of California Irvine. DG-BEAT is designed to allow stakeholders to assess the economics of installing stationary fuel cell systems in a variety of building types in the United States.

  8. Geometric Model of Induction Heating Process of Iron-Based Sintered Materials

    NASA Astrophysics Data System (ADS)

    Semagina, Yu V.; Egorova, M. A.

    2018-03-01

    The article studies the issue of building multivariable dependences based on the experimental data. A constructive method for solving the issue is presented in the form of equations of (n-1) – surface compartments of the extended Euclidean space E+n. The dimension of space is taken to be equal to the sum of the number of parameters and factors of the model of the system being studied. The basis for building multivariable dependencies is the generalized approach to n-space used for the surface compartments of 3D space. The surface is designed on the basis of the kinematic method, moving one geometric object along a certain trajectory. The proposed approach simplifies the process aimed at building the multifactorial empirical dependencies which describe the process being investigated.

  9. The evolution of biomass-burning aerosol size distributions due to coagulation: dependence on fire and meteorological details and parameterization

    NASA Astrophysics Data System (ADS)

    Sakamoto, Kimiko M.; Laing, James R.; Stevens, Robin G.; Jaffe, Daniel A.; Pierce, Jeffrey R.

    2016-06-01

    Biomass-burning aerosols have a significant effect on global and regional aerosol climate forcings. To model the magnitude of these effects accurately requires knowledge of the size distribution of the emitted and evolving aerosol particles. Current biomass-burning inventories do not include size distributions, and global and regional models generally assume a fixed size distribution from all biomass-burning emissions. However, biomass-burning size distributions evolve in the plume due to coagulation and net organic aerosol (OA) evaporation or formation, and the plume processes occur on spacial scales smaller than global/regional-model grid boxes. The extent of this size-distribution evolution is dependent on a variety of factors relating to the emission source and atmospheric conditions. Therefore, accurately accounting for biomass-burning aerosol size in global models requires an effective aerosol size distribution that accounts for this sub-grid evolution and can be derived from available emission-inventory and meteorological parameters. In this paper, we perform a detailed investigation of the effects of coagulation on the aerosol size distribution in biomass-burning plumes. We compare the effect of coagulation to that of OA evaporation and formation. We develop coagulation-only parameterizations for effective biomass-burning size distributions using the SAM-TOMAS large-eddy simulation plume model. For the most-sophisticated parameterization, we use the Gaussian Emulation Machine for Sensitivity Analysis (GEM-SA) to build a parameterization of the aged size distribution based on the SAM-TOMAS output and seven inputs: emission median dry diameter, emission distribution modal width, mass emissions flux, fire area, mean boundary-layer wind speed, plume mixing depth, and time/distance since emission. This parameterization was tested against an independent set of SAM-TOMAS simulations and yields R2 values of 0.83 and 0.89 for Dpm and modal width, respectively. The size distribution is particularly sensitive to the mass emissions flux, fire area, wind speed, and time, and we provide simplified fits of the aged size distribution to just these input variables. The simplified fits were tested against 11 aged biomass-burning size distributions observed at the Mt. Bachelor Observatory in August 2015. The simple fits captured over half of the variability in observed Dpm and modal width even though the freshly emitted Dpm and modal widths were unknown. These fits may be used in global and regional aerosol models. Finally, we show that coagulation generally leads to greater changes in the particle size distribution than OA evaporation/formation does, using estimates of OA production/loss from the literature.

  10. Research of Ancient Architectures in Jin-Fen Area Based on GIS&BIM Technology

    NASA Astrophysics Data System (ADS)

    Jia, Jing; Zheng, Qiuhong; Gao, Huiying; Sun, Hai

    2017-05-01

    The number of well-preserved ancient buildings located in Shanxi Province, enjoying the absolute maximum proportion of ancient architectures in China, is about 18418, among which, 9053 buildings have the structural style of wood frame. The value of the application of BIM (Building Information Modeling) and GIS (Geographic Information System) is gradually probed and testified in the corresponding fields of ancient architecture’s spatial distribution information management, routine maintenance and special conservation & restoration, the evaluation and simulation of related disasters, such as earthquake. The research objects are ancient architectures in JIN-FEN area, which were first investigated by Sicheng LIANG and recorded in his work of “Chinese ancient architectures survey report”. The research objects, i.e. the ancient architectures in Jin-Fen area include those in Sicheng LIANG’s investigation, and further adjustments were made through authors’ on-site investigation and literature searching & collection. During this research process, the spatial distributing Geodatabase of research objects is established utilizing GIS. The BIM components library for ancient buildings is formed combining on-site investigation data and precedent classic works, such as “Yingzao Fashi”, a treatise on architectural methods in Song Dynasty, “Yongle Encyclopedia” and “Gongcheng Zuofa Zeli”, case collections of engineering practice, by the Ministry of Construction of Qing Dynasty. A building of Guangsheng temple in Hongtong county is selected as an example to elaborate the BIM model construction process based on the BIM components library for ancient buildings. Based on the foregoing work results of spatial distribution data, attribute data of features, 3D graphic information and parametric building information model, the information management system for ancient architectures in Jin-Fen Area, utilizing GIS&BIM technology, could be constructed to support the further research of seismic disaster analysis and seismic performance simulation.

  11. Gaseous Elemental Mercury and Total and Leached Mercury in Building Materials from the Former Hg-Mining Area of Abbadia San Salvatore (Central Italy).

    PubMed

    Vaselli, Orlando; Nisi, Barbara; Rappuoli, Daniele; Cabassi, Jacopo; Tassi, Franco

    2017-04-15

    Mercury has a strong environmental impact since both its organic and inorganic forms are toxic, and it represents a pollutant of global concern. Liquid Hg is highly volatile and can be released during natural and anthropogenic processes in the hydrosphere, biosphere and atmosphere. In this study, the distribution of Gaseous Elemental Mercury (GEM) and the total and leached mercury concentrations on paint, plaster, roof tiles, concrete, metals, dust and wood structures were determined in the main buildings and structures of the former Hg-mining area of Abbadia San Salvatore (Siena, Central Italy). The mining complex (divided into seven units) covers a surface of about 65 ha and contains mining structures and managers' and workers' buildings. Nine surveys of GEM measurements were carried out from July 2011 to August 2015 for the buildings and structures located in Units 2, 3 and 6, the latter being the area where liquid mercury was produced. Measurements were also performed in February, April, July, September and December 2016 in the edifices and mining structures of Unit 6. GEM concentrations showed a strong variability in time and space mostly depending on ambient temperature and the operational activities that were carried out in each building. The Unit 2 surveys carried out in the hotter period (from June to September) showed GEM concentrations up to 27,500 ng·m -3 , while in Unit 6, they were on average much higher, and occasionally, they saturated the GEM measurement device (>50,000 ng·m -3 ). Concentrations of total (in mg·kg -1 ) and leached (in μg·L -1 ) mercury measured in different building materials (up to 46,580 mg·kg -1 and 4470 mg·L -1 , respectively) were highly variable, being related to the edifice or mining structure from which they were collected. The results obtained in this study are of relevant interest for operational cleanings to be carried out during reclamation activities.

  12. Laboratory demonstration of lightning strike pattern on different roof tops installed with Franklin Rods

    NASA Astrophysics Data System (ADS)

    Ullah, Irshad; Baharom, MNR; Ahmed, H.; Luqman, HM.; Zainal, Zainab

    2017-11-01

    Protection against lightning is always a challenging job for the researcher. The consequences due to lightning on different building shapes needs a comprehensive knowledge in order to provide the information to the common man. This paper is mainly concern with lightning pattern when it strikes on the building with different shape. The work is based on the practical experimental work in high voltage laboratory. Different shapes of the scaled structures have been selected in order to investigate the equal distribution of lightning voltage. The equal distribution of lightning voltage will provide the maximum probability of lightning strike on air terminal of the selected shapes. Building shapes have a very important role in lightning protection. The shapes of the roof tops have different geometry and the Franklin rod installation is also varies with changing the shape of the roof top. According to the ambient weather condition of Malaysia high voltage impulse is applied on the lightning rod installed on different geometrical shape. The equal distribution of high voltage impulse is obtained as the geometry of the scaled structure is identical and the air gap for all the tested object is kept the same. This equal distribution of the lightning voltage also proves that the probability of lightning strike is on the corner and the edges of the building structure.

  13. Attentional modulation of neuronal variability in circuit models of cortex

    PubMed Central

    Kanashiro, Tatjana; Ocker, Gabriel Koch; Cohen, Marlene R; Doiron, Brent

    2017-01-01

    The circuit mechanisms behind shared neural variability (noise correlation) and its dependence on neural state are poorly understood. Visual attention is well-suited to constrain cortical models of response variability because attention both increases firing rates and their stimulus sensitivity, as well as decreases noise correlations. We provide a novel analysis of population recordings in rhesus primate visual area V4 showing that a single biophysical mechanism may underlie these diverse neural correlates of attention. We explore model cortical networks where top-down mediated increases in excitability, distributed across excitatory and inhibitory targets, capture the key neuronal correlates of attention. Our models predict that top-down signals primarily affect inhibitory neurons, whereas excitatory neurons are more sensitive to stimulus specific bottom-up inputs. Accounting for trial variability in models of state dependent modulation of neuronal activity is a critical step in building a mechanistic theory of neuronal cognition. DOI: http://dx.doi.org/10.7554/eLife.23978.001 PMID:28590902

  14. United Nations Office on Drugs and Crime International Network of Drug Dependence Treatment and Rehabilitation Resource Centres: Treatnet.

    PubMed

    Tomás-Rosselló, Juana; Rawson, Richard A; Zarza, Maria J; Bellows, Anne; Busse, Anja; Saenz, Elizabeth; Freese, Thomas; Shawkey, Mansour; Carise, Deni; Ali, Robert; Ling, Walter

    2010-10-01

    Key to the dissemination of evidence-based addiction treatments is the exchange of experiences and mutual support among treatment practitioners, as well as the availability of accurate addiction training materials and effective trainers. To address the shortage of such resources, the United Nations Office on Drugs and Crime (UNODC) created Treatnet, a network of 20 drug dependence treatment resource centers around the world. Treatnet's primary goal is to promote the use of effective addiction treatment practices. Phase I of this project included (1) selecting and establishing a network of geographically distributed centers; (2) conducting a capacity-building program consisting of a training needs assessment, development of training packages, and the training of 2 trainers per center in 1 content area each; and (3) creating good-practice documents. Data on the training activities conducted by the trainers during their first 6 months in the field are presented. Plans for Phase II of the Treatnet project are also discussed.

  15. Calculation of absolute protein-ligand binding free energy using distributed replica sampling.

    PubMed

    Rodinger, Tomas; Howell, P Lynne; Pomès, Régis

    2008-10-21

    Distributed replica sampling [T. Rodinger et al., J. Chem. Theory Comput. 2, 725 (2006)] is a simple and general scheme for Boltzmann sampling of conformational space by computer simulation in which multiple replicas of the system undergo a random walk in reaction coordinate or temperature space. Individual replicas are linked through a generalized Hamiltonian containing an extra potential energy term or bias which depends on the distribution of all replicas, thus enforcing the desired sampling distribution along the coordinate or parameter of interest regardless of free energy barriers. In contrast to replica exchange methods, efficient implementation of the algorithm does not require synchronicity of the individual simulations. The algorithm is inherently suited for large-scale simulations using shared or heterogeneous computing platforms such as a distributed network. In this work, we build on our original algorithm by introducing Boltzmann-weighted jumping, which allows moves of a larger magnitude and thus enhances sampling efficiency along the reaction coordinate. The approach is demonstrated using a realistic and biologically relevant application; we calculate the standard binding free energy of benzene to the L99A mutant of T4 lysozyme. Distributed replica sampling is used in conjunction with thermodynamic integration to compute the potential of mean force for extracting the ligand from protein and solvent along a nonphysical spatial coordinate. Dynamic treatment of the reaction coordinate leads to faster statistical convergence of the potential of mean force than a conventional static coordinate, which suffers from slow transitions on a rugged potential energy surface.

  16. Calculation of absolute protein-ligand binding free energy using distributed replica sampling

    NASA Astrophysics Data System (ADS)

    Rodinger, Tomas; Howell, P. Lynne; Pomès, Régis

    2008-10-01

    Distributed replica sampling [T. Rodinger et al., J. Chem. Theory Comput. 2, 725 (2006)] is a simple and general scheme for Boltzmann sampling of conformational space by computer simulation in which multiple replicas of the system undergo a random walk in reaction coordinate or temperature space. Individual replicas are linked through a generalized Hamiltonian containing an extra potential energy term or bias which depends on the distribution of all replicas, thus enforcing the desired sampling distribution along the coordinate or parameter of interest regardless of free energy barriers. In contrast to replica exchange methods, efficient implementation of the algorithm does not require synchronicity of the individual simulations. The algorithm is inherently suited for large-scale simulations using shared or heterogeneous computing platforms such as a distributed network. In this work, we build on our original algorithm by introducing Boltzmann-weighted jumping, which allows moves of a larger magnitude and thus enhances sampling efficiency along the reaction coordinate. The approach is demonstrated using a realistic and biologically relevant application; we calculate the standard binding free energy of benzene to the L99A mutant of T4 lysozyme. Distributed replica sampling is used in conjunction with thermodynamic integration to compute the potential of mean force for extracting the ligand from protein and solvent along a nonphysical spatial coordinate. Dynamic treatment of the reaction coordinate leads to faster statistical convergence of the potential of mean force than a conventional static coordinate, which suffers from slow transitions on a rugged potential energy surface.

  17. A generalized strategy for building resident database interfaces

    NASA Technical Reports Server (NTRS)

    Moroh, Marsha; Wanderman, Ken

    1990-01-01

    A strategy for building resident interfaces to host heterogeneous distributed data base management systems is developed. The strategy is used to construct several interfaces. A set of guidelines is developed for users to construct their own interfaces.

  18. Ventilation System Effectiveness and Tested Indoor Air Quality Impacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rudd, Armin; Bergey, Daniel

    In this project, Building America research team Building Science Corporation tested the effectiveness of ventilation systems at two unoccupied, single-family, detached lab homes at the University of Texas - Tyler. Five ventilation system tests were conducted with various whole-building ventilation systems. Multizone fan pressurization testing characterized building and zone enclosure leakage. PFT testing showed multizone air change rates and interzonal airflow. Cumulative particle counts for six particle sizes, and formaldehyde and other Top 20 VOC concentrations were measured in multiple zones. The testing showed that single-point exhaust ventilation was inferior as a whole-house ventilation strategy. This was because the sourcemore » of outside air was not direct from outside, the ventilation air was not distributed, and no provision existed for air filtration. Indoor air recirculation by a central air distribution system can help improve the exhaust ventilation system by way of air mixing and filtration. In contrast, the supply and balanced ventilation systems showed that there is a significant benefit to drawing outside air from a known outside location, and filtering and distributing that air. Compared to the exhaust systems, the CFIS and ERV systems showed better ventilation air distribution and lower concentrations of particulates, formaldehyde and other VOCs. System improvement percentages were estimated based on four system factor categories: balance, distribution, outside air source, and recirculation filtration. Recommended system factors could be applied to reduce ventilation fan airflow rates relative to ASHRAE Standard 62.2 to save energy and reduce moisture control risk in humid climates. HVAC energy savings were predicted to be 8-10%, or $50-$75/year.« less

  19. Lighting system combining daylight concentrators and an artificial source

    DOEpatents

    Bornstein, Jonathan G.; Friedman, Peter S.

    1985-01-01

    A combined lighting system for a building interior includes a stack of luminescent solar concentrators (LSC), an optical conduit made of preferably optical fibers for transmitting daylight from the LSC stack, a collimating lens set at an angle, a fixture for receiving the daylight at one end and for distributing the daylight as illumination inside the building, an artificial light source at the other end of the fixture for directing artifical light into the fixture for distribution as illumination inside the building, an automatic dimmer/brightener for the artificial light source, and a daylight sensor positioned near to the LSC stack for controlling the automatic dimmer/brightener in response to the daylight sensed. The system also has a reflector positioned behind the artificial light source and a fan for exhausting heated air out of the fixture during summer and for forcing heated air into the fixture for passage into the building interior during winter.

  20. The Relationship between Building Teacher Leadership Capacity and Campus Culture

    ERIC Educational Resources Information Center

    Harris, Dawn R.; Kemp-Graham, Kriss Y.

    2017-01-01

    The purpose of this explanatory sequential mixed methods research study was to explore the relationship between building teacher leadership capacity and campus culture in a suburban East Texas school district. Developing teacher leaders by building leadership capacity depends on administrators' abilities to develop leaders from within the existing…

  1. School Building in Early Development. Part 3. Project Implementation.

    ERIC Educational Resources Information Center

    Giertz, L. M.; Dijkgraaf, C.

    1977-01-01

    The first two publications in this series dealt generally with school building problems in developing nations. This third part offers more direct guidance. Described in some detail are organizational fundamentals and tools that depend on universal similarities in building practice and have, therefore, been recommended for use internationally. A…

  2. Building Information Modeling (BIM) Roadmap: Supplement 2 - BIM Implementation Plan for Military Construction Projects, Bentley Platform

    DTIC Science & Technology

    2011-01-01

    ER D C TR -0 6- 10 , S up pl em en t 2 Building Information Modeling ( BIM ) Roadmap Supplement 2 – BIM Implementation Plan for Military...release; distribution is unlimited. ERDC TR-06-10, Supplement 2 January 2011 Building Information Modeling ( BIM ) Roadmap Supplement 2 – BIM ...ERDC TR-06-10, Supplement 2 (January 2011) 2 Abstract: Building Information Modeling ( BIM ) technology provides the communities of practice in

  3. Indoor environment program. 1994 annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daisey, J.M.

    1995-04-01

    Buildings use approximately one-third of the energy consumed in the United States. The potential energy savings derived from reduced infiltration and ventilation in buildings are substantial, since energy use associated with conditioning and distributing ventilation air is about 5.5 EJ per year. However, since ventilation is the dominant mechanism for removing pollutants from indoor sources, reduction of ventilation can have adverse effects on indoor air quality, and on the health, comfort, and productivity of building occupants. The Indoor Environment Program in LBL`s Energy and Environment Division was established in 1977 to conduct integrated research on ventilation, indoor air quality, andmore » energy use and efficiency in buildings for the purpose of reducing energy liabilities associated with airflows into, within, and out of buildings while maintaining or improving occupant health and comfort. The Program is part of LBL`s Center for Building Science. Research is conducted on building energy use and efficiency, ventilation and infiltration, and thermal distribution systems; on the nature, sources, transport, transformation, and deposition of indoor air pollutants; and on exposure and health risks associated with indoor air pollutants. Pollutants of particular interest include radon; volatile, semivolatile, and particulate organic compounds; and combustion emissions, including environmental tobacco smoke, CO, and NO{sub x}.« less

  4. Extracting DNA words based on the sequence features: non-uniform distribution and integrity.

    PubMed

    Li, Zhi; Cao, Hongyan; Cui, Yuehua; Zhang, Yanbo

    2016-01-25

    DNA sequence can be viewed as an unknown language with words as its functional units. Given that most sequence alignment algorithms such as the motif discovery algorithms depend on the quality of background information about sequences, it is necessary to develop an ab initio algorithm for extracting the "words" based only on the DNA sequences. We considered that non-uniform distribution and integrity were two important features of a word, based on which we developed an ab initio algorithm to extract "DNA words" that have potential functional meaning. A Kolmogorov-Smirnov test was used for consistency test of uniform distribution of DNA sequences, and the integrity was judged by the sequence and position alignment. Two random base sequences were adopted as negative control, and an English book was used as positive control to verify our algorithm. We applied our algorithm to the genomes of Saccharomyces cerevisiae and 10 strains of Escherichia coli to show the utility of the methods. The results provide strong evidences that the algorithm is a promising tool for ab initio building a DNA dictionary. Our method provides a fast way for large scale screening of important DNA elements and offers potential insights into the understanding of a genome.

  5. Statistical Analysis of Hurst Exponents of Essential/Nonessential Genes in 33 Bacterial Genomes

    PubMed Central

    Liu, Xiao; Wang, Baojin; Xu, Luo

    2015-01-01

    Methods for identifying essential genes currently depend predominantly on biochemical experiments. However, there is demand for improved computational methods for determining gene essentiality. In this study, we used the Hurst exponent, a characteristic parameter to describe long-range correlation in DNA, and analyzed its distribution in 33 bacterial genomes. In most genomes (31 out of 33) the significance levels of the Hurst exponents of the essential genes were significantly higher than for the corresponding full-gene-set, whereas the significance levels of the Hurst exponents of the nonessential genes remained unchanged or increased only slightly. All of the Hurst exponents of essential genes followed a normal distribution, with one exception. We therefore propose that the distribution feature of Hurst exponents of essential genes can be used as a classification index for essential gene prediction in bacteria. For computer-aided design in the field of synthetic biology, this feature can build a restraint for pre- or post-design checking of bacterial essential genes. Moreover, considering the relationship between gene essentiality and evolution, the Hurst exponents could be used as a descriptive parameter related to evolutionary level, or be added to the annotation of each gene. PMID:26067107

  6. Visualization of the structural changes in plywood and gypsum board during the growth of Chaetomium globosum and Stachybotrys chartarum.

    PubMed

    Lewinska, Anna M; Hoof, Jakob B; Peuhkuri, Ruut H; Rode, Carsten; Lilje, Osu; Foley, Matthew; Trimby, Patrick; Andersen, Birgitte

    2016-10-01

    Fungal growth in indoor environments is associated with many negative health effects. Many studies focus on brown- and white-rot fungi and their effect on wood, but there is none that reveals the influence of soft-rot fungi, such as Stachybotrys spp. and Chaetomium spp., on the structure of building materials such as plywood and gypsum wallboard. This study focuses on using micro-computed tomography (microCT) to investigate changes of the structure of plywood and gypsum wallboard during fungal degradation by S. chartarum and C. globosum. Changes in the materials as a result of dampness and fungal growth were determined by measuring porosity and pore shape via microCT. The results show that the composition of the building material influenced the level of penetration by fungi as shown by scanning electron microscopy (SEM). Plywood appeared to be the most affected, with the penetration of moisture and fungi throughout the whole thickness of the sample. Conversely, fungi grew only on the top cardboard in the gypsum wallboard and they did not have significant influence on the gypsum wallboard structure. The majority of the observed changes in gypsum wallboard occurred due to moisture. This paper suggests that the mycelium distribution within building materials and the structural changes, caused by dampness and fungal growth, depend on the type of the material. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. 16. Perimeter acquisition radar building room #102, electrical equipment room; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. Perimeter acquisition radar building room #102, electrical equipment room; the prime power distribution system. Excellent example of endulum-types shock isolation. The grey cabinet and barrel assemble is part of the polychlorinated biphenyl (PCB) retrofill project - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  8. The longevity of lava dome eruptions

    NASA Astrophysics Data System (ADS)

    Wolpert, Robert L.; Ogburn, Sarah E.; Calder, Eliza S.

    2016-02-01

    Understanding the duration of past, ongoing, and future volcanic eruptions is an important scientific goal and a key societal need. We present a new methodology for forecasting the duration of ongoing and future lava dome eruptions based on a database (DomeHaz) recently compiled by the authors. The database includes duration and composition for 177 such eruptions, with "eruption" defined as the period encompassing individual episodes of dome growth along with associated quiescent periods during which extrusion pauses but unrest continues. In a key finding, we show that probability distributions for dome eruption durations are both heavy tailed and composition dependent. We construct objective Bayesian statistical models featuring heavy-tailed Generalized Pareto distributions with composition-specific parameters to make forecasts about the durations of new and ongoing eruptions that depend on both eruption duration to date and composition. Our Bayesian predictive distributions reflect both uncertainty about model parameter values (epistemic uncertainty) and the natural variability of the geologic processes (aleatoric uncertainty). The results are illustrated by presenting likely trajectories for 14 dome-building eruptions ongoing in 2015. Full representation of the uncertainty is presented for two key eruptions, Soufriére Hills Volcano in Montserrat (10-139 years, median 35 years) and Sinabung, Indonesia (1-17 years, median 4 years). Uncertainties are high but, importantly, quantifiable. This work provides for the first time a quantitative and transferable method and rationale on which to base long-term planning decisions for lava dome-forming volcanoes, with wide potential use and transferability to forecasts of other types of eruptions and other adverse events across the geohazard spectrum.

  9. Valence-quark distribution functions in the kaon and pion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chen; Chang, Lei; Roberts, Craig D.

    2016-04-18

    We describe expressions for pion and kaon dressed-quark distribution functions that incorporate contributions from gluons which bind quarks into these mesons and hence overcome a flaw of the commonly used handbag approximation. The distributions therewith obtained are purely valence in character, ensuring that dressed quarks carry all the meson’s momentum at a characteristic hadronic scale and vanish as ( 1 - x ) 2 when Bjorken- x → 1 . Comparing such distributions within the pion and kaon, it is apparent that the size of S U ( 3 ) -flavor symmetry breaking in meson parton distribution functions is modulatedmore » by the flavor dependence of dynamical chiral symmetry breaking. Corrections to these leading-order formulas may be divided into two classes, responsible for shifting dressed-quark momentum into glue and sea quarks. Working with available empirical information, we build an algebraic framework that is capable of expressing the principal impact of both classes of corrections. This enables a realistic comparison with experiment which allows us to identify and highlight basic features of measurable pion and kaon valence-quark distributions. We find that whereas roughly two thirds of the pion’s light-front momentum is carried by valence dressed quarks at a characteristic hadronic scale; this fraction rises to 95% in the kaon; evolving distributions with these features to a scale typical of available Drell-Yan data produces a kaon-to-pion ratio of u -quark distributions that is in agreement with the single existing data set, and predicts a u -quark distribution within the pion that agrees with a modern reappraisal of π N Drell-Yan data. Precise new data are essential in order to validate this reappraisal and because a single modest-quality measurement of the kaon-to-pion ratio cannot be considered definitive.« less

  10. Oligomer formation in the troposphere: from experimental knowledge to 3-D modeling

    NASA Astrophysics Data System (ADS)

    Lemaire, V.; Coll, I.; Couvidat, F.; Mouchel-Vallon, C.; Seigneur, C.; Siour, G.

    2015-10-01

    The organic fraction of atmospheric aerosols has proven to be a critical element of air quality and climate issues. However, its composition and the aging processes it undergoes remain insufficiently understood. This work builds on laboratory knowledge to simulate the formation of oligomers from biogenic secondary organic aerosol (BSOA) in the troposphere at the continental scale. We compare the results of two different modeling approaches, a 1st-order kinetic process and a pH-dependent parameterization, both implemented in the CHIMERE air quality model (AQM), to simulate the spatial and temporal distribution of oligomerized SOA over western Europe. Our results show that there is a strong dependence of the results on the selected modeling approach: while the irreversible kinetic process leads to the oligomerization of about 50 % of the total BSOA mass, the pH-dependent approach shows a broader range of impacts, with a strong dependency on environmental parameters (pH and nature of aerosol) and the possibility for the process to be reversible. In parallel, we investigated the sensitivity of each modeling approach to the representation of SOA precursor solubility (Henry's law constant values). Finally, the pros and cons of each approach for the representation of SOA aging are discussed and recommendations are provided to improve current representations of oligomer formation in AQMs.

  11. Integration of Geographical Information Systems and Geophysical Applications with Distributed Computing Technologies.

    NASA Astrophysics Data System (ADS)

    Pierce, M. E.; Aktas, M. S.; Aydin, G.; Fox, G. C.; Gadgil, H.; Sayar, A.

    2005-12-01

    We examine the application of Web Service Architectures and Grid-based distributed computing technologies to geophysics and geo-informatics. We are particularly interested in the integration of Geographical Information System (GIS) services with distributed data mining applications. GIS services provide the general purpose framework for building archival data services, real time streaming data services, and map-based visualization services that may be integrated with data mining and other applications through the use of distributed messaging systems and Web Service orchestration tools. Building upon on our previous work in these areas, we present our current research efforts. These include fundamental investigations into increasing XML-based Web service performance, supporting real time data streams, and integrating GIS mapping tools with audio/video collaboration systems for shared display and annotation.

  12. Appendices of an appraisal for the use of geothermal energy in state-owned buildings in Colorado. Section D. Durango

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, R.T.; Coe, B.A.; Dick, J.D.

    1981-01-01

    Four state-owned building complexes ahve been evaluated within the city of Durango: The State Fish Hatchery, Fort Lewis College, new State Highway Department Building near the Bodo Industrial Park, and the National Guard Building. Three of the state facilities in Durango are evaluated for geothermal systems on thea ssumption of taking geothermal water from a trunk-line originating at the area northof Durango: State Fish Hatchery, Fort Lewis College and new State Highway Department Building. The National Guard Building is evaluated on the basis of a water-to-air heat pump, with warm water derived from a hypothetical shallow aquifer immediately below themore » building site. Two geothermal options were separately evaluated for Fort Lewis College: a central heat exchanger system for delivery of 145/sup 0/F heating water to the campus buildings and a central heat pump system for boosting the heating water to 200/sup 0/F prior to delivery to the buildings; both systems require the installation of a distribution piping network for the entire campus area. Retrofit engineering for the State Fish Hatchery provides for the installation of a small scale central distribution piping system to the several buildings, a central heat excanger coupled to the geothermal trunk line, and the use of various fan coil and unit heaters for space heating. An option is provided for discharge-mixing the geothermal water into the fish ponds and runs in order to raise the hatchery water temperature a couple degrees for increasing fish production and yield. The heating system for the new State Highway Department Building is redesigned to replace the natural-gas-fired forced-air furnaces with a heat exchanger, hot water fan coils and unit heaters.« less

  13. Article and method of forming an article

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lacy, Benjamin Paul; Kottilingam, Srikanth Chandrudu; Dutta, Sandip

    Provided are an article and a method of forming an article. The method includes providing a metallic powder, heating the metallic powder to a temperature sufficient to joint at least a portion of the metallic powder to form an initial layer, sequentially forming additional layers in a build direction by providing a distributed layer of the metallic powder over the initial layer and heating the distributed layer of the metallic powder, repeating the steps of sequentially forming the additional layers in the build direction to form a portion of the article having a hollow space formed in the build direction,more » and forming an overhang feature extending into the hollow space. The article includes an article formed by the method described herein.« less

  14. Large-Eddy Simulation on Plume Dispersion within Regular Arrays of Cubic Buildings

    NASA Astrophysics Data System (ADS)

    Nakayama, H.; Jurcakova, K.; Nagai, H.

    2010-09-01

    There is a potential problem that hazardous and flammable materials are accidentally or intentionally released into the atmosphere, either within or close to populated urban areas. For the assessment of human health hazard from toxic substances, the existence of high concentration peaks in a plume should be considered. For the safety analysis of flammable gas, certain critical threshold levels should be evaluated. Therefore, in such a situation, not only average levels but also instantaneous magnitudes of concentration should be accurately predicted. However, plume dispersion is an extremely complicated process strongly influenced by the existence of buildings. In complex turbulent flows, such as impinging, separated and circulation flows around buildings, plume behaviors can be no longer accurately predicted using empirical Gaussian-type plume model. Therefore, we perform Large-Eddy Simulations (LES) on turbulent flows and plume dispersions within and over regular arrays of cubic buildings with various roughness densities and investigate the influence of the building arrangement pattern on the characteristics of mean and fluctuation concentrations. The basic equations for the LES model are composed of the spatially filtered continuity equation, Navier-Stokes equation and transport equation of concentration. The standard Smagorinsky model (Smagorinsky, 1963) that has enough potential for environment flows is used and its constant is set to 0.12 for estimating the eddy viscosity. The turbulent Schmidt number is 0.5. In our LES model, two computational regions are set up. One is a driver region for generation of inflow turbulence and the other is a main region for LES of plume dispersion within a regular array of cubic buildings. First, inflow turbulence is generated by using Kataoka's method (2002) in the driver region and then, its data are imposed at the inlet of the main computational region at each time step. In this study, the cubic building arrays with λf=0.16, 0.25 and 0.33 are set up (λf: the building frontal area index). These surface geometries consist of 20×6, 25×7 and 28×9 arrays in streamwise and spanwise directions, respectively. Three cases of plume source located at the ground surface behind the building in the 6th, 7th and 8th row of the building array are tested. It is found that the patterns of the dispersion behavior depending on roughness density are successfully simulated and the spatial distributions of mean and fluctuating concentrations are also captured within and over the building arrays in comparison with the wind tunnel experiments conducted by Bezpalcová (2008).

  15. Quantifying the Dependencies of Rooftop Temperatures on Albedo

    NASA Technical Reports Server (NTRS)

    Dominquez, Anthony; Kleissl, Jan; Luvall, Jeff

    2009-01-01

    The thermal properties of building materials directly effect the conditions inside of buildings Heat transfer is not a primary design driver in building design. Rooftop modifications lower heat transfer, which lowers energy consumption and costs. The living environmental laboratory attitude at UCSD makes it the perfect place to test the success of these modifications.

  16. High-School Buildings and Grounds. Bulletin, 1922, No. 23

    ERIC Educational Resources Information Center

    Bureau of Education, Department of the Interior, 1922

    1922-01-01

    The success of any high school depends largely upon the planning of its building. The wise planning of a high-school building requires familiarity with school needs and processes, knowledge of the best approved methods of safety, lighting, sanitation, and ventilation, and ability to solve the educational, structural, and architectural problems…

  17. The Role of Leadership in Facilitating Organisational Learning and Collective Capacity Building

    ERIC Educational Resources Information Center

    Piranfar, Hosein

    2007-01-01

    The paper examines the role of leadership in facilitating collective learning and capacity building by utilising ideas from the fields of evolutionary learning, operations strategy, quality, project and risk management. Two contrasting cases are chosen to show how success and failure can depend upon collective capacity building through…

  18. Footprint Map Partitioning Using Airborne Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Xiong, B.; Oude Elberink, S.; Vosselman, G.

    2016-06-01

    Nowadays many cities and countries are creating their 3D building models for a better daily management and smarter decision making. The newly created 3D models are required to be consistent with existing 2D footprint maps. Thereby the 2D maps are usually combined with height data for the task of 3D reconstruction. Many buildings are often composed by parts that are discontinuous over height. Building parts can be reconstructed independently and combined into a complete building. Therefore, most of the state-of-the-art work on 3D building reconstruction first decomposes a footprint map into parts. However, those works usually change the footprint maps for easier partitioning and cannot detect building parts that are fully inside the footprint polygon. In order to solve those problems, we introduce two methodologies, one more dependent on height data, and the other one more dependent on footprints. We also experimentally evaluate the two methodologies and compare their advantages and disadvantages. The experiments use Airborne Laser Scanning (ALS) data and two vector maps, one with 1:10,000 scale and another one with 1:500 scale.

  19. Regional distribution and losses of end-of-life steel throughout multiple product life cycles-Insights from the global multiregional MaTrace model.

    PubMed

    Pauliuk, Stefan; Kondo, Yasushi; Nakamura, Shinichiro; Nakajima, Kenichi

    2017-01-01

    Substantial amounts of post-consumer scrap are exported to other regions or lost during recovery and remelting, and both export and losses pose a constraint to desires for having regionally closed material cycles. To quantify the challenges and trade-offs associated with closed-loop metal recycling, we looked at the material cycles from the perspective of a single material unit and trace a unit of material through several product life cycles. Focusing on steel, we used current process parameters, loss rates, and trade patterns of the steel cycle to study how steel that was originally contained in high quality applications such as machinery or vehicles with stringent purity requirements gets subsequently distributed across different regions and product groups such as building and construction with less stringent purity requirements. We applied MaTrace Global, a supply-driven multiregional model of steel flows coupled to a dynamic stock model of steel use. We found that, depending on region and product group, up to 95% of the steel consumed today will leave the use phase of that region until 2100, and that up to 50% can get lost in obsolete stocks, landfills, or slag piles until 2100. The high losses resulting from business-as-usual scrap recovery and recycling can be reduced, both by diverting postconsumer scrap into long-lived applications such as buildings and by improving the recovery rates in the waste management and remelting industries. Because the lifetimes of high-quality (cold-rolled) steel applications are shorter and remelting occurs more often than for buildings and infrastructure, we found and quantified a tradeoff between low losses and high-quality applications in the steel cycle. Furthermore, we found that with current trade patterns, reduced overall losses will lead to higher fractions of secondary steel being exported to other regions. Current loss rates, product lifetimes, and trade patterns impede the closure of the steel cycle.

  20. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less

  1. A New Distributed Optimization for Community Microgrids Scheduling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starke, Michael R; Tomsovic, Kevin

    This paper proposes a distributed optimization model for community microgrids considering the building thermal dynamics and customer comfort preference. The microgrid central controller (MCC) minimizes the total cost of operating the community microgrid, including fuel cost, purchasing cost, battery degradation cost and voluntary load shedding cost based on the customers' consumption, while the building energy management systems (BEMS) minimize their electricity bills as well as the cost associated with customer discomfort due to room temperature deviation from the set point. The BEMSs and the MCC exchange information on energy consumption and prices. When the optimization converges, the distributed generation scheduling,more » energy storage charging/discharging and customers' consumption as well as the energy prices are determined. In particular, we integrate the detailed thermal dynamic characteristics of buildings into the proposed model. The heating, ventilation and air-conditioning (HVAC) systems can be scheduled intelligently to reduce the electricity cost while maintaining the indoor temperature in the comfort range set by customers. Numerical simulation results show the effectiveness of proposed model.« less

  2. Skyshine study for next generation of fusion devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gohar, Y.; Yang, S.

    1987-02-01

    A shielding analysis for next generation of fusion devices (ETR/INTOR) was performed to study the dose equivalent outside the reactor building during operation including the contribution from neutrons and photons scattered back by collisions with air nuclei (skyshine component). Two different three-dimensional geometrical models for a tokamak fusion reactor based on INTOR design parameters were developed for this study. In the first geometrical model, the reactor geometry and the spatial distribution of the deuterium-tritium neutron source were simplified for a parametric survey. The second geometrical model employed an explicit representation of the toroidal geometry of the reactor chamber and themore » spatial distribution of the neutron source. The MCNP general Monte Carlo code for neutron and photon transport was used to perform all the calculations. The energy distribution of the neutron source was used explicitly in the calculations with ENDF/B-V data. The dose equivalent results were analyzed as a function of the concrete roof thickness of the reactor building and the location outside the reactor building.« less

  3. Bioclimate and city planning - open space planning

    NASA Astrophysics Data System (ADS)

    Mertens, Elke

    The planning and using of open spaces in urban areas very much depend on the shading of the surrounding building structures. This article presents a method for the investigation of the sunlight and the bioclimatic conditions in dependence on the surrounding buildings. It is illustrated for typical courtyards in Berlin, Germany, as one type of open spaces. The programme HelioDat determines the shading of any spot of an open space. It gives the possible duration of direct sunlight for the selected spot for each day of the year. The sunlight conditions in the courtyards differ from one another a lot in dependence on their size the tallness of the surrounding buildings. The calculation of the PMV on the basis of the results of the programme HelioDat determine the bioclimatic situation in the discussed courtyards. Although the results of HelioDat are only one input among the weather conditions and the personal characteristics of the test-person, the bioclimatic conditions correlate very much with the sunlight conditions. In a projected building structure, the sunlight conditions vary a lot between the present situation and the two architectural alternatives. Since the bioclimatic situation is correlated to the sunlight conditions, this example demonstrates the importance for the investigation of the sunlight conditions and the bioclimate already during the planning process of buildings.

  4. Three-dimensional vapor intrusion modeling approach that combines wind and stack effects on indoor, atmospheric, and subsurface domains.

    PubMed

    Shirazi, Elham; Pennell, Kelly G

    2017-12-13

    Vapor intrusion (IV) exposure risks are difficult to characterize due to the role of atmospheric, building and subsurface processes. This study presents a three-dimensional VI model that extends the common subsurface fate and transport equations to incorporate wind and stack effects on indoor air pressure, building air exchange rate (AER) and indoor contaminant concentration to improve VI exposure risk estimates. The model incorporates three modeling programs: (1) COMSOL Multiphysics to model subsurface fate and transport processes, (2) CFD0 to model atmospheric air flow around the building, and (3) CONTAM to model indoor air quality. The combined VI model predicts AER values, zonal indoor air pressures and zonal indoor air contaminant concentrations as a function of wind speed, wind direction and outdoor and indoor temperature. Steady state modeling results for a single-story building with a basement demonstrate that wind speed, wind direction and opening locations in a building play important roles in changing the AER, indoor air pressure, and indoor air contaminant concentration. Calculated indoor air pressures ranged from approximately -10 Pa to +4 Pa depending on weather conditions and building characteristics. AER values, mass entry rates and indoor air concentrations vary depending on weather conditions and building characteristics. The presented modeling approach can be used to investigate the relationship between building features, AER, building pressures, soil gas concentrations, indoor air concentrations and VI exposure risks.

  5. Building 736, second floor, view to southwest showing original window ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Building 736, second floor, view to southwest showing original window on the left, and electrical controls and entry ladder on the right - Mare Island Naval Shipyard, Electrical Distribution Centers, Railroad Avenue near Eighteenth Street, Vallejo, Solano County, CA

  6. Linked sustainability challenges and trade-offs among fisheries, aquaculture and agriculture.

    PubMed

    Blanchard, Julia L; Watson, Reg A; Fulton, Elizabeth A; Cottrell, Richard S; Nash, Kirsty L; Bryndum-Buchholz, Andrea; Büchner, Matthias; Carozza, David A; Cheung, William W L; Elliott, Joshua; Davidson, Lindsay N K; Dulvy, Nicholas K; Dunne, John P; Eddy, Tyler D; Galbraith, Eric; Lotze, Heike K; Maury, Olivier; Müller, Christoph; Tittensor, Derek P; Jennings, Simon

    2017-09-01

    Fisheries and aquaculture make a crucial contribution to global food security, nutrition and livelihoods. However, the UN Sustainable Development Goals separate marine and terrestrial food production sectors and ecosystems. To sustainably meet increasing global demands for fish, the interlinkages among goals within and across fisheries, aquaculture and agriculture sectors must be recognized and addressed along with their changing nature. Here, we assess and highlight development challenges for fisheries-dependent countries based on analyses of interactions and trade-offs between goals focusing on food, biodiversity and climate change. We demonstrate that some countries are likely to face double jeopardies in both fisheries and agriculture sectors under climate change. The strategies to mitigate these risks will be context-dependent, and will need to directly address the trade-offs among Sustainable Development Goals, such as halting biodiversity loss and reducing poverty. Countries with low adaptive capacity but increasing demand for food require greater support and capacity building to transition towards reconciling trade-offs. Necessary actions are context-dependent and include effective governance, improved management and conservation, maximizing societal and environmental benefits from trade, increased equitability of distribution and innovation in food production, including continued development of low input and low impact aquaculture.

  7. Modeling the Delivery Physiology of Distributed Learning Systems.

    ERIC Educational Resources Information Center

    Paquette, Gilbert; Rosca, Ioan

    2003-01-01

    Discusses instructional delivery models and their physiology in distributed learning systems. Highlights include building delivery models; types of delivery models, including distributed classroom, self-training on the Web, online training, communities of practice, and performance support systems; and actors (users) involved, including experts,…

  8. 46 CFR 386.15 - Distribution of handbills.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Distribution of handbills. 386.15 Section 386.15 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION MISCELLANEOUS REGULATIONS GOVERNING PUBLIC BUILDINGS AND GROUNDS AT THE UNITED STATES MERCHANT MARINE ACADEMY § 386.15 Distribution of handbills. The...

  9. Conceptual Model Scenarios for the Vapor Intrusion Pathway

    EPA Pesticide Factsheets

    This report provides simplified simulation examples to illustrate graphically how subsurface conditions and building-specific characteristics determine the distribution chemical distribution and indoor air concentration relative to a source concentration.

  10. Assessment of Distributed Generation Potential in JapaneseBuildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Nan; Marnay, Chris; Firestone, Ryan

    2005-05-25

    To meet growing energy demands, energy efficiency, renewable energy, and on-site generation coupled with effective utilization of exhaust heat will all be required. Additional benefit can be achieved by integrating these distributed technologies into distributed energy resource (DER) systems (or microgrids). This research investigates a method of choosing economically optimal DER, expanding on prior studies at the Berkeley Lab using the DER design optimization program, the Distributed Energy Resources Customer Adoption Model (DER-CAM). DER-CAM finds the optimal combination of installed equipment from available DER technologies, given prevailing utility tariffs, site electrical and thermal loads, and a menu of available equipment.more » It provides a global optimization, albeit idealized, that shows how the site energy loads can be served at minimum cost by selection and operation of on-site generation, heat recovery, and cooling. Five prototype Japanese commercial buildings are examined and DER-CAM applied to select the economically optimal DER system for each. The five building types are office, hospital, hotel, retail, and sports facility. Based on the optimization results, energy and emission reductions are evaluated. Furthermore, a Japan-U.S. comparison study of policy, technology, and utility tariffs relevant to DER installation is presented. Significant decreases in fuel consumption, carbon emissions, and energy costs were seen in the DER-CAM results. Savings were most noticeable in the sports facility (a very favourable CHP site), followed by the hospital, hotel, and office building.« less

  11. Infilling and quality checking of discharge, precipitation and temperature data using a copula based approach

    NASA Astrophysics Data System (ADS)

    Anwar, Faizan; Bárdossy, András; Seidel, Jochen

    2017-04-01

    Estimating missing values in a time series of a hydrological variable is an everyday task for a hydrologist. Existing methods such as inverse distance weighting, multivariate regression, and kriging, though simple to apply, provide no indication of the quality of the estimated value and depend mainly on the values of neighboring stations at a given step in the time series. Copulas have the advantage of representing the pure dependence structure between two or more variables (given the relationship between them is monotonic). They rid us of questions such as transforming the data before use or calculating functions that model the relationship between the considered variables. A copula-based approach is suggested to infill discharge, precipitation, and temperature data. As a first step the normal copula is used, subsequently, the necessity to use non-normal / non-symmetrical dependence is investigated. Discharge and temperature are treated as regular continuous variables and can be used without processing for infilling and quality checking. Due to the mixed distribution of precipitation values, it has to be treated differently. This is done by assigning a discrete probability to the zeros and treating the rest as a continuous distribution. Building on the work of others, along with infilling, the normal copula is also utilized to identify values in a time series that might be erroneous. This is done by treating the available value as missing, infilling it using the normal copula and checking if it lies within a confidence band (5 to 95% in our case) of the obtained conditional distribution. Hydrological data from two catchments Upper Neckar River (Germany) and Santa River (Peru) are used to demonstrate the application for datasets with different data quality. The Python code used here is also made available on GitHub. The required input is the time series of a given variable at different stations.

  12. Estimation of effective temperatures in quantum annealers for sampling applications: A case study with possible applications in deep learning

    NASA Astrophysics Data System (ADS)

    Benedetti, Marcello; Realpe-Gómez, John; Biswas, Rupak; Perdomo-Ortiz, Alejandro

    2016-08-01

    An increase in the efficiency of sampling from Boltzmann distributions would have a significant impact on deep learning and other machine-learning applications. Recently, quantum annealers have been proposed as a potential candidate to speed up this task, but several limitations still bar these state-of-the-art technologies from being used effectively. One of the main limitations is that, while the device may indeed sample from a Boltzmann-like distribution, quantum dynamical arguments suggest it will do so with an instance-dependent effective temperature, different from its physical temperature. Unless this unknown temperature can be unveiled, it might not be possible to effectively use a quantum annealer for Boltzmann sampling. In this work, we propose a strategy to overcome this challenge with a simple effective-temperature estimation algorithm. We provide a systematic study assessing the impact of the effective temperatures in the learning of a special class of a restricted Boltzmann machine embedded on quantum hardware, which can serve as a building block for deep-learning architectures. We also provide a comparison to k -step contrastive divergence (CD-k ) with k up to 100. Although assuming a suitable fixed effective temperature also allows us to outperform one-step contrastive divergence (CD-1), only when using an instance-dependent effective temperature do we find a performance close to that of CD-100 for the case studied here.

  13. Assessment of the Seismic Risk in the City of Yerevan and its Mitigation by Application of Innovative Seismic Isolation Technologies

    NASA Astrophysics Data System (ADS)

    Melkumyan, Mikayel G.

    2011-03-01

    It is obvious that the problem of precise assessment and/or analysis of seismic hazard (SHA) is quite a serious issue, and seismic risk reduction considerably depends on it. It is well known that there are two approaches in seismic hazard analysis, namely, deterministic (DSHA) and probabilistic (PSHA). The latter utilizes statistical estimates of earthquake parameters. However, they may not exist in a specific region, and using PSHA it is difficult to take into account local aspects, such as specific regional geology and site effects, with sufficient precision. For this reason, DSHA is preferable in many cases. After the destructive 1988 Spitak earthquake, the SHA of the territory of Armenia has been revised and increased. The distribution pattern for seismic risk in Armenia is given. Maximum seismic risk is concentrated in the region of the capital, the city of Yerevan, where 40% of the republic's population resides. We describe the method used for conducting seismic resistance assessment of the existing reinforced concrete (R/C) buildings. Using this assessment, as well as GIS technology, the coefficients characterizing the seismic risk of destruction were calculated for almost all buildings of Yerevan City. The results of the assessment are presented. It is concluded that, presently, there is a particularly pressing need for strengthening existing buildings. We then describe non-conventional approaches to upgrading the earthquake resistance of existing multistory R/C frame buildings by means of Additional Isolated Upper Floor (AIUF) and of existing stone and frame buildings by means of base isolation. In addition, innovative seismic isolation technologies were developed and implemented in Armenia for construction of new multistory multifunctional buildings. The advantages of these technologies are listed in the paper. It is worth noting that the aforementioned technologies were successfully applied for retrofitting an existing 100-year-old bank building in Irkutsk (Russia), for retrofit design of an existing 177-year-old municipality building in Iasi (Romania) and for construction of a new clinic building in Stepanakert (Nagorno Karabakh). Short descriptions of these projects are presented. Since 1994 the total number of base and roof isolated buildings constructed, retrofitted or under construction in Armenia, has reached 32. Statistics of seismically isolated buildings are given in the paper. The number of base isolated buildings per capita in Armenia is one of the highest in the world. In Armenia, for the first time in history, retrofitting of existing buildings by base isolation was carried out without interruption in the use of the buildings. The description of different base isolated buildings erected in Armenia, as well as the description of the method of retrofitting of existing buildings which is patented in Armenia (M. G. Melkumyan, patent of the Republic of Armenia No. 579), are also given in the paper.

  14. THE RELATIONSHIP BETWEEN THE CORROSION OF DISTRIBUTION SYSTEM MATERIALS, AND OXIDANT AND REDOX POTENTIAL

    EPA Science Inventory

    Scale build-up, corrosion rate, and metal release associated with drinking water distribution system pipes have been suggested to relate to the oxidant type and concentration. Conversely, different distribution system metals may exert different oxidant demands. The impact of ox...

  15. Shear-lag effect and its effect on the design of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Thanh Dat, Bui; Traykov, Alexander; Traykova, Marina

    2018-03-01

    For super high-rise buildings, the analysis and selection of suitable structural solutions are very important. The structure has not only to carry the gravity loads (self-weight, live load, etc.), but also to resist lateral loads (wind and earthquake loads). As the buildings become taller, the demand on different structural systems dramatically increases. The article considers the division of the structural systems of tall buildings into two main categories - interior structures for which the major part of the lateral load resisting system is located within the interior of the building, and exterior structures for which the major part of the lateral load resisting system is located at the building perimeter. The basic types of each of the main structural categories are described. In particular, the framed tube structures, which belong to the second main category of exterior structures, seem to be very efficient. That type of structure system allows tall buildings resist the lateral loads. However, those tube systems are affected by shear lag effect - a nonlinear distribution of stresses across the sides of the section, which is commonly found in box girders under lateral loads. Based on a numerical example, some general conclusions for the influence of the shear-lag effect on frequencies, periods, distribution and variation of the magnitude of the internal forces in the structure are presented.

  16. What do we gain from simplicity versus complexity in species distribution models?

    USGS Publications Warehouse

    Merow, Cory; Smith, Matthew J.; Edwards, Thomas C.; Guisan, Antoine; McMahon, Sean M.; Normand, Signe; Thuiller, Wilfried; Wuest, Rafael O.; Zimmermann, Niklaus E.; Elith, Jane

    2014-01-01

    Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence–environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence–environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building ‘under fit’ models, having insufficient flexibility to describe observed occurrence–environment relationships, we risk misunderstanding the factors shaping species distributions. By building ‘over fit’ models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species ranges.

  17. Collaborative Knowledge Building with Wikis: The Impact of Redundancy and Polarity

    ERIC Educational Resources Information Center

    Moskaliuk, Johannes; Kimmerle, Joachim; Cress, Ulrike

    2012-01-01

    Wikis as shared digital artifacts may enable users to participate in processes of knowledge building. To what extent and with which quality knowledge building can take place is assumed to depend on the interrelation between people's prior knowledge and the information available in a wiki. In two experimental studies we examined the impact on…

  18. Continuation Power Flow Analysis for PV Integration Studies at Distribution Feeders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jiyu; Zhu, Xiangqi; Lubkeman, David L.

    2017-10-30

    This paper presents a method for conducting continuation power flow simulation on high-solar penetration distribution feeders. A load disaggregation method is developed to disaggregate the daily feeder load profiles collected in substations down to each load node, where the electricity consumption of residential houses and commercial buildings are modeled using actual data collected from single family houses and commercial buildings. This allows the modeling of power flow and voltage profile along a distribution feeder on a continuing fashion for a 24- hour period at minute-by-minute resolution. By separating the feeder into load zones based on the distance between the loadmore » node and the feeder head, we studied the impact of PV penetration on distribution grid operation in different seasons and under different weather conditions for different PV placements.« less

  19. A wind tunnel study on the effect of trees on PM2.5 distribution around buildings.

    PubMed

    Ji, Wenjing; Zhao, Bin

    2018-03-15

    Vegetation, especially trees, is effective in reducing the concentration of particulate matter. Trees can efficiently capture particles, improve urban air quality, and may further decrease the introduction of outdoor particles to indoor air. The objective of this study is to investigate the effects of trees on particle distribution and removal around buildings using wind tunnel experiments. The wind tunnel is 18m long, 12m wide, and 3.5m high. Trees were modeled using real cypress branches to mimic trees planted around buildings. At the inlet of the wind tunnel, a "line source" of particles was released, simulating air laden with particulate matter. Experiments with the cypress tree and tree-free models were conducted to compare particle concentrations around the buildings. The results indicate that cypress trees clearly reduce PM 2.5 concentrations compared with the tree-free model. The cypress trees enhanced the PM 2.5 removal rate by about 20%. The effects of trees on PM 2.5 removal and distribution vary at different heights. At the base of the trees, their effect on reducing PM 2.5 concentrations is the most significant. At a great height above the treetops, the effect is almost negligible. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. [Spatial distribution characteristics of urban potential population in Shenyang City based on QuickBird image and GIS].

    PubMed

    Li, Jun-Ying; Hu, Yuan-Man; Chen, Wei; Liu, Miao; Hu, Jian-Bo; Zhong, Qiao-Lin; Lu, Ning

    2012-06-01

    Population is the most active factor affecting city development. To understand the distribution characteristics of urban population is of significance for making city policy decisions and for optimizing the layout of various urban infrastructures. In this paper, the information of the residential buildings in Shenyang urban area was extracted from the QuickBird remote sensing images, and the spatial distribution characteristics of the population within the Third-Ring Road of the City were analyzed, according to the social and economic statistics data. In 2010, the population density in different types of residential buildings within the Third-Ring Road of the City decreased in the order of high-storey block, mixed block, mixed garden, old multi-storey building, high-storey garden, multi-storey block, multi-storey garden, villa block, shanty, and villa garden. The vacancy rate of the buildings within the Third-Ring Road was more than 30%, meaning that the real estate market was seriously overstocked. Among the five Districts of Shenyang City, Shenhe District had the highest potential population density, while Tiexi District and Dadong District had a lower one. The gravity center of the City and its five Districts was also analyzed, which could provide basic information for locating commercial facilities and planning city infrastructure.

  1. Green Building Premium Cost Analysis in Indonesia Using Work Breakdown Structure Method

    NASA Astrophysics Data System (ADS)

    Basten, V.; Latief, Y.; Berawi, M. A.; Riswanto; Muliarto, H.

    2018-03-01

    The concept of green building in the construction industry is indispensable for mitigating environmental issues such as waste, pollution, and carbon emissions. There are some countries that have Green Building rating tools. Indonesia particularly is the country which has Greenship rating tools but the number of Green Building is relatively low. Development of building construction is depended on building investors or owner initiation, so this research is conducted to get the building aspects that have significant effect on the attractiveness using The Green Building Concept. The method in this research is work breakdown structure method that detailing the green building activities. The particular activities will be processed to get the cost elements for the green building achievement that it was targeted to improve better than conventional building. The final result of the study was a very significant work package on green building construction in the city of Indonesia case study.

  2. Ontology for Life-Cycle Modeling of Water Distribution Systems: Model View Definition

    DTIC Science & Technology

    2013-06-01

    Research and Development Center, Construction Engineering Research Laboratory (ERDC-CERL) to develop a life-cycle building model have resulted in the...Laboratory (ERDC-CERL) to develop a life-cycle building model have resulted in the definition of a “core” building information model that contains...developed experimental BIM models us- ing commercial off-the-shelf (COTS) software. Those models represent three types of typical low-rise Army

  3. The Make 2D-DB II package: conversion of federated two-dimensional gel electrophoresis databases into a relational format and interconnection of distributed databases.

    PubMed

    Mostaguir, Khaled; Hoogland, Christine; Binz, Pierre-Alain; Appel, Ron D

    2003-08-01

    The Make 2D-DB tool has been previously developed to help build federated two-dimensional gel electrophoresis (2-DE) databases on one's own web site. The purpose of our work is to extend the strength of the first package and to build a more efficient environment. Such an environment should be able to fulfill the different needs and requirements arising from both the growing use of 2-DE techniques and the increasing amount of distributed experimental data.

  4. DSSTOX WEBSITE LAUNCH: IMPROVING PUBLIC ACCESS TO DATABASES FOR BUILDING STRUCTURE-TOXICITY PREDICTION MODELS

    EPA Science Inventory

    DSSTox Website Launch: Improving Public Access to Databases for Building Structure-Toxicity Prediction Models
    Ann M. Richard
    US Environmental Protection Agency, Research Triangle Park, NC, USA

    Distributed: Decentralized set of standardized, field-delimited databases,...

  5. Needs Analysis in Belgium's Flemish Community.

    ERIC Educational Resources Information Center

    Leemans, Geert

    1999-01-01

    Describes the methodology used by a Belgian community to determine the community's building needs at all levels of education. Explains how the inquiry evaluated building stock, needs, and effects; and offers recommendations for increasing investment funds, distributing resources, relying on experts' reports, budgeting for resources, and increasing…

  6. 16. BUILDING NO. 445, PHYSICS LAB (FORMERLY GUN BAG LOADING), ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. BUILDING NO. 445, PHYSICS LAB (FORMERLY GUN BAG LOADING), INTERIOR, SECOND LEVEL. LOOKING UP AT POWDER AND DISTRIBUTION TUBES. ELEVATOR SHAFT ON LEFT. - Picatinny Arsenal, 400 Area, Gun Bag Loading District, State Route 15 near I-80, Dover, Morris County, NJ

  7. Exposure Modeling for Polychlorinated Biphenyls in School Buildings

    EPA Science Inventory

    There is limited research on characterizing exposures from PCB sources for occupants of school buildings. PCB measurement results from six schools were used to estimate potential exposure distributions for four age groups (4-5, 6-10, 11-14, 14-18 year-olds) using the Stochastic...

  8. The Principal's Roles in Building Capacity for Change

    ERIC Educational Resources Information Center

    Butler, Sara Griffiths

    2017-01-01

    This qualitative dissertation explores how the principal's leadership roles build capacity for change and how chaos theory contributes to this understanding. The roles studied include distributive leadership, moral leadership, social justice leadership, democratic leadership, and instructional leadership. The tenets of chaos theory examined…

  9. Development of building energy asset rating using stock modelling in the USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Na; Goel, Supriya; Makhmalbaf, Atefe

    2016-01-29

    The US Building Energy Asset Score helps building stakeholders quickly gain insight into the efficiency of building systems (envelope, electrical and mechanical systems). A robust, easy-to-understand 10-point scoring system was developed to facilitate an unbiased comparison of similar building types across the country. The Asset Score does not rely on a database or specific building baselines to establish a rating. Rather, distributions of energy use intensity (EUI) for various building use types were constructed using Latin hypercube sampling and converted to a series of stepped linear scales to score buildings. A score is calculated based on the modelled source EUImore » after adjusting for climate. A web-based scoring tool, which incorporates an analytical engine and a simulation engine, was developed to standardize energy modelling and reduce implementation cost. This paper discusses the methodology used to perform several hundred thousand building simulation runs and develop the scoring scales.« less

  10. Understanding Human Perception of Building Categories in Virtual 3d Cities - a User Study

    NASA Astrophysics Data System (ADS)

    Tutzauer, P.; Becker, S.; Niese, T.; Deussen, O.; Fritsch, D.

    2016-06-01

    Virtual 3D cities are becoming increasingly important as a means of visually communicating diverse urban-related information. To get a deeper understanding of a human's cognitive experience of virtual 3D cities, this paper presents a user study on the human ability to perceive building categories (e.g. residential home, office building, building with shops etc.) from geometric 3D building representations. The study reveals various dependencies between geometric properties of the 3D representations and the perceptibility of the building categories. Knowledge about which geometries are relevant, helpful or obstructive for perceiving a specific building category is derived. The importance and usability of such knowledge is demonstrated based on a perception-guided 3D building abstraction process.

  11. Towards a comprehensive city emission function (CCEF)

    NASA Astrophysics Data System (ADS)

    Kocifaj, Miroslav

    2018-01-01

    The comprehensive city emission function (CCEF) is developed for a heterogeneous light-emitting or blocking urban environments, embracing any combination of input parameters that characterize linear dimensions in the system (size and distances between buildings or luminaires), properties of light-emitting elements (such as luminous building façades and street lighting), ground reflectance and total uplight-fraction, all of these defined for an arbitrarily sized 2D area. The analytical formula obtained is not restricted to a single model class as it can capture any specific light-emission feature for wide range of cities. The CCEF method is numerically fast in contrast to what can be expected of other probabilistic approaches that rely on repeated random sampling. Hence the present solution has great potential in light-pollution modeling and can be included in larger numerical models. Our theoretical findings promise great progress in light-pollution modeling as this is the first time an analytical solution to city emission function (CEF) has been developed that depends on statistical mean size and height of city buildings, inter-building separation, prevailing heights of light fixtures, lighting density, and other factors such as e.g. luminaire light output and light distribution, including the amount of uplight, and representative city size. The model is validated for sensitivity and specificity pertinent to combinations of input parameters in order to test its behavior under various conditions, including those that can occur in complex urban environments. It is demonstrated that the solution model succeeds in reproducing a light emission peak at some elevated zenith angles and is consistent with reduced rather than enhanced emission in directions nearly parallel to the ground.

  12. Impact of input data uncertainty on environmental exposure assessment models: A case study for electromagnetic field modelling from mobile phone base stations.

    PubMed

    Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel

    2014-11-01

    With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. The influence of wind speed on airflow and fine particle transport within different building layouts of an industrial city.

    PubMed

    Mei, Dan; Wen, Meng; Xu, Xuemei; Zhu, Yuzheng; Xing, Futang

    2018-04-20

    In atmospheric environment, the layout difference of urban buildings has a powerful influence on accelerating or inhibiting the dispersion of particle matters (PM). In industrial cities, buildings of variable heights can obstruct the diffusion of PM from industrial stacks. In this study, PM dispersed within building groups was simulated by Reynolds-averaged Navier-Stokes equations coupled Lagrangian approach. Four typical street building arrangements were used: (a) a low-rise building block with Height/base H/b = 1 (b = 20 m); (b) step-up building layout (H/b = 1, 2, 3, 4); (c) step-down building layout (H/b = 4, 3, 2, 1); (d) high-rise building block (H/b = 5). Profiles of stream functions and turbulence intensity were used to examine the effect of various building layouts on atmospheric airflow. Here, concepts of particle suspension fraction and concentration distribution were used to evaluate the effect of wind speed on fine particle transport. These parameters showed that step-up building layouts accelerated top airflow and diffused more particles into street canyons, likely having adverse effects on resident health. In renewal old industry areas, the step-down building arrangement which can hinder PM dispersion from high-level stacks should be constructed preferentially. High turbulent intensity results in formation of a strong vortex that hinders particles into the street canyons. It is found that an increase in wind speed enhanced particle transport and reduced local particle concentrations, however, it did not affect the relative location of high particle concentration zones, which are related to building height and layout. This study has demonstrated the height variation and layout of urban architecture affect the local concentration distribution of particulate matter (PM) in the atmosphere and for the first time that wind velocity has particular effects on PM transport in various building groups. The findings may have general implications in optimization the building layout based on particle transport characteristics during the renewal of industrial cities. For city planners, the results and conclusions are useful for improving the local air quality. The study method also can be used to calculate the explosion risk of industrial dust for people who live in industrial cities.

  14. Autonomic Management in a Distributed Storage System

    NASA Astrophysics Data System (ADS)

    Tauber, Markus

    2010-07-01

    This thesis investigates the application of autonomic management to a distributed storage system. Effects on performance and resource consumption were measured in experiments, which were carried out in a local area test-bed. The experiments were conducted with components of one specific distributed storage system, but seek to be applicable to a wide range of such systems, in particular those exposed to varying conditions. The perceived characteristics of distributed storage systems depend on their configuration parameters and on various dynamic conditions. For a given set of conditions, one specific configuration may be better than another with respect to measures such as resource consumption and performance. Here, configuration parameter values were set dynamically and the results compared with a static configuration. It was hypothesised that under non-changing conditions this would allow the system to converge on a configuration that was more suitable than any that could be set a priori. Furthermore, the system could react to a change in conditions by adopting a more appropriate configuration. Autonomic management was applied to the peer-to-peer (P2P) and data retrieval components of ASA, a distributed storage system. The effects were measured experimentally for various workload and churn patterns. The management policies and mechanisms were implemented using a generic autonomic management framework developed during this work. The experimental evaluations of autonomic management show promising results, and suggest several future research topics. The findings of this thesis could be exploited in building other distributed storage systems that focus on harnessing storage on user workstations, since these are particularly likely to be exposed to varying, unpredictable conditions.

  15. Fault-tolerant computer study. [logic designs for building block circuits

    NASA Technical Reports Server (NTRS)

    Rennels, D. A.; Avizienis, A. A.; Ercegovac, M. D.

    1981-01-01

    A set of building block circuits is described which can be used with commercially available microprocessors and memories to implement fault tolerant distributed computer systems. Each building block circuit is intended for VLSI implementation as a single chip. Several building blocks and associated processor and memory chips form a self checking computer module with self contained input output and interfaces to redundant communications buses. Fault tolerance is achieved by connecting self checking computer modules into a redundant network in which backup buses and computer modules are provided to circumvent failures. The requirements and design methodology which led to the definition of the building block circuits are discussed.

  16. Modular Exposure Disaggregation Methodologies for Catastrophe Modelling using GIS and Remotely-Sensed Data

    NASA Astrophysics Data System (ADS)

    Foulser-Piggott, R.; Saito, K.; Spence, R.

    2012-04-01

    Loss estimates produced by catastrophe models are dependent on the quality of the input data, including both the hazard and exposure data. Currently, some of the exposure data input into a catastrophe model is aggregated over an area and therefore an estimate of the risk in this area may have a low level of accuracy. In order to obtain a more detailed and accurate loss estimate, it is necessary to have higher resolution exposure data. However, high resolution exposure data is not commonly available worldwide and therefore methods to infer building distribution and characteristics at higher resolution from existing information must be developed. This study is focussed on the development of disaggregation methodologies for exposure data which, if implemented in current catastrophe models, would lead to improved loss estimates. The new methodologies developed for disaggregating exposure data make use of GIS, remote sensing and statistical techniques. The main focus of this study is on earthquake risk, however the methods developed are modular so that they may be applied to different hazards. A number of different methods are proposed in order to be applicable to different regions of the world which have different amounts of data available. The new methods give estimates of both the number of buildings in a study area and a distribution of building typologies, as well as a measure of the vulnerability of the building stock to hazard. For each method, a way to assess and quantify the uncertainties in the methods and results is proposed, with particular focus on developing an index to enable input data quality to be compared. The applicability of the methods is demonstrated through testing for two study areas, one in Japan and the second in Turkey, selected because of the occurrence of recent and damaging earthquake events. The testing procedure is to use the proposed methods to estimate the number of buildings damaged at different levels following a scenario earthquake event. This enables the results of the models to be compared with real data and the relative performance of the different methodologies to be evaluated. A sensitivity analysis is also conducted for two main reasons. Firstly, to determine the key input variables in the methodology that have the most significant impact on the resulting loss estimate. Secondly, to enable the uncertainty in the different approaches to be quantified and therefore provide a range of uncertainty in the loss estimates.

  17. Predicting the Texas Windstorm Insurance Association claim payout of commercial buildings from Hurricane Ike

    NASA Astrophysics Data System (ADS)

    Kim, J. M.; Woods, P. K.; Park, Y. J.; Son, K.

    2013-08-01

    Following growing public awareness of the danger from hurricanes and tremendous demands for analysis of loss, many researchers have conducted studies to develop hurricane damage analysis methods. Although researchers have identified the significant indicators, there currently is no comprehensive research for identifying the relationship among the vulnerabilities, natural disasters, and economic losses associated with individual buildings. To address this lack of research, this study will identify vulnerabilities and hurricane indicators, develop metrics to measure the influence of economic losses from hurricanes, and visualize the spatial distribution of vulnerability to evaluate overall hurricane damage. This paper has utilized the Geographic Information System to facilitate collecting and managing data, and has combined vulnerability factors to assess the financial losses suffered by Texas coastal counties. A multiple linear regression method has been applied to develop hurricane economic damage predicting models. To reflect the pecuniary loss, insured loss payment was used as the dependent variable to predict the actual financial damage. Geographical vulnerability indicators, built environment vulnerability indicators, and hurricane indicators were all used as independent variables. Accordingly, the models and findings may possibly provide vital references for government agencies, emergency planners, and insurance companies hoping to predict hurricane damage.

  18. Ocean acidification and its potential effects on marine ecosystems.

    PubMed

    Guinotte, John M; Fabry, Victoria J

    2008-01-01

    Ocean acidification is rapidly changing the carbonate system of the world oceans. Past mass extinction events have been linked to ocean acidification, and the current rate of change in seawater chemistry is unprecedented. Evidence suggests that these changes will have significant consequences for marine taxa, particularly those that build skeletons, shells, and tests of biogenic calcium carbonate. Potential changes in species distributions and abundances could propagate through multiple trophic levels of marine food webs, though research into the long-term ecosystem impacts of ocean acidification is in its infancy. This review attempts to provide a general synthesis of known and/or hypothesized biological and ecosystem responses to increasing ocean acidification. Marine taxa covered in this review include tropical reef-building corals, cold-water corals, crustose coralline algae, Halimeda, benthic mollusks, echinoderms, coccolithophores, foraminifera, pteropods, seagrasses, jellyfishes, and fishes. The risk of irreversible ecosystem changes due to ocean acidification should enlighten the ongoing CO(2) emissions debate and make it clear that the human dependence on fossil fuels must end quickly. Political will and significant large-scale investment in clean-energy technologies are essential if we are to avoid the most damaging effects of human-induced climate change, including ocean acidification.

  19. Decommissioning a phosphoric acid production plant: a radiological protection case study.

    PubMed

    Stamatis, V; Seferlis, S; Kamenopoulou, V; Potiriadis, C; Koukouliou, V; Kehagia, K; Dagli, C; Georgiadis, S; Camarinopoulos, L

    2010-12-01

    During a preliminary survey at the area of an abandoned fertilizer plant, increased levels of radioactivity were measured at places, buildings, constructions and materials. The extent of the contamination was determined and the affected areas were characterized as controlled areas. After the quantitative and qualitative determination of the contaminated materials, the decontamination was planned and performed step by step: the contaminated materials were categorized according to their physical characteristics (scrap metals, plastic pipes, scales and residues, building materials, etc) and according to their level of radioactivity. Depending on the material type, different decontamination and disposal options were proposed; the most appropriate technique was chosen taking into account apart from technical issues, the legal framework, radiation protection issues, the opinion of the local authorities involved as well as the owner's wish. After taking away the biggest amount of the contaminated materials, an iterative process consisting of surveys and decontamination actions was performed in order to remove the residual traces of contamination from the area. During the final survey, no residual surface contamination was detected; some sparsely distributed low level contaminated materials deeply immersed into the soil were found and removed. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Design, Specification and Construction of Specialized Measurement System in the Experimental Building

    NASA Astrophysics Data System (ADS)

    Fedorczak-Cisak, Malgorzata; Kwasnowski, Pawel; Furtak, Marcin; Hayduk, Grzegorz

    2017-10-01

    Experimental buildings for “in situ” research are a very important tool for collecting data on energy efficiency of the energy-saving technologies. One of the most advanced building of this type in Poland is the Maloposkie Laboratory of Energy-saving Buildings at Cracow University of Technology. The building itself is used by scientists as a research object and research tool to test energy-saving technologies. It is equipped with a specialized measuring system consisting of approx. 3 000 different sensors distributed in technical installations and structural elements of the building (walls, ceilings, cornices) and the ground. The authors of the paper will present the innovative design and technology of this specialized instrumentation. They will discuss issues arising during the implementation and use of the building.

  1. Predictive Modeling of Risk Associated with Temperature Extremes over Continental US

    NASA Astrophysics Data System (ADS)

    Kravtsov, S.; Roebber, P.; Brazauskas, V.

    2016-12-01

    We build an extremely statistically accurate, essentially bias-free empirical emulator of atmospheric surface temperature and apply it for meteorological risk assessment over the domain of continental US. The resulting prediction scheme achieves an order-of-magnitude or larger gain of numerical efficiency compared with the schemes based on high-resolution dynamical atmospheric models, leading to unprecedented accuracy of the estimated risk distributions. The empirical model construction methodology is based on our earlier work, but is further modified to account for the influence of large-scale, global climate change on regional US weather and climate. The resulting estimates of the time-dependent, spatially extended probability of temperature extremes over the simulation period can be used as a risk management tool by insurance companies and regulatory governmental agencies.

  2. Information Processing in Living Systems

    NASA Astrophysics Data System (ADS)

    Tkačik, Gašper; Bialek, William

    2016-03-01

    Life depends as much on the flow of information as on the flow of energy. Here we review the many efforts to make this intuition precise. Starting with the building blocks of information theory, we explore examples where it has been possible to measure, directly, the flow of information in biological networks, or more generally where information-theoretic ideas have been used to guide the analysis of experiments. Systems of interest range from single molecules (the sequence diversity in families of proteins) to groups of organisms (the distribution of velocities in flocks of birds), and all scales in between. Many of these analyses are motivated by the idea that biological systems may have evolved to optimize the gathering and representation of information, and we review the experimental evidence for this optimization, again across a wide range of scales.

  3. Finite element based simulation on friction stud welding of metal matrix composites to steel

    NASA Astrophysics Data System (ADS)

    Hynes, N. Rajesh Jesudoss; Tharmaraj, R.; Velu, P. Shenbaga; Kumar, R.

    2016-05-01

    Friction welding is a solid state joining technique used for joining similar and dissimilar materials with high integrity. This new technique is being successfully applied to the aerospace, automobile, and ship building industries, and is attracting more and more research interest. The quality of Friction Stud Welded joints depends on the frictional heat generated at the interface. Hence, thermal analysis on friction stud welding of stainless steel (AISI 304) and aluminium silicon carbide (AlSiC) combination is carried out in the present work. In this study, numerical simulation is carried out using ANSYS software and the temperature profiles are predicted at various increments of time. The developed numerical model is found to be adequate to predict temperature distribution of friction stud weld aluminium silicon carbide/stainless steel joints.

  4. Characterizing Geological Facies using Seismic Waveform Classification in Sarawak Basin

    NASA Astrophysics Data System (ADS)

    Zahraa, Afiqah; Zailani, Ahmad; Prasad Ghosh, Deva

    2017-10-01

    Numerous effort have been made to build relationship between geology and geophysics using different techniques throughout the years. The integration of these two most important data in oil and gas industry can be used to reduce uncertainty in exploration and production especially for reservoir productivity enhancement and stratigraphic identification. This paper is focusing on seismic waveform classification to different classes using neural network and to link them according to the geological facies which are established using the knowledge on lithology and log motif of well data. Seismic inversion is used as the input for the neural network to act as the direct lithology indicator reducing dependency on well calibration. The interpretation of seismic facies classification map provides a better understanding towards the lithology distribution, depositional environment and help to identify significant reservoir rock

  5. The cosmological Janus model: comparison with observational data

    NASA Astrophysics Data System (ADS)

    Petit, Jean-Pierre; Dagostini, Gilles

    2017-01-01

    In 2014 we presented a model based on a system of two coupled field equations to describe two populations of particles, one positive and the other mass of negative mass. The analysis of this system by Newtonian approximation show that the masses of the same signs attract according to Newton's law while the masses of opposite signs repel according to an anti-Newton law. This eliminates the runaway phenomenon. It uses the time-dependent exact solution of this system to build the bolometric magnitude distribution of the red-shift. Comparing the prediction of our model -which requires adjustment with a single parameter- with the data from 740 supernovae highlighting the acceleration of the universe gives an excellent agreement. The comparison is then made with the multi-parametric Λ CDM model.

  6. Supernovae, neutrinos and the chirality of amino acids.

    PubMed

    Boyd, Richard N; Kajino, Toshitaka; Onaka, Takashi

    2011-01-01

    A mechanism for creating an enantioenrichment in the amino acids, the building blocks of the proteins, that involves global selection of one handedness by interactions between the amino acids and neutrinos from core-collapse supernovae is defined. The chiral selection involves the dependence of the interaction cross sections on the orientations of the spins of the neutrinos and the (14)N nuclei in the amino acids, or in precursor molecules, which in turn couple to the molecular chirality. It also requires an asymmetric distribution of neutrinos emitted from the supernova. The subsequent chemical evolution and galactic mixing would ultimately populate the Galaxy with the selected species. The resulting amino acids could either be the source thereof on Earth, or could have triggered the chirality that was ultimately achieved for Earth's proteinaceous amino acids.

  7. Mechanical properties of carbon steel depending on the rate of the dose build-up of nitrogen and argon ions

    NASA Astrophysics Data System (ADS)

    Vorob'ev, V. L.; Bykov, P. V.; Bayankin, V. Ya.; Shushkov, A. A.; Vakhrushev, A. V.

    2014-08-01

    The effect of pulsed irradiation with argons and nitrogen ions on the mechanical properties, morphology, and structure of the surface layers of carbon steel St3 (0.2% C, 0.4% Mn, 0.15% Si, and Fe for balance) has been investigated depending on the rate of dose build-up at an average ion current density of 10, 20, and 40 μA/cm2. It has been established that the fatigue life and microhardness of surface layers increase in the entire studied range of dose build-up rates. This seems to be due to the hardening of the surface layers, which resulted from the generation of radiation defects and the irradiation-dynamic effect of fast ions. The sample irradiated by argon ions at the lowest of the selected dose build-up rates j av = 10 μA/cm2 withstands the largest number of cycles to failure.

  8. Providing the Persistent Data Storage in a Software Engineering Environment Using Java/COBRA and a DBMS

    NASA Technical Reports Server (NTRS)

    Dhaliwal, Swarn S.

    1997-01-01

    An investigation was undertaken to build the software foundation for the WHERE (Web-based Hyper-text Environment for Requirements Engineering) project. The TCM (Toolkit for Conceptual Modeling) was chosen as the foundation software for the WHERE project which aims to provide an environment for facilitating collaboration among geographically distributed people involved in the Requirements Engineering process. The TCM is a collection of diagram and table editors and has been implemented in the C++ programming language. The C++ implementation of the TCM was translated into Java in order to allow the editors to be used for building various functionality of the WHERE project; the WHERE project intends to use the Web as its communication back- bone. One of the limitations of the translated software (TcmJava), which militated against its use in the WHERE project, was persistent data management mechanisms which it inherited from the original TCM; it was designed to be used in standalone applications. Before TcmJava editors could be used as a part of the multi-user, geographically distributed applications of the WHERE project, a persistent storage mechanism must be built which would allow data communication over the Internet, using the capabilities of the Web. An approach involving features of Java, CORBA (Common Object Request Broker), the Web, a middle-ware (Java Relational Binding (JRB)), and a database server was used to build the persistent data management infrastructure for the WHERE project. The developed infrastructure allows a TcmJava editor to be downloaded and run from a network host by using a JDK 1.1 (Java Developer's Kit) compatible Web-browser. The aforementioned editor establishes connection with a server by using the ORB (Object Request Broker) software and stores/retrieves data in/from the server. The server consists of a CORBA object or objects depending upon whether the data is to be made persistent on a single server or multiple servers. The CORBA object providing the persistent data server is implemented using the Java progranu-ning language. It uses the JRB to store/retrieve data in/from a relational database server. The persistent data management system provides transaction and user management facilities which allow multi-user, distributed access to the stored data in a secure manner.

  9. Very fast road database verification using textured 3D city models obtained from airborne imagery

    NASA Astrophysics Data System (ADS)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models of buildings, trees, and ground as output. Building s and ground are textured by means of available images. This facilitates the orientation in the model and the interactive verification of the road objects that where initially classified as unknown. The three main modules of the texturing algorithm are: Pose estimation (if the videos are not geo-referenced), occlusion analysis, and texture synthesis.

  10. Audio distribution and Monitoring Circuit

    NASA Technical Reports Server (NTRS)

    Kirkland, J. M.

    1983-01-01

    Versatile circuit accepts and distributes TV audio signals. Three-meter audio distribution and monitoring circuit provides flexibility in monitoring, mixing, and distributing audio inputs and outputs at various signal and impedance levels. Program material is simultaneously monitored on three channels, or single-channel version built to monitor transmitted or received signal levels, drive speakers, interface to building communications, and drive long-line circuits.

  11. Waste-to-Energy Projects at Army Installations

    DTIC Science & Technology

    2011-01-13

    JAN 2011 US Army Corps of Engineers BUILDING STRONG® Distribution Statement A -- Approved for public release; distribution is unlimited. Report...Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response...NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES presented at the DOE

  12. A Petri Net model for distributed energy system

    NASA Astrophysics Data System (ADS)

    Konopko, Joanna

    2015-12-01

    Electrical networks need to evolve to become more intelligent, more flexible and less costly. The smart grid is the next generation power energy, uses two-way flows of electricity and information to create a distributed automated energy delivery network. Building a comprehensive smart grid is a challenge for system protection, optimization and energy efficient. Proper modeling and analysis is needed to build an extensive distributed energy system and intelligent electricity infrastructure. In this paper, the whole model of smart grid have been proposed using Generalized Stochastic Petri Nets (GSPN). The simulation of created model is also explored. The simulation of the model has allowed the analysis of how close the behavior of the model is to the usage of the real smart grid.

  13. 75 FR 4423 - Distribution of the 2004 Through 2007 Satellite Royalty Funds

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-27

    ... Madison Memorial Building, LM-401, 101 Independence Avenue, SE, Washington, DC 20559-6000. If delivered by... Board, Library of Congress, James Madison Memorial Building, LM-403, 101 Independence Avenue, SE... Claimants Group, Music Claimants (American Society of Composers, Authors and Publishers, Broadcast Music...

  14. Laboratory Buildings.

    ERIC Educational Resources Information Center

    Barnett, Jonathan

    The need for flexibility in science research facilities is discussed, with emphasis on the effect of that need on the design of laboratories. The relationship of office space, bench space, and special equipment areas, and the location and distribution of piping and air conditioning, are considered particularly important. This building type study…

  15. VIEW OF BUILDING 124, THE WATER TREATMENT PLANT, LOOKING NORTHEAST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF BUILDING 124, THE WATER TREATMENT PLANT, LOOKING NORTHEAST. THE ROCKY FLATS PLANT WATER SUPPLY, TREATMENT, STORAGE, AND DISTRIBUTION SYSTEM HAS OPERATED CONTINUOUSLY SINCE 1953 - Rocky Flats Plant, Water Treatment Plant, West of Third Street, north of Cedar Avenue, Golden, Jefferson County, CO

  16. Reflections from USDA Forest Service employees on institutional constraints to engaging and serving their local communities

    Treesearch

    Mae A. Davenport; Dorothy H. Anderson; Jessica E. Leahy; Pamela J. Jakes

    2007-01-01

    Although community relationship building has been recognized since the early 1980s as integral to forest management, it has not been widely supported or adopted. Today, relationship building depends largely on the innovation and commitment of forest supervisors and staff. The institutional environment and its culture play an important role in building capacity for...

  17. Building Energy Modeling and Control Methods for Optimization and Renewables Integration

    NASA Astrophysics Data System (ADS)

    Burger, Eric M.

    This dissertation presents techniques for the numerical modeling and control of building systems, with an emphasis on thermostatically controlled loads. The primary objective of this work is to address technical challenges related to the management of energy use in commercial and residential buildings. This work is motivated by the need to enhance the performance of building systems and by the potential for aggregated loads to perform load following and regulation ancillary services, thereby enabling the further adoption of intermittent renewable energy generation technologies. To increase the generalizability of the techniques, an emphasis is placed on recursive and adaptive methods which minimize the need for customization to specific buildings and applications. The techniques presented in this dissertation can be divided into two general categories: modeling and control. Modeling techniques encompass the processing of data streams from sensors and the training of numerical models. These models enable us to predict the energy use of a building and of sub-systems, such as a heating, ventilation, and air conditioning (HVAC) unit. Specifically, we first present an ensemble learning method for the short-term forecasting of total electricity demand in buildings. As the deployment of intermittent renewable energy resources continues to rise, the generation of accurate building-level electricity demand forecasts will be valuable to both grid operators and building energy management systems. Second, we present a recursive parameter estimation technique for identifying a thermostatically controlled load (TCL) model that is non-linear in the parameters. For TCLs to perform demand response services in real-time markets, online methods for parameter estimation are needed. Third, we develop a piecewise linear thermal model of a residential building and train the model using data collected from a custom-built thermostat. This model is capable of approximating unmodeled dynamics within a building by learning from sensor data. Control techniques encompass the application of optimal control theory, model predictive control, and convex distributed optimization to TCLs. First, we present the alternative control trajectory (ACT) representation, a novel method for the approximate optimization of non-convex discrete systems. This approach enables the optimal control of a population of non-convex agents using distributed convex optimization techniques. Second, we present a distributed convex optimization algorithm for the control of a TCL population. Experimental results demonstrate the application of this algorithm to the problem of renewable energy generation following. This dissertation contributes to the development of intelligent energy management systems for buildings by presenting a suite of novel and adaptable modeling and control techniques. Applications focus on optimizing the performance of building operations and on facilitating the integration of renewable energy resources.

  18. Simulating the effect of slab features on vapor intrusion of crack entry

    PubMed Central

    Yao, Yijun; Pennell, Kelly G.; Suuberg, Eric M.

    2012-01-01

    In vapor intrusion screening models, a most widely employed assumption in simulating the entry of contaminant into a building is that of a crack in the building foundation slab. Some modelers employed a perimeter crack hypothesis while others chose not to identify the crack type. However, few studies have systematically investigated the influence on vapor intrusion predictions of slab crack features, such as the shape and distribution of slab cracks and related to this overall building foundation footprint size. In this paper, predictions from a three-dimensional model of vapor intrusion are used to compare the contaminant mass flow rates into buildings with different foundation slab crack features. The simulations show that the contaminant mass flow rate into the building does not change much for different assumed slab crack shapes and locations, and the foundation footprint size does not play a significant role in determining contaminant mass flow rate through a unit area of crack. Moreover, the simulation helped reveal the distribution of subslab contaminant soil vapor concentration beneath the foundation, and the results suggest that in most cases involving no biodegradation, the variation in subslab concentration should not exceed an order of magnitude, and is often significantly less than this. PMID:23359620

  19. Building vulnerability and human casualty estimation for a pyroclastic flow: a model and its application to Vesuvius

    NASA Astrophysics Data System (ADS)

    Spence, Robin J. S.; Baxter, Peter J.; Zuccaro, Giulio

    2004-05-01

    Pyroclastic flows clearly present a serious threat to life for the inhabitants of settlements on the slopes of volcanoes with a history of explosive eruptions; but it is increasingly realised that buildings can provide a measure of protection to occupants trapped by such flows. One important example is Vesuvius, whose eruption history includes many events which were lethal for the inhabitants of the neighbouring Vesuvian villages. Recent computational fluid dynamics computer modelling for Vesuvius [Todesco et al., Bull. Volcanol. 64 (2002) 155-177] has enabled a realistic picture of an explosive eruption to be modelled, tracing the time-dependent development of the physical parameters of a simulated flow at a large three-dimensional mesh of points, based on assumed conditions of temperature, mass-flow rate and particle size distribution at the vent. The output includes mapping of temperature, mixture density and mixture velocity over the whole adjacent terrain. But to date this information has not been used to assess the impacts of such flows on buildings and their occupants. In the project reported in this paper, estimates of the near-ground flow parameters were used to assess the impact of a particular simulated pyroclastic flow (modelled roughly on the 1631 eruption) on the buildings and population in four of the Vesuvian villages considered most at risk. The study had five components. First, a survey of buildings and the urban environment was conducted to identify the incidence of characteristics and elements likely to affect human vulnerability, and to classify the building stock. The survey emphasised particularly the number, location and type of openings characteristic of the major classes of the local building stock. In the second part of the study, this survey formed the basis for estimates of the probable impact of the pyroclastic flow on the envelope and internal air conditions of typical buildings. In the third part, a number of distinct ways in which human casualties would occur were identified, and estimates were made of the relationship between casualty rates and environmental conditions for each casualty type. In the fourth part of the study, the assumed casualty rates were used to estimate the proportions of occupants who would be killed or seriously injured for the assumed pyroclastic flow scenario in the Vesuvian villages studied, and their distribution by distance from the vent. It was estimated that in a daytime eruption, 25 min after the start of the eruption, there would be 480 deaths and a further 190 serious injuries, for every 1000 remaining in the area. In a night-time scenario, there would be 360 deaths with a further 230 serious injuries per 1000 after the same time interval. Finally, a set of risk factors for casualties was identified, and factors were discussed and ranked for their mitigation impact in the eruption scenario. The most effective mitigation action would of course be total evacuation before the start of the eruption. But if this were not achieved, barred window openings or sealed openings to slow the ingress of hot gases, together with a reduction of the fire load, could be effective means of reducing casualty levels.

  20. Bayesian prediction of future ice sheet volume using local approximation Markov chain Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Davis, A. D.; Heimbach, P.; Marzouk, Y.

    2017-12-01

    We develop a Bayesian inverse modeling framework for predicting future ice sheet volume with associated formal uncertainty estimates. Marine ice sheets are drained by fast-flowing ice streams, which we simulate using a flowline model. Flowline models depend on geometric parameters (e.g., basal topography), parameterized physical processes (e.g., calving laws and basal sliding), and climate parameters (e.g., surface mass balance), most of which are unknown or uncertain. Given observations of ice surface velocity and thickness, we define a Bayesian posterior distribution over static parameters, such as basal topography. We also define a parameterized distribution over variable parameters, such as future surface mass balance, which we assume are not informed by the data. Hyperparameters are used to represent climate change scenarios, and sampling their distributions mimics internal variation. For example, a warming climate corresponds to increasing mean surface mass balance but an individual sample may have periods of increasing or decreasing surface mass balance. We characterize the predictive distribution of ice volume by evaluating the flowline model given samples from the posterior distribution and the distribution over variable parameters. Finally, we determine the effect of climate change on future ice sheet volume by investigating how changing the hyperparameters affects the predictive distribution. We use state-of-the-art Bayesian computation to address computational feasibility. Characterizing the posterior distribution (using Markov chain Monte Carlo), sampling the full range of variable parameters and evaluating the predictive model is prohibitively expensive. Furthermore, the required resolution of the inferred basal topography may be very high, which is often challenging for sampling methods. Instead, we leverage regularity in the predictive distribution to build a computationally cheaper surrogate over the low dimensional quantity of interest (future ice sheet volume). Continual surrogate refinement guarantees asymptotic sampling from the predictive distribution. Directly characterizing the predictive distribution in this way allows us to assess the ice sheet's sensitivity to climate variability and change.

  1. Analysis of domestic refrigerator temperatures and home storage time distributions for shelf-life studies and food safety risk assessment.

    PubMed

    Roccato, Anna; Uyttendaele, Mieke; Membré, Jeanne-Marie

    2017-06-01

    In the framework of food safety, when mimicking the consumer phase, the storage time and temperature used are mainly considered as single point estimates instead of probability distributions. This singlepoint approach does not take into account the variability within a population and could lead to an overestimation of the parameters. Therefore, the aim of this study was to analyse data on domestic refrigerator temperatures and storage times of chilled food in European countries in order to draw general rules which could be used either in shelf-life testing or risk assessment. In relation to domestic refrigerator temperatures, 15 studies provided pertinent data. Twelve studies presented normal distributions, according to the authors or from the data fitted into distributions. Analysis of temperature distributions revealed that the countries were separated into two groups: northern European countries and southern European countries. The overall variability of European domestic refrigerators is described by a normal distribution: N (7.0, 2.7)°C for southern countries, and, N (6.1, 2.8)°C for the northern countries. Concerning storage times, seven papers were pertinent. Analysis indicated that the storage time was likely to end in the first days or weeks (depending on the product use-by-date) after purchase. Data fitting showed the exponential distribution was the most appropriate distribution to describe the time that food spent at consumer's place. The storage time was described by an exponential distribution corresponding to the use-by date period divided by 4. In conclusion, knowing that collecting data is time and money consuming, in the absence of data, and at least for the European market and for refrigerated products, building a domestic refrigerator temperature distribution using a Normal law and a time-to-consumption distribution using an Exponential law would be appropriate. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Organization and dissemination of multimedia medical databases on the WWW.

    PubMed

    Todorovski, L; Ribaric, S; Dimec, J; Hudomalj, E; Lunder, T

    1999-01-01

    In the paper, we focus on the problem of building and disseminating multimedia medical databases on the World Wide Web (WWW). The current results of the ongoing project of building a prototype dermatology images database and its WWW presentation are presented. The dermatology database is part of an ambitious plan concerning an organization of a network of medical institutions building distributed and federated multimedia databases of a much wider scale.

  3. 1. DEPENDENCY Both pointed and flat shingles appear to be ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. DEPENDENCY Both pointed and flat shingles appear to be original. Original purpose of this building was not recorded at the time of this survey. - Annandale Plantation, Dependency, State Routes 30 & 18 vicinity, Georgetown, Georgetown County, SC

  4. Lamination effects on a 3D model of the magnetic core of power transformers

    NASA Astrophysics Data System (ADS)

    Poveda-Lerma, Antonio; Serrano-Callergues, Guillermo; Riera-Guasp, Martin; Pineda-Sanchez, Manuel; Puche-Panadero, Ruben; Perez-Cruz, Juan

    2017-12-01

    In this paper the lamination effect on the model of a power transformer's core with stacked E-I structure is analyzed. The distribution of the magnetic flux in the laminations depends on the stacking method. In this work it is shown, using a 3D FEM model and an experimental prototype, that the non-uniform distribution of the flux in a laminated E-I core with alternate-lap joint stack increases substantially the average value of the magnetic flux density in the core, compared with a butt joint stack. Both the simulated model and the experimental tests show that the presence of constructive air-gaps in the E-I junctions gives rise to a zig-zag flux in the depth direction. This inter-lamination flux reduces the magnetic flux density in the I-pieces and increases substantially the magnetic flux density in the E-pieces, with highly saturated points that traditional 2D analysis cannot reproduce. The relation between the number of laminations included in the model, and the computational resourses needed to build it, is also evaluated in this work.

  5. System performance predictions for Space Station Freedom's electric power system

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Hojnicki, Jeffrey S.; Green, Robert D.; Follo, Jeffrey C.

    1993-01-01

    Space Station Freedom Electric Power System (EPS) capability to effectively deliver power to housekeeping and user loads continues to strongly influence Freedom's design and planned approaches for assembly and operations. The EPS design consists of silicon photovoltaic (PV) arrays, nickel-hydrogen batteries, and direct current power management and distribution hardware and cabling. To properly characterize the inherent EPS design capability, detailed system performance analyses must be performed for early stages as well as for the fully assembled station up to 15 years after beginning of life. Such analyses were repeatedly performed using the FORTRAN code SPACE (Station Power Analysis for Capability Evaluation) developed at the NASA Lewis Research Center over a 10-year period. SPACE combines orbital mechanics routines, station orientation/pointing routines, PV array and battery performance models, and a distribution system load-flow analysis to predict EPS performance. Time-dependent, performance degradation, low earth orbit environmental interactions, and EPS architecture build-up are incorporated in SPACE. Results from two typical SPACE analytical cases are presented: (1) an electric load driven case and (2) a maximum EPS capability case.

  6. Probabilistic inference in discrete spaces can be implemented into networks of LIF neurons.

    PubMed

    Probst, Dimitri; Petrovici, Mihai A; Bytschok, Ilja; Bill, Johannes; Pecevski, Dejan; Schemmel, Johannes; Meier, Karlheinz

    2015-01-01

    The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables. We test our framework for a model inference task based on a psychophysical phenomenon (the Knill-Kersten optical illusion) and further assess its performance when applied to randomly generated distributions. As the local computations performed by the network strongly depend on the interaction between neurons, we compare several types of couplings mediated by either single synapses or interneuron chains. Due to its robustness to substrate imperfections such as parameter noise and background noise correlations, our model is particularly interesting for implementation on novel, neuro-inspired computing architectures, which can thereby serve as a fast, low-power substrate for solving real-world inference problems.

  7. Use of Isotope Ratio Mass Spectrometry (IRMS) Determination ((18)O/(16)O) to Assess the Local Origin of Fish and Asparagus in Western Switzerland.

    PubMed

    Rossier, Joël S; Maury, Valérie; de Voogd, Blaise; Pfammatter, Elmar

    2014-10-01

    Here we present the use of isotope ratio mass spectrometry (IRMS) for the detection of mislabelling of food produced in Switzerland. The system is based on the analysis of the oxygen isotope distribution in water (δ(18)O). Depending on the location on the earth, lake or groundwater has a specific isotopic distribution, which can serve as a fingerprint in order to verify whether a product has grown by means of the corresponding water. This report presents specifically the IRMS technique and the results obtained in the origin detection of fish grown in selected Swiss lakes as well as asparagus grown in Valais ground. Strengths and limitations of the method are presented for both cited products; on one hand, the technique is relatively universal for any product which contains significant water but on the other hand, it necessitates a rather heavy workload to build up a database of water δ(18)O values of products of different origins. This analytical tool is part of the concept of combating fraud currently in use in Switzerland.

  8. A suffix arrays based approach to semantic search in P2P systems

    NASA Astrophysics Data System (ADS)

    Shi, Qingwei; Zhao, Zheng; Bao, Hu

    2007-09-01

    Building a semantic search system on top of peer-to-peer (P2P) networks is becoming an attractive and promising alternative scheme for the reason of scalability, Data freshness and search cost. In this paper, we present a Suffix Arrays based algorithm for Semantic Search (SASS) in P2P systems, which generates a distributed Semantic Overlay Network (SONs) construction for full-text search in P2P networks. For each node through the P2P network, SASS distributes document indices based on a set of suffix arrays, by which clusters are created depending on words or phrases shared between documents, therefore, the search cost for a given query is decreased by only scanning semantically related documents. In contrast to recently announced SONs scheme designed by using metadata or predefined-class, SASS is an unsupervised approach for decentralized generation of SONs. SASS is also an incremental, linear time algorithm, which efficiently handle the problem of nodes update in P2P networks. Our simulation results demonstrate that SASS yields high search efficiency in dynamic environments.

  9. Probabilistic inference in discrete spaces can be implemented into networks of LIF neurons

    PubMed Central

    Probst, Dimitri; Petrovici, Mihai A.; Bytschok, Ilja; Bill, Johannes; Pecevski, Dejan; Schemmel, Johannes; Meier, Karlheinz

    2015-01-01

    The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables. We test our framework for a model inference task based on a psychophysical phenomenon (the Knill-Kersten optical illusion) and further assess its performance when applied to randomly generated distributions. As the local computations performed by the network strongly depend on the interaction between neurons, we compare several types of couplings mediated by either single synapses or interneuron chains. Due to its robustness to substrate imperfections such as parameter noise and background noise correlations, our model is particularly interesting for implementation on novel, neuro-inspired computing architectures, which can thereby serve as a fast, low-power substrate for solving real-world inference problems. PMID:25729361

  10. Microstructure characterization of multi-phase composites and utilization of phase change materials and recycled rubbers in cementitious materials

    NASA Astrophysics Data System (ADS)

    Meshgin, Pania

    2011-12-01

    This research focuses on two important subjects: (1) Characterization of heterogeneous microstructure of multi-phase composites and the effect of microstructural features on effective properties of the material. (2) Utilizations of phase change materials and recycled rubber particles from waste tires to improve thermal properties of insulation materials used in building envelopes. Spatial pattern of multi-phase and multidimensional internal structures of most composite materials are highly random. Quantitative description of the spatial distribution should be developed based on proper statistical models, which characterize the morphological features. For a composite material with multi-phases, the volume fraction of the phases as well as the morphological parameters of the phases have very strong influences on the effective property of the composite. These morphological parameters depend on the microstructure of each phase. This study intends to include the effect of higher order morphological details of the microstructure in the composite models. The higher order statistics, called two-point correlation functions characterize various behaviors of the composite at any two points in a stochastic field. Specifically, correlation functions of mosaic patterns are used in the study for characterizing transport properties of composite materials. One of the most effective methods to improve energy efficiency of buildings is to enhance thermal properties of insulation materials. The idea of using phase change materials and recycled rubber particles such as scrap tires in insulation materials for building envelopes has been studied.

  11. Correlation of classroom typologies to lighting energy performance of academic building in warm-humid climate (case study: ITS Campus Sukolilo Surabaya)

    NASA Astrophysics Data System (ADS)

    Ekasiwi, S. N. N.; Antaryama, I. G. N.; Krisdianto, J.; Ulum, M. S.

    2018-03-01

    Classrooms in educational buildings require certain lighting requirements to serve teaching and learning activities during daytime. The most typical design is double sided opening in order to get good daylight distribution in the classroom. Using artificial light is essential to contribute the worse daylight condition. A short observation indicates that during the lecture time the light turned on, even in the daytime. That might result in wasting electrical energy. The aim of the study is to examine the type of classroom, which perform comfortable lighting environment as well as saving energy. This paper reports preliminary results of the study obtained from field observation and measurements. The use of energy and usage pattern of artificial lighting during the lecture is recorded and then the data evaluated to see the suitability of existing energy use to building energy standards. The daylighting design aspects have to be the first consideration. However, the similarity in WWR of the classroom, the Daylight Factor (DF) may differ. It depends on the room depth. The similarity of the increase of WWR and Ratio of openings to floor area do not directly correspond to the increase of DF. The outdoor condition of larger daylight access and the room depth are the influencing factors. Despite the similarity of physical type, usage pattern of the classroom imply the use of electrical energy for lighting. The results indicate the factors influencing lighting energy performance in correlation to their typologies

  12. Infrastructure and distributed learning methodology for privacy-preserving multi-centric rapid learning health care: euroCAT.

    PubMed

    Deist, Timo M; Jochems, A; van Soest, Johan; Nalbantov, Georgi; Oberije, Cary; Walsh, Seán; Eble, Michael; Bulens, Paul; Coucke, Philippe; Dries, Wim; Dekker, Andre; Lambin, Philippe

    2017-06-01

    Machine learning applications for personalized medicine are highly dependent on access to sufficient data. For personalized radiation oncology, datasets representing the variation in the entire cancer patient population need to be acquired and used to learn prediction models. Ethical and legal boundaries to ensure data privacy hamper collaboration between research institutes. We hypothesize that data sharing is possible without identifiable patient data leaving the radiation clinics and that building machine learning applications on distributed datasets is feasible. We developed and implemented an IT infrastructure in five radiation clinics across three countries (Belgium, Germany, and The Netherlands). We present here a proof-of-principle for future 'big data' infrastructures and distributed learning studies. Lung cancer patient data was collected in all five locations and stored in local databases. Exemplary support vector machine (SVM) models were learned using the Alternating Direction Method of Multipliers (ADMM) from the distributed databases to predict post-radiotherapy dyspnea grade [Formula: see text]. The discriminative performance was assessed by the area under the curve (AUC) in a five-fold cross-validation (learning on four sites and validating on the fifth). The performance of the distributed learning algorithm was compared to centralized learning where datasets of all institutes are jointly analyzed. The euroCAT infrastructure has been successfully implemented in five radiation clinics across three countries. SVM models can be learned on data distributed over all five clinics. Furthermore, the infrastructure provides a general framework to execute learning algorithms on distributed data. The ongoing expansion of the euroCAT network will facilitate machine learning in radiation oncology. The resulting access to larger datasets with sufficient variation will pave the way for generalizable prediction models and personalized medicine.

  13. View of the current distribution "bus" atop switching cabinets within ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of the current distribution "bus" atop switching cabinets within the former transformer building. Looking northwest - Childs-Irving Hydroelectric Project, Childs System, Childs Powerhouse, Forest Service Road 708/502, Camp Verde, Yavapai County, AZ

  14. NREL and Panasonic | Energy Systems Integration Facility | NREL

    Science.gov Websites

    with distribution system modeling for the first time. The tool combines NREL's building energy system distribution system models, and Panasonic will perform cost-benefit analyses. Along with the creation of the

  15. QuTiP: An open-source Python framework for the dynamics of open quantum systems

    NASA Astrophysics Data System (ADS)

    Johansson, J. R.; Nation, P. D.; Nori, Franco

    2012-08-01

    We present an object-oriented open-source framework for solving the dynamics of open quantum systems written in Python. Arbitrary Hamiltonians, including time-dependent systems, may be built up from operators and states defined by a quantum object class, and then passed on to a choice of master equation or Monte Carlo solvers. We give an overview of the basic structure for the framework before detailing the numerical simulation of open system dynamics. Several examples are given to illustrate the build up to a complete calculation. Finally, we measure the performance of our library against that of current implementations. The framework described here is particularly well suited to the fields of quantum optics, superconducting circuit devices, nanomechanics, and trapped ions, while also being ideal for use in classroom instruction. Catalogue identifier: AEMB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 16 482 No. of bytes in distributed program, including test data, etc.: 213 438 Distribution format: tar.gz Programming language: Python Computer: i386, x86-64 Operating system: Linux, Mac OSX, Windows RAM: 2+ Gigabytes Classification: 7 External routines: NumPy (http://numpy.scipy.org/), SciPy (http://www.scipy.org/), Matplotlib (http://matplotlib.sourceforge.net/) Nature of problem: Dynamics of open quantum systems. Solution method: Numerical solutions to Lindblad master equation or Monte Carlo wave function method. Restrictions: Problems must meet the criteria for using the master equation in Lindblad form. Running time: A few seconds up to several tens of minutes, depending on size of underlying Hilbert space.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holesinger, T. G.; Carpenter, J. S.; Lienert, T. J.

    The ability of additive manufacturing to directly fabricate complex shapes provides characterization challenges for part qualification. The orientation of the microstructures produced by these processes will change relative to the surface normal of a complex part. In this work, the microscopy and x-ray tomography of an AlSi10Mg alloy hemispherical shell fabricated using powder bed metal additive manufacturing are used to illustrate some of these challenges. The shell was manufactured using an EOS M280 system in combination with EOS-specified powder and process parameters. The layer-by-layer process of building the shell with the powder bed additive manufacturing approach results in a position-dependentmore » microstructure that continuously changes its orientation relative to the shell surface normal. X-ray tomography was utilized to examine the position-dependent size and distribution of porosity and surface roughness in the 98.6% dense part. Optical and electron microscopy were used to identify global and local position-dependent structures, grain morphologies, chemistry, and precipitate sizes and distributions. The rapid solidification processes within the fusion zone (FZ) after the laser transit results in a small dendrite size. Cell spacings taken from the structure in the middle of the FZ were used with published relationships to estimate a cooling rate of ~9 × 10 5 K/s. Uniformly-distributed, nanoscale Si precipitates were found within the primary α-Al grains. A thin, distinct boundary layer containing larger α-Al grains and extended regions of the nanocrystalline divorced eutectic material surrounds the FZ. Moreover, subtle differences in the composition between the latter layer and the interior of the FZ were noted with scanning transmission electron microscopy (STEM) spectral imaging.« less

  17. Characterization of an aluminum alloy hemispherical shell fabricated via direct metal laser melting

    DOE PAGES

    Holesinger, T. G.; Carpenter, J. S.; Lienert, T. J.; ...

    2016-01-11

    The ability of additive manufacturing to directly fabricate complex shapes provides characterization challenges for part qualification. The orientation of the microstructures produced by these processes will change relative to the surface normal of a complex part. In this work, the microscopy and x-ray tomography of an AlSi10Mg alloy hemispherical shell fabricated using powder bed metal additive manufacturing are used to illustrate some of these challenges. The shell was manufactured using an EOS M280 system in combination with EOS-specified powder and process parameters. The layer-by-layer process of building the shell with the powder bed additive manufacturing approach results in a position-dependentmore » microstructure that continuously changes its orientation relative to the shell surface normal. X-ray tomography was utilized to examine the position-dependent size and distribution of porosity and surface roughness in the 98.6% dense part. Optical and electron microscopy were used to identify global and local position-dependent structures, grain morphologies, chemistry, and precipitate sizes and distributions. The rapid solidification processes within the fusion zone (FZ) after the laser transit results in a small dendrite size. Cell spacings taken from the structure in the middle of the FZ were used with published relationships to estimate a cooling rate of ~9 × 10 5 K/s. Uniformly-distributed, nanoscale Si precipitates were found within the primary α-Al grains. A thin, distinct boundary layer containing larger α-Al grains and extended regions of the nanocrystalline divorced eutectic material surrounds the FZ. Moreover, subtle differences in the composition between the latter layer and the interior of the FZ were noted with scanning transmission electron microscopy (STEM) spectral imaging.« less

  18. Effect of energy-efficient measures in building construction on indoor radon in Russia.

    PubMed

    Vasilyev, A; Yarmoshenko, I

    2017-04-28

    The effect of implementation of energy-efficient measures in building construction was studied. Analysis includes study of indoor radon in energy-efficient buildings in Ekaterinburg, Russia, and results of radiation measurements in 83 regions of Russia conducted within the regional programmes. The forecast distribution of radon concentration in Ekaterinburg was built with regard to the city development programme. With Ekaterinburg taken as representative case, forecast distribution of radon concentration in Russia in 2030 was built. In comparison with 2000, average radon concentration increases by a factor of 1.42 in 2030 year; percentage above the reference level 300 Bq/m3 increases by a factor of 4 in 2030 year. It is necessary to perceive such an increase with all seriousness and to prepare appropriate measures for optimization of protection against indoor radon. Despite the high uncertainty, reconstructed distribution of radon concentration can be applied for justification of measures to be incorporated in the radon mitigation strategy. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Decision making based on analysis of benefit versus costs of preventive retrofit versus costs of repair after earthquake hazards

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, M.

    2012-04-01

    In this presentation interventions on seismically vulnerable early reinforced concrete skeleton buildings, from the interwar time, at different performance levels, from avoiding collapse up to assuring immediate post-earthquake functionality are considered. Between these two poles there are degrees of damage depending on the performance aim set. The costs of the retrofit and post-earthquake repair differ depending on the targeted performance. Not only an earthquake has impact on a heritage building, but also the retrofit measure, for example on its appearance or its functional layout. This way criteria of the structural engineer, the investor, the architect/conservator/urban planner and the owner/inhabitants from the neighbourhood are considered for taking a benefit-cost decision. Benefit-cost analysis based decision is an element in a risk management process. A solution must be found on how much change to accept for retrofit and how much repairable damage to take into account. There are two impact studies. Numerical simulation was run for the building typology considered for successive earthquakes, selected in a deterministic way (1977, 1986 and two for 1991 from Vrancea, Romania and respectively 1978 Thessaloniki, Greece), considering also the case when retrofit is done between two earthquakes. The typology of buildings itself was studied not only for Greece and Romania, but for numerous European countries, including Italy. The typology was compared to earlier reinforced concrete buildings, with Hennebique system, in order to see to which amount these can belong to structural heritage and to shape the criteria of the architect/conservator. Based on the typology study two model buildings were designed, and for one of these different retrofit measures (side walls, structural walls, steel braces, steel jacketing) were considered, while for the other one of these retrofit techniques (diagonal braces, which permits adding also active measures such as energy dissipaters) to different amount and location in the building was considered. Device computations, a civil engineering method for building economics (and which was, before statistics existed, also the method for computing the costs of general upgrade of buildings), were done for the retrofit and for the repair measures, being able to be applied for different countries, also ones where there is no database on existing projects in seismic retrofit. The building elements for which the device computations were done are named "retrofit elements" and they can be new elements, modified elements or replaced elements of the initial building. The addition of the devices is simple, as the row in project management was, but, for the sake of comparison, also complex project management computed in other works was compared for innovative measures such as FRP (with glass and fibre). The theoretical costs for model measures were compared to the way costs of real retrofit for this building type (with reinforced concrete jacketing and FRP) are computed in Greece. The theoretical proposed measures were generally compared to those applied in practice, in Romania and Italy as well. A further study will include these, as in Italy diagonal braces with dissipation had been used. The typology of braces is relevant also for the local seismic culture, maybe outgoing for another type of skeleton structures the distribution of which has been studied: the timber skeleton. A subtype of Romanian reinforced concrete skeleton buildings includes diagonal braces. In order to assess the costs of rebuilding or general upgrade without retrofit, architecture methods for building economics are considered based on floor surface. Diagrams have been built to see how the total costs vary as addition between the preventive retrofit and the post-earthquake repair, and tables to compare to the costs of rebuilding, outgoing from a the model of addition of day-lighting in atria of buildings. The moment when a repair measure has to be applied, function of the recurrence period of earthquakes, is similar to the depth of the atria. Depending on how strong the expected earthquake is, a more extensive retrofit is required in order to decrease repair costs. A further study would allow converting the device computations in floor surface costs, to be able not only to implement in an ICT environment by means of ontology and BIM, but also to convert to urban scale. For the latter studies of probabilistic application of structural mechanics models instead of observation based statistics can be considered. But first the socio-economic models of construction management games will be considered, both computer games and board hard-copy games, starting with SimCity which initially included the San Francisco 1906 earthquake, in order to see how the resources needed can be modeled. All criteria build the taxonomy of decision. Among them different ways to make the cost-benefit analysis exist, from weighted tree to pair-wise comparison. The taxonomy was modeled as a decision tree, which builds the basis for an ontology.

  20. Performance related issues in distributed database systems

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1991-01-01

    The key elements of research performed during the year long effort of this project are: Investigate the effects of heterogeneity in distributed real time systems; Study the requirements to TRAC towards building a heterogeneous database system; Study the effects of performance modeling on distributed database performance; and Experiment with an ORACLE based heterogeneous system.

  1. Distributed Leadership with the Aim of "Reculturing": A Departmental Case Study

    ERIC Educational Resources Information Center

    Melville, Wayne; Jones, Doug; Campbell, Todd

    2014-01-01

    This article considers a secondary science department that has, since 2000, developed distributed leadership as a form of human capacity building. Using a longitudinal ethnographic case study allowed us to consider how distributed leadership can be nurtured and developed in a department. Our analysis centres on two key issues: the nature and…

  2. Leveraging Fourth and Sixth Graders' Experiences to Reveal Understanding of the Forms and Features of Distributed Causality

    ERIC Educational Resources Information Center

    Grotzer, Tina A.; Derbiszewska, Katarzyna; Solis, S. Lynneth

    2017-01-01

    Research has focused on students' difficulties understanding phenomena in which agency is distributed across actors whose individual-level behaviors converge to result in collective outcomes. Building on Levy and Wilensky (2008), this study identified features of distributed causality students understand and that may offer affordances for…

  3. 40 CFR 35.2040 - Grant application.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...: (a) Step 2+3: Combined design and building of a treatment works and building related services and... limitations on award (§§ 35.2100 through 35.2127); (5) Final design drawings and specifications; (6) The..., a preliminary layout of the distribution and drainage systems, and an explanation of the intended...

  4. 40 CFR 35.2040 - Grant application.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...: (a) Step 2+3: Combined design and building of a treatment works and building related services and... limitations on award (§§ 35.2100 through 35.2127); (5) Final design drawings and specifications; (6) The..., a preliminary layout of the distribution and drainage systems, and an explanation of the intended...

  5. 40 CFR 35.2040 - Grant application.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...: (a) Step 2+3: Combined design and building of a treatment works and building related services and... limitations on award (§§ 35.2100 through 35.2127); (5) Final design drawings and specifications; (6) The..., a preliminary layout of the distribution and drainage systems, and an explanation of the intended...

  6. 40 CFR 35.2040 - Grant application.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...: (a) Step 2+3: Combined design and building of a treatment works and building related services and... limitations on award (§§ 35.2100 through 35.2127); (5) Final design drawings and specifications; (6) The..., a preliminary layout of the distribution and drainage systems, and an explanation of the intended...

  7. 40 CFR 35.2040 - Grant application.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...: (a) Step 2+3: Combined design and building of a treatment works and building related services and... limitations on award (§§ 35.2100 through 35.2127); (5) Final design drawings and specifications; (6) The..., a preliminary layout of the distribution and drainage systems, and an explanation of the intended...

  8. Teach Students about Civics through Schoolwide Governance

    ERIC Educational Resources Information Center

    Brasof, Marc; Spector, Anne

    2016-01-01

    Building democracies in K-8 schools is a promising approach to increasing young people and educators' civic knowledge, skills and dispositions. The Rendell Center for Civics and Civics Engagement leveraged strategies and concepts from the fields of civic education, student voice, and distributed leadership to build a youth-adult school governance…

  9. 49 CFR 1242.26 - Miscellaneous building and structures (account XX-19-28).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Miscellaneous building and structures (account XX... XX-19-28). Separate common expenses as specific facts indicate or according to distribution of common expenses listed in § 1242.10, Administration-Track (account XX-19-02). ...

  10. 45 CFR 73.735-305 - Conduct in Federal buildings.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Conduct in Federal buildings. 73.735-305 Section 73.735-305 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION STANDARDS OF... contributions, engage in commercial soliciting and vending, display or distribute commercial advertisements, or...

  11. 45 CFR 73.735-305 - Conduct in Federal buildings.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Conduct in Federal buildings. 73.735-305 Section 73.735-305 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION STANDARDS OF... contributions, engage in commercial soliciting and vending, display or distribute commercial advertisements, or...

  12. 45 CFR 73.735-305 - Conduct in Federal buildings.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Conduct in Federal buildings. 73.735-305 Section 73.735-305 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION STANDARDS OF... contributions, engage in commercial soliciting and vending, display or distribute commercial advertisements, or...

  13. 16 CFR 802.5 - Acquisitions of investment rental property assets.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .../entertainment complex which it will rent to professional sports teams and promoters of special events for... warehouse/distribution center, a retail tire and automobile parts store, an office building, and a small... unavailable. The exemptions in § 802.2 for warehouses, rental retail space, office buildings, and undeveloped...

  14. 16 CFR 802.5 - Acquisitions of investment rental property assets.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .../entertainment complex which it will rent to professional sports teams and promoters of special events for... warehouse/distribution center, a retail tire and automobile parts store, an office building, and a small... unavailable. The exemptions in § 802.2 for warehouses, rental retail space, office buildings, and undeveloped...

  15. 16 CFR 802.5 - Acquisitions of investment rental property assets.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .../entertainment complex which it will rent to professional sports teams and promoters of special events for... warehouse/distribution center, a retail tire and automobile parts store, an office building, and a small... unavailable. The exemptions in § 802.2 for warehouses, rental retail space, office buildings, and undeveloped...

  16. 16 CFR 802.5 - Acquisitions of investment rental property assets.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .../entertainment complex which it will rent to professional sports teams and promoters of special events for... warehouse/distribution center, a retail tire and automobile parts store, an office building, and a small... unavailable. The exemptions in § 802.2 for warehouses, rental retail space, office buildings, and undeveloped...

  17. 16 CFR 802.5 - Acquisitions of investment rental property assets.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .../entertainment complex which it will rent to professional sports teams and promoters of special events for... warehouse/distribution center, a retail tire and automobile parts store, an office building, and a small... unavailable. The exemptions in § 802.2 for warehouses, rental retail space, office buildings, and undeveloped...

  18. 15. BUILDING NO. 445, PHYSICS LAB (FORMERLY GUN BAG LOADING), ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. BUILDING NO. 445, PHYSICS LAB (FORMERLY GUN BAG LOADING), INTERIOR, FOURTH LEVEL. POWDER HOPPER AT TOP OF ELEVATOR SHAFT. POWDER DISTRIBUTED FROM HERE TO LOADING ROOMS BY TUBES. - Picatinny Arsenal, 400 Area, Gun Bag Loading District, State Route 15 near I-80, Dover, Morris County, NJ

  19. Three Precast Buildings from the Office of Marcel Breuer and Associates

    ERIC Educational Resources Information Center

    Architectural Record, 1973

    1973-01-01

    Two campus structures -- a student-faculty center at the University of Massachusetts and a technological complex at New York University -- and a downtown high-rise office building in Cleveland all have precast panel walls, which play a major role in the mechanical and electrical distribution systems. (Author)

  20. ASSESSMENT OF VENTILATION SYSTEMS AS RELATED TO THE BASE STUDY

    EPA Science Inventory

    The U.S. EPA's Office of Radiation and Indoor Air studied 100 public and private office buildings across the U.S. from 1994 1998. The purpose of the study, entitled the Building Assessment Survey and Evaluation (BASE) Study, was to: (a) provide a distribution of indoor air quali...

  1. Creating an Organic Knowledge-Building Environment within an Asynchronous Distributed Learning Context.

    ERIC Educational Resources Information Center

    Moller, Leslie; Prestera, Gustavo E.; Harvey, Douglas; Downs-Keller, Margaret; McCausland, Jo-Ann

    2002-01-01

    Discusses organic architecture and suggests that learning environments should be designed and constructed using an organic approach, so that learning is not viewed as a distinct human activity but incorporated into everyday performance. Highlights include an organic knowledge-building model; information objects; scaffolding; discourse action…

  2. From Maximum Entropy Models to Non-Stationarity and Irreversibility

    NASA Astrophysics Data System (ADS)

    Cofre, Rodrigo; Cessac, Bruno; Maldonado, Cesar

    The maximum entropy distribution can be obtained from a variational principle. This is important as a matter of principle and for the purpose of finding approximate solutions. One can exploit this fact to obtain relevant information about the underlying stochastic process. We report here in recent progress in three aspects to this approach.1- Biological systems are expected to show some degree of irreversibility in time. Based on the transfer matrix technique to find the spatio-temporal maximum entropy distribution, we build a framework to quantify the degree of irreversibility of any maximum entropy distribution.2- The maximum entropy solution is characterized by a functional called Gibbs free energy (solution of the variational principle). The Legendre transformation of this functional is the rate function, which controls the speed of convergence of empirical averages to their ergodic mean. We show how the correct description of this functional is determinant for a more rigorous characterization of first and higher order phase transitions.3- We assess the impact of a weak time-dependent external stimulus on the collective statistics of spiking neuronal networks. We show how to evaluate this impact on any higher order spatio-temporal correlation. RC supported by ERC advanced Grant ``Bridges'', BC: KEOPS ANR-CONICYT, Renvision and CM: CONICYT-FONDECYT No. 3140572.

  3. Gaseous Elemental Mercury and Total and Leached Mercury in Building Materials from the Former Hg-Mining Area of Abbadia San Salvatore (Central Italy)

    PubMed Central

    Vaselli, Orlando; Nisi, Barbara; Rappuoli, Daniele; Cabassi, Jacopo; Tassi, Franco

    2017-01-01

    Mercury has a strong environmental impact since both its organic and inorganic forms are toxic, and it represents a pollutant of global concern. Liquid Hg is highly volatile and can be released during natural and anthropogenic processes in the hydrosphere, biosphere and atmosphere. In this study, the distribution of Gaseous Elemental Mercury (GEM) and the total and leached mercury concentrations on paint, plaster, roof tiles, concrete, metals, dust and wood structures were determined in the main buildings and structures of the former Hg-mining area of Abbadia San Salvatore (Siena, Central Italy). The mining complex (divided into seven units) covers a surface of about 65 ha and contains mining structures and managers’ and workers’ buildings. Nine surveys of GEM measurements were carried out from July 2011 to August 2015 for the buildings and structures located in Units 2, 3 and 6, the latter being the area where liquid mercury was produced. Measurements were also performed in February, April, July, September and December 2016 in the edifices and mining structures of Unit 6. GEM concentrations showed a strong variability in time and space mostly depending on ambient temperature and the operational activities that were carried out in each building. The Unit 2 surveys carried out in the hotter period (from June to September) showed GEM concentrations up to 27,500 ng·m−3, while in Unit 6, they were on average much higher, and occasionally, they saturated the GEM measurement device (>50,000 ng·m−3). Concentrations of total (in mg·kg−1) and leached (in μg·L−1) mercury measured in different building materials (up to 46,580 mg·kg−1 and 4470 mg·L−1, respectively) were highly variable, being related to the edifice or mining structure from which they were collected. The results obtained in this study are of relevant interest for operational cleanings to be carried out during reclamation activities. PMID:28420130

  4. Evaluation of tsunami risk in Heraklion city, Crete, Greece, by using GIS methods

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Ioanna; Fokaefs, Anna; Novikova, Tatyana; Papadopoulos, Gerasimos A.; Vaitis, Michalis

    2016-04-01

    The Hellenic Arc is the most active seismotectonic structure in the Mediterranean region. The island of Crete occupies the central segment of the arc which is characterized by high seismic and tsunami activity. Several tsunamis generated by large earthquakes, volcanic eruptions and landslides were reported that hit the capital city of Heraklion in the historical past. We focus our tsunami risk study in the northern coastal area of Crete (ca. 6 km in length and 1 km in maximum width) which includes the western part of the city of Heraklion and a large part of the neighboring municipality of Gazi. The evaluation of tsunami risk included calculations and mapping with QGIS of (1) cost for repairing buildings after tsunami damage, (2) population exposed to tsunami attack, (3) optimum routes and times for evacuation. To calculate the cost for building reparation after a tsunami attack we have determined the tsunami inundation zone in the study area after numerical simulations for extreme tsunami scenarios. The geographical distribution of buildings per building block, obtained from the 2011 census data of the Hellenic Statistical Authority (EL.STAT) and satellite data, was mapped. By applying the SCHEMA Damage Tool we assessed the building vulnerability to tsunamis according to the types of buildings and their expected damage from the hydrodynamic impact. A set of official cost rates varying with the building types and the damage levels, following standards set by the state after the strong damaging earthquakes in Greece in 2014, was applied to calculate the cost of rebuilding or repairing buildings damaged by the tsunami. In the investigation of the population exposed to tsunami inundation we have used the interpolation method to smooth out the population geographical distribution per building block within the inundation zone. Then, the population distribution was correlated with tsunami hydrodynamic parameters in the inundation zone. The last approach of tsunami risk assessment refers to the selection of optimal routes and times needed for evacuation from certain points within the inundation zone to a number of shelters outside the zone. The three different approaches were evaluated as for their overall contribution in the development of a plan for the tsunami risk mitigation. This research is a contribution to the EU-FP7 tsunami research project ASTARTE (Assessment, Strategy And Risk Reduction for Tsunamis in Europe), grant agreement no: 603839, 2013-10-30.

  5. The Theory Question in Research Capacity Building in Education: Towards an Agenda for Research and Practice

    ERIC Educational Resources Information Center

    Biesta, Gert; Allan, Julie; Edwards, Richard

    2011-01-01

    The question of capacity building in education has predominantly been approached with regard to the methods and methodologies of educational research. Far less attention has been given to capacity building in relation to theory. In many ways the latter is as pressing an issue as the former, given that good research depends on a combination of high…

  6. Estimating foundation water vapor release using a simple moisture balance and AIM-2 : case study of a contemporary wood-frame house

    Treesearch

    C. R. Boardman; Samuel V. Glass; Charles G. Carll

    2010-01-01

    Proper management of indoor humidity in buildings is an essential aspect of durability. Following dissipation of moisture from construction materials, humidity levels during normal operation are generally assumed to primarily depend on the building volume, the number of building occupants and their behavior, the air exchange rate, and the water vapor content of outdoor...

  7. Nucleotide pools dictate the identity and frequency of ribonucleotide incorporation in mitochondrial DNA.

    PubMed

    Berglund, Anna-Karin; Navarrete, Clara; Engqvist, Martin K M; Hoberg, Emily; Szilagyi, Zsolt; Taylor, Robert W; Gustafsson, Claes M; Falkenberg, Maria; Clausen, Anders R

    2017-02-01

    Previous work has demonstrated the presence of ribonucleotides in human mitochondrial DNA (mtDNA) and in the present study we use a genome-wide approach to precisely map the location of these. We find that ribonucleotides are distributed evenly between the heavy- and light-strand of mtDNA. The relative levels of incorporated ribonucleotides reflect that DNA polymerase γ discriminates the four ribonucleotides differentially during DNA synthesis. The observed pattern is also dependent on the mitochondrial deoxyribonucleotide (dNTP) pools and disease-causing mutations that change these pools alter both the absolute and relative levels of incorporated ribonucleotides. Our analyses strongly suggest that DNA polymerase γ-dependent incorporation is the main source of ribonucleotides in mtDNA and argues against the existence of a mitochondrial ribonucleotide excision repair pathway in human cells. Furthermore, we clearly demonstrate that when dNTP pools are limiting, ribonucleotides serve as a source of building blocks to maintain DNA replication. Increased levels of embedded ribonucleotides in patient cells with disturbed nucleotide pools may contribute to a pathogenic mechanism that affects mtDNA stability and impair new rounds of mtDNA replication.

  8. The solution of private problems for optimization heat exchangers parameters

    NASA Astrophysics Data System (ADS)

    Melekhin, A.

    2017-11-01

    The relevance of the topic due to the decision of problems of the economy of resources in heating systems of buildings. To solve this problem we have developed an integrated method of research which allows solving tasks on optimization of parameters of heat exchangers. This method decides multicriteria optimization problem with the program nonlinear optimization on the basis of software with the introduction of an array of temperatures obtained using thermography. The author have developed a mathematical model of process of heat exchange in heat exchange surfaces of apparatuses with the solution of multicriteria optimization problem and check its adequacy to the experimental stand in the visualization of thermal fields, an optimal range of managed parameters influencing the process of heat exchange with minimal metal consumption and the maximum heat output fin heat exchanger, the regularities of heat exchange process with getting generalizing dependencies distribution of temperature on the heat-release surface of the heat exchanger vehicles, defined convergence of the results of research in the calculation on the basis of theoretical dependencies and solving mathematical model.

  9. Phenomenology from SIDIS and e+e- multiplicities: multiplicities and phenomenology - part I

    NASA Astrophysics Data System (ADS)

    Bacchetta, Alessandro; Echevarria, Miguel G.; Radici, Marco; Signori, Andrea

    2015-01-01

    This study is part of a project to investigate the transverse momentum dependence in parton distribution and fragmentation functions, analyzing (semi-)inclusive high-energy processes within a proper QCD framework. We calculate the transverse-momentum-dependent (TMD) multiplicities for e+e- annihilation into two hadrons (considering different combinations of pions and kaons) aiming to investigate the impact of intrinsic and radiative partonic transverse momentum and their mixing with flavor. Different descriptions of the non-perturbative evolution kernel (see, e.g., Refs. [1-5]) are available on the market and there are 200 sets of flavor configurations for the unpolarized TMD fragmentation functions (FFs) resulting from a Monte Carlo fit of Semi-Inclusive Deep-Inelastic Scattering (SIDIS) data at Hermes (see Ref. [6]). We build our predictions of e+e- multiplicities relying on this rich phenomenology. The comparison of these calculations with future experimental data (from Belle and BaBar collaborations) will shed light on non-perturbative aspects of hadron structure, opening important insights into the physics of spin, flavor and momentum structure of hadrons.

  10. Experimental study on water content detection of traditional masonry based on infrared thermal image

    NASA Astrophysics Data System (ADS)

    Zhang, Baoqing; Lei, Zukang

    2017-10-01

    Based on infrared thermal imaging technology for seepage test of two kinds of brick masonry, find out the relationship between the distribution of one-dimensional two brick surface temperature distribution and one-dimensional surface moisture content were determined after seepage brick masonry minimum temperature zone and water content determination method of the highest point of the regression equation, the relationship between temperature and moisture content of the brick masonry reflected the quantitative and establish the initial wet masonry building disease analysis method, then the infrared technology is applied to the protection of historic buildings in.

  11. Time evolution of pore system in lime - Pozzolana composites

    NASA Astrophysics Data System (ADS)

    Doleželová, Magdaléna; Čáchová, Monika; Scheinherrová, Lenka; Keppert, Martin

    2017-11-01

    The lime - pozzolana mortars and plasters are used in restoration works on building cultural heritage but these materials are also following the trend of energy - efficient solutions in civil engineering. Porosity and pore size distribution is one of crucial parameters influencing engineering properties of porous materials. The pore size distribution of lime based system is changing in time due to chemical processes occurring in the material. The present paper describes time evolution of pore system in lime - pozzolana composites; the obtained results are useful in prediction of performance of lime - pozzolana systems in building structures.

  12. USE OF MODELS FOR GAMMA SHIELDING STUDIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clifford, C.E.

    1962-02-01

    The use of models for shielding studies of buildings exposed to gamma radiation was evaluated by comparing the dose distributions produced in a blockhouse with movable inside walls exposed to 0.66 Mev gamma radiation with corresponding distributions in an iron 1 to 10 scale model. The effects of air and ground scaling on the readings in the model were also investigated. Iron appeared to be a suitable model material for simple closed buildings but for more complex structures it appeared that the use of iron models would progressively overestimite the gamms shielding protection as the complexity increased. (auth)

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Auslander, David; Culler, David; Wright, Paul

    The goal of the 2.5 year Distributed Intelligent Automated Demand Response (DIADR) project was to reduce peak electricity load of Sutardja Dai Hall at UC Berkeley by 30% while maintaining a healthy, comfortable, and productive environment for the occupants. We sought to bring together both central and distributed control to provide “deep” demand response1 at the appliance level of the building as well as typical lighting and HVAC applications. This project brought together Siemens Corporate Research and Siemens Building Technology (the building has a Siemens Apogee Building Automation System (BAS)), Lawrence Berkeley National Laboratory (leveraging their Open Automated Demand Responsemore » (openADR), Auto-­Demand Response, and building modeling expertise), and UC Berkeley (related demand response research including distributed wireless control, and grid-­to-­building gateway development). Sutardja Dai Hall houses the Center for Information Technology Research in the Interest of Society (CITRIS), which fosters collaboration among industry and faculty and students of four UC campuses (Berkeley, Davis, Merced, and Santa Cruz). The 141,000 square foot building, occupied in 2009, includes typical office spaces and a nanofabrication laboratory. Heating is provided by a district heating system (steam from campus as a byproduct of the campus cogeneration plant); cooling is provided by one of two chillers: a more typical electric centrifugal compressor chiller designed for the cool months (Nov-­ March) and a steam absorption chiller for use in the warm months (April-­October). Lighting in the open office areas is provided by direct-­indirect luminaries with Building Management System-­based scheduling for open areas, and occupancy sensors for private office areas. For the purposes of this project, we focused on the office portion of the building. Annual energy consumption is approximately 8053 MWh; the office portion is estimated as 1924 MWh. The maximum peak load during the study period was 1175 kW. Several new tools facilitated this work, such as the Smart Energy Box, the distributed load controller or Energy Information Gateway, the web-­based DR controller (dubbed the Central Load-­Shed Coordinator or CLSC), and the Demand Response Capacity Assessment & Operation Assistance Tool (DRCAOT). In addition, an innovative data aggregator called sMAP (simple Measurement and Actuation Profile) allowed data from different sources collected in a compact form and facilitated detailed analysis of the building systems operation. A smart phone application (RAP or Rapid Audit Protocol) facilitated an inventory of the building’s plug loads. Carbon dioxide sensors located in conference rooms and classrooms allowed demand controlled ventilation. The extensive submetering and nimble access to this data provided great insight into the details of the building operation as well as quick diagnostics and analyses of tests. For example, students discovered a short-­cycling chiller, a stuck damper, and a leaking cooling coil in the first field tests. For our final field tests, we were able to see how each zone was affected by the DR strategies (e.g., the offices on the 7th floor grew very warm quickly) and fine-­tune the strategies accordingly.« less

  14. Thermal modelling of normal distributed nanoparticles through thickness in an inorganic material matrix

    NASA Astrophysics Data System (ADS)

    Latré, S.; Desplentere, F.; De Pooter, S.; Seveno, D.

    2017-10-01

    Nanoscale materials showing superior thermal properties have raised the interest of the building industry. By adding these materials to conventional construction materials, it is possible to decrease the total thermal conductivity by almost one order of magnitude. This conductivity is mainly influenced by the dispersion quality within the matrix material. At the industrial scale, the main challenge is to control this dispersion to reduce or even eliminate thermal bridges. This allows to reach an industrially relevant process to balance out the high material cost and their superior thermal insulation properties. Therefore, a methodology is required to measure and describe these nanoscale distributions within the inorganic matrix material. These distributions are either random or normally distributed through thickness within the matrix material. We show that the influence of these distributions is meaningful and modifies the thermal conductivity of the building material. Hence, this strategy will generate a thermal model allowing to predict the thermal behavior of the nanoscale particles and their distributions. This thermal model will be validated by the hot wire technique. For the moment, a good correlation is found between the numerical results and experimental data for a randomly distributed form of nanoparticles in all directions.

  15. Proceedings of the workshop on the dynamic response of environmental control processes in buildings, Lafayette, Indiana, March 13-15, 1979

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tree, D.R.; McBride, M.F.

    The purpose of this workshop was to consider how energy use in buildings can be reduced while maintaining the comfort of the occupants. It was postulated that optimization of energy use in buildings can be achieved through the use of operating strategies which consider the dynamic characteristics of comfort, the design and construction of the building, and the environmental control system. Working sessions were presented on equipment, controls, structures, human factors, circulation/distribution, design, operation and use pattern, management and codes, and energy storage. (LCL)

  16. Final Report. Montpelier District Energy Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Jessie; Motyka, Kurt; Aja, Joe

    2015-03-30

    The City of Montpelier, in collaboration with the State of Vermont, developed a central heat plant fueled with locally harvested wood-chips and a thermal energy distribution system. The project provides renewable energy to heat a complex of state buildings and a mix of commercial, private and municipal buildings in downtown Montpelier. The State of Vermont operates the central heat plant and the system to heat the connected state buildings. The City of Montpelier accepts energy from the central heat plant and operates a thermal utility to heat buildings in downtown Montpelier which elected to take heat from the system.

  17. Energy Signal Tool for Decision Support in Building Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henze, G. P.; Pavlak, G. S.; Florita, A. R.

    2014-12-01

    A prototype energy signal tool is demonstrated for operational whole-building and system-level energy use evaluation. The purpose of the tool is to give a summary of building energy use which allows a building operator to quickly distinguish normal and abnormal energy use. Toward that end, energy use status is displayed as a traffic light, which is a visual metaphor for energy use that is either substantially different from expected (red and yellow lights) or approximately the same as expected (green light). Which light to display for a given energy end use is determined by comparing expected to actual energy use.more » As expected, energy use is necessarily uncertain; we cannot choose the appropriate light with certainty. Instead, the energy signal tool chooses the light by minimizing the expected cost of displaying the wrong light. The expected energy use is represented by a probability distribution. Energy use is modeled by a low-order lumped parameter model. Uncertainty in energy use is quantified by a Monte Carlo exploration of the influence of model parameters on energy use. Distributions over model parameters are updated over time via Bayes' theorem. The simulation study was devised to assess whole-building energy signal accuracy in the presence of uncertainty and faults at the submetered level, which may lead to tradeoffs at the whole-building level that are not detectable without submetering.« less

  18. A Petri Net model for distributed energy system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konopko, Joanna

    2015-12-31

    Electrical networks need to evolve to become more intelligent, more flexible and less costly. The smart grid is the next generation power energy, uses two-way flows of electricity and information to create a distributed automated energy delivery network. Building a comprehensive smart grid is a challenge for system protection, optimization and energy efficient. Proper modeling and analysis is needed to build an extensive distributed energy system and intelligent electricity infrastructure. In this paper, the whole model of smart grid have been proposed using Generalized Stochastic Petri Nets (GSPN). The simulation of created model is also explored. The simulation of themore » model has allowed the analysis of how close the behavior of the model is to the usage of the real smart grid.« less

  19. Threshold groundwater ages and young water fractions estimated from 3H, 3He, and 14C

    NASA Astrophysics Data System (ADS)

    Kirchner, James; Jasechko, Scott

    2016-04-01

    It is widely recognized that a water sample taken from a running stream is not described by a single age, but rather by a distribution of ages. It is less widely recognized that the same principle holds true for groundwaters, as indicated by the commonly observed discordances between model ages obtained from different tracers (e.g., 3H vs 14C) in the same sample. Water age distributions are often characterized by their mean residence times (MRT's). However, MRT estimates are highly uncertain because they depend on the shape of the assumed residence time distribution (in particular on the thickness of the long-time tail), which is difficult or impossible to constrain with data. Furthermore, because MRT's are typically nonlinear functions of age tracer concentrations, they are subject to aggregation bias. That is, MRT estimates derived from a mixture of waters with different ages (and thus different tracer concentrations) will systematically underestimate the mixture's true mean age. Here, building on recent work with stable isotope tracers in surface waters [1-3], we present a new framework for using 3H, 3He and 14C to characterize groundwater age distributions. Rather than describing groundwater age distributions by their MRT, we characterize them by the fraction of the distribution that is younger or older than a threshold age. The threshold age that separates "young" from "old" water depends on the characteristics of the specific tracer, including its history of atmospheric inputs. Our approach depends only on whether a given slice of the age distribution is younger or older than the threshold age, but not on how much younger or older it is. Thus our approach is insensitive to the tails of the age distribution, and is therefore relatively unaffected by uncertainty in the distribution's shape. Here we show that concentrations of 3H, 3He, and 14C are almost linearly related to the fractions of water that are younger or older than specified threshold ages. These "young" and "old" water fractions are therefore immune to the aggregation bias that afflicts MRT estimates. They are also relatively insensitive to the shape of the assumed residence time distribution. We apply this approach to 3H and 14C measurements from ˜5000 wells in ˜200 aquifers around the world. Our results show that even very old groundwaters, with 14C ages of thousands of years, often contain significant amounts of much younger water, with a substantial fraction of their age distributions younger than ˜100 years old. Thus despite being very old on average, these groundwaters may also be vulnerable to relatively recent contamination. [1] Kirchner J.W., Aggregation in environmental systems: Catchment mean transit times and young water fractions under hydrologic nonstationarity, Hydrology and Earth System Sciences, in press. [2] Kirchner J.W., Aggregation in environmental systems: Seasonal tracer cycles quantify young water fractions, but not mean transit times, in spatially heterogeneous catchments, Hydrology and Earth System Sciences, in press. [3] Jasechko S., Kirchner J.W., Welker J.M., and McDonnell J.J., Substantial young streamflow in global rivers, Nature Geoscience, in press.

  20. Iterative model building, structure refinement and density modification with the PHENIX AutoBuild wizard.

    PubMed

    Terwilliger, Thomas C; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Moriarty, Nigel W; Zwart, Peter H; Hung, Li Wei; Read, Randy J; Adams, Paul D

    2008-01-01

    The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 A, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution.

  1. 76 FR 20049 - Notice of Revised Determination on Reconsideration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-11

    ... Distribution Center, 100 T.G. Vaughan, Jr. Road, Galax, Virginia On December 3, 2010, the Department of Labor... Company, Inc., T.G. Vaughan Distribution Center, 100 T.G. Vaughan, Jr. Road, Galax, Virginia (TA-W-74,551B... to B.C. Vaughan Factory/Chestnut Creek Veneer Building (TA-W-74,551A) and T.G. Vaughan Distribution...

  2. Soft-Input Soft-Output Modules for the Construction and Distributed Iterative Decoding of Code Networks

    NASA Technical Reports Server (NTRS)

    Benedetto, S.; Divsalar, D.; Montorsi, G.; Pollara, F.

    1998-01-01

    Soft-input soft-output building blocks (modules) are presented to construct and iteratively decode in a distributed fashion code networks, a new concept that includes, and generalizes, various forms of concatenated coding schemes.

  3. 49 CFR 1242.22 - Shop buildings-locomotives (account XX-19-24).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Shop buildings-locomotives (account XX-19-24... Structures § 1242.22 Shop buildings—locomotives (account XX-19-24). Separate common expenses according to distribution of common expenses in the following accounts: Machinery Repair (XX-26-40) Locomotive—Repair and...

  4. 31 CFR 12.1 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance: Treasury 1 2013-07-01 2013-07-01 false Purpose. 12.1 Section 12.1 Money and Finance: Treasury Office of the Secretary of the Treasury RESTRICTION OF SALE AND DISTRIBUTION OF TOBACCO... Minors in Federal Buildings Act,” Public Law 104-52, Section 636, with respect to buildings under the...

  5. 31 CFR 12.1 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance: Treasury 1 2014-07-01 2014-07-01 false Purpose. 12.1 Section 12.1 Money and Finance: Treasury Office of the Secretary of the Treasury RESTRICTION OF SALE AND DISTRIBUTION OF TOBACCO... Minors in Federal Buildings Act,” Public Law 104-52, Section 636, with respect to buildings under the...

  6. Solar-Heated and Cooled Office Building--Columbus, Ohio

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Final report documents solar-energy system installed in office building to provide space heating, space cooling and domestic hot water. Collectors mounted on roof track Sun and concentrate rays on fluid-circulating tubes. Collected energy is distributed to hot-water-fired absorption chiller and space-heating and domestic-hot-water preheating systems.

  7. Minimum separation distances for natural gas pipeline and boilers in the 300 area, Hanford Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daling, P.M.; Graham, T.M.

    1997-08-01

    The U.S. Department of Energy (DOE) is proposing actions to reduce energy expenditures and improve energy system reliability at the 300 Area of the Hanford Site. These actions include replacing the centralized heating system with heating units for individual buildings or groups of buildings, constructing a new natural gas distribution system to provide a fuel source for many of these units, and constructing a central control building to operate and maintain the system. The individual heating units will include steam boilers that are to be housed in individual annex buildings located at some distance away from nearby 300 Area nuclearmore » facilities. This analysis develops the basis for siting the package boilers and natural gas distribution systems to be used to supply steam to 300 Area nuclear facilities. The effects of four potential fire and explosion scenarios involving the boiler and natural gas pipeline were quantified to determine minimum separation distances that would reduce the risks to nearby nuclear facilities. The resulting minimum separation distances are shown in Table ES.1.« less

  8. Vertical distribution of Aedes mosquitoes in multiple storey buildings in Selangor and Kuala Lumpur, Malaysia.

    PubMed

    Lau, K W; Chen, C D; Lee, H L; Izzul, A A; Asri-Isa, M; Zulfadli, M; Sofian-Azirun, M

    2013-03-01

    The aim of the present study was to determine the vertical distribution and abundance of Aedes mosquitoes in multiple storey buildings in Selangor and Kuala Lumpur, Malaysia. Ovitrap surveillance was conducted for 4 continuous weeks in multiple storey buildings in 4 residential areas located in Selangor [Kg. Baiduri (KB)] and Kuala Lumpur [Student Hostel of University of Malaya (UM), Kg. Kerinchi (KK) and Hang Tuah (HT)]. The results implied that Aedes mosquitoes could be found from ground floor to highest floor of multiple storey buildings and data from different elevation did not show significant difference. Ovitrap index for UM, KB, HT and KK ranged from 0 - 29.17%, 0 - 55.56%, 8.33 - 83.33% and 0 - 91.17% respectively. Aedes aegypti and Aedes albopictus were found breeding in HT, KK and KB; while only Ae. albopictus was obtained from UM. The results indicate that the invasion of Aedes mosquitoes in high-rise apartments could facilitate the transmission of dengue virus and new approaches to vector control in this type of residential area should be developed.

  9. Consumer effects on the vital rates of their resource can determine the outcome of competition between consumers.

    PubMed

    Lee, Charlotte T; Miller, Tom E X; Inouye, Brian D

    2011-10-01

    Current competition theory does not adequately address the fact that competitors may affect the survival, growth, and reproductive rates of their resources. Ecologically important interactions in which consumers affect resource vital rates range from parasitism and herbivory to mutualism. We present a general model of competition that explicitly includes consumer-dependent resource vital rates. We build on the classic MacArthur model of competition for multiple resources, allowing direct comparison with expectations from established concepts of resource-use overlap. Consumers share a stage-structured resource population but may use the different stages to different extents, as they do the different independent resources in the classic model. Here, however, the stages are dynamically linked via consumer-dependent vital rates. We show that consumers' effects on resource vital rates result in two important departures from classic results. First, consumers can coexist despite identical use of resource stages, provided each competitor shifts the resource stage distribution toward stages that benefit other species. Second, consumers specializing on different resource stages can compete strongly, possibly resulting in competitive exclusion despite a lack of resource stage-use overlap. Our model framework demonstrates the critical role that consumer-dependent resource vital rates can play in competitive dynamics in a wide range of biological systems.

  10. Optimizing Distributed Energy Resources and building retrofits with the strategic DER-CAModel

    DOE PAGES

    Stadler, M.; Groissböck, M.; Cardoso, G.; ...

    2014-08-05

    The pressuring need to reduce the import of fossil fuels as well as the need to dramatically reduce CO 2 emissions in Europe motivated the European Commission (EC) to implement several regulations directed to building owners. Most of these regulations focus on increasing the number of energy efficient buildings, both new and retrofitted, since retrofits play an important role in energy efficiency. Overall, this initiative results from the realization that buildings will have a significant impact in fulfilling the 20/20/20-goals of reducing the greenhouse gas emissions by 20%, increasing energy efficiency by 20%, and increasing the share of renewables tomore » 20%, all by 2020. The Distributed Energy Resources Customer Adoption Model (DER-CAM) is an optimization tool used to support DER investment decisions, typically by minimizing total annual costs or CO 2 emissions while providing energy services to a given building or microgrid site. This document shows enhancements made to DER-CAM to consider building retrofit measures along with DER investment options. Specifically, building shell improvement options have been added to DER-CAM as alternative or complementary options to investments in other DER such as PV, solar thermal, combined heat and power, or energy storage. The extension of the mathematical formulation required by the new features introduced in DER-CAM is presented and the resulting model is demonstrated at an Austrian Campus building by comparing DER-CAM results with and without building shell improvement options. Strategic investment results are presented and compared to the observed investment decision at the test site. Results obtained considering building shell improvement options suggest an optimal weighted average U value of about 0.53 W/(m 2K) for the test site. This result is approximately 25% higher than what is currently observed in the building, suggesting that the retrofits made in 2002 were not optimal. Furthermore, the results obtained with DER-CAM illustrate the complexity of interactions between DER and passive measure options, showcasing the need for a holistic optimization approach to effectively optimize energy costs and CO 2 emissions. Lastly, the simultaneous optimization of building shell improvements and DER investments enables building owners to take one step further towards nearly zero energy buildings (nZEB) or nearly zero carbon emission buildings (nZCEB), and therefore support the 20/20/20 goals.« less

  11. Optimizing Distributed Energy Resources and building retrofits with the strategic DER-CAModel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stadler, M.; Groissböck, M.; Cardoso, G.

    The pressuring need to reduce the import of fossil fuels as well as the need to dramatically reduce CO 2 emissions in Europe motivated the European Commission (EC) to implement several regulations directed to building owners. Most of these regulations focus on increasing the number of energy efficient buildings, both new and retrofitted, since retrofits play an important role in energy efficiency. Overall, this initiative results from the realization that buildings will have a significant impact in fulfilling the 20/20/20-goals of reducing the greenhouse gas emissions by 20%, increasing energy efficiency by 20%, and increasing the share of renewables tomore » 20%, all by 2020. The Distributed Energy Resources Customer Adoption Model (DER-CAM) is an optimization tool used to support DER investment decisions, typically by minimizing total annual costs or CO 2 emissions while providing energy services to a given building or microgrid site. This document shows enhancements made to DER-CAM to consider building retrofit measures along with DER investment options. Specifically, building shell improvement options have been added to DER-CAM as alternative or complementary options to investments in other DER such as PV, solar thermal, combined heat and power, or energy storage. The extension of the mathematical formulation required by the new features introduced in DER-CAM is presented and the resulting model is demonstrated at an Austrian Campus building by comparing DER-CAM results with and without building shell improvement options. Strategic investment results are presented and compared to the observed investment decision at the test site. Results obtained considering building shell improvement options suggest an optimal weighted average U value of about 0.53 W/(m 2K) for the test site. This result is approximately 25% higher than what is currently observed in the building, suggesting that the retrofits made in 2002 were not optimal. Furthermore, the results obtained with DER-CAM illustrate the complexity of interactions between DER and passive measure options, showcasing the need for a holistic optimization approach to effectively optimize energy costs and CO 2 emissions. Lastly, the simultaneous optimization of building shell improvements and DER investments enables building owners to take one step further towards nearly zero energy buildings (nZEB) or nearly zero carbon emission buildings (nZCEB), and therefore support the 20/20/20 goals.« less

  12. Issues in the reconstruction of environmental doses on the basis of thermoluminescence measurements in the Techa riverside

    NASA Technical Reports Server (NTRS)

    Bougrov, N. G.; Goksu, H. Y.; Haskell, E.; Degteva, M. O.; Meckbach, R.; Jacob, P.; Neta, P. I. (Principal Investigator)

    1998-01-01

    The potential of thermoluminescence measurements of bricks from the contaminated area of the Techa river valley, Southern Urals, Russia, for reconstructing external exposures of affected population groups has been studied. Thermoluminescence dating of background samples was used to evaluate the age of old buildings available on the river banks. The anthropogenic gamma dose accrued in exposed samples is determined by subtracting the natural radiation background dose for the corresponding age from the accumulated dose measured by thermoluminescence. For a site in the upper Techa river region, where the levels of external exposures were extremely high, the depth-dose distribution in bricks and the dependence of accidental dose on the height of the sampling position were determined. For the same site, Monte Carlo simulations of radiation transport were performed for different source configurations corresponding to the situation before and after the construction of a reservoir on the river and evacuation of the population in 1956. A comparison of the results provides an understanding of the features of the measured depth-dose distributions and height dependencies in terms of the source configurations and shows that bricks from the higher sampling positions are likely to have accrued a larger fraction of anthropogenic dose from the time before the construction of the reservoir. The applicability of the thermoluminescent dosimetry method to environmental dose reconstruction in the middle Techa region, where the external exposure was relatively low, was also investigated.

  13. Mixing zone and drinking water intake dilution factor and wastewater generation distributions to enable probabilistic assessment of down-the-drain consumer product chemicals in the U.S.

    PubMed

    Kapo, Katherine E; McDonough, Kathleen; Federle, Thomas; Dyer, Scott; Vamshi, Raghu

    2015-06-15

    Environmental exposure and associated ecological risk related to down-the-drain chemicals discharged by municipal wastewater treatment plants (WWTPs) are strongly influenced by in-stream dilution of receiving waters which varies by geography, flow conditions and upstream wastewater inputs. The iSTREEM® model (American Cleaning Institute, Washington D.C.) was utilized to determine probabilistic distributions for no decay and decay-based dilution factors in mean annual and low (7Q10) flow conditions. The dilution factors derived in this study are "combined" dilution factors which account for both hydrologic dilution and cumulative upstream effluent contributions that will differ depending on the rate of in-stream decay due to biodegradation, volatilization, sorption, etc. for the chemical being evaluated. The median dilution factors estimated in this study (based on various in-stream decay rates from zero decay to a 1h half-life) for WWTP mixing zones dominated by domestic wastewater flow ranged from 132 to 609 at mean flow and 5 to 25 at low flow, while median dilution factors at drinking water intakes (mean flow) ranged from 146 to 2×10(7) depending on the in-stream decay rate. WWTPs within the iSTREEM® model were used to generate a distribution of per capita wastewater generated in the U.S. The dilution factor and per capita wastewater generation distributions developed by this work can be used to conduct probabilistic exposure assessments for down-the-drain chemicals in influent wastewater, wastewater treatment plant mixing zones and at drinking water intakes in the conterminous U.S. In addition, evaluation of types and abundance of U.S. wastewater treatment processes provided insight into treatment trends and the flow volume treated by each type of process. Moreover, removal efficiencies of chemicals can differ by treatment type. Hence, the availability of distributions for per capita wastewater production, treatment type, and dilution factors at a national level provides a series of practical and powerful tools for building probabilistic exposure models. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. VOLTTRON™: Tech-to-Market Best-Practices Guide for Small- and Medium-Sized Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cort, Katherine A.; Haack, Jereme N.; Katipamula, Srinivas

    VOLTTRON™ is an open-source distributed control and sensing platform developed by Pacific Northwest National Laboratory for the U.S. Department of Energy. It was developed to be used by the Office of Energy Efficiency and Renewable Energy to support transactive controls research and deployment activities. VOLTTRON is designed to be an overarching integration platform that could be used to bring together vendors, users, and developers and enable rapid application development and testing. The platform is designed to support modern control strategies, including the use of agent- and transaction-based controls. It also is designed to support the management of a wide rangemore » of applications, including heating, ventilation, and air-conditioning systems; electric vehicles; and distributed-energy and whole-building loads. This report was completed as part of the Building Technologies Office’s Technology-to-Market Initiative for VOLTTRON’s Market Validation and Business Case Development efforts. The report provides technology-to-market guidance and best practices related to VOLTTRON platform deployments and commercialization activities for use by entities serving small- and medium-sized commercial buildings. The report characterizes the platform ecosystem within the small- and medium-sized commercial building market and articulates the value proposition of VOLTTRON for three core participants in this ecosystem: 1) platform owners/adopters, 2) app developers, and 3) end-users. The report also identifies key market drivers and opportunities for open platform deployments in the small- and medium-sized commercial building market. Possible pathways to the market are described—laboratory testing to market adoption to commercialization. We also identify and address various technical and market barriers that could hinder deployment of VOLTTRON. Finally, we provide “best practice” tech-to-market guidance for building energy-related deployment efforts serving small- and medium-sized commercial buildings.« less

  15. Sound propagation in urban areas: a periodic disposition of buildings.

    PubMed

    Picaut, J; Hardy, J; Simon, L

    1999-10-01

    A numerical simulation of background noise propagation is performed for a network of hexagonal buildings. The obtained results suggest that the prediction of background noise in urban spaces is possible by means of a modified diffusion equation using two parameters: the diffusion coefficient that expresses the spreading out of noise resulting from diffuse scattering and multiple reflections by buildings, and an attenuation term accounting for the wall absorption, atmospheric attenuation, and absorption by the open top. The dependence of the diffusion coefficient with geometrical shapes and the diffusive nature of the buildings are investigated in the case of a periodic disposition of hexagonal buildings.

  16. Building mud castles: a perspective from brick-laying termites.

    PubMed

    Zachariah, Nikita; Das, Aritra; Murthy, Tejas G; Borges, Renee M

    2017-07-05

    Animal constructions such as termite mounds have received scrutiny by architects, structural engineers, soil scientists and behavioural ecologists but their basic building blocks remain uncharacterized and the criteria used for material selection unexplored. By conducting controlled experiments on Odontotermes obesus termites, we characterize the building blocks of termite mounds and determine the key elements defining material choice and usage by these accomplished engineers. Using biocement and a self-organized process, termites fabricate, transport and assemble spherical unitary structures called boluses that have a bimodal size distribution, achieving an optimal packing solution for mound construction. Granular, hydrophilic, osmotically inactive, non-hygroscopic materials with surface roughness, rigidity and containing organic matter are the easiest to handle and are crucial determinants of mass transfer during mound construction. We suggest that these properties, along with optimal moisture availability, are important predictors of the global geographic distribution of termites.

  17. Building better water models using the shape of the charge distribution of a water molecule

    NASA Astrophysics Data System (ADS)

    Dharmawardhana, Chamila Chathuranga; Ichiye, Toshiko

    2017-11-01

    The unique properties of liquid water apparently arise from more than just the tetrahedral bond angle between the nuclei of a water molecule since simple three-site models of water are poor at mimicking these properties in computer simulations. Four- and five-site models add partial charges on dummy sites and are better at modeling these properties, which suggests that the shape of charge distribution is important. Since a multipole expansion of the electrostatic potential describes a charge distribution in an orthogonal basis set that is exact in the limit of infinite order, multipoles may be an even better way to model the charge distribution. In particular, molecular multipoles up to the octupole centered on the oxygen appear to describe the electrostatic potential from electronic structure calculations better than four- and five-site models, and molecular multipole models give better agreement with the temperature and pressure dependence of many liquid state properties of water while retaining the computational efficiency of three-site models. Here, the influence of the shape of the molecular charge distribution on liquid state properties is examined by correlating multipoles of non-polarizable water models with their liquid state properties in computer simulations. This will aid in the development of accurate water models for classical simulations as well as in determining the accuracy needed in quantum mechanical/molecular mechanical studies and ab initio molecular dynamics simulations of water. More fundamentally, this will lead to a greater understanding of how the charge distribution of a water molecule leads to the unique properties of liquid water. In particular, these studies indicate that p-orbital charge out of the molecular plane is important.

  18. Obtaining Parts

    Science.gov Websites

    The Cosmic Connection Parts for the Berkeley Detector Suppliers: Scintillator Eljen Technology 1 obtain the components needed to build the Berkeley Detector. These companies have helped previous the last update. He estimates that the cost to build a detector varies from $1500 to $2700 depending

  19. Kinetochores: if you build it, they will come.

    PubMed

    DeLuca, Jennifer G; Salmon, E D

    2004-11-09

    Successful mitosis depends critically on the segregation of chromosomes by kinetochore microtubules. A recent paper describes a conserved protein network from Caenorhabditis elegans that is composed of three classes of molecules, each of which contributes uniquely to the building of the kinetochore-microtubule attachment site.

  20. The characteristic of the building damage from historical large earthquakes in Kyoto

    NASA Astrophysics Data System (ADS)

    Nishiyama, Akihito

    2016-04-01

    The Kyoto city, which is located in the northern part of Kyoto basin in Japan, has a long history of >1,200 years since the city was initially constructed. The city has been a populated area with many buildings and the center of the politics, economy and culture in Japan for nearly 1,000 years. Some of these buildings are now subscribed as the world's cultural heritage. The Kyoto city has experienced six damaging large earthquakes during the historical period: i.e., in 976, 1185, 1449, 1596, 1662, and 1830. Among these, the last three earthquakes which caused severe damage in Kyoto occurred during the period in which the urban area had expanded. These earthquakes are considered to be inland earthquakes which occurred around the Kyoto basin. The damage distribution in Kyoto from historical large earthquakes is strongly controlled by ground condition and earthquakes resistance of buildings rather than distance from estimated source fault. Therefore, it is necessary to consider not only the strength of ground shaking but also the condition of building such as elapsed years since the construction or last repair in order to more accurately and reliably estimate seismic intensity distribution from historical earthquakes in Kyoto. The obtained seismic intensity map would be helpful for reducing and mitigating disaster from future large earthquakes.

  1. Economic aspects of possible residential heating conservation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopkowicz, M.; Szul, A.

    1995-12-31

    The paper presents methods of evaluation of energy and economy related effects of different actions aimed at conservation in residential buildings. It identifies also the method of selecting the most effective way of distribution funds assigned to weatherization as well as necessary improvements to be implemented within the heating node and the internal heating system of the building. The analysis of data gathered for four 11-stories high residential buildings of {open_quotes}Zeran{close_quotes} type being subject of the Conservation Demonstrative Project, included a differentiated scope of weatherization efforts and various actions aimed at system upgrading. Basing upon the discussion of the splitmore » of heat losses in a building as well as the established energy savings for numerous options of upgrading works, the main problem has been defined. It consists in optimal distribution of financial means for the discussed measures if the total amount of funds assigned for modifications is defined. The method based upon the principle of relative increments has been suggested. The economical and energy specifications of the building and its components, required for this method have also been elaborated. The application of this method allowed to define the suggested optimal scope of actions within the entire fund assigned for the comprehensive weatherization.« less

  2. Asbestos-containing materials and airborne asbestos levels in industrial buildings in Korea.

    PubMed

    Choi, Sangjun; Suk, Mee-Hee; Paik, Nam Won

    2010-03-01

    Recently in Korea, the treatment of asbestos-containing materials (ACM) in building has emerged as one of the most important environmental health issues. This study was conducted to identify the distribution and characteristics of ACM and airborne asbestos concentrations in industrial buildings in Korea. A total of 1285 presumed asbestos-containing material (PACM) samples were collected from 80 workplaces across the nation, and 40% of the PACMs contained more than 1% of asbestos. Overall, 94% of the surveyed workplaces contained ACM. The distribution of ACM did not show a significant difference by region, employment size, or industry. The total ACM area in the buildings surveyed was 436,710 m2. Ceiling tile ACM accounted for 61% (267,093 m2) of the total ACM area, followed by roof ACM (32%), surfacing ACM (6.1%), and thermal system insulation (TSI). In terms of asbestos type, 98% of total ACM was chrysotile, while crocidolite was not detected. A comparison of building material types showed that the material with the highest priority for regular management is ceiling tile, followed by roof, TSI, and surfacing material. The average airborne concentration of asbestos sampled without disturbing in-place ACM was 0.0028 fibers/cc by PCM, with all measurements below the standard of recommendation for indoor air quality in Korea (0.01 fibers/cc).

  3. Supernovae, Neutrinos and the Chirality of Amino Acids

    PubMed Central

    Boyd, Richard N.; Kajino, Toshitaka; Onaka, Takashi

    2011-01-01

    A mechanism for creating an enantioenrichment in the amino acids, the building blocks of the proteins, that involves global selection of one handedness by interactions between the amino acids and neutrinos from core-collapse supernovae is defined. The chiral selection involves the dependence of the interaction cross sections on the orientations of the spins of the neutrinos and the 14N nuclei in the amino acids, or in precursor molecules, which in turn couple to the molecular chirality. It also requires an asymmetric distribution of neutrinos emitted from the supernova. The subsequent chemical evolution and galactic mixing would ultimately populate the Galaxy with the selected species. The resulting amino acids could either be the source thereof on Earth, or could have triggered the chirality that was ultimately achieved for Earth’s proteinaceous amino acids. PMID:21747686

  4. Variation principle in calculating the flow of a two-phase mixture in the pipes of the cooling systems in high-rise buildings

    NASA Astrophysics Data System (ADS)

    Aksenov, Andrey; Malysheva, Anna

    2018-03-01

    The analytical solution of one of the urgent problems of modern hydromechanics and heat engineering about the distribution of gas and liquid phases along the channel cross-section, the thickness of the annular layer and their connection with the mass content of the gas phase in the gas-liquid flow is given in the paper.The analytical method is based on the fundamental laws of theoretical mechanics and thermophysics on the minimum of energy dissipation and the minimum rate of increase in the system entropy, which determine the stability of stationary states and processes. Obtained dependencies disclose the physical laws of the motion of two-phase media and can be used in hydraulic calculations during the design and operation of refrigeration and air conditioning systems.

  5. Combinatorial Enzyme Design Probes Allostery and Cooperativity in the Trypsin Fold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Page, Michael J.; Di Cera, Enrico; St. Louis-MED)

    2010-06-14

    Converting one enzyme into another is challenging due to the uneven distribution of important amino acids for function in both protein sequence and structure. We report a strategy for protein engineering allowing an organized mixing and matching of genetic material that leverages lower throughput with increased quality of screens. Our approach successfully tested the contribution of each surface-exposed loop in the trypsin fold alone and the cooperativity of their combinations towards building the substrate selectivity and Na{sup +}-dependent allosteric activation of the protease domain of human coagulation factor Xa into a bacterial trypsin. As the created proteases lack additional proteinmore » domains and protein co-factor activation mechanism requisite for the complexity of blood coagulation, they are stepping-stones towards further understanding and engineering of artificial clotting factors.« less

  6. Role of Y-Al Oxides During Extended Recovery Process of a Ferritic ODS Alloy

    NASA Astrophysics Data System (ADS)

    Capdevila, C.; Pimentel, G.; Aranda, M. M.; Rementeria, R.; Dawson, K.; Urones-Garrote, E.; Tatlock, G. J.; Miller, M. K.

    2015-08-01

    The microstructural stability of Y-Al oxides during the recrystallization of Fe-Cr-Al oxide dispersion strengthened alloy is studied in this work. The goal is to determine the specific distribution pattern of oxides depending where they are located: in the matrix or at the grain boundaries. It was concluded that those located at the grain boundaries yielded a faster coarsening than the ones in the matrix, although no significant differences in composition and/or crystal structure were observed. However, the recrystallization heat treatment leads to the dissolution of the Y2O3 and its combination with Al to form the YAlO3 perovskite oxide particles process, mainly located at the grain boundaries. Finally, atom probe tomography analysis revealed a significant Ti build-up at the grain boundaries that might affect subsequent migration during recrystallization.

  7. Dylan Cutler | NREL

    Science.gov Websites

    focuses on integration and optimization of distributed energy resources, specifically cost-optimal sizing Campus team which is focusing on NREL's own control system integration and energy informatics sizing and dispatch of distributed energy resources Integration of building and utility control systems

  8. PARTICLE SIZE DISTRIBUTIONS FOR AN OFFICE AEROSOL

    EPA Science Inventory

    The article discusses an evaluation of the effect of percent outdoor air supplied and occupation level on the particle size distributions and mass concentrations for a typical office building. (NOTE: As attention has become focused on indoor air pollution control, it has become i...

  9. The characteristic of the earthquake damage in Kyoto during the historical period

    NASA Astrophysics Data System (ADS)

    Nishiyama, Akihito

    2017-04-01

    The Kyoto city is located in the northern part of the Kyoto basin, central Japan and has a history of more than 1200 years. Kyoto has long been populated area with many buildings, and the center of politics, economics and culture of Japan. Due to historical large earthquakes, the Kyoto city was severely damaged such as collapses of buildings and human casualties. In the historical period, the Kyoto city has experienced six damaging large earthquake of 976, 1185, 1449, 1596, 1662 and 1830. Among them, Kyoto has experienced three damaging large earthquakes from the end of the 16th century to the middle of the 19th century, when the urban area was being expanded. All of these earthquakes are considered to be not the earthquakes in the Kyoto basin but inland earthquakes occurred in the surrounding area. The earthquake damage in Kyoto during the historical period is strongly controlled by ground conditions and earthquakes resistance of buildings rather than distance from the estimated source fault. To better estimate seismic intensity based on building damage, it is necessary to consider the state of buildings (e.g., elapsed years since established, histories of repairs and/or reinforcements, building structures) as well as the strength of ground shakings. By considering the strength of buildings at the time of an earthquake occurrence, the seismic intensity distribution due to historical large earthquakes can be estimated with higher reliability than before. The estimated seismic intensity distribution map for such historical earthquakes can be utilized for developing the strong ground motion prediction in the Kyoto basin.

  10. Stochastic Modeling of Overtime Occupancy and Its Application in Building Energy Simulation and Calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Kaiyu; Yan, Da; Hong, Tianzhen

    2014-02-28

    Overtime is a common phenomenon around the world. Overtime drives both internal heat gains from occupants, lighting and plug-loads, and HVAC operation during overtime periods. Overtime leads to longer occupancy hours and extended operation of building services systems beyond normal working hours, thus overtime impacts total building energy use. Current literature lacks methods to model overtime occupancy because overtime is stochastic in nature and varies by individual occupants and by time. To address this gap in the literature, this study aims to develop a new stochastic model based on the statistical analysis of measured overtime occupancy data from an officemore » building. A binomial distribution is used to represent the total number of occupants working overtime, while an exponential distribution is used to represent the duration of overtime periods. The overtime model is used to generate overtime occupancy schedules as an input to the energy model of a second office building. The measured and simulated cooling energy use during the overtime period is compared in order to validate the overtime model. A hybrid approach to energy model calibration is proposed and tested, which combines ASHRAE Guideline 14 for the calibration of the energy model during normal working hours, and a proposed KS test for the calibration of the energy model during overtime. The developed stochastic overtime model and the hybrid calibration approach can be used in building energy simulations to improve the accuracy of results, and better understand the characteristics of overtime in office buildings.« less

  11. Dendrite and Axon Specific Geometrical Transformation in Neurite Development

    PubMed Central

    Mironov, Vasily I.; Semyanov, Alexey V.; Kazantsev, Victor B.

    2016-01-01

    We propose a model of neurite growth to explain the differences in dendrite and axon specific neurite development. The model implements basic molecular kinetics, e.g., building protein synthesis and transport to the growth cone, and includes explicit dependence of the building kinetics on the geometry of the neurite. The basic assumption was that the radius of the neurite decreases with length. We found that the neurite dynamics crucially depended on the relationship between the rate of active transport and the rate of morphological changes. If these rates were in the balance, then the neurite displayed axon specific development with a constant elongation speed. For dendrite specific growth, the maximal length was rapidly saturated by degradation of building protein structures or limited by proximal part expansion reaching the characteristic cell size. PMID:26858635

  12. Expressivism, Relativism, and the Analytic Equivalence Test

    PubMed Central

    Frápolli, Maria J.; Villanueva, Neftalí

    2015-01-01

    The purpose of this paper is to show that, pace (Field, 2009), MacFarlane’s assessment relativism and expressivism should be sharply distinguished. We do so by arguing that relativism and expressivism exemplify two very different approaches to context-dependence. Relativism, on the one hand, shares with other contemporary approaches a bottom–up, building block, model, while expressivism is part of a different tradition, one that might include Lewis’ epistemic contextualism and Frege’s content individuation, with which it shares an organic model to deal with context-dependence. The building-block model and the organic model, and thus relativism and expressivism, are set apart with the aid of a particular test: only the building-block model is compatible with the idea that there might be analytically equivalent, and yet different, propositions. PMID:26635690

  13. Cluster-based control of a separating flow over a smoothly contoured ramp

    NASA Astrophysics Data System (ADS)

    Kaiser, Eurika; Noack, Bernd R.; Spohn, Andreas; Cattafesta, Louis N.; Morzyński, Marek

    2017-12-01

    The ability to manipulate and control fluid flows is of great importance in many scientific and engineering applications. The proposed closed-loop control framework addresses a key issue of model-based control: The actuation effect often results from slow dynamics of strongly nonlinear interactions which the flow reveals at timescales much longer than the prediction horizon of any model. Hence, we employ a probabilistic approach based on a cluster-based discretization of the Liouville equation for the evolution of the probability distribution. The proposed methodology frames high-dimensional, nonlinear dynamics into low-dimensional, probabilistic, linear dynamics which considerably simplifies the optimal control problem while preserving nonlinear actuation mechanisms. The data-driven approach builds upon a state space discretization using a clustering algorithm which groups kinematically similar flow states into a low number of clusters. The temporal evolution of the probability distribution on this set of clusters is then described by a control-dependent Markov model. This Markov model can be used as predictor for the ergodic probability distribution for a particular control law. This probability distribution approximates the long-term behavior of the original system on which basis the optimal control law is determined. We examine how the approach can be used to improve the open-loop actuation in a separating flow dominated by Kelvin-Helmholtz shedding. For this purpose, the feature space, in which the model is learned, and the admissible control inputs are tailored to strongly oscillatory flows.

  14. Feasibility study for distributed dose monitoring in ionizing radiation environments with standard and custom-made optical fibers

    NASA Astrophysics Data System (ADS)

    Van Uffelen, Marco; Berghmans, Francis; Brichard, Benoit; Borgermans, Paul; Decréton, Marc C.

    2002-09-01

    Optical fibers stimulate much interest since many years for their potential use in various nuclear environments, both for radiation tolerant and EMI-free data communication as well as for distributed sensing. Besides monitoring temperature and stress, measuring ionizing doses with optical fibers is particularly essential in applications such as long-term nuclear waste disposal monitoring, and for real-time aging monitoring of power and signal cables installed inside a reactor containment building. Two distinct options exist to perform optical fiber dosimetry. First, find an accurate model for a restricted application field that accounts for all the parameters that influence the radiation response of a standard fiber, or second, develop a dedicated fiber with a response that will solely depend on the deposited energy. Using various models presented in literature, we evaluate both standard commercially available and custom-made optical fibers under gamma radiation, particularly for distributed dosimetry applications with an optical time domain reflectometer (OTDR). We therefore present the radiation induced attenuation at near-infrared telecom wavelengths up to MGy total dose levels, with dose rates ranging from about 1 Gy/h up to 1 kGy/h, whereas temperature was raised step-wise from 25 °C to 85 °C. Our results allow to determine and compare the practical limitations of distributed dose measurements with both fiber types in terms of temperature sensitivity, dose estimation accuracy and spatial resolution.

  15. Supervised Outlier Detection in Large-Scale Mvs Point Clouds for 3d City Modeling Applications

    NASA Astrophysics Data System (ADS)

    Stucker, C.; Richard, A.; Wegner, J. D.; Schindler, K.

    2018-05-01

    We propose to use a discriminative classifier for outlier detection in large-scale point clouds of cities generated via multi-view stereo (MVS) from densely acquired images. What makes outlier removal hard are varying distributions of inliers and outliers across a scene. Heuristic outlier removal using a specific feature that encodes point distribution often delivers unsatisfying results. Although most outliers can be identified correctly (high recall), many inliers are erroneously removed (low precision), too. This aggravates object 3D reconstruction due to missing data. We thus propose to discriminatively learn class-specific distributions directly from the data to achieve high precision. We apply a standard Random Forest classifier that infers a binary label (inlier or outlier) for each 3D point in the raw, unfiltered point cloud and test two approaches for training. In the first, non-semantic approach, features are extracted without considering the semantic interpretation of the 3D points. The trained model approximates the average distribution of inliers and outliers across all semantic classes. Second, semantic interpretation is incorporated into the learning process, i.e. we train separate inlieroutlier classifiers per semantic class (building facades, roof, ground, vegetation, fields, and water). Performance of learned filtering is evaluated on several large SfM point clouds of cities. We find that results confirm our underlying assumption that discriminatively learning inlier-outlier distributions does improve precision over global heuristics by up to ≍ 12 percent points. Moreover, semantically informed filtering that models class-specific distributions further improves precision by up to ≍ 10 percent points, being able to remove very isolated building, roof, and water points while preserving inliers on building facades and vegetation.

  16. USING TIME VARIANT VOLTAGE TO CALCULATE ENERGY CONSUMPTION AND POWER USE OF BUILDING SYSTEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makhmalbaf, Atefe; Augenbroe , Godfried

    2015-12-09

    Buildings are the main consumers of electricity across the world. However, in the research and studies related to building performance assessment, the focus has been on evaluating the energy efficiency of buildings whereas the instantaneous power efficiency has been overlooked as an important aspect of total energy consumption. As a result, we never developed adequate models that capture both thermal and electrical characteristics (e.g., voltage) of building systems to assess the impact of variations in the power system and emerging technologies of the smart grid on buildings energy and power performance and vice versa. This paper argues that the powermore » performance of buildings as a function of electrical parameters should be evaluated in addition to systems’ mechanical and thermal behavior. The main advantage of capturing electrical behavior of building load is to better understand instantaneous power consumption and more importantly to control it. Voltage is one of the electrical parameters that can be used to describe load. Hence, voltage dependent power models are constructed in this work and they are coupled with existing thermal energy models. Lack of models that describe electrical behavior of systems also adds to the uncertainty of energy consumption calculations carried out in building energy simulation tools such as EnergyPlus, a common building energy modeling and simulation tool. To integrate voltage-dependent power models with thermal models, the thermal cycle (operation mode) of each system was fed into the voltage-based electrical model. Energy consumption of systems used in this study were simulated using EnergyPlus. Simulated results were then compared with estimated and measured power data. The mean square error (MSE) between simulated, estimated, and measured values were calculated. Results indicate that estimated power has lower MSE when compared with measured data than simulated results. Results discussed in this paper will illustrate the significance of enhancing building energy models with electrical characteristics. This would support different studies such as those related to modernization of the power system that require micro scale building-grid interaction, evaluating building energy efficiency with power efficiency considerations, and also design and control decisions that rely on accuracy of building energy simulation results.« less

  17. Building America Case Study: Ventilation System Effectiveness and Tested Indoor Air Quality Impacts, Tyler, Texas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ?Ventilation system effectiveness testing was conducted at two unoccupied, single-family, detached lab homes at the University of Texas - Tyler. Five ventilation system tests were conducted with various whole-building ventilation systems. Multizone fan pressurization testing characterized building and zone enclosure leakage. PFT testing showed multizone air change rates and interzonal airflow filtration. Indoor air recirculation by a central air distribution system can help improve the exhaust ventilation system by way of air mixing and filtration. In contrast, the supply and balanced ventilation systems showed that there is a significant benefit to drawing outside air from a known outside location, andmore » filtering and distributing that air. Compared to the Exhaust systems, the CFIS and ERV systems showed better ventilation air distribution and lower concentrations of particulates, formaldehyde and other VOCs. System improvement percentages were estimated based on four System Factor Categories: Balance, Distribution, Outside Air Source, and Recirculation Filtration. Recommended System Factors could be applied to reduce ventilation fan airflow rates relative to ASHRAE Standard 62.2 to save energy and reduce moisture control risk in humid climates. HVAC energy savings were predicted to be 8-10%, or $50-$75/year. Cumulative particle counts for six particle sizes, and formaldehyde and other Top 20 VOC concentrations were measured in multiple zones. The testing showed that single-point exhaust ventilation was inferior as a whole-house ventilation strategy.« less

  18. Methodology to build medical ontology from textual resources.

    PubMed

    Baneyx, Audrey; Charlet, Jean; Jaulent, Marie-Christine

    2006-01-01

    In the medical field, it is now established that the maintenance of unambiguous thesauri goes through ontologies. Our research task is to help pneumologists code acts and diagnoses with a software that represents medical knowledge through a domain ontology. In this paper, we describe our general methodology aimed at knowledge engineers in order to build various types of medical ontologies based on terminology extraction from texts. The hypothesis is to apply natural language processing tools to textual patient discharge summaries to develop the resources needed to build an ontology in pneumology. Results indicate that the joint use of distributional analysis and lexico-syntactic patterns performed satisfactorily for building such ontologies.

  19. Applications of the Theory of Distributed and Real Time Systems to the Development of Large-Scale Timing Based Systems.

    DTIC Science & Technology

    1996-04-01

    time systems . The focus is on the study of ’building-blocks’ for the construction of reliable and efficient systems. Our works falls into three...Members of MIT’s Theory of Distributed Systems group have continued their work on modelling, designing, verifying and analyzing distributed and real

  20. Baghdad ER - Revisited

    DTIC Science & Technology

    2009-09-01

    delicious food . The next big event was a body building contest involving both our men and women. Not only did several China Dragons attend, but our...debate on key issues. This report is cleared for public release ; distribution is unlimited. ***** This publication is a work of the U.S. Government...NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release ; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT 15

  1. Programming your way out of the past: ISIS and the META Project

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Marzullo, Keith

    1989-01-01

    The ISIS distributed programming system and the META Project are described. The ISIS programming toolkit is an aid to low-level programming that makes it easy to build fault-tolerant distributed applications that exploit replication and concurrent execution. The META Project is reexamining high-level mechanisms such as the filesystem, shell language, and administration tools in distributed systems.

  2. The relationship between building design and residents' quality of life in extra care housing schemes.

    PubMed

    Orrell, Alison; McKee, Kevin; Torrington, Judith; Barnes, Sarah; Darton, Robin; Netten, Ann; Lewis, Alan

    2013-05-01

    Well-designed housing is recognised as being an important factor in promoting a good quality of life. Specialised housing models incorporating care services, such as extra care housing (ECH) schemes are seen as enabling older people to maintain a good quality of life despite increasing health problems that can accompany ageing. Despite the variation in ECH building design little is known about the impact of ECH building design on the quality of life of building users. The evaluation of older people's living environments (EVOLVE) study collected cross-sectional data on building design and quality of life in 23 ECH schemes in England, UK. Residents' quality of life was assessed using the schedule for the evaluation of individual quality of life-direct weighting (SEIQoL-DW) and on the four domains of control, autonomy, self-realisation and pleasure on the CASP-19. Building design was measured on 12 user-related domains by means of a new tool; the EVOLVE tool. Using multilevel linear regression, significant associations were found between several aspects of building design and quality of life. Furthermore, there was evidence that the relationship between building design and quality of life was partly mediated by the dependency of participants and scheme size (number of living units). Our findings suggest that good quality building design in ECH can support the quality of life of residents, but that designing features that support the needs of both relatively independent and frail users is problematic, with the needs of highly dependent users not currently supported as well as could be hoped by ECH schemes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. The distribution of common construction materials at risk to acid deposition in the United States

    NASA Astrophysics Data System (ADS)

    Lipfert, Frederick W.; Daum, Mary L.

    Information on the geographic distribution of various types of exposed materials is required to estimate the economic costs of damage to construction materials from acid deposition. This paper focuses on the identification, evaluation and interpretation of data describing the distributions of exterior construction materials, primarily in the United States. This information could provide guidance on how data needed for future economic assessments might be acquired in the most cost-effective ways. Materials distribution surveys from 16 cities in the U.S. and Canada and five related databases from government agencies and trade organizations were examined. Data on residential buildings are more commonly available than on nonresidential buildings; little geographically resolved information on distributions of materials in infrastructure was found. Survey results generally agree with the appropriate ancillary databases, but the usefulness of the databases is often limited by their coarse spatial resolution. Information on those materials which are most sensitive to acid deposition is especially scarce. Since a comprehensive error analysis has never been performed on the data required for an economic assessment, it is not possible to specify the corresponding detailed requirements for data on the distributions of materials.

  4. Palaeonummulites venosus: Natural growth rates and quantification by means of CT investigation

    NASA Astrophysics Data System (ADS)

    Kinoshita, Shunichi; Eder, Wolfgang; Woeger, Julia; Hohenegger, Johann; Briguglio, Antonino

    2016-04-01

    Symbiont-bearing larger benthic Foraminifera (LBF) are long-living marine (possibly >1 year), single-celled organisms with complex calcium carbonate shells. Reproduction period, longevity and chamber building rate of LBF are important for population dynamics studies. It was expected that growth experiments in laboratory cultures cannot be used for estimation of chamber building rates and longevity studies although the laboratory conditions were simulated to natural conditions. Therefore, it is necessary to study individual and population growth under natural conditions for getting natural information. Therefore, the 'natural laboratory' method was developed to calculate the averaged chamber building rate and averaged longevity of species based on monthly sampling at fixed sampling stations and to compare with laboratory cultures simulating environmental conditions as close as possible to the natural environment. Thus, in this study, samples of living individuals were collected in 16 monthly intervals at 50m depth in front of Sesoko Island, Okinawa, Japan. We used micro-computed tomography (microCT) to investigate the chamber number of every specimen from the samples immediately dried after sampling. Single non dried specimens were cultured and the time of chamber building was obtained using microphotographs counted for every specimen at 2 to 4 days time intervals. The investigation using the natural laboratory method of Palaeonummulites venosus is based on the decomposition of the monthly frequency distributions into normally distributed components. Then the shift of the component parameters mean and standard deviation was used to calculate the necessary character of maximum chamber number and the Michaelis Menten function was applied to estimate the chamber building rate under natural conditions. This resulted in two reproduction periods, the first starting in May and the second in September, both showing the same chamber building rates, where the first shows a slightly stronger increase in the initial part. Longevity seems to be round about one year. Due to the several reproduction periods, the existence of small and large specimens in the same sample and the bimodal distributions can be explained. The cultured individuals shows a much lower chamber building rate, often demonstrating a longer period with no chamber production just after sampling, the results of sampling shock. This is the first time that it can be shown that chamber building rates and longevities cannot be based on laboratory investigations.

  5. Spending Paradox

    ERIC Educational Resources Information Center

    Kennedy, Mike

    2005-01-01

    How much does a school cost? It's a straightforward question, but the answer for many education administrators is a frustrating, "It depends." The cost of a school facility surely includes what an education institution pays for the building site, design plans, construction materials, workers who assemble the building, and furniture and equipment…

  6. Pathways from Poverty: Economic Development and Institution-Building on American Indian Reservations.

    ERIC Educational Resources Information Center

    Cornell, Stephen; Kalt, Joseph P.

    1990-01-01

    Comparative analysis of economic development on 15 American Indian reservations plus supplementary data on 100 reservations suggest that successful development depends on tribal sovereignty coupled with aggressive assertions of Indian control, effective social institution-building, and appropriate development choices tested against tribal cultural…

  7. Estimating air chemical emissions from research activities using stack measurement data.

    PubMed

    Ballinger, Marcel Y; Duchsherer, Cheryl J; Woodruff, Rodger K; Larson, Timothy V

    2013-03-01

    Current methods of estimating air emissions from research and development (R&D) activities use a wide range of release fractions or emission factors with bases ranging from empirical to semi-empirical. Although considered conservative, the uncertainties and confidence levels of the existing methods have not been reported. Chemical emissions were estimated from sampling data taken from four research facilities over 10 years. The approach was to use a Monte Carlo technique to create distributions of annual emission estimates for target compounds detected in source test samples. Distributions were created for each year and building sampled for compounds with sufficient detection frequency to qualify for the analysis. The results using the Monte Carlo technique without applying a filter to remove negative emission values showed almost all distributions spanning zero, and 40% of the distributions having a negative mean. This indicates that emissions are so low as to be indistinguishable from building background. Application of a filter to allow only positive values in the distribution provided a more realistic value for emissions and increased the distribution mean by an average of 16%. Release fractions were calculated by dividing the emission estimates by a building chemical inventory quantity. Two variations were used for this quantity: chemical usage, and chemical usage plus one-half standing inventory. Filters were applied so that only release fraction values from zero to one were included in the resulting distributions. Release fractions had a wide range among chemicals and among data sets for different buildings and/or years for a given chemical. Regressions of release fractions to molecular weight and vapor pressure showed weak correlations. Similarly, regressions of mean emissions to chemical usage, chemical inventory, molecular weight, and vapor pressure also gave weak correlations. These results highlight the difficulties in estimating emissions from R&D facilities using chemical inventory data. Air emissions from research operations are difficult to estimate because of the changing nature of research processes and the small quantity and wide variety of chemicals used. Analysis of stack measurements taken over multiple facilities and a 10-year period using a Monte Carlo technique provided a method to quantify the low emissions and to estimate release fractions based on chemical inventories. The variation in release fractions did not correlate well with factors investigated, confirming the complexities in estimating R&D emissions.

  8. A Nakanishi-based model illustrating the covariant extension of the pion GPD overlap representation and its ambiguities

    NASA Astrophysics Data System (ADS)

    Chouika, N.; Mezrag, C.; Moutarde, H.; Rodríguez-Quintero, J.

    2018-05-01

    A systematic approach for the model building of Generalized Parton Distributions (GPDs), based on their overlap representation within the DGLAP kinematic region and a further covariant extension to the ERBL one, is applied to the valence-quark pion's case, using light-front wave functions inspired by the Nakanishi representation of the pion Bethe-Salpeter amplitudes (BSA). This simple but fruitful pion GPD model illustrates the general model building technique and, in addition, allows for the ambiguities related to the covariant extension, grounded on the Double Distribution (DD) representation, to be constrained by requiring a soft-pion theorem to be properly observed.

  9. Coordinated Collaboration between Heterogeneous Distributed Energy Resources

    DOE PAGES

    Abdollahy, Shahin; Lavrova, Olga; Mammoli, Andrea

    2014-01-01

    A power distribution feeder, where a heterogeneous set of distributed energy resources is deployed, is examined by simulation. The energy resources include PV, battery storage, natural gas GenSet, fuel cells, and active thermal storage for commercial buildings. The resource scenario considered is one that may exist in a not too distant future. Two cases of interaction between different resources are examined. One interaction involves a GenSet used to partially offset the duty cycle of a smoothing battery connected to a large PV system. The other example involves the coordination of twenty thermal storage devices, each associated with a commercial building.more » Storage devices are intended to provide maximum benefit to the building, but it is shown that this can have a deleterious effect on the overall system, unless the action of the individual storage devices is coordinated. A network based approach is also introduced to calculate some type of effectiveness metric to all available resources which take part in coordinated operation. The main finding is that it is possible to achieve synergy between DERs on a system; however this required a unified strategy to coordinate the action of all devices in a decentralized way.« less

  10. Nest building is a novel method for indexing severity of alcohol withdrawal in mice.

    PubMed

    Greenberg, G D; Huang, L C; Spence, S E; Schlumbohm, J P; Metten, P; Ozburn, A R; Crabbe, J C

    2016-04-01

    Withdrawal after chronic ethanol (EtOH) affects body temperature, goal-directed behavior and motor function in mice and increases general central nervous system excitability. Nest-building tests have been used to assay these states but to this point have not been employed as measures of EtOH withdrawal severity. We first refined nest-scoring methods using a genetically heterogeneous stock of mice (HS/Npt). Mice were then made physically dependent following three days of chronic EtOH vapor inhalation to produce average blood EtOH concentrations (BECs) of 1.89 mg/mL. EtOH withdrawal affected the progression of nest building over time when mice were tested 2-4 days after removal from three days of chronic exposure to EtOH. In a separate group of mice, chronic EtOH vapor inhalation (BECs 1.84 mg/mL) suppressed nest building over days 1-2 but not days 2-3 of withdrawal. In a following experiment, EtOH withdrawal dose-dependently slowed recovery of nest building for up to 32 h. Finally, we determined that long-lasting nest-building deficits extend to mice undergoing withdrawal from a high dose (4 g/kg) of acute EtOH. Sex differences for nest building were absent following EtOH exposure. In mice naïve to EtOH treatments, male mice had lower pre-test body temperatures and increased nest scores across a two-day testing period compared to females. These results suggest that nest building can be used to assess chronic and acute EtOH withdrawal severity in mice. Published by Elsevier B.V.

  11. Rapidly locating and characterizing pollutant releases in buildings.

    PubMed

    Sohn, Michael D; Reynolds, Pamela; Singh, Navtej; Gadgil, Ashok J

    2002-12-01

    Releases of airborne contaminants in or near a building can lead to significant human exposures unless prompt response measures are taken. However, possible responses can include conflicting strategies, such as shutting the ventilation system off versus running it in a purge mode or having occupants evacuate versus sheltering in place. The proper choice depends in part on knowing the source locations, the amounts released, and the likely future dispersion routes of the pollutants. We present an approach that estimates this information in real time. It applies Bayesian statistics to interpret measurements of airborne pollutant concentrations from multiple sensors placed in the building and computes best estimates and uncertainties of the release conditions. The algorithm is fast, capable of continuously updating the estimates as measurements stream in from sensors. We demonstrate the approach using a hypothetical pollutant release in a five-room building. Unknowns to the interpretation algorithm include location, duration, and strength of the source, and some building and weather conditions. Two sensor sampling plans and three levels of data quality are examined. Data interpretation in all examples is rapid; however, locating and characterizing the source with high probability depends on the amount and quality of data and the sampling plan.

  12. Long-Term Monitoring of Mini-Split Ductless Heat Pumps in the Northeast

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ueno, K.; Loomis, H.

    Transformations, Inc. has extensive experience building their high performance housing at a variety of Massachusetts locations, in both a production and custom home setting. The majority of their construction uses mini-split heat pumps (MSHPs) for space conditioning. This research covered the long-term performance of MSHPs in Zone 5A; it is the culmination of up to 3 years' worth of monitoring in a set of eight houses. This research examined electricity use of MSHPs, distributions of interior temperatures and humidity when using simplified (two-point) heating systems in high-performance housing, and the impact of open-door/closed-door status on temperature distributions. The use ofmore » simplified space conditioning distribution (through use of MSHPs) provides significant first cost savings, which are used to offset the increased investment in the building enclosure.« less

  13. Airborne agent concentration analysis

    DOEpatents

    Gelbard, Fred

    2004-02-03

    A method and system for inferring airborne contaminant concentrations in rooms without contaminant sensors, based on data collected by contaminant sensors in other rooms of a building, using known airflow interconnectivity data. The method solves a least squares problem that minimizes the difference between measured and predicted contaminant sensor concentrations with respect to an unknown contaminant release time. Solutions are constrained to providing non-negative initial contaminant concentrations in all rooms. The method can be used to identify a near-optimal distribution of sensors within the building, when then number of available sensors is less than the total number of rooms. This is achieved by having a system-sensor matrix that is non-singular, and by selecting that distribution which yields the lowest condition number of all the distributions considered. The method can predict one or more contaminant initial release points from the collected data.

  14. Technology Solutions Case Study: Ventilation System Effectiveness and Tested Indoor Air Quality Impacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. Rudd and D. Bergey

    Ventilation system effectiveness testing was conducted at two unoccupied, single-family, detached lab homes at the University of Texas - Tyler. Five ventilation system tests were conducted with various whole-building ventilation systems. Multizone fan pressurization testing characterized building and zone enclosure leakage. PFT testing showed multizone air change rates and interzonal airflow filtration. Indoor air recirculation by a central air distribution system can help improve the exhaust ventilation system by way of air mixing and filtration. In contrast, the supply and balanced ventilation systems showed that there is a significant benefit to drawing outside air from a known outside location, andmore » filtering and distributing that air. Compared to the Exhaust systems, the CFIS and ERV systems showed better ventilation air distribution and lower concentrations of particulates, formaldehyde and other VOCs.« less

  15. The distribution of indoor radon in Transylvania (Romania) - influence of the natural and anthropogenic factors

    NASA Astrophysics Data System (ADS)

    Cucos Dinu, Alexandra; Baciu, Calin; Dicu, Tiberius; Papp, Botond; Moldovan, Mircea; Bety Burghele, Denissa; Tenter, Ancuta; Szacsvai, Kinga

    2017-04-01

    Exposure to radon in homes and workplaces is now recognized as the most important natural factor in causing lung cancer. Radon activity is usually higher in buildings than in the outside atmosphere, as it may be released from building materials and soil beneath the constructions, and the concentration builds-up indoor, due to the low air renewal rates. Indoor radon levels can vary from one to multiple orders of magnitude over time and space, as it depends on several natural and anthropogenic factors, such us the radon concentration in soil under the construction, the weather conditions, the degree of containment in the areas where individuals are exposed, building materials, outside air, tap water and even city gas, the architecture, equipment (chimney, mechanical ventilation systems, etc.), the environmental parameters of the building (temperature, pressure, etc.), and on the occupants' lifestyle. The study presents the distribution of indoor radon in Transylvania, Romania, together with the measurements of radon in soil and soil water. Indoor radon measurements were performed by using CR-39 track detectors exposed for 3 months on ground-floor level of dwellings, according to the NRPB Measurement Protocol. Radon concentrations in soil and water were measured using the LUK3C device. A complete map was plotted at the date, based on 3300 indoor radon measurements, covering an area of about 42% of the Romanian territory. The indoor radon concentrations ranged from 5 to 3287 Bq m-3, with an updated preliminary arithmetic mean of 179 Bq m-3, and a geometric mean of 122 Bq m-3. In about 11% of the investigated grid cells the indoor radon concentrations exceed the threshold of 300 Bq m-3. The soil gas radon concentration varies from 0.8 to 169 kBq m-3, with a geometric mean of 26 kBq m-3. For water samples, the results show radon concentrations within the range of 0.3 - 352.2 kBq m-3, with a geometric mean of 7.7 Bq L-1. A weak correlation between the three sets of values (residential, soil, water) was observed, both as individual values, average values over the grid or county level. The highest concentrations of indoor radon were found in Bihor, Mures, Brasov, and Cluj. In these regions further investigation is needed on the factors influencing the accumulation of radon in high concentrations in indoor air, such as soil type and geology, ventilation, or constructive and architectural features. Acknowledgements: The research is supported by the project ID P_37_229, Contract No. 22/01.09.2016, with the title „Smart Systems for Public Safety through Control and Mitigation of Residential Radon linked with Energy Efficiency Optimization of Buildings in Romanian Major Urban Agglomerations SMART-RAD-EN" of the POC Programme.

  16. Survey of Secondary School Principals: Building Engineer Reporting Line Change. Report No. 8425.

    ERIC Educational Resources Information Center

    Farber, Irvin J.; Lytle, James H.

    This paper reports the results of a questionnaire distributed to all Philadelphia secondary school principals (with returns from 68 percent), eliciting their reactions to various aspects of the transfer to them of line authority for building engineers. Responses indicate that the process of assuming supervisory responsibility was not yet complete,…

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katipamula, Srinivas; Gowri, Krishnan; Hernandez, George

    This paper describes one such reference process that can be deployed to provide continuous automated conditioned-based maintenance management for buildings that have BIM, a building automation system (BAS) and a computerized maintenance management software (CMMS) systems. The process can be deployed using an open source transactional network platform, VOLTTRON™, designed for distributed sensing and controls and supports both energy efficiency and grid services.

  18. 51. Interior of launch support building, minuteman power processor at ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    51. Interior of launch support building, minuteman power processor at lower left, power distribution panel at center, old diesel control panel at lower right, diesel battery at upper right, view towards west - Ellsworth Air Force Base, Delta Flight, Launch Facility, On County Road T512, south of Exit 116 off I-90, Interior, Jackson County, SD

  19. Technological Supports for Onsite and Distance Education and Students' Perceptions of Acquisition of Thinking and Team-Building Skills

    ERIC Educational Resources Information Center

    Thomas, Jennifer D. E.; Morin, Danielle

    2010-01-01

    This paper compares students' perceptions of support provided in the acquisition of various thinking and team-building skills, resulting from the various activities, resources and technologies (ART) integrated into an upper level Distributed Computing (DC) course. The findings indicate that students perceived strong support for their acquisition…

  20. 36 CFR § 702.13 - Soliciting, vending, debt collection, and distribution of handbills.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... LIBRARY OF CONGRESS CONDUCT ON LIBRARY PREMISES § 702.13 Soliciting, vending, debt collection, and... article for sale, or the collecting of private debts on the grounds or within the buildings of the Library...) Peddlers and solicitors will not be permitted to enter Library buildings unless they have a specific...

  1. Data Sharing in DHT Based P2P Systems

    NASA Astrophysics Data System (ADS)

    Roncancio, Claudia; Del Pilar Villamil, María; Labbé, Cyril; Serrano-Alvarado, Patricia

    The evolution of peer-to-peer (P2P) systems triggered the building of large scale distributed applications. The main application domain is data sharing across a very large number of highly autonomous participants. Building such data sharing systems is particularly challenging because of the “extreme” characteristics of P2P infrastructures: massive distribution, high churn rate, no global control, potentially untrusted participants... This article focuses on declarative querying support, query optimization and data privacy on a major class of P2P systems, that based on Distributed Hash Table (P2P DHT). The usual approaches and the algorithms used by classic distributed systems and databases for providing data privacy and querying services are not well suited to P2P DHT systems. A considerable amount of work was required to adapt them for the new challenges such systems present. This paper describes the most important solutions found. It also identifies important future research trends in data management in P2P DHT systems.

  2. Multivariate statistical analysis of radiological data of building materials used in Tiruvannamalai, Tamilnadu, India.

    PubMed

    Ravisankar, R; Vanasundari, K; Suganya, M; Raghu, Y; Rajalakshmi, A; Chandrasekaran, A; Sivakumar, S; Chandramohan, J; Vijayagopal, P; Venkatraman, B

    2014-02-01

    Using γ spectrometry, the concentration of the naturally occurring radionuclides (226)Ra, (232)Th and (40)K has been measured in soil, sand, cement, clay and bricks, which are used as building materials in Tiruvannamalai, Tamilnadu, India. The radium equivalent activity (Raeq), the criterion formula (CF), indoor gamma absorbed dose rate (DR), annual effective dose (HR), activity utilization index (AUI), alpha index (Iα), gamma index (Iγ), external radiation hazard index (Hex), internal radiation hazard index (Hin), representative level index (RLI), excess lifetime cancer risk (ELCR) and annual gonadal dose equivalent (AGDE) associated with the natural radionuclides are calculated to assess the radiation hazard of the natural radioactivity in the building materials. From the analysis, it is found that these materials used for the construction of dwellings are safe for the inhabitants. The radiological data were processed using multivariate statistical methods to determine the similarities and correlation among the various samples. The frequency distributions for all radionuclides were analyzed. The data set consisted of 15 measured variables. The Pearson correlation coefficient reveals that the (226)Ra distribution in building materials is controlled by the variation of the (40)K concentration. Principal component analysis (PCA) yields a two-component representation of the acquired data from the building materials in Tiruvannamalai, wherein 94.9% of the total variance is explained. The resulting dendrogram of hierarchical cluster analysis (HCA) classified the 30 building materials into four major groups using 15 variables. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Sensitivity of Earthquake Loss Estimates to Source Modeling Assumptions and Uncertainty

    USGS Publications Warehouse

    Reasenberg, Paul A.; Shostak, Nan; Terwilliger, Sharon

    2006-01-01

    Introduction: This report explores how uncertainty in an earthquake source model may affect estimates of earthquake economic loss. Specifically, it focuses on the earthquake source model for the San Francisco Bay region (SFBR) created by the Working Group on California Earthquake Probabilities. The loss calculations are made using HAZUS-MH, a publicly available computer program developed by the Federal Emergency Management Agency (FEMA) for calculating future losses from earthquakes, floods and hurricanes within the United States. The database built into HAZUS-MH includes a detailed building inventory, population data, data on transportation corridors, bridges, utility lifelines, etc. Earthquake hazard in the loss calculations is based upon expected (median value) ground motion maps called ShakeMaps calculated for the scenario earthquake sources defined in WGCEP. The study considers the effect of relaxing certain assumptions in the WG02 model, and explores the effect of hypothetical reductions in epistemic uncertainty in parts of the model. For example, it addresses questions such as what would happen to the calculated loss distribution if the uncertainty in slip rate in the WG02 model were reduced (say, by obtaining additional geologic data)? What would happen if the geometry or amount of aseismic slip (creep) on the region's faults were better known? And what would be the effect on the calculated loss distribution if the time-dependent earthquake probability were better constrained, either by eliminating certain probability models or by better constraining the inherent randomness in earthquake recurrence? The study does not consider the effect of reducing uncertainty in the hazard introduced through models of attenuation and local site characteristics, although these may have a comparable or greater effect than does source-related uncertainty. Nor does it consider sources of uncertainty in the building inventory, building fragility curves, and other assumptions adopted in the loss calculations. This is a sensitivity study aimed at future regional earthquake source modelers, so that they may be informed of the effects on loss introduced by modeling assumptions and epistemic uncertainty in the WG02 earthquake source model.

  4. Indoor particle dynamics in a school office: determination of particle concentrations, deposition rates and penetration factors under naturally ventilated conditions.

    PubMed

    Cong, X C; Zhao, J J; Jing, Z; Wang, Q G; Ni, P F

    2018-05-09

    Recently, the problem of indoor particulate matter pollution has received much attention. An increasing number of epidemiological studies show that the concentration of atmospheric particulate matter has a significant effect on human health, even at very low concentrations. Most of these investigations have relied upon outdoor particle concentrations as surrogates of human exposures. However, considering that the concentration distribution of the indoor particulate matter is largely dependent on the extent to which these particles penetrate the building and on the degree of suspension in the indoor air, human exposures to particles of outdoor origin may not be equal to outdoor particle concentration levels. Therefore, it is critical to understand the relationship between the particle concentrations found outdoors and those found in indoor micro-environments. In this study, experiments were conducted using a naturally ventilated office located in Qingdao, China. The indoor and outdoor particle concentrations were measured at the same time using an optical counter with four size ranges. The particle size distribution ranged from 0.3 to 2.5 μm, and the experimental period was from April to September, 2016. Based on the experimental data, the dynamic and mass balance model based on time was used to estimate the penetration rate and deposition rate at air exchange rates of 0.03-0.25 h -1 . The values of the penetration rate and deposition velocity of indoor particles were determined to range from 0.45 to 0.82 h -1 and 1.71 to 2.82 m/h, respectively. In addition, the particulate pollution exposure in the indoor environment was analyzed to estimate the exposure hazard from indoor particulate matter pollution, which is important for human exposure to particles and associated health effects. The conclusions from this study can serve to provide a better understanding the dynamics and behaviors of airborne particle entering into buildings. And they will also highlight effective methods to reduce exposure to particles in office buildings.

  5. Impact of water quality on chlorine demand of corroding copper.

    PubMed

    Lytle, Darren A; Liggett, Jennifer

    2016-04-01

    Copper is widely used in drinking water premise plumbing system materials. In buildings such as hospitals, large and complicated plumbing networks make it difficult to maintain good water quality. Sustaining safe disinfectant residuals throughout a building to protect against waterborne pathogens such as Legionella is particularly challenging since copper and other reactive distribution system materials can exert considerable demands. The objective of this work was to evaluate the impact of pH and orthophosphate on the consumption of free chlorine associated with corroding copper pipes over time. A copper test-loop pilot system was used to control test conditions and systematically meet the study objectives. Chlorine consumption trends attributed to abiotic reactions with copper over time were different for each pH condition tested, and the total amount of chlorine consumed over the test runs increased with increasing pH. Orthophosphate eliminated chlorine consumption trends with elapsed time (i.e., chlorine demand was consistent across entire test runs). Orthophosphate also greatly reduced the total amount of chlorine consumed over the test runs. Interestingly, the total amount of chlorine consumed and the consumption rate were not pH dependent when orthophosphate was present. The findings reflect the complex and competing reactions at the copper pipe wall including corrosion, oxidation of Cu(I) minerals and ions, and possible oxidation of Cu(II) minerals, and the change in chlorine species all as a function of pH. The work has practical applications for maintaining chlorine residuals in premise plumbing drinking water systems including large buildings such as hospitals. Published by Elsevier Ltd.

  6. Multi-layered fabrication of large area PDMS flexible optical light guide sheets

    NASA Astrophysics Data System (ADS)

    Green, Robert; Knopf, George K.; Bordatchev, Evgueni V.

    2017-02-01

    Large area polydimethylsiloxane (PDMS) flexible optical light guide sheets can be used to create a variety of passive light harvesting and illumination systems for wearable technology, advanced indoor lighting, non-planar solar light collectors, customized signature lighting, and enhanced safety illumination for motorized vehicles. These thin optically transparent micro-patterned polymer sheets can be draped over a flat or arbitrarily curved surface. The light guiding behavior of the optical light guides depends on the geometry and spatial distribution of micro-optical structures, thickness and shape of the flexible sheet, refractive indices of the constituent layers, and the wavelength of the incident light. A scalable fabrication method that combines soft-lithography, closed thin cavity molding, partial curing, and centrifugal casting is described in this paper for building thin large area multi-layered PDMS optical light guide sheets. The proposed fabrication methodology enables the of internal micro-optical structures (MOSs) in the monolithic PDMS light guide by building the optical system layer-by-layer. Each PDMS layer in the optical light guide can have the similar, or a slightly different, indices of refraction that permit total internal reflection within the optical sheet. The individual molded layers may also be defect free or micro-patterned with microlens or reflecting micro-features. In addition, the bond between adjacent layers is ensured because each layer is only partially cured before the next functional layer is added. To illustrate the scalable build-by-layers fabrication method a three-layer mechanically flexible illuminator with an embedded LED strip is constructed and demonstrated.

  7. An international survey of building energy codes and their implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Meredydd; Roshchanka, Volha; Graham, Peter

    Buildings are key to low-carbon development everywhere, and many countries have introduced building energy codes to improve energy efficiency in buildings. Yet, building energy codes can only deliver results when the codes are implemented. For this reason, studies of building energy codes need to consider implementation of building energy codes in a consistent and comprehensive way. This research identifies elements and practices in implementing building energy codes, covering codes in 22 countries that account for 70% of global energy demand from buildings. Access to benefits of building energy codes depends on comprehensive coverage of buildings by type, age, size, andmore » geographic location; an implementation framework that involves a certified agency to inspect construction at critical stages; and independently tested, rated, and labeled building energy materials. Training and supporting tools are another element of successful code implementation, and their role is growing in importance, given the increasing flexibility and complexity of building energy codes. Some countries have also introduced compliance evaluation and compliance checking protocols to improve implementation. This article provides examples of practices that countries have adopted to assist with implementation of building energy codes.« less

  8. A long-term, integrated impact assessment of alternative building energy code scenarios in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Sha; Eom, Jiyong; Evans, Meredydd

    2014-04-01

    China is the second largest building energy user in the world, ranking first and third in residential and commercial energy consumption. Beginning in the early 1980s, the Chinese government has developed a variety of building energy codes to improve building energy efficiency and reduce total energy demand. This paper studies the impact of building energy codes on energy use and CO2 emissions by using a detailed building energy model that represents four distinct climate zones each with three building types, nested in a long-term integrated assessment framework GCAM. An advanced building stock module, coupled with the building energy model, ismore » developed to reflect the characteristics of future building stock and its interaction with the development of building energy codes in China. This paper also evaluates the impacts of building codes on building energy demand in the presence of economy-wide carbon policy. We find that building energy codes would reduce Chinese building energy use by 13% - 22% depending on building code scenarios, with a similar effect preserved even under the carbon policy. The impact of building energy codes shows regional and sectoral variation due to regionally differentiated responses of heating and cooling services to shell efficiency improvement.« less

  9. Impact of water quality on chlorine demand of corroding copper

    EPA Pesticide Factsheets

    Copper is widely used in drinking water premise plumbing system materials. In buildings such ashospitals, large and complicated plumbing networks make it difficult to maintain good water quality.Sustaining safe disinfectant residuals throughout a building to protect against waterborne pathogenssuch as Legionella is particularly challenging since copper and other reactive distribution system materialscan exert considerable demands. The objective of this work was to evaluate the impact of pH andorthophosphate on the consumption of free chlorine associated with corroding copper pipes over time. Acopper test-loop pilot system was used to control test conditions and systematically meet the studyobjectives. Chlorine consumption trends attributed to abiotic reactions with copper over time weredifferent for each pH condition tested, and the total amount of chlorine consumed over the test runsincreased with increasing pH. Orthophosphate eliminated chlorine consumption trends with elapsedtime (i.e., chlorine demand was consistent across entire test runs). Orthophosphate also greatly reducedthe total amount of chlorine consumed over the test runs. Interestingly, the total amount of chlorineconsumed and the consumption rate were not pH dependent when orthophosphate was present. Thefindings reflect the complex and competing reactions at the copper pipe wall including corrosion,oxidation of Cu(I) minerals and ions, and possible oxidation of Cu(II) minerals, and the change in

  10. A network-based framework for assessing infrastructure resilience: a case study of the London metro system.

    PubMed

    Chopra, Shauhrat S; Dillon, Trent; Bilec, Melissa M; Khanna, Vikas

    2016-05-01

    Modern society is increasingly dependent on the stability of a complex system of interdependent infrastructure sectors. It is imperative to build resilience of large-scale infrastructures like metro systems for addressing the threat of natural disasters and man-made attacks in urban areas. Analysis is needed to ensure that these systems are capable of withstanding and containing unexpected perturbations, and develop heuristic strategies for guiding the design of more resilient networks in the future. We present a comprehensive, multi-pronged framework that analyses information on network topology, spatial organization and passenger flow to understand the resilience of the London metro system. Topology of the London metro system is not fault tolerant in terms of maintaining connectivity at the periphery of the network since it does not exhibit small-world properties. The passenger strength distribution follows a power law, suggesting that while the London metro system is robust to random failures, it is vulnerable to disruptions on a few critical stations. The analysis further identifies particular sources of structural and functional vulnerabilities that need to be mitigated for improving the resilience of the London metro network. The insights from our framework provide useful strategies to build resilience for both existing and upcoming metro systems. © 2016 The Author(s).

  11. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  12. Scalable Indoor Localization via Mobile Crowdsourcing and Gaussian Process

    PubMed Central

    Chang, Qiang; Li, Qun; Shi, Zesen; Chen, Wei; Wang, Weiping

    2016-01-01

    Indoor localization using Received Signal Strength Indication (RSSI) fingerprinting has been extensively studied for decades. The positioning accuracy is highly dependent on the density of the signal database. In areas without calibration data, however, this algorithm breaks down. Building and updating a dense signal database is labor intensive, expensive, and even impossible in some areas. Researchers are continually searching for better algorithms to create and update dense databases more efficiently. In this paper, we propose a scalable indoor positioning algorithm that works both in surveyed and unsurveyed areas. We first propose Minimum Inverse Distance (MID) algorithm to build a virtual database with uniformly distributed virtual Reference Points (RP). The area covered by the virtual RPs can be larger than the surveyed area. A Local Gaussian Process (LGP) is then applied to estimate the virtual RPs’ RSSI values based on the crowdsourced training data. Finally, we improve the Bayesian algorithm to estimate the user’s location using the virtual database. All the parameters are optimized by simulations, and the new algorithm is tested on real-case scenarios. The results show that the new algorithm improves the accuracy by 25.5% in the surveyed area, with an average positioning error below 2.2 m for 80% of the cases. Moreover, the proposed algorithm can localize the users in the neighboring unsurveyed area. PMID:26999139

  13. Limestone characterization to model damage from acidic precipitation: Effect of pore structure on mass transfer

    USGS Publications Warehouse

    Leith, S.D.; Reddy, M.M.; Irez, W.F.; Heymans, M.J.

    1996-01-01

    The pore structure of Salem limestone is investigated, and conclusions regarding the effect of the pore geometry on modeling moisture and contaminant transport are discussed based on thin section petrography, scanning electron microscopy, mercury intrusion porosimetry, and nitrogen adsorption analyses. These investigations are compared to and shown to compliment permeability and capillary pressure measurements for this common building stone. Salem limestone exhibits a bimodal pore size distribution in which the larger pores provide routes for convective mass transfer of contaminants into the material and the smaller pores lead to high surface area adsorption and reaction sites. Relative permeability and capillary pressure measurements of the air/water system indicate that Salem limestone exhibits high capillarity end low effective permeability to water. Based on stone characterization, aqueous diffusion and convection are believed to be the primary transport mechanisms for pollutants in this stone. The extent of contaminant accumulation in the stone depends on the mechanism of partitioning between the aqueous and solid phases. The described characterization techniques and modeling approach can be applied to many systems of interest such as acidic damage to limestone, mass transfer of contaminants in concrete and other porous building materials, and modeling pollutant transport in subsurface moisture zones.

  14. Evidence for Multiple Phototransduction Pathways in a Reef-Building Coral

    PubMed Central

    Mason, Benjamin; Schmale, Michael; Gibbs, Patrick; Miller, Margaret W.; Wang, Qiang; Levay, Konstantin; Shestopalov, Valery; Slepak, Vladlen Z.

    2012-01-01

    Photosensitive behaviors and circadian rhythms are well documented in reef-building corals and their larvae, but the mechanisms responsible for photoreception have not been described in these organisms. Here we report the cloning, immunolocalization, and partial biochemical characterization of three opsins and four G proteins expressed in planulae of the Caribbean elkhorn coral, Acropora palmata. All three opsins (acropsins 1–3) possess conserved seven-pass transmembrane structure, and localize to distinct regions of coral planulae. Acropsin 1 was localized in the larval endoderm, while acropsin 2 was localized in solitary cells of the ectoderm. These rod-like cells displayed a remarkably polarized distribution, concentrated in the aboral end. We also cloned four A. palmata G protein alpha subunits. Three were homologs of vertebrate Gi, Go, and Gq. The fourth is presumably a novel G protein, which displays only 40% identity with the nearest known G protein, and we termed it Gc for “cnidarian”. We show that Gc and Gq can be activated by acropsins in a light-dependent manner in vitro. This indicates that at least acropsins 1 and 3 can form functional photoreceptors and potentially may play a role in color preference during settlement, vertical positioning and other light-guided behaviors observed in coral larvae. PMID:23227169

  15. Self-Monitoring and Knowledge-Building in Learning by Teaching

    ERIC Educational Resources Information Center

    Roscoe, Rod D.

    2014-01-01

    Prior research has established that learning by teaching depends upon peer tutors' engagement in knowledge-building, in which tutors integrate their knowledge and generate new knowledge through reasoning. However, many tutors adopt a "knowledge-telling bias" defined by shallow summarizing of source materials and didactic lectures.…

  16. Big Enough for Everyone?

    ERIC Educational Resources Information Center

    Coote, Anna

    2010-01-01

    The UK's coalition government wants to build a "Big Society." The Prime Minister says "we are all in this together" and building it is the responsibility of every citizen as well as every government department. The broad vision is welcome, but everything depends on how the vision is translated into policy and practice. The…

  17. Land use and urban morphology parameters for Vienna required for initialisation of the urban canopy model TEB derived via the concept of "local climate zones"

    NASA Astrophysics Data System (ADS)

    Trimmel, Heidelinde; Weihs, Philipp; Oswald, Sandro M.; Masson, Valéry; Schoetter, Robert

    2017-04-01

    Urban settlements are generally known for their high fractions of impermeable surfaces, large heat capacity and low humidity compared to rural areas which results in the well known phenomena of urban heat islands. The urbanized areas are growing which can amplify the intensity and frequency of situations with heat stress. The distribution of the urban heat island is not uniform though, because the urban environment is highly diverse regarding its morphology as building heights, building contiguity and configuration of open spaces and trees vary, which cause changes in the aerodynamic resistance for heat transfers and drag coefficients for momentum. Furthermore cities are characterized by highly variable physical surface properties as albedo, emissivity, heat capacity and thermal conductivity. The distribution of the urban heat island is influenced by these morphological and physical parameters as well as the distribution of unsealed soil and vegetation. These aspects influence the urban climate on micro- and mesoscale. For larger Vienna high resolution vector and raster geodatasets were processed to derive land use surface fractions and building morphology parameters on block scale following the methodology of Cordeau (2016). A dataset of building age and typology was cross checked and extended using satellite visual and thermal bands and linked to a database joining building age and typology with typical physical building parameters obtained from different studies (Berger et al. 2012 and Amtmann M and Altmann-Mavaddat N (2014)) and the OIB (Österreichisches Institut für Bautechnik). Using dominant parameters obtained using this high resolution mainly ground based data sets (building height, built area fraction, unsealed fraction, sky view factor) a local climate zone classification was produced using an algorithm. The threshold values were chosen according to Stewart and Oke (2012). This approach is compared to results obtained with the methodology of Bechtel et al. (2015) which is based on machine learning algorithms depending on satellite imagery and expert knowledge. The data on urban land use and morphology are used for initialisation of the town energy balance scheme TEB, but are also useful for other urban canopy models or studies related to urban planning or modelling of the urban system. The sensitivity of canyon air and surface temperatures, air specific humidity and horizontal wind simulated by the town energy balance scheme TEB (Masson, 2000) regarding the dominant parameters within the range determined for the present urban structure of Vienna and the expected changes (MA 18 (2011, 2014a+b), PGO (2011), Amtmann M and Altmann-Mavaddat N (2014)) was calculated for different land cover zones. While the buildings heights have a standard deviation of 3.2m which is 15% of the maximum average building height of one block the built and unsealed surface fraction vary stronger with around 30% standard deviation. The pre 1919 structure of Vienna is rather uniform and easier to describe, the later building structure is more diverse regarding morphological as well as physical building parameters. Therefore largest uncertainties are possible at the urban rims where also the highest development is expected. The analysis will be focused on these areas. Amtmann M and Altmann-Mavaddat N (2014) Eine Typology österreichischer Wohngebäude, Österreichische Energieargentur - Austrian Energy Agency, TABULA/EPISCOPE Bechtel B, Alexander P, Böhner J, et al (2015) Mapping Local Climate Zones for a Worldwide Database of the Form and Function of Cities. ISPRS Int J Geo-Inf 4:199-219. doi: 10.3390/ijgi4010199 Berger T, Formayer H, Smutny R, Neururer C, Passawa R (2012) Auswirkungen des Klimawandelsauf den thermischen Komfort in Bürogebäuden, Berichte aus Energie- und Umweltforschung Cordeau E / Les îlots morphologiques urbains (IMU) / IAU îdF / 2016 Magistratsabteilung 18 - Stadtentwicklung und Stadtplanung, Wien - MA 18 (2011) Siedlungsformen für die Stadterweiterung, MA 18 (2014a) Smart City Wien - Rahmenstrategie MA 18 (2014b) Stadtentwicklungsplan STEP 2025, www.step.wien.at Masson V (2000) A physically-based scheme for the urban energy budget in atmospheric models. Bound-Layer Meteorol 94:357-397. doi: 10.1023/A:1002463829265 PGO (Planungsgemeinschaft Ost) (2011) stadtregion +, Planungskooperation zur räumlichen Entwicklung der Stadtregion Wien Niederösterreich Burgenland. Stewart ID, Oke TR (2012) Local climate zones for urban temperature studies. Bull Am Meteorol Soc 93:1879-1900.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neuscamman, Stephanie J.

    This section describes ways in which an urban environment can affect the distribution of airborne radiological material. In an urban area, winds at street level are significantly more variable and complex than the prevailing winds above the buildings. Elevated winds may be uniform and representative of the general flow over the surrounding area, but buildings influence the local flow such that the winds below the building heights vary significantly in location and time (Hanna et al 2006). For a release of material near an individual building, the complex effect of the building on the airflow may locally enhance the airmore » concentration of released material in some regions near the building and reduce it in others compared to a release in open terrain. However, the overall effect of an individual building is to induce a rapid enlargement and dilution of an incident plume from an isolated source upwind of the building (Hosker 1984). A plume spreading through an urban environment of multiple buildings will experience enhanced mixing and greater spreading of the contaminant plume in both the vertical and horizontal directions, compared to the same release in open terrain.« less

  19. Retrofit of a Multifamily Mass Masonry Building in New England

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ueno, K.; Kerrigan, P.; Wytrykowska, H.

    2013-08-01

    Merrimack Valley Habitat for Humanity (MVHfH) has partnered with Building Science Corporation to provide high performance affordable housing for 10 families in the retrofit of an existing brick building (a former convent) into condominiums. The research performed for this project provides information regarding advanced retrofit packages for multi-family masonry buildings in Cold climates. In particular, this project demonstrates safe, durable, and cost-effective solutions that will potentially benefit millions of multi-family brick buildings throughout the East Coast and Midwest (Cold climates). The retrofit packages provide insight on the opportunities for and constraints on retrofitting multifamily buildings with ambitious energy performance goalsmore » but a limited budget. The condominium conversion project will contribute to several areas of research on enclosures, space conditioning, and water heating. Enclosure items include insulation of mass masonry building on the interior, airtightness of these types of retrofits, multi-unit building compartmentalization, window selection, and roof insulation strategies. Mechanical system items include combined hydronic and space heating systems with hydronic distribution in small (low load) units, and ventilation system retrofits for multifamily buildings.« less

  20. Retrofit of a MultiFamily Mass Masonry Building in New England

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ueno, K.; Kerrigan, P.; Wytrykowska, H.

    2013-08-01

    Merrimack Valley Habitat for Humanity (MVHfH) has partnered with Building Science Corporation to provide high performance affordable housing for 10 families in the retrofit of an existing brick building (a former convent) into condominiums. The research performed for this project provides information regarding advanced retrofit packages for multi-family masonry buildings in Cold climates. In particular, this project demonstrates safe, durable, and cost-effective solutions that will potentially benefit millions of multi-family brick buildings throughout the East Coast and Midwest (Cold climates). The retrofit packages provide insight on the opportunities for and constraints on retrofitting multifamily buildings with ambitious energy performance goalsmore » but a limited budget. The condominium conversion project will contribute to several areas of research on enclosures, space conditioning, and water heating. Enclosure items include insulation of mass masonry building on the interior, airtightness of these types of retrofits, multi-unit building compartmentalization, window selection, and roof insulation strategies. Mechanical system items include combined hydronic and space heating systems with hydronic distribution in small (low load) units, and ventilation system retrofits for multifamily buildings.« less

  1. Distributed Knowledge Base Systems for Diagnosis and Information Retrieval.

    DTIC Science & Technology

    1983-11-01

    social system metaphors State University. for distributed problem solving: Introduction to the issue. IEEE Newell. A. and Simon, H. A. (1972) Human...experts and Sriram Mahalingam wha-helped think out the probLema associated with building Auto-Mech. Research on diagnostic expert systemas for the

  2. APPLICATION OF A FULLY DISTRIBUTED WASHOFF AND TRANSPORT MODEL FOR A GULF COAST WATERSHED

    EPA Science Inventory

    Advances in hydrologic modeling have been shown to improve the accuracy of rainfall runoff simulation and prediction. Building on the capabilities of distributed hydrologic modeling, a water quality model was developed to simulate buildup, washoff, and advective transport of a co...

  3. Iterative model building, structure refinement and density modification with the PHENIX AutoBuild wizard

    PubMed Central

    Terwilliger, Thomas C.; Grosse-Kunstleve, Ralf W.; Afonine, Pavel V.; Moriarty, Nigel W.; Zwart, Peter H.; Hung, Li-Wei; Read, Randy J.; Adams, Paul D.

    2008-01-01

    The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 Å, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution. PMID:18094468

  4. Brief communication: coaxial lines for multiphase power distribution.

    PubMed

    Barnes, F S; Harwick, P; Banerjee, A

    1996-01-01

    A coaxial cable can be used to reduce the magnetic and electric fields that extend into environments in the vicinity of transmission lines and distribution lines and in-house or building wiring for power distribution systems. The use of the coaxial geometry may prove useful in cases where there are environmental concerns with respect to health effects and in cases where there is a need to run high-speed data communications in close proximity to power distribution systems.

  5. A distributed component framework for science data product interoperability

    NASA Technical Reports Server (NTRS)

    Crichton, D.; Hughes, S.; Kelly, S.; Hardman, S.

    2000-01-01

    Correlation of science results from multi-disciplinary communities is a difficult task. Traditionally data from science missions is archived in proprietary data systems that are not interoperable. The Object Oriented Data Technology (OODT) task at the Jet Propulsion Laboratory is working on building a distributed product server as part of a distributed component framework to allow heterogeneous data systems to communicate and share scientific results.

  6. Cryptanalysis of the Sodark Family of Cipher Algorithms

    DTIC Science & Technology

    2017-09-01

    software project for building three-bit LUT circuit representations of S- boxes is available as a GitHub repository [40]. It contains several improvements...DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The...second- and third-generation automatic link establishment (ALE) systems for high frequency radios. Radios utilizing ALE technology are in use by a

  7. The building loads analysis system thermodynamics (BLAST) program, Version 2. 0: input booklet. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sowell, E.

    1979-06-01

    The Building Loads Analysis and System Thermodynamics (BLAST) program is a comprehensive set of subprograms for predicting energy consumption in buildings. There are three major subprograms: (1) the space load predicting subprogram, which computes hourly space loads in a building or zone based on user input and hourly weather data; (2) the air distribution system simulation subprogram, which uses the computed space load and user inputs describing the building air-handling system to calculate hot water or steam, chilled water, and electric energy demands; and (3) the central plant simulation program, which simulates boilers, chillers, onsite power generating equipment and solarmore » energy systems and computes monthly and annual fuel and electrical power consumption and plant life cycle cost.« less

  8. The potential for distributed generation in Japanese prototype buildings: A DER-CAM analysis of policy, tariff design, building energy use, and technology development (English Version)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Nan; Marnay, Chris; Firestone, Ryan

    The August 2003 blackout of the northeastern U.S. and CANADA caused great economic losses and inconvenience to New York City and other affected areas. The blackout was a warning to the rest of the world that the ability of conventional power systems to meet growing electricity demand is questionable. Failure of large power systems can lead to serious emergencies. Introduction of on-site generation, renewable energy such as solar and wind power and the effective utilization of exhaust heat is needed, to meet the growing energy demands of the residential and commercial sectors. Additional benefit can be achieved by integrating thesemore » distributed technologies into distributed energy resource (DER) systems. This work demonstrates a method for choosing and designing economically optimal DER systems. An additional purpose of this research is to establish a database of energy tariffs, DER technology cost and performance characteristics, and building energy consumption for Japan. This research builds on prior DER studies at the Ernest Orlando Lawrence Berkeley National Laboratory (LBNL) and with their associates in the Consortium for Electric Reliability Technology Solutions (CERTS) and operation, including the development of the microgrid concept, and the DER selection optimization program, the Distributed Energy Resources Customer Adoption Model (DER-CAM). DER-CAM is a tool designed to find the optimal combination of installed equipment and an idealized operating schedule to minimize a site's energy bills, given performance and cost data on available DER technologies, utility tariffs, and site electrical and thermal loads over a test period, usually an historic year. Since hourly electric and thermal energy data are rarely available, they are typically developed by building simulation for each of six end use loads used to model the building: electric-only loads, space heating, space cooling, refrigeration, water heating, and natural-gas-only loads. DER-CAM provides a global optimization, albeit idealized, that shows how the necessary useful energy loads can be provided for at minimum cost by selection and operation of on-site generation, heat recovery, cooling, and efficiency improvements. This study examines five prototype commercial buildings and uses DER-CAM to select the economically optimal DER system for each. The five building types are office, hospital, hotel, retail, and sports facility. Each building type was considered for both 5,000 and 10,000 square meter floor sizes. The energy consumption of these building types is based on building energy simulation and published literature. Based on the optimization results, energy conservation and the emissions reduction were also evaluated. Furthermore, a comparison study between Japan and the U.S. has been conducted covering the policy, technology and the utility tariffs effects on DER systems installations. This study begins with an examination of existing DER research. Building energy loads were then generated through simulation (DOE-2) and scaled to match available load data in the literature. Energy tariffs in Japan and the U.S. were then compared: electricity prices did not differ significantly, while commercial gas prices in Japan are much higher than in the U.S. For smaller DER systems, the installation costs in Japan are more than twice those in the U.S., but this difference becomes smaller with larger systems. In Japan, DER systems are eligible for a 1/3 rebate of installation costs, while subsidies in the U.S. vary significantly by region and application. For 10,000 m{sup 2} buildings, significant decreases in fuel consumption, carbon emissions, and energy costs were seen in the economically optimal results. This was most noticeable in the sports facility, followed the hospital and hotel. This research demonstrates that office buildings can benefit from CHP, in contrast to popular opinion. For hospitals and sports facilities, the use of waste heat is particularly effective for water and space heating. For the other building types, waste heat is most effectively used for both heating and cooling. The same examination was done for the 5,000 m{sup 2} buildings. Although CHP installation capacity is smaller and the payback periods are longer, economic, fuel efficiency, and environmental benefits are still seen. While these benefits remain even when subsidies are removed, the increased installation costs lead to lower levels of installation capacity and thus benefit.« less

  9. Fungal colonization of fiberglass insulation in the air distribution system of a multi-story office building: VOC production and possible relationship to a sick building syndrome.

    PubMed

    Ahearn, D G; Crow, S A; Simmons, R B; Price, D L; Noble, J A; Mishra, S K; Pierson, D L

    1996-05-01

    Complaints characteristic of those for sick building syndrome prompted mycological investigations of a modern multi-story office building on the Gulf coast in the Southeastern United States (Houston-Galveston area). The air handling units and fiberglass duct liner of the heating, ventilating and air conditioning system of the building, without a history of catastrophic or chronic water damage, demonstrated extensive colonization with Penicillium spp and Cladosporium herbarum. Although dense fungal growth was observed on surfaces within the heating-cooling system, most air samples yielded fewer than 200 CFU m-3. Several volatile compounds found in the building air were released also from colonized fiberglass. Removal of colonized insulation from the floor receiving the majority of complaints of mouldy air and continuous operation of the units supplying this floor resulted in a reduction in the number of complaints.

  10. Fungal colonization of fiberglass insulation in the air distribution system of a multi-story office building: VOC production and possible relationship to a sick building syndrome

    NASA Technical Reports Server (NTRS)

    Ahearn, D. G.; Crow, S. A.; Simmons, R. B.; Price, D. L.; Noble, J. A.; Mishra, S. K.; Pierson, D. L.

    1996-01-01

    Complaints characteristic of those for sick building syndrome prompted mycological investigations of a modern multi-story office building on the Gulf coast in the Southeastern United States (Houston-Galveston area). The air handling units and fiberglass duct liner of the heating, ventilating and air conditioning system of the building, without a history of catastrophic or chronic water damage, demonstrated extensive colonization with Penicillium spp and Cladosporium herbarum. Although dense fungal growth was observed on surfaces within the heating-cooling system, most air samples yielded fewer than 200 CFU m-3. Several volatile compounds found in the building air were released also from colonized fiberglass. Removal of colonized insulation from the floor receiving the majority of complaints of mouldy air and continuous operation of the units supplying this floor resulted in a reduction in the number of complaints.

  11. Building Area Extraction from Polarimetric SAR Data via Stationarity Detection and Circular-Pol Correlation Coefficient

    NASA Astrophysics Data System (ADS)

    Xiang, Deliang; Su, Yi; Ban, Yifeng

    2015-04-01

    Since the buildings have complex geometries and may be misclassified as forests or mountains with volume scattering due to the significant cross-pol backscatter and lack reflection symmetry, especially the slant-oriented buildings, building area extraction is a challenging problem. In this paper, the time-frequency decomposition technique is adopted to acquire subaperture images, which correspond to the same scene responses under different azimuthal look angles. Stationarity detection approach with polarimetric G0 distribution is proposed to extract ortho-orientedbuildings and the circular polarization correlation coefficient is optimal in characterizing slant-oriented buildings. We test the aforementioned method using ESAR image with L-band. The results demonstrate that the proposed method can effectively extract both ortho-oriented and slant-oriented buildings and the overall detection accuracy as well as kappa value is 10%-20% higher than the compared methods.

  12. Transactive Control of Commercial Buildings for Demand Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hao, He; Corbin, Charles D.; Kalsi, Karanjit

    Transactive control is a type of distributed control strategy that uses market mechanism to engage self-interested responsive loads to achieve power balance in the electrical power grid. In this paper, we propose a transactive control approach of commercial building Heating, Ventilation, and Air- Conditioning (HVAC) systems for demand response. We first describe the system models, and identify their model parameters using data collected from Systems Engineering Building (SEB) located on our Pacific Northwest National Laboratory (PNNL) campus. We next present a transactive control market structure for commercial building HVAC system, and describe its agent bidding and market clearing strategies. Severalmore » case studies are performed in a simulation environment using Building Control Virtual Test Bed (BCVTB) and calibrated SEB EnergyPlus model. We show that the proposed transactive control approach is very effective at peak clipping, load shifting, and strategic conservation for commercial building HVAC systems.« less

  13. Using sketch-map coordinates to analyze and bias molecular dynamics simulations

    PubMed Central

    Tribello, Gareth A.; Ceriotti, Michele; Parrinello, Michele

    2012-01-01

    When examining complex problems, such as the folding of proteins, coarse grained descriptions of the system drive our investigation and help us to rationalize the results. Oftentimes collective variables (CVs), derived through some chemical intuition about the process of interest, serve this purpose. Because finding these CVs is the most difficult part of any investigation, we recently developed a dimensionality reduction algorithm, sketch-map, that can be used to build a low-dimensional map of a phase space of high-dimensionality. In this paper we discuss how these machine-generated CVs can be used to accelerate the exploration of phase space and to reconstruct free-energy landscapes. To do so, we develop a formalism in which high-dimensional configurations are no longer represented by low-dimensional position vectors. Instead, for each configuration we calculate a probability distribution, which has a domain that encompasses the entirety of the low-dimensional space. To construct a biasing potential, we exploit an analogy with metadynamics and use the trajectory to adaptively construct a repulsive, history-dependent bias from the distributions that correspond to the previously visited configurations. This potential forces the system to explore more of phase space by making it desirable to adopt configurations whose distributions do not overlap with the bias. We apply this algorithm to a small model protein and succeed in reproducing the free-energy surface that we obtain from a parallel tempering calculation. PMID:22427357

  14. In vitro degradation of ZnO flowered coated Zn-Mg alloys in simulated physiological conditions.

    PubMed

    Alves, Marta M; Prosek, Tomas; Santos, Catarina F; Montemor, Maria F

    2017-01-01

    Flowered coatings composed by ZnO crystals were successfully electrodeposited on Zn-Mg alloys. The distinct coatings morphologies were found to be dependent upon the solid interfaces distribution, with the smaller number of bigger flowers (ø 46μm) obtained on Zn-Mg alloy containing 1wt.% Mg (Zn-1Mg) contrasting with the higher number of smaller flowers (ø 38μm) achieved on Zn-Mg alloy with 2wt.% Mg (Zn-2Mg). To assess the in vitro behaviour of these novel resorbable materials, a detailed evaluation of the degradation behaviour, in simulated physiological conditions, was performed by electrochemical impedance spectroscopy (EIS). The opposite behaviours observed in the corrosion resistances resulted in the build-up of distinct corrosion layers. The products forming these layers, preferentially detected at the flowers, were identified and their spatial distribution disclosed by EDS and Raman spectroscopy techniques. The presence of smithsonite, simonkolleite, hydrozincite, skorpionite and hydroxyapatite were assigned to both corrosion layers. However the distinct spatial distributions depicted may impact the biocompatibility of these resorbable materials, with the bone analogue compounds (hydroxyapatite and skorpionite) depicted in-between the ZnO crystals and on the top corrosion layer of Zn-1Mg flowers clearly contrasting with the hindered layer formed at the interface of the substrate with the flowers on Zn-2Mg. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Building-Based Analysis of the Spatial Provision of Urban Parks in Shenzhen, China.

    PubMed

    Gao, Wenxiu; Lyu, Qiang; Fan, Xiang; Yang, Xiaochun; Liu, Jiangtao; Zhang, Xirui

    2017-12-06

    Urban parks provide important environmental, social, and economic benefits to people and urban areas. The literature demonstrates that proximity to urban parks is one of the key factors influencing people's willingness to use them. Therefore, the provision of urban parks near residential areas and workplaces is one of the key factors influencing quality of life. This study designed a solution based on the spatial association between urban parks and buildings where people live or work to identify whether people in different buildings have nearby urban parks available for their daily lives. A building density map based on building floor area (BFA) was used to illustrate the spatial distribution of urban parks and five indices were designed to measure the scales, service coverage and potential service loads of urban parks and reveal areas lacking urban park services in an acceptable walking distance. With such solution, we investigated the provision of urban parks in ten districts of Shenzhen in China, which has grown from several small villages to a megacity in only 30 years. The results indicate that the spatial provision of urban parks in Shenzhen is not sufficient since people in about 65% of the buildings cannot access urban parks by walking 10-min. The distribution and service coverage of the existing urban parks is not balanced at the district level. In some districts, the existing urban parks have good numbers of potential users and even have large service loads, while in some districts, the building densities surrounding the existing parks are quite low and at the same time there is no urban parks nearby some high-density areas.

  16. Building-Based Analysis of the Spatial Provision of Urban Parks in Shenzhen, China

    PubMed Central

    Gao, Wenxiu; Lyu, Qiang; Fan, Xiang; Liu, Jiangtao; Zhang, Xirui

    2017-01-01

    Urban parks provide important environmental, social, and economic benefits to people and urban areas. The literature demonstrates that proximity to urban parks is one of the key factors influencing people’s willingness to use them. Therefore, the provision of urban parks near residential areas and workplaces is one of the key factors influencing quality of life. This study designed a solution based on the spatial association between urban parks and buildings where people live or work to identify whether people in different buildings have nearby urban parks available for their daily lives. A building density map based on building floor area (BFA) was used to illustrate the spatial distribution of urban parks and five indices were designed to measure the scales, service coverage and potential service loads of urban parks and reveal areas lacking urban park services in an acceptable walking distance. With such solution, we investigated the provision of urban parks in ten districts of Shenzhen in China, which has grown from several small villages to a megacity in only 30 years. The results indicate that the spatial provision of urban parks in Shenzhen is not sufficient since people in about 65% of the buildings cannot access urban parks by walking 10-min. The distribution and service coverage of the existing urban parks is not balanced at the district level. In some districts, the existing urban parks have good numbers of potential users and even have large service loads, while in some districts, the building densities surrounding the existing parks are quite low and at the same time there is no urban parks nearby some high-density areas. PMID:29211046

  17. UMCS feasibility study for Fort George G. Meade. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-12-01

    Fort George G. Meade selected eighty-three (83) buildings, from the approximately 1,500 buildings on the base to be included in the UMCS Feasibility Study. The purpose of the study is to evaluate the feasibility of replacing the existing analog based Energy Monitoring and Control System (EMCS) with a new distributed process Monitoring and Control System (UMCS).

  18. UMCS feasibility study for Fort George G. Meade volume 1. Feasibility study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-12-01

    Fort George G. Meade selected 83 buildings, from the approximately 1,500 buildings on the base to be included in the UMCS Feasibility Study. The purpose of the study is to evaluate the feasibility of replacing the existing analog based Energy Monitoring and Control System (EMCS) with a new distributed process Monitoring and Control System (UMCS).

  19. The Value of Green to the Army

    DTIC Science & Technology

    2011-03-16

    of Management and Budget OSD Office of the Secretary of Defense PBS Public Building Service PEMS Portable Emissions Measurement System PNNL ...Buildings. Pacific Northwest National Laboratory ( PNNL ). July 2008. Eichholtz, Piet, Nils Kok, and John Quigley. Doing well by doing good? An analysis of...M. Jenicek, and Dahtzen Chu March 2011 Approved for public release; distribution is unlimited. ERDC/CERL SR-11-2

  20. Identifying Critical Factors in the Cost-Effectiveness of Solar and Battery Storage in Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLaren, Joyce A.; Anderson, Katherine H.; Laws, Nicholas D.

    This analysis elucidates the emerging market for distributed solar paired with battery energy storage in commercial buildings across the United States. It provides insight into the near-term and future solar and solar-plus-storage market opportunities as well as the variables that impact the expected savings from installing behind-the-meter systems.

  1. 1400164

    NASA Image and Video Library

    2014-03-10

    ELEASA WILSON, KRAIG TERSIGNI, JUSTIN CARTLEDGE MISSION OPERATIONS LABORATORY - LABORATORY TRAINING COMPLEX (LTC), BUILDING 4663, EXPRESS RACK TRAINING, EMERALD BRICK (POWER DISTRIBUTION FOR EXPRESS RACK LAPTOP).

  2. Modelling heavy metals build-up on urban road surfaces for effective stormwater reuse strategy implementation.

    PubMed

    Hong, Nian; Zhu, Panfeng; Liu, An

    2017-12-01

    Urban road stormwater is an alternative water resource to mitigate water shortage issues in the worldwide. Heavy metals deposited (build-up) on urban road surface can enter road stormwater runoff, undermining stormwater reuse safety. As heavy metal build-up loads perform high variabilities in terms of spatial distribution and is strongly influenced by surrounding land uses, it is essential to develop an approach to identify hot-spots where stormwater runoff could include high heavy metal concentrations and hence cannot be reused if it is not properly treated. This study developed a robust modelling approach to estimating heavy metal build-up loads on urban roads using land use fractions (representing percentages of land uses within a given area) by an artificial neural network (ANN) model technique. Based on the modelling results, a series of heavy metal load spatial distribution maps and a comprehensive ecological risk map were generated. These maps provided a visualization platform to identify priority areas where the stormwater can be safely reused. Additionally, these maps can be utilized as an urban land use planning tool in the context of effective stormwater reuse strategy implementation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Domotics Project Housing Block.

    PubMed

    Morón, Carlos; Payán, Alejandro; García, Alfonso; Bosquet, Francisco

    2016-05-23

    This document develops the study of an implementation project of a home automation system in a housing placed in the town of Galapagar, Madrid. This house, which is going to be occupied by a four-member family, consists of 67 constructed square meters distributed in lounge, kitchen, three bedrooms, bath, bathroom and terrace, this being a common arrangement in Spain. Thus, this study will allow extracting conclusions about the adequacy of the home automation in a wide percentage of housing in Spain. In this document, three house automation proposals are developed based on the requirements of the client and the different home automation levels that the Spanish House and Building Automation Association has established, besides two parallel proposals relating to the safety and the technical alarms. The mentioned proposed systems are described by means of product datasheets and descriptions, distribution plans, measurements, budgets and flow charts that describe the functioning of the system in every case. An evaluation of each system is included, based on other studies conclusions on this matter, where expected energy savings from each design, depending on the current cost of lighting, water and gas, as well as the expected economic amortization period is evaluated.

  4. Building an experimental model of the human body with non-physiological parameters.

    PubMed

    Labuz, Joseph M; Moraes, Christopher; Mertz, David R; Leung, Brendan M; Takayama, Shuichi

    2017-03-01

    New advances in engineering and biomedical technology have enabled recent efforts to capture essential aspects of human physiology in microscale, in-vitro systems. The application of these advances to experimentally model complex processes in an integrated platform - commonly called a 'human-on-a-chip (HOC)' - requires that relevant compartments and parameters be sized correctly relative to each other and to the system as a whole. Empirical observation, theoretical treatments of resource distribution systems and natural experiments can all be used to inform rational design of such a system, but technical and fundamental challenges (e.g. small system blood volumes and context-dependent cell metabolism, respectively) pose substantial, unaddressed obstacles. Here, we put forth two fundamental principles for HOC design: inducing in-vivo -like cellular metabolic rates is necessary and may be accomplished in-vitro by limiting O 2 availability and that the effects of increased blood volumes on drug concentration can be mitigated through pharmacokinetics-based treatments of solute distribution. Combining these principles with natural observation and engineering workarounds, we derive a complete set of design criteria for a practically realizable, physiologically faithful, five-organ millionth-scale (× 10 -6 ) microfluidic model of the human body.

  5. Differentially Private Synthesization of Multi-Dimensional Data using Copula Functions

    PubMed Central

    Li, Haoran; Xiong, Li; Jiang, Xiaoqian

    2014-01-01

    Differential privacy has recently emerged in private statistical data release as one of the strongest privacy guarantees. Most of the existing techniques that generate differentially private histograms or synthetic data only work well for single dimensional or low-dimensional histograms. They become problematic for high dimensional and large domain data due to increased perturbation error and computation complexity. In this paper, we propose DPCopula, a differentially private data synthesization technique using Copula functions for multi-dimensional data. The core of our method is to compute a differentially private copula function from which we can sample synthetic data. Copula functions are used to describe the dependence between multivariate random vectors and allow us to build the multivariate joint distribution using one-dimensional marginal distributions. We present two methods for estimating the parameters of the copula functions with differential privacy: maximum likelihood estimation and Kendall’s τ estimation. We present formal proofs for the privacy guarantee as well as the convergence property of our methods. Extensive experiments using both real datasets and synthetic datasets demonstrate that DPCopula generates highly accurate synthetic multi-dimensional data with significantly better utility than state-of-the-art techniques. PMID:25405241

  6. Detection of abnormal item based on time intervals for recommender systems.

    PubMed

    Gao, Min; Yuan, Quan; Ling, Bin; Xiong, Qingyu

    2014-01-01

    With the rapid development of e-business, personalized recommendation has become core competence for enterprises to gain profits and improve customer satisfaction. Although collaborative filtering is the most successful approach for building a recommender system, it suffers from "shilling" attacks. In recent years, the research on shilling attacks has been greatly improved. However, the approaches suffer from serious problem in attack model dependency and high computational cost. To solve the problem, an approach for the detection of abnormal item is proposed in this paper. In the paper, two common features of all attack models are analyzed at first. A revised bottom-up discretized approach is then proposed based on time intervals and the features for the detection. The distributions of ratings in different time intervals are compared to detect anomaly based on the calculation of chi square distribution (χ(2)). We evaluated our approach on four types of items which are defined according to the life cycles of these items. The experimental results show that the proposed approach achieves a high detection rate with low computational cost when the number of attack profiles is more than 15. It improves the efficiency in shilling attacks detection by narrowing down the suspicious users.

  7. Building an experimental model of the human body with non-physiological parameters

    PubMed Central

    Labuz, Joseph M.; Moraes, Christopher; Mertz, David R.; Leung, Brendan M.; Takayama, Shuichi

    2017-01-01

    New advances in engineering and biomedical technology have enabled recent efforts to capture essential aspects of human physiology in microscale, in-vitro systems. The application of these advances to experimentally model complex processes in an integrated platform — commonly called a ‘human-on-a-chip (HOC)’ — requires that relevant compartments and parameters be sized correctly relative to each other and to the system as a whole. Empirical observation, theoretical treatments of resource distribution systems and natural experiments can all be used to inform rational design of such a system, but technical and fundamental challenges (e.g. small system blood volumes and context-dependent cell metabolism, respectively) pose substantial, unaddressed obstacles. Here, we put forth two fundamental principles for HOC design: inducing in-vivo-like cellular metabolic rates is necessary and may be accomplished in-vitro by limiting O2 availability and that the effects of increased blood volumes on drug concentration can be mitigated through pharmacokinetics-based treatments of solute distribution. Combining these principles with natural observation and engineering workarounds, we derive a complete set of design criteria for a practically realizable, physiologically faithful, five-organ millionth-scale (× 10−6) microfluidic model of the human body. PMID:28713851

  8. The application of quasi-steady approximation in atomic kinetics in simulation of hohlraum radiation drive

    NASA Astrophysics Data System (ADS)

    Ren, Guoli; Pei, Wenbing; Lan, Ke; Li, Xin; Hohlraum Physics Team

    2014-10-01

    In current routine 2D simulation of hohlraum physics, we adopt the principal-quantum-number (n-level) average atom model (AAM) in NLTE plasma description. The more sophisticated atomic kinetics description is better choice, but the in-line calculation consumes much more resource. By distinguishing the much more fast bound-bound atomic processes from the relative slow bound-free atomic processes, we found a method to built up a bound electron distribution (n-level or nl-level) using in-line n-level calculated plasma condition (such as temperature, density, average ionization degree). We name this method ``quasi-steady approximation.'' Using this method and the plasma condition calculated under n-level, we re-build the nl-level bound electron distribution (Pnl), and acquire a new hohlraum radiative drive by post-processing. Comparison with the n-level post-processed hohlraum drive shows that we get an almost identical radiation flux but with more-detailed frequency-dependant structures. Also we use this method in the benchmark gold sphere experiment, the constructed nl-level radiation drive resembles the experimental results and DCA results, while the n-level raditation does not.

  9. Field size dependent mapping of medical linear accelerator radiation leakage

    NASA Astrophysics Data System (ADS)

    Vũ Bezin, Jérémi; Veres, Attila; Lefkopoulos, Dimitri; Chavaudra, Jean; Deutsch, Eric; de Vathaire, Florent; Diallo, Ibrahima

    2015-03-01

    The purpose of this study was to investigate the suitability of a graphics library based model for the assessment of linear accelerator radiation leakage. Transmission through the shielding elements was evaluated using the build-up factor corrected exponential attenuation law and the contribution from the electron guide was estimated using the approximation of a linear isotropic radioactive source. Model parameters were estimated by a fitting series of thermoluminescent dosimeter leakage measurements, achieved up to 100 cm from the beam central axis along three directions. The distribution of leakage data at the patient plane reflected the architecture of the shielding elements. Thus, the maximum leakage dose was found under the collimator when only one jaw shielded the primary beam and was about 0.08% of the dose at isocentre. Overall, we observe that the main contributor to leakage dose according to our model was the electron beam guide. Concerning the discrepancies between the measurements used to calibrate the model and the calculations from the model, the average difference was about 7%. Finally, graphics library modelling is a readily and suitable way to estimate leakage dose distribution on a personal computer. Such data could be useful for dosimetric evaluations in late effect studies.

  10. SDN-NGenIA, a software defined next generation integrated architecture for HEP and data intensive science

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Hendricks, T. W.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.

    2017-10-01

    The SDN Next Generation Integrated Architecture (SDN-NGeNIA) project addresses some of the key challenges facing the present and next generations of science programs in HEP, astrophysics, and other fields, whose potential discoveries depend on their ability to distribute, process and analyze globally distributed Petascale to Exascale datasets. The SDN-NGenIA system under development by Caltech and partner HEP and network teams is focused on the coordinated use of network, computing and storage infrastructures, through a set of developments that build on the experience gained in recently completed and previous projects that use dynamic circuits with bandwidth guarantees to support major network flows, as demonstrated across LHC Open Network Environment [1] and in large scale demonstrations over the last three years, and recently integrated with PhEDEx and Asynchronous Stage Out data management applications of the CMS experiment at the Large Hadron Collider. In addition to the general program goals of supporting the network needs of the LHC and other science programs with similar needs, a recent focus is the use of the Leadership HPC facility at Argonne National Lab (ALCF) for data intensive applications.

  11. Domotics Project Housing Block

    PubMed Central

    Morón, Carlos; Payán, Alejandro; García, Alfonso; Bosquet, Francisco

    2016-01-01

    This document develops the study of an implementation project of a home automation system in a housing placed in the town of Galapagar, Madrid. This house, which is going to be occupied by a four-member family, consists of 67 constructed square meters distributed in lounge, kitchen, three bedrooms, bath, bathroom and terrace, this being a common arrangement in Spain. Thus, this study will allow extracting conclusions about the adequacy of the home automation in a wide percentage of housing in Spain. In this document, three house automation proposals are developed based on the requirements of the client and the different home automation levels that the Spanish House and Building Automation Association has established, besides two parallel proposals relating to the safety and the technical alarms. The mentioned proposed systems are described by means of product datasheets and descriptions, distribution plans, measurements, budgets and flow charts that describe the functioning of the system in every case. An evaluation of each system is included, based on other studies conclusions on this matter, where expected energy savings from each design, depending on the current cost of lighting, water and gas, as well as the expected economic amortization period is evaluated. PMID:27223285

  12. Quantum computation and analysis of Wigner and Husimi functions: toward a quantum image treatment.

    PubMed

    Terraneo, M; Georgeot, B; Shepelyansky, D L

    2005-06-01

    We study the efficiency of quantum algorithms which aim at obtaining phase-space distribution functions of quantum systems. Wigner and Husimi functions are considered. Different quantum algorithms are envisioned to build these functions, and compared with the classical computation. Different procedures to extract more efficiently information from the final wave function of these algorithms are studied, including coarse-grained measurements, amplitude amplification, and measure of wavelet-transformed wave function. The algorithms are analyzed and numerically tested on a complex quantum system showing different behavior depending on parameters: namely, the kicked rotator. The results for the Wigner function show in particular that the use of the quantum wavelet transform gives a polynomial gain over classical computation. For the Husimi distribution, the gain is much larger than for the Wigner function and is larger with the help of amplitude amplification and wavelet transforms. We discuss the generalization of these results to the simulation of other quantum systems. We also apply the same set of techniques to the analysis of real images. The results show that the use of the quantum wavelet transform allows one to lower dramatically the number of measurements needed, but at the cost of a large loss of information.

  13. Bayesian assessment of moving group membership: importance of models and prior knowledge

    NASA Astrophysics Data System (ADS)

    Lee, Jinhee; Song, Inseok

    2018-04-01

    Young nearby moving groups are important and useful in many fields of astronomy such as studying exoplanets, low-mass stars, and the stellar evolution of the early planetary systems over tens of millions of years, which has led to intensive searches for their members. Identification of members depends on the used models sensitively; therefore, careful examination of the models is required. In this study, we investigate the effects of the models used in moving group membership calculations based on a Bayesian framework (e.g. BANYAN II) focusing on the beta-Pictoris moving group (BPMG). Three improvements for building models are suggested: (1) updating a list of accepted members by re-assessing memberships in terms of position, motion, and age, (2) investigating member distribution functions in XYZ, and (3) exploring field star distribution functions in XYZ and UVW. The effect of each change is investigated, and we suggest using all of these improvements simultaneously in future membership probability calculations. Using this improved MG membership calculation and the careful examination of the age, 57 bona fide members of BPMG are confirmed including 12 new members. We additionally suggest 17 highly probable members.

  14. Engaging Staff in the Development of Distributed Leadership

    ERIC Educational Resources Information Center

    Street, Gary Wayne

    2011-01-01

    During the 2010-11 school-year at Scootney Springs Elementary, an action research project with 8 teachers was initiated to create a culture of distributed leadership in the school building. Throughout phase 1 of the study, we collected data from interviews, surveys, checklists, meeting minutes, observations, and documented discussions with the…

  15. Enhanced representation of soil NO emissions in the Community Multiscale Air Quality (CMAQ) model version 5.0.2

    EPA Science Inventory

    Modeling of soil nitric oxide (NO) emissions is highly uncertain and may misrepresent its spatial and temporal distribution. This study builds upon a recently introduced parameterization to improve the timing and spatial distribution of soil NO emission estimates in the Community...

  16. DYNAMIC ENERGY SAVING IN BUILDINGS WITH UNDERFLOOR AIR DISTRIBUTION SYSTEM – EXPERIMENTAL AND SIMULATION STUDIES

    EPA Science Inventory

    The present study is aimed at seeking a better understanding of the thermodynamics involved with the air distribution strategies associated with UFAD systems and its impact on the energy saving dynamics.
    Thus objectives are:

  1. Causes and Solutions for High Energy Consumption in Traditional Buildings Located in Hot Climate Regions

    NASA Astrophysics Data System (ADS)

    Barayan, Olfat Mohammad

    A considerable amount of money for high-energy consumption is spent in traditional buildings located in hot climate regions. High-energy consumption is significantly influenced by several causes, including building materials, orientation, mass, and openings' sizes. This paper aims to identify these causes and find practical solutions to reduce the annual cost of bills. For the purpose of this study, simulation research method has been followed. A comparison between two Revit models has also been created to point out the major cause of high-energy consumption. By analysing different orientations, wall insulation, and window glazing and applying some other high performance building techniques, a conclusion was found to confirm that appropriate building materials play a vital role in affecting energy cost. Therefore, the ability to reduce the energy cost by more than 50% in traditional buildings depends on a careful balance of building materials, mass, orientation, and type of window glazing.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Los Alamos National Laboratory, Mailstop M888, Los Alamos, NM 87545, USA; Lawrence Berkeley National Laboratory, One Cyclotron Road, Building 64R0121, Berkeley, CA 94720, USA; Department of Haematology, University of Cambridge, Cambridge CB2 0XY, England

    The PHENIX AutoBuild Wizard is a highly automated tool for iterative model-building, structure refinement and density modification using RESOLVE or TEXTAL model-building, RESOLVE statistical density modification, and phenix.refine structure refinement. Recent advances in the AutoBuild Wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model completion algorithms, and automated solvent molecule picking. Model completion algorithms in the AutoBuild Wizard include loop-building, crossovers between chains in different models of a structure, and side-chain optimization. The AutoBuild Wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 {angstrom} tomore » 3.2 {angstrom}, resulting in a mean R-factor of 0.24 and a mean free R factor of 0.29. The R-factor of the final model is dependent on the quality of the starting electron density, and relatively independent of resolution.« less

  3. Indoor-Outdoor Air Leakage of Apartments and Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, P.N.; Shehabi, A.; Chan, R.W.

    We compiled and analyzed available data concerning indoor-outdoor air leakage rates and building leakiness parameters for commercial buildings and apartments. We analyzed the data, and reviewed the related literature, to determine the current state of knowledge of the statistical distribution of air exchange rates and related parameters for California buildings, and to identify significant gaps in the current knowledge and data. Very few data were found from California buildings, so we compiled data from other states and some other countries. Even when data from other developed countries were included, data were sparse and few conclusive statements were possible. Little systematicmore » variation in building leakage with construction type, building activity type, height, size, or location within the u.s. was observed. Commercial buildings and apartments seem to be about twice as leaky as single-family houses, per unit of building envelope area. Although further work collecting and analyzing leakage data might be useful, we suggest that a more important issue may be the transport of pollutants between units in apartments and mixed-use buildings, an under-studied phenomenon that may expose occupants to high levels of pollutants such as tobacco smoke or dry cleaning fumes.« less

  4. Detection of collapsed buildings from lidar data due to the 2016 Kumamoto earthquake in Japan

    NASA Astrophysics Data System (ADS)

    Moya, Luis; Yamazaki, Fumio; Liu, Wen; Yamada, Masumi

    2018-01-01

    The 2016 Kumamoto earthquake sequence was triggered by an Mw 6.2 event at 21:26 on 14 April. Approximately 28 h later, at 01:25 on 16 April, an Mw 7.0 event (the mainshock) followed. The epicenters of both events were located near the residential area of Mashiki and affected the region nearby. Due to very strong seismic ground motion, the earthquake produced extensive damage to buildings and infrastructure. In this paper, collapsed buildings were detected using a pair of digital surface models (DSMs), taken before and after the 16 April mainshock by airborne light detection and ranging (lidar) flights. Different methods were evaluated to identify collapsed buildings from the DSMs. The change in average elevation within a building footprint was found to be the most important factor. Finally, the distribution of collapsed buildings in the study area was presented, and the result was consistent with that of a building damage survey performed after the earthquake.

  5. Integrating the Web and continuous media through distributed objects

    NASA Astrophysics Data System (ADS)

    Labajo, Saul P.; Garcia, Narciso N.

    1998-09-01

    The Web has rapidly grown to become the standard for documents interchange on the Internet. At the same time the interest on transmitting continuous media flows on the Internet, and its associated applications like multimedia on demand, is also growing. Integrating both kinds of systems should allow building real hypermedia systems where all media objects can be linked from any other, taking into account temporal and spatial synchronization. A way to achieve this integration is using the Corba architecture. This is a standard for open distributed systems. There are also recent efforts to integrate Web and Corba systems. We use this architecture to build a service for distribution of data flows endowed with timing restrictions. We use to integrate it with the Web, by one side Java applets that can use the Corba architecture and are embedded on HTML pages. On the other side, we also benefit from the efforts to integrate Corba and the Web.

  6. Molecular profiling of fungal communities in moisture damaged buildings before and after remediation--a comparison of culture-dependent and culture-independent methods.

    PubMed

    Pitkäranta, Miia; Meklin, Teija; Hyvärinen, Anne; Nevalainen, Aino; Paulin, Lars; Auvinen, Petri; Lignell, Ulla; Rintala, Helena

    2011-10-21

    Indoor microbial contamination due to excess moisture is an important contributor to human illness in both residential and occupational settings. However, the census of microorganisms in the indoor environment is limited by the use of selective, culture-based detection techniques. By using clone library sequencing of full-length internal transcribed spacer region combined with quantitative polymerase chain reaction (qPCR) for 69 fungal species or assay groups and cultivation, we have been able to generate a more comprehensive description of the total indoor mycoflora. Using this suite of methods, we assessed the impact of moisture damage on the fungal community composition of settled dust and building material samples (n = 8 and 16, correspondingly). Water-damaged buildings (n = 2) were examined pre- and post- remediation, and compared with undamaged reference buildings (n = 2). Culture-dependent and independent methods were consistent in the dominant fungal taxa in dust, but sequencing revealed a five to ten times higher diversity at the genus level than culture or qPCR. Previously unknown, verified fungal phylotypes were detected in dust, accounting for 12% of all diversity. Fungal diversity, especially within classes Dothideomycetes and Agaricomycetes tended to be higher in the water damaged buildings. Fungal phylotypes detected in building materials were present in dust samples, but their proportion of total fungi was similar for damaged and reference buildings. The quantitative correlation between clone library phylotype frequencies and qPCR counts was moderate (r = 0.59, p < 0.01). We examined a small number of target buildings and found indications of elevated fungal diversity associated with water damage. Some of the fungi in dust were attributable to building growth, but more information on the material-associated communities is needed in order to understand the dynamics of microbial communities between building structures and dust. The sequencing-based method proved indispensable for describing the true fungal diversity in indoor environments. However, making conclusions concerning the effect of building conditions on building mycobiota using this methodology was complicated by the wide natural diversity in the dust samples, the incomplete knowledge of material-associated fungi fungi and the semiquantitative nature of sequencing based methods.

  7. Building America Case Study: Evaluating Through-Wall Air Transfer Fans, Pittsburgh, Pennsylvania (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2014-10-01

    In this project, Building America team IBACOS performed field testing in a new construction unoccupied test house in Pittsburgh, Pennsylvania to evaluate heating, ventilating, and air conditioning (HVAC) distribution systems during heating, cooling, and midseason conditions. Four air-based HVAC distribution systems were assessed:-a typical airflow ducted system to the bedrooms, a low airflow ducted system to the bedrooms, a system with transfer fans to the bedrooms, and a system with no ductwork to the bedrooms. The relative ability of each system was considered with respect to relevant Air Conditioning Contractors of America and ASHRAE standards for house temperature uniformity andmore » stability, respectively.« less

  8. Scalar Fluxes Near a Tall Building in an Aligned Array of Rectangular Buildings

    NASA Astrophysics Data System (ADS)

    Fuka, Vladimír; Xie, Zheng-Tong; Castro, Ian P.; Hayden, Paul; Carpentieri, Matteo; Robins, Alan G.

    2018-04-01

    Scalar dispersion from ground-level sources in arrays of buildings is investigated using wind-tunnel measurements and large-eddy simulation (LES). An array of uniform-height buildings of equal dimensions and an array with an additional single tall building (wind tunnel) or a periodically repeated tall building (LES) are considered. The buildings in the array are aligned and form long streets. The sensitivity of the dispersion pattern to small changes in wind direction is demonstrated. Vertical scalar fluxes are decomposed into the advective and turbulent parts and the influences of wind direction and of the presence of the tall building on the scalar flux components are evaluated. In the uniform-height array turbulent scalar fluxes are dominant, whereas the tall building produces an increase of the magnitude of advective scalar fluxes that yields the largest component. The presence of the tall building causes either an increase or a decrease to the total vertical scalar flux depending on the position of the source with respect to the tall building. The results of the simulations can be used to develop parametrizations for street-canyon dispersion models and enhance their capabilities in areas with tall buildings.

  9. Age-Related Declines in the Fidelity of Newly Acquired Category Representations

    ERIC Educational Resources Information Center

    Davis, Tyler; Love, Bradley C.; Maddox, W. Todd

    2012-01-01

    We present a theory suggesting that the ability to build category representations that reflect the nuances of category structures in the environment depends upon clustering mechanisms instantiated in an MTL-PFC-based circuit. Because function in this circuit declines with age, we predict that the ability to build category representations will be…

  10. Thorough Control

    ERIC Educational Resources Information Center

    Kennedy, Mike

    2008-01-01

    Educators and administrators may differ over what kind of floor surfaces they prefer in their facilities--carpeting or hard flooring. Advocates on each side can make a case for their choices, depending on how a building is being used, the wishes of building occupants, or the budget they have available. However, no matter what kind of flooring a…

  11. The Character of a Coach: Success Depends on Trustworthiness

    ERIC Educational Resources Information Center

    Psencik, Kay

    2015-01-01

    This article describes the role of coach as one that would nurture and build trust in their organization, allow others to see the coach as trustworthy, and build positive energy in the organization. The author offers some qualities that contribute to this trustworthy position, such as: self-awareness, honesty, sincerity, competence, reliability,…

  12. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    ERIC Educational Resources Information Center

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  13. Building Effective Medical Missions with Servant Leadership Skills.

    PubMed

    Johanson, Linda

    Nurses are naturally drawn to service opportunities, such as short-term medical missions (STMM), which hold great potential to benefit health. But STMMs have been criticized as potentially being culturally insensitive, leading to dependency, inadvertently causing harm, or being unsustainable. Utilizing servant leadership skills, nurses can effectively build community, vision, and sustainability into STMM projects.

  14. Copula Models for Sociology: Measures of Dependence and Probabilities for Joint Distributions

    ERIC Educational Resources Information Center

    Vuolo, Mike

    2017-01-01

    Often in sociology, researchers are confronted with nonnormal variables whose joint distribution they wish to explore. Yet, assumptions of common measures of dependence can fail or estimating such dependence is computationally intensive. This article presents the copula method for modeling the joint distribution of two random variables, including…

  15. Spatial pattern of diarrhea based on regional economic and environment by spatial autoregressive model

    NASA Astrophysics Data System (ADS)

    Bekti, Rokhana Dwi; Nurhadiyanti, Gita; Irwansyah, Edy

    2014-10-01

    The diarrhea case pattern information, especially for toddler, is very important. It is used to show the distribution of diarrhea in every region, relationship among that locations, and regional economic characteristic or environmental behavior. So, this research uses spatial pattern to perform them. This method includes: Moran's I, Spatial Autoregressive Models (SAR), and Local Indicator of Spatial Autocorrelation (LISA). It uses sample from 23 sub districts of Bekasi Regency, West Java, Indonesia. Diarrhea case, regional economic, and environmental behavior of households have a spatial relationship among sub district. SAR shows that the percentage of Regional Gross Domestic Product is significantly effect on diarrhea at α = 10%. Therefore illiteracy and health center facilities are significant at α = 5%. With LISA test, sub districts in southern Bekasi have high dependencies with Cikarang Selatan, Serang Baru, and Setu. This research also builds development application that is based on java and R to support data analysis.

  16. State-to-State Internal Energy Relaxation Following the Quantum-Kinetic Model in DSMC

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.

    2014-01-01

    A new model for chemical reactions, the Quantum-Kinetic (Q-K) model of Bird, has recently been introduced that does not depend on macroscopic rate equations or values of local flow field data. Subsequently, the Q-K model has been extended to include reactions involving charged species and electronic energy level transitions. Although this is a phenomenological model, it has been shown to accurately reproduce both equilibrium and non-equilibrium reaction rates. The usefulness of this model becomes clear as local flow conditions either exceed the conditions used to build previous models or when they depart from an equilibrium distribution. Presently, the applicability of the relaxation technique is investigated for the vibrational internal energy mode. The Forced Harmonic Oscillator (FHO) theory for vibrational energy level transitions is combined with the Q-K energy level transition model to accurately reproduce energy level transitions at a reduced computational cost compared to the older FHO models.

  17. A design space of visualization tasks.

    PubMed

    Schulz, Hans-Jörg; Nocke, Thomas; Heitzler, Magnus; Schumann, Heidrun

    2013-12-01

    Knowledge about visualization tasks plays an important role in choosing or building suitable visual representations to pursue them. Yet, tasks are a multi-faceted concept and it is thus not surprising that the many existing task taxonomies and models all describe different aspects of tasks, depending on what these task descriptions aim to capture. This results in a clear need to bring these different aspects together under the common hood of a general design space of visualization tasks, which we propose in this paper. Our design space consists of five design dimensions that characterize the main aspects of tasks and that have so far been distributed across different task descriptions. We exemplify its concrete use by applying our design space in the domain of climate impact research. To this end, we propose interfaces to our design space for different user roles (developers, authors, and end users) that allow users of different levels of expertise to work with it.

  18. Multi-scale clustering of functional data with application to hydraulic gradients in wetlands

    USGS Publications Warehouse

    Greenwood, Mark C.; Sojda, Richard S.; Sharp, Julia L.; Peck, Rory G.; Rosenberry, Donald O.

    2011-01-01

    A new set of methods are developed to perform cluster analysis of functions, motivated by a data set consisting of hydraulic gradients at several locations distributed across a wetland complex. The methods build on previous work on clustering of functions, such as Tarpey and Kinateder (2003) and Hitchcock et al. (2007), but explore functions generated from an additive model decomposition (Wood, 2006) of the original time se- ries. Our decomposition targets two aspects of the series, using an adaptive smoother for the trend and circular spline for the diurnal variation in the series. Different measures for comparing locations are discussed, including a method for efficiently clustering time series that are of different lengths using a functional data approach. The complicated nature of these wetlands are highlighted by the shifting group memberships depending on which scale of variation and year of the study are considered.

  19. Indanedione based binary chromophore supramolecular systems as a NLO active polymer composites

    NASA Astrophysics Data System (ADS)

    Rutkis, M.; Tokmakovs, A.; Jecs, E.; Kreicberga, J.; Kampars, V.; Kokars, V.

    2010-06-01

    Novel route to obtain EO material is proposed by supramolecular assembly of neutral-ground-state (NGS) and zwitterionic (ZWI) NLO chromophores in binary chromophore organic glass (BCOG) host-guest system. On a basis of our Langeven Dynamics (LD) molecular modeling combined with quantum chemical calculations, we have shown that anticipated enhancement NLO efficiency of BCOG material is possible via electrostatic supramolecular assembly of NGS with ZWI chromophore in antiparallel manner. Binding energy of such complex could be more dependent on molecular compatibility of components and local (atomic) charge distribution, then overall molecular dipole moments. According to our LD simulations these supramolecular bind structures of NGS and ZWI chromophores can sustain thermally assisted electrical field poling. For the one of experimentally investigated systems, build from dimethylaminobenzylidene 1,3-indanedione containing host and zwitterionic indanedione-1,3 pyridinium betaine as a guest, almost twofold enhancement of NLO efficiency was observed.

  20. Microbial specialists in below-grade foundation walls in Scandinavia.

    PubMed

    Nunez, M; Hammer, H

    2014-10-01

    Below-grade foundation walls are often exposed to excessive moisture by water infiltration, condensation, leakage, or lack of ventilation. Microbial growth in these structures depends largely on environmental factors, elapsed time, and the type of building materials and construction setup. The ecological preferences of Actinomycetes (Actinobacteria) and the molds Ascotricha chartarum, Myxotrichum chartarum (Ascomycota), Geomyces pannorum, and Monocillium sp. (Hyphomycetes) have been addressed based on analyses of 1764 samples collected in below-grade spaces during the period of 2001-2012. Our results show a significant correlation between these taxa and moist foundation walls as ecological niches. Substrate preference was the strongest predictor of taxa distribution within the wall, but the taxa's physiological needs, together with gradients of abiotic factors within the wall structure, also played a role. Our study describes for the first time how the wall environment affects microbial growth. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

Top