Distributed dynamic simulations of networked control and building performance applications.
Yahiaoui, Azzedine
2018-02-01
The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.
Distributed dynamic simulations of networked control and building performance applications
Yahiaoui, Azzedine
2017-01-01
The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper. PMID:29568135
Statistical distribution of building lot frontage: application for Tokyo downtown districts
NASA Astrophysics Data System (ADS)
Usui, Hiroyuki
2018-03-01
The frontage of a building lot is the determinant factor of the residential environment. The statistical distribution of building lot frontages shows how the perimeters of urban blocks are shared by building lots for a given density of buildings and roads. For practitioners in urban planning, this is indispensable to identify potential districts which comprise a high percentage of building lots with narrow frontage after subdivision and to reconsider the appropriate criteria for the density of buildings and roads as residential environment indices. In the literature, however, the statistical distribution of building lot frontages and the density of buildings and roads has not been fully researched. In this paper, based on the empirical study in the downtown districts of Tokyo, it is found that (1) a log-normal distribution fits the observed distribution of building lot frontages better than a gamma distribution, which is the model of the size distribution of Poisson Voronoi cells on closed curves; (2) the statistical distribution of building lot frontages statistically follows a log-normal distribution, whose parameters are the gross building density, road density, average road width, the coefficient of variation of building lot frontage, and the ratio of the number of building lot frontages to the number of buildings; and (3) the values of the coefficient of variation of building lot frontages, and that of the ratio of the number of building lot frontages to that of buildings are approximately equal to 0.60 and 1.19, respectively.
NASA Astrophysics Data System (ADS)
Pierce, M. E.; Aktas, M. S.; Aydin, G.; Fox, G. C.; Gadgil, H.; Sayar, A.
2005-12-01
We examine the application of Web Service Architectures and Grid-based distributed computing technologies to geophysics and geo-informatics. We are particularly interested in the integration of Geographical Information System (GIS) services with distributed data mining applications. GIS services provide the general purpose framework for building archival data services, real time streaming data services, and map-based visualization services that may be integrated with data mining and other applications through the use of distributed messaging systems and Web Service orchestration tools. Building upon on our previous work in these areas, we present our current research efforts. These include fundamental investigations into increasing XML-based Web service performance, supporting real time data streams, and integrating GIS mapping tools with audio/video collaboration systems for shared display and annotation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Emma M.; Hendrix, Val; Chertkov, Michael
This white paper introduces the application of advanced data analytics to the modernized grid. In particular, we consider the field of machine learning and where it is both useful, and not useful, for the particular field of the distribution grid and buildings interface. While analytics, in general, is a growing field of interest, and often seen as the golden goose in the burgeoning distribution grid industry, its application is often limited by communications infrastructure, or lack of a focused technical application. Overall, the linkage of analytics to purposeful application in the grid space has been limited. In this paper wemore » consider the field of machine learning as a subset of analytical techniques, and discuss its ability and limitations to enable the future distribution grid and the building-to-grid interface. To that end, we also consider the potential for mixing distributed and centralized analytics and the pros and cons of these approaches. Machine learning is a subfield of computer science that studies and constructs algorithms that can learn from data and make predictions and improve forecasts. Incorporation of machine learning in grid monitoring and analysis tools may have the potential to solve data and operational challenges that result from increasing penetration of distributed and behind-the-meter energy resources. There is an exponentially expanding volume of measured data being generated on the distribution grid, which, with appropriate application of analytics, may be transformed into intelligible, actionable information that can be provided to the right actors – such as grid and building operators, at the appropriate time to enhance grid or building resilience, efficiency, and operations against various metrics or goals – such as total carbon reduction or other economic benefit to customers. While some basic analysis into these data streams can provide a wealth of information, computational and human boundaries on performing the analysis are becoming significant, with more data and multi-objective concerns. Efficient applications of analysis and the machine learning field are being considered in the loop.« less
2013-06-01
Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...to develop a life-cycle building model have resulted in the definition of a “core” building information model that contains general information de...develop an information -exchange Model View Definition (MVD) for building electrical systems. The objective of the current work was to document the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macumber, Daniel L; Horowitz, Scott G; Schott, Marjorie
Across most industries, desktop applications are being rapidly migrated to web applications for a variety of reasons. Web applications are inherently cross platform, mobile, and easier to distribute than desktop applications. Fueling this trend are a wide range of free, open source libraries and frameworks that make it incredibly easy to develop powerful web applications. The building energy modeling community is just beginning to pick up on these larger trends, with a small but growing number of building energy modeling applications starting on or moving to the web. This paper presents a new, open source, web based geometry editor formore » Building Energy Modeling (BEM). The editor is written completely in JavaScript and runs in a modern web browser. The editor works on a custom JSON file format and is designed to be integrated into a variety of web and desktop applications. The web based editor is available to use as a standalone web application at: https://nrel.github.io/openstudio-geometry-editor/. An example integration is demonstrated with the OpenStudio desktop application. Finally, the editor can be easily integrated with a wide range of possible building energy modeling web applications.« less
a Framework for Distributed Mixed Language Scientific Applications
NASA Astrophysics Data System (ADS)
Quarrie, D. R.
The Object Management Group has defined an architecture (CORBA) for distributed object applications based on an Object Request Broker and Interface Definition Language. This project builds upon this architecture to establish a framework for the creation of mixed language scientific applications. A prototype compiler has been written that generates FORTRAN 90 or Eiffel stubs and skeletons and the required C++ glue code from an input IDL file that specifies object interfaces. This generated code can be used directly for non-distributed mixed language applications or in conjunction with the C++ code generated from a commercial IDL compiler for distributed applications. A feasibility study is presently underway to see whether a fully integrated software development environment for distributed, mixed-language applications can be created by modifying the back-end code generator of a commercial CASE tool to emit IDL.
40 CFR 35.2040 - Grant application.
Code of Federal Regulations, 2010 CFR
2010-07-01
...: (a) Step 2+3: Combined design and building of a treatment works and building related services and... limitations on award (§§ 35.2100 through 35.2127); (5) Final design drawings and specifications; (6) The..., a preliminary layout of the distribution and drainage systems, and an explanation of the intended...
40 CFR 35.2040 - Grant application.
Code of Federal Regulations, 2014 CFR
2014-07-01
...: (a) Step 2+3: Combined design and building of a treatment works and building related services and... limitations on award (§§ 35.2100 through 35.2127); (5) Final design drawings and specifications; (6) The..., a preliminary layout of the distribution and drainage systems, and an explanation of the intended...
40 CFR 35.2040 - Grant application.
Code of Federal Regulations, 2012 CFR
2012-07-01
...: (a) Step 2+3: Combined design and building of a treatment works and building related services and... limitations on award (§§ 35.2100 through 35.2127); (5) Final design drawings and specifications; (6) The..., a preliminary layout of the distribution and drainage systems, and an explanation of the intended...
40 CFR 35.2040 - Grant application.
Code of Federal Regulations, 2011 CFR
2011-07-01
...: (a) Step 2+3: Combined design and building of a treatment works and building related services and... limitations on award (§§ 35.2100 through 35.2127); (5) Final design drawings and specifications; (6) The..., a preliminary layout of the distribution and drainage systems, and an explanation of the intended...
40 CFR 35.2040 - Grant application.
Code of Federal Regulations, 2013 CFR
2013-07-01
...: (a) Step 2+3: Combined design and building of a treatment works and building related services and... limitations on award (§§ 35.2100 through 35.2127); (5) Final design drawings and specifications; (6) The..., a preliminary layout of the distribution and drainage systems, and an explanation of the intended...
Building Energy Modeling and Control Methods for Optimization and Renewables Integration
NASA Astrophysics Data System (ADS)
Burger, Eric M.
This dissertation presents techniques for the numerical modeling and control of building systems, with an emphasis on thermostatically controlled loads. The primary objective of this work is to address technical challenges related to the management of energy use in commercial and residential buildings. This work is motivated by the need to enhance the performance of building systems and by the potential for aggregated loads to perform load following and regulation ancillary services, thereby enabling the further adoption of intermittent renewable energy generation technologies. To increase the generalizability of the techniques, an emphasis is placed on recursive and adaptive methods which minimize the need for customization to specific buildings and applications. The techniques presented in this dissertation can be divided into two general categories: modeling and control. Modeling techniques encompass the processing of data streams from sensors and the training of numerical models. These models enable us to predict the energy use of a building and of sub-systems, such as a heating, ventilation, and air conditioning (HVAC) unit. Specifically, we first present an ensemble learning method for the short-term forecasting of total electricity demand in buildings. As the deployment of intermittent renewable energy resources continues to rise, the generation of accurate building-level electricity demand forecasts will be valuable to both grid operators and building energy management systems. Second, we present a recursive parameter estimation technique for identifying a thermostatically controlled load (TCL) model that is non-linear in the parameters. For TCLs to perform demand response services in real-time markets, online methods for parameter estimation are needed. Third, we develop a piecewise linear thermal model of a residential building and train the model using data collected from a custom-built thermostat. This model is capable of approximating unmodeled dynamics within a building by learning from sensor data. Control techniques encompass the application of optimal control theory, model predictive control, and convex distributed optimization to TCLs. First, we present the alternative control trajectory (ACT) representation, a novel method for the approximate optimization of non-convex discrete systems. This approach enables the optimal control of a population of non-convex agents using distributed convex optimization techniques. Second, we present a distributed convex optimization algorithm for the control of a TCL population. Experimental results demonstrate the application of this algorithm to the problem of renewable energy generation following. This dissertation contributes to the development of intelligent energy management systems for buildings by presenting a suite of novel and adaptable modeling and control techniques. Applications focus on optimizing the performance of building operations and on facilitating the integration of renewable energy resources.
a Statistical Texture Feature for Building Collapse Information Extraction of SAR Image
NASA Astrophysics Data System (ADS)
Li, L.; Yang, H.; Chen, Q.; Liu, X.
2018-04-01
Synthetic Aperture Radar (SAR) has become one of the most important ways to extract post-disaster collapsed building information, due to its extreme versatility and almost all-weather, day-and-night working capability, etc. In view of the fact that the inherent statistical distribution of speckle in SAR images is not used to extract collapsed building information, this paper proposed a novel texture feature of statistical models of SAR images to extract the collapsed buildings. In the proposed feature, the texture parameter of G0 distribution from SAR images is used to reflect the uniformity of the target to extract the collapsed building. This feature not only considers the statistical distribution of SAR images, providing more accurate description of the object texture, but also is applied to extract collapsed building information of single-, dual- or full-polarization SAR data. The RADARSAT-2 data of Yushu earthquake which acquired on April 21, 2010 is used to present and analyze the performance of the proposed method. In addition, the applicability of this feature to SAR data with different polarizations is also analysed, which provides decision support for the data selection of collapsed building information extraction.
Methods and tools for profiling and control of distributed systems
NASA Astrophysics Data System (ADS)
Sukharev, R.; Lukyanchikov, O.; Nikulchev, E.; Biryukov, D.; Ryadchikov, I.
2018-02-01
This article is devoted to the topic of profiling and control of distributed systems. Distributed systems have a complex architecture, applications are distributed among various computing nodes, and many network operations are performed. Therefore, today it is important to develop methods and tools for profiling distributed systems. The article analyzes and standardizes methods for profiling distributed systems that focus on simulation to conduct experiments and build a graph model of the system. The theory of queueing networks is used for simulation modeling of distributed systems, receiving and processing user requests. To automate the above method of profiling distributed systems the software application was developed with a modular structure and similar to a SCADA-system.
Object-oriented Tools for Distributed Computing
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1993-01-01
Distributed computing systems are proliferating, owing to the availability of powerful, affordable microcomputers and inexpensive communication networks. A critical problem in developing such systems is getting application programs to interact with one another across a computer network. Remote interprogram connectivity is particularly challenging across heterogeneous environments, where applications run on different kinds of computers and operating systems. NetWorks! (trademark) is an innovative software product that provides an object-oriented messaging solution to these problems. This paper describes the design and functionality of NetWorks! and illustrates how it is being used to build complex distributed applications for NASA and in the commercial sector.
Code of Federal Regulations, 2011 CFR
2011-01-01
... lien accommodation or subordination and including the amount and maturity of the proposed loan, a... provisions of applicable model codes for any buildings to be constructed, as required by 7 CFR 1792.104; and...
Data Sharing in DHT Based P2P Systems
NASA Astrophysics Data System (ADS)
Roncancio, Claudia; Del Pilar Villamil, María; Labbé, Cyril; Serrano-Alvarado, Patricia
The evolution of peer-to-peer (P2P) systems triggered the building of large scale distributed applications. The main application domain is data sharing across a very large number of highly autonomous participants. Building such data sharing systems is particularly challenging because of the “extreme” characteristics of P2P infrastructures: massive distribution, high churn rate, no global control, potentially untrusted participants... This article focuses on declarative querying support, query optimization and data privacy on a major class of P2P systems, that based on Distributed Hash Table (P2P DHT). The usual approaches and the algorithms used by classic distributed systems and databases for providing data privacy and querying services are not well suited to P2P DHT systems. A considerable amount of work was required to adapt them for the new challenges such systems present. This paper describes the most important solutions found. It also identifies important future research trends in data management in P2P DHT systems.
The Application of Hardware in the Loop Testing for Distributed Engine Control
NASA Technical Reports Server (NTRS)
Thomas, George L.; Culley, Dennis E.; Brand, Alex
2016-01-01
The essence of a distributed control system is the modular partitioning of control function across a hardware implementation. This type of control architecture requires embedding electronics in a multitude of control element nodes for the execution of those functions, and their integration as a unified system. As the field of distributed aeropropulsion control moves toward reality, questions about building and validating these systems remain. This paper focuses on the development of hardware-in-the-loop (HIL) test techniques for distributed aero engine control, and the application of HIL testing as it pertains to potential advanced engine control applications that may now be possible due to the intelligent capability embedded in the nodes.
Telecommunications administration standard
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gustwiller, K.D.
1996-05-01
The administration of telecommunications is critical to proper maintenance and operation. The intent is to be able to properly support telecommunications for the distribution of all information within a building/campus. This standard will provide a uniform administration scheme that is independent of applications, and will establish guidelines for owners, installers, designers and contractors. This standard will accommodate existing building wiring, new building wiring and outside plant wiring. Existing buildings may not readily adapt to all applications of this standard, but the requirement for telecommunications administration is applicable to all buildings. Administration of the telecommunications infrastructure includes documentation (labels, records, drawings,more » reports, and work orders) of cables, termination hardware, patching and cross-connect facilities, telecommunications rooms, and other telecommunications spaces (conduits, grounding, and cable pathways are documented by Facilities Engineering). The investment in properly documenting telecommunications is a worthwhile effort. It is necessary to adhere to these standards to ensure quality and efficiency for the operation and maintenance of the telecommunications infrastructure for Sandia National Laboratories.« less
Cold Climate and Retrofit Applications for Air-to-Air Heat Pumps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baxter, Van D
2015-01-01
Air source heat pumps (ASHP) including air-to-air ASHPs are easily applied to buildings almost anywhere for new construction as well as retrofits or renovations. They are widespread in milder climate regions but their use in cold regions is hampered due to low heating efficiency and capacity at cold outdoor temperatures. Retrofitting air-to-air ASHPs to existing buildings is relatively easy if the building already has an air distribution system. For buildings without such systems alternative approaches are necessary. Examples are ductless, minisplit heat pumps or central heat pumps coupled to small diameter, high velocity (SDHV) air distribution systems. This article presentsmore » two subjects: 1) a summary of R&D investigations aimed at improving the cold weather performance of ASHPs, and 2) a brief discussion of building retrofit options using air-to-air ASHP systems.« less
APPLICATION OF A FULLY DISTRIBUTED WASHOFF AND TRANSPORT MODEL FOR A GULF COAST WATERSHED
Advances in hydrologic modeling have been shown to improve the accuracy of rainfall runoff simulation and prediction. Building on the capabilities of distributed hydrologic modeling, a water quality model was developed to simulate buildup, washoff, and advective transport of a co...
The QuakeSim Project: Web Services for Managing Geophysical Data and Applications
NASA Astrophysics Data System (ADS)
Pierce, Marlon E.; Fox, Geoffrey C.; Aktas, Mehmet S.; Aydin, Galip; Gadgil, Harshawardhan; Qi, Zhigang; Sayar, Ahmet
2008-04-01
We describe our distributed systems research efforts to build the “cyberinfrastructure” components that constitute a geophysical Grid, or more accurately, a Grid of Grids. Service-oriented computing principles are used to build a distributed infrastructure of Web accessible components for accessing data and scientific applications. Our data services fall into two major categories: Archival, database-backed services based around Geographical Information System (GIS) standards from the Open Geospatial Consortium, and streaming services that can be used to filter and route real-time data sources such as Global Positioning System data streams. Execution support services include application execution management services and services for transferring remote files. These data and execution service families are bound together through metadata information and workflow services for service orchestration. Users may access the system through the QuakeSim scientific Web portal, which is built using a portlet component approach.
ERIC Educational Resources Information Center
Martins, Rosane Maria; Chaves, Magali Ribeiro; Pirmez, Luci; Rust da Costa Carmo, Luiz Fernando
2001-01-01
Discussion of the need to filter and retrieval relevant information from the Internet focuses on the use of mobile agents, specific software components which are based on distributed artificial intelligence and integrated systems. Surveys agent technology and discusses the agent building package used to develop two applications using IBM's Aglet…
Design and Verification of Remote Sensing Image Data Center Storage Architecture Based on Hadoop
NASA Astrophysics Data System (ADS)
Tang, D.; Zhou, X.; Jing, Y.; Cong, W.; Li, C.
2018-04-01
The data center is a new concept of data processing and application proposed in recent years. It is a new method of processing technologies based on data, parallel computing, and compatibility with different hardware clusters. While optimizing the data storage management structure, it fully utilizes cluster resource computing nodes and improves the efficiency of data parallel application. This paper used mature Hadoop technology to build a large-scale distributed image management architecture for remote sensing imagery. Using MapReduce parallel processing technology, it called many computing nodes to process image storage blocks and pyramids in the background to improve the efficiency of image reading and application and sovled the need for concurrent multi-user high-speed access to remotely sensed data. It verified the rationality, reliability and superiority of the system design by testing the storage efficiency of different image data and multi-users and analyzing the distributed storage architecture to improve the application efficiency of remote sensing images through building an actual Hadoop service system.
1996-04-01
time systems . The focus is on the study of ’building-blocks’ for the construction of reliable and efficient systems. Our works falls into three...Members of MIT’s Theory of Distributed Systems group have continued their work on modelling, designing, verifying and analyzing distributed and real
Programming your way out of the past: ISIS and the META Project
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.; Marzullo, Keith
1989-01-01
The ISIS distributed programming system and the META Project are described. The ISIS programming toolkit is an aid to low-level programming that makes it easy to build fault-tolerant distributed applications that exploit replication and concurrent execution. The META Project is reexamining high-level mechanisms such as the filesystem, shell language, and administration tools in distributed systems.
Web-based Distributed Medical Information System for Chronic Viral Hepatitis
NASA Astrophysics Data System (ADS)
Yang, Ying; Qin, Tuan-fa; Jiang, Jian-ning; Lu, Hui; Ma, Zong-e.; Meng, Hong-chang
2008-11-01
To make a long-term dynamic monitoring to the chronically ill, especially patients of HBV A, we build a distributed Medical Information System for Chronic Viral Hepatitis (MISCHV). The Web-based system architecture and its function are described, and the extensive application and important role are also presented.
Revised congressional budget request, FY 1982. Conservation and renewable energy program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1981-03-01
Programs dealing with conservation and renewable energy are reprinted from the Revised Congressional Budget Request FY 1982. From Volume 7, Energy Conservation, information is presented on: buildings and community systems; industrial programs; transportation programs; state and local programs; inventor's program energy conversion technology; energy impact assistance; and residential/commercial retrofit. From Volume 2, Energy Supply Research and Development, information and data are presented on: solar building applications; solar industrial applications; solar power applications; solar information systems; SERI facility; solar international activities; alcohol fuels; geothermal; and hydropower. From Volume 6, Energy Production, Demonstration, and Distribution, information and data on solar energy production,more » demonstration, and distribution are presented. From Volume 3, Energy Supply and R and D Appropriation, information and data on electric energy systems and energy storage systems are included. From Volume 4, information and data are included on geothermal resources development fund. In Volume 5, Power Marketing Administrations, information and data are presented on estimates by appropriations, positions and staff years by appropriation, staffing distribution, and power marketing administrations. Recissions and deferrals for FY 1981 are given. (MCW)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendes, Goncalo; Feng, Wei; Stadler, Michael
The following paper conducts a regional analysis of the U.S. and Chinese buildings? potential for adopting Distributed Energy Resources (DER). The expected economics of DER in 2020-2025 is modeled for a commercial and a multi-family residential building in different climate zones. The optimal building energy economic performance is calculated using the Distributed Energy Resources Customer Adoption Model (DER CAM) which minimizes building energy costs for a typical reference year of operation. Several DER such as combined heat and power (CHP) units, photovoltaics, and battery storage are considered. The results indicate DER have economic and environmental competitiveness potential, especially for commercialmore » buildings in hot and cold climates of both countries. In the U.S., the average expected energy cost savings in commercial buildings from DER CAM?s suggested investments is 17percent, while in Chinese buildings is 12percent. The electricity tariffs structure and prices along with the cost of natural gas, represent important factors in determining adoption of DER, more so than climate. High energy pricing spark spreads lead to increased economic attractiveness of DER. The average emissions reduction in commercial buildings is 19percent in the U.S. as a result of significant investments in PV, whereas in China, it is 20percent and driven by investments in CHP. Keywords: Building Modeling and Simulation, Distributed Energy Resources (DER), Energy Efficiency, Combined Heat and Power (CHP), CO2 emissions 1. Introduction The transition from a centralized and fossil-based energy paradigm towards the decentralization of energy supply and distribution has been a major subject of research over the past two decades. Various concerns have brought the traditional model into question; namely its environmental footprint, its structural inflexibility and inefficiency, and more recently, its inability to maintain acceptable reliability of supply. Under such a troubled setting, distributed energy resources (DER) comprising of small, modular, electrical renewable or fossil-based electricity generation units placed at or near the point of energy consumption, has gained much attention as a viable alternative or addition to the current energy system. In 2010, China consumed about 30percent of its primary energy in the buildings sector, leading the country to pay great attention to DER development and its applications in buildings. During the 11th Five Year Plan (FYP), China has implemented 371 renewable energy building demonstration projects, and 210 photovoltaics (PV) building integration projects. At the end of the 12th FYP, China is targeting renewable energy to provide 10percent of total building energy, and to save 30 metric tons of CO2 equivalents (mtce) of energy with building integrated renewables. China is also planning to implement one thousand natural gas-based distributed cogeneration demonstration projects with energy utilization rates over 70percent in the 12th FYP. All these policy targets require significant DER systems development for building applications. China?s fast urbanization makes building energy efficiency a crucial economic issue; however, only limited studies have been done that examine how to design and select suitable building energy technologies in its different regions. In the U.S., buildings consumed 40percent of the total primary energy in 2010 [1] and it is estimated that about 14 billion m2 of floor space of the existing building stock will be remodeled over the next 30 years. Most building?s renovation work has been on building envelope, lighting and HVAC systems. Although interest has emerged, less attention is being paid to DER for buildings. This context has created opportunities for research, development and progressive deployment of DER, due to its potential to combine the production of power and heat (CHP) near the point of consumption and delivering multiple benefits to customers, such as cost« less
Bao, Yi; Chen, Yizheng; Hoehler, Matthew S; Smith, Christopher M; Bundy, Matthew; Chen, Genda
2017-01-01
This paper presents high temperature measurements using a Brillouin scattering-based fiber optic sensor and the application of the measured temperatures and building code recommended material parameters into enhanced thermomechanical analysis of simply supported steel beams subjected to combined thermal and mechanical loading. The distributed temperature sensor captures detailed, nonuniform temperature distributions that are compared locally with thermocouple measurements with less than 4.7% average difference at 95% confidence level. The simulated strains and deflections are validated using measurements from a second distributed fiber optic (strain) sensor and two linear potentiometers, respectively. The results demonstrate that the temperature-dependent material properties specified in the four investigated building codes lead to strain predictions with less than 13% average error at 95% confidence level and that the Europe building code provided the best predictions. However, the implicit consideration of creep in Europe is insufficient when the beam temperature exceeds 800°C.
Students' Development of Structure Sense for the Distributive Law
ERIC Educational Resources Information Center
Schüler-Meyer, Alexander
2017-01-01
After being introduced to the distributive law in meaningful contexts, students need to extend its scope of application to unfamiliar expressions. In this article, a process model for the development of structure sense is developed. Building on this model, this article reports on a design research project in which exercise tasks support students…
VOLTTRON™: Tech-to-Market Best-Practices Guide for Small- and Medium-Sized Commercial Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cort, Katherine A.; Haack, Jereme N.; Katipamula, Srinivas
VOLTTRON™ is an open-source distributed control and sensing platform developed by Pacific Northwest National Laboratory for the U.S. Department of Energy. It was developed to be used by the Office of Energy Efficiency and Renewable Energy to support transactive controls research and deployment activities. VOLTTRON is designed to be an overarching integration platform that could be used to bring together vendors, users, and developers and enable rapid application development and testing. The platform is designed to support modern control strategies, including the use of agent- and transaction-based controls. It also is designed to support the management of a wide rangemore » of applications, including heating, ventilation, and air-conditioning systems; electric vehicles; and distributed-energy and whole-building loads. This report was completed as part of the Building Technologies Office’s Technology-to-Market Initiative for VOLTTRON’s Market Validation and Business Case Development efforts. The report provides technology-to-market guidance and best practices related to VOLTTRON platform deployments and commercialization activities for use by entities serving small- and medium-sized commercial buildings. The report characterizes the platform ecosystem within the small- and medium-sized commercial building market and articulates the value proposition of VOLTTRON for three core participants in this ecosystem: 1) platform owners/adopters, 2) app developers, and 3) end-users. The report also identifies key market drivers and opportunities for open platform deployments in the small- and medium-sized commercial building market. Possible pathways to the market are described—laboratory testing to market adoption to commercialization. We also identify and address various technical and market barriers that could hinder deployment of VOLTTRON. Finally, we provide “best practice” tech-to-market guidance for building energy-related deployment efforts serving small- and medium-sized commercial buildings.« less
Models@Home: distributed computing in bioinformatics using a screensaver based approach.
Krieger, Elmar; Vriend, Gert
2002-02-01
Due to the steadily growing computational demands in bioinformatics and related scientific disciplines, one is forced to make optimal use of the available resources. A straightforward solution is to build a network of idle computers and let each of them work on a small piece of a scientific challenge, as done by Seti@Home (http://setiathome.berkeley.edu), the world's largest distributed computing project. We developed a generally applicable distributed computing solution that uses a screensaver system similar to Seti@Home. The software exploits the coarse-grained nature of typical bioinformatics projects. Three major considerations for the design were: (1) often, many different programs are needed, while the time is lacking to parallelize them. Models@Home can run any program in parallel without modifications to the source code; (2) in contrast to the Seti project, bioinformatics applications are normally more sensitive to lost jobs. Models@Home therefore includes stringent control over job scheduling; (3) to allow use in heterogeneous environments, Linux and Windows based workstations can be combined with dedicated PCs to build a homogeneous cluster. We present three practical applications of Models@Home, running the modeling programs WHAT IF and YASARA on 30 PCs: force field parameterization, molecular dynamics docking, and database maintenance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darula, Stanislav; Kocifaj, Miroslav; Kittler, Richard
2010-12-15
To ensure comfort and healthy conditions in interior spaces the thermal, acoustics and daylight factors of the environment have to be considered in the building design. Due to effective energy performance in buildings the new technology and applications also in daylight engineering are sought such as tubular light guides. These allow the transport of natural light into the building core reducing energy consumption. A lot of installations with various geometrical and optical properties can be applied in real buildings. The simplest set of tubular light guide consists of a transparent cupola, direct tube with high reflected inner surface and amore » ceiling cover or diffuser redistributing light into the interior. Such vertical tubular guide is often used on flat roofs. When the roof construction is inclined a bend in the light guide system has to be installed. In this case the cupola is set on the sloped roof which collects sunlight and skylight from the seen part of the sky hemisphere as well as that reflected from the ground and opposite facades. In comparison with the vertical tube some additional light losses and distortions of the propagated light have to be expected in bended tubular light guides. Recently the theoretical model of light propagation was already published and its applications are presented in this study solving illuminance distributions on the ceiling cover interface and further illuminance distribution on the working plane in the interior. (author)« less
A distributed computing model for telemetry data processing
NASA Astrophysics Data System (ADS)
Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.
1994-05-01
We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.
A distributed computing model for telemetry data processing
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.
1994-01-01
We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.
Bao, Yi; Chen, Yizheng; Hoehler, Matthew S.; Smith, Christopher M.; Bundy, Matthew; Chen, Genda
2016-01-01
This paper presents high temperature measurements using a Brillouin scattering-based fiber optic sensor and the application of the measured temperatures and building code recommended material parameters into enhanced thermomechanical analysis of simply supported steel beams subjected to combined thermal and mechanical loading. The distributed temperature sensor captures detailed, nonuniform temperature distributions that are compared locally with thermocouple measurements with less than 4.7% average difference at 95% confidence level. The simulated strains and deflections are validated using measurements from a second distributed fiber optic (strain) sensor and two linear potentiometers, respectively. The results demonstrate that the temperature-dependent material properties specified in the four investigated building codes lead to strain predictions with less than 13% average error at 95% confidence level and that the Europe building code provided the best predictions. However, the implicit consideration of creep in Europe is insufficient when the beam temperature exceeds 800°C. PMID:28239230
An authentication infrastructure for today and tomorrow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engert, D.E.
1996-06-01
The Open Software Foundation`s Distributed Computing Environment (OSF/DCE) was originally designed to provide a secure environment for distributed applications. By combining it with Kerberos Version 5 from MIT, it can be extended to provide network security as well. This combination can be used to build both an inter and intra organizational infrastructure while providing single sign-on for the user with overall improved security. The ESnet community of the Department of Energy is building just such an infrastructure. ESnet has modified these systems to improve their interoperability, while encouraging the developers to incorporate these changes and work more closely together tomore » continue to improve the interoperability. The success of this infrastructure depends on its flexibility to meet the needs of many applications and network security requirements. The open nature of Kerberos, combined with the vendor support of OSF/DCE, provides the infrastructure for today and tomorrow.« less
High Level Analysis, Design and Validation of Distributed Mobile Systems with
NASA Astrophysics Data System (ADS)
Farahbod, R.; Glässer, U.; Jackson, P. J.; Vajihollahi, M.
System design is a creative activity calling for abstract models that facilitate reasoning about the key system attributes (desired requirements and resulting properties) so as to ensure these attributes are properly established prior to actually building a system. We explore here the practical side of using the abstract state machine (ASM) formalism in combination with the CoreASM open source tool environment for high-level design and experimental validation of complex distributed systems. Emphasizing the early phases of the design process, a guiding principle is to support freedom of experimentation by minimizing the need for encoding. CoreASM has been developed and tested building on a broad scope of applications, spanning computational criminology, maritime surveillance and situation analysis. We critically reexamine here the CoreASM project in light of three different application scenarios.
Exploiting virtual synchrony in distributed systems
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.; Joseph, Thomas A.
1987-01-01
Applications of a virtually synchronous environment are described for distributed programming, which underlies a collection of distributed programming tools in the ISIS2 system. A virtually synchronous environment allows processes to be structured into process groups, and makes events like broadcasts to the group as an entity, group membership changes, and even migration of an activity from one place to another appear to occur instantaneously, in other words, synchronously. A major advantage to this approach is that many aspects of a distributed application can be treated independently without compromising correctness. Moreover, user code that is designed as if the system were synchronous can often be executed concurrently. It is argued that this approach to building distributed and fault tolerant software is more straightforward, more flexible, and more likely to yield correct solutions than alternative approaches.
ETICS: the international software engineering service for the grid
NASA Astrophysics Data System (ADS)
Meglio, A. D.; Bégin, M.-E.; Couvares, P.; Ronchieri, E.; Takacs, E.
2008-07-01
The ETICS system is a distributed software configuration, build and test system designed to fulfil the needs of improving the quality, reliability and interoperability of distributed software in general and grid software in particular. The ETICS project is a consortium of five partners (CERN, INFN, Engineering Ingegneria Informatica, 4D Soft and the University of Wisconsin-Madison). The ETICS service consists of a build and test job execution system based on the Metronome software and an integrated set of web services and software engineering tools to design, maintain and control build and test scenarios. The ETICS system allows taking into account complex dependencies among applications and middleware components and provides a rich environment to perform static and dynamic analysis of the software and execute deployment, system and interoperability tests. This paper gives an overview of the system architecture and functionality set and then describes how the EC-funded EGEE, DILIGENT and OMII-Europe projects are using the software engineering services to build, validate and distribute their software. Finally a number of significant use and test cases will be described to show how ETICS can be used in particular to perform interoperability tests of grid middleware using the grid itself.
Applied Distributed Model Predictive Control for Energy Efficient Buildings and Ramp Metering
NASA Astrophysics Data System (ADS)
Koehler, Sarah Muraoka
Industrial large-scale control problems present an interesting algorithmic design challenge. A number of controllers must cooperate in real-time on a network of embedded hardware with limited computing power in order to maximize system efficiency while respecting constraints and despite communication delays. Model predictive control (MPC) can automatically synthesize a centralized controller which optimizes an objective function subject to a system model, constraints, and predictions of disturbance. Unfortunately, the computations required by model predictive controllers for large-scale systems often limit its industrial implementation only to medium-scale slow processes. Distributed model predictive control (DMPC) enters the picture as a way to decentralize a large-scale model predictive control problem. The main idea of DMPC is to split the computations required by the MPC problem amongst distributed processors that can compute in parallel and communicate iteratively to find a solution. Some popularly proposed solutions are distributed optimization algorithms such as dual decomposition and the alternating direction method of multipliers (ADMM). However, these algorithms ignore two practical challenges: substantial communication delays present in control systems and also problem non-convexity. This thesis presents two novel and practically effective DMPC algorithms. The first DMPC algorithm is based on a primal-dual active-set method which achieves fast convergence, making it suitable for large-scale control applications which have a large communication delay across its communication network. In particular, this algorithm is suited for MPC problems with a quadratic cost, linear dynamics, forecasted demand, and box constraints. We measure the performance of this algorithm and show that it significantly outperforms both dual decomposition and ADMM in the presence of communication delay. The second DMPC algorithm is based on an inexact interior point method which is suited for nonlinear optimization problems. The parallel computation of the algorithm exploits iterative linear algebra methods for the main linear algebra computations in the algorithm. We show that the splitting of the algorithm is flexible and can thus be applied to various distributed platform configurations. The two proposed algorithms are applied to two main energy and transportation control problems. The first application is energy efficient building control. Buildings represent 40% of energy consumption in the United States. Thus, it is significant to improve the energy efficiency of buildings. The goal is to minimize energy consumption subject to the physics of the building (e.g. heat transfer laws), the constraints of the actuators as well as the desired operating constraints (thermal comfort of the occupants), and heat load on the system. In this thesis, we describe the control systems of forced air building systems in practice. We discuss the "Trim and Respond" algorithm which is a distributed control algorithm that is used in practice, and show that it performs similarly to a one-step explicit DMPC algorithm. Then, we apply the novel distributed primal-dual active-set method and provide extensive numerical results for the building MPC problem. The second main application is the control of ramp metering signals to optimize traffic flow through a freeway system. This application is particularly important since urban congestion has more than doubled in the past few decades. The ramp metering problem is to maximize freeway throughput subject to freeway dynamics (derived from mass conservation), actuation constraints, freeway capacity constraints, and predicted traffic demand. In this thesis, we develop a hybrid model predictive controller for ramp metering that is guaranteed to be persistently feasible and stable. This contrasts to previous work on MPC for ramp metering where such guarantees are absent. We apply a smoothing method to the hybrid model predictive controller and apply the inexact interior point method to this nonlinear non-convex ramp metering problem.
D Building Reconstruction by Multiview Images and the Integrated Application with Augmented Reality
NASA Astrophysics Data System (ADS)
Hwang, Jin-Tsong; Chu, Ting-Chen
2016-10-01
This study presents an approach wherein photographs with a high degree of overlap are clicked using a digital camera and used to generate three-dimensional (3D) point clouds via feature point extraction and matching. To reconstruct a building model, an unmanned aerial vehicle (UAV) is used to click photographs from vertical shooting angles above the building. Multiview images are taken from the ground to eliminate the shielding effect on UAV images caused by trees. Point clouds from the UAV and multiview images are generated via Pix4Dmapper. By merging two sets of point clouds via tie points, the complete building model is reconstructed. The 3D models are reconstructed using AutoCAD 2016 to generate vectors from the point clouds; SketchUp Make 2016 is used to rebuild a complete building model with textures. To apply 3D building models in urban planning and design, a modern approach is to rebuild the digital models; however, replacing the landscape design and building distribution in real time is difficult as the frequency of building replacement increases. One potential solution to these problems is augmented reality (AR). Using Unity3D and Vuforia to design and implement the smartphone application service, a markerless AR of the building model can be built. This study is aimed at providing technical and design skills related to urban planning, urban designing, and building information retrieval using AR.
NASA Astrophysics Data System (ADS)
El Khattabi, El Mehdi; Mharzi, Mohamed; Raefat, Saad; Meghari, Zouhair
2018-05-01
In this paper, the thermal equivalence of the passive elements of a room in a building located in Fez-Morocco has been studied. The possibility of replacing them with a semi-passive element such as ventilation has been appraised. For this aim a Software in Fortran taking into account the meteorological external conditions along with different parameters of the building envelope has been performed. A new computational approach is adapted to determinate the temperature distribution throughout the building multilayer walls. A novel equation gathering the internal temperature with the external conditions, and the building envelope has been deduced in transient state.
Aphinyanaphongs, Yin; Fu, Lawrence D; Aliferis, Constantin F
2013-01-01
Building machine learning models that identify unproven cancer treatments on the Health Web is a promising approach for dealing with the dissemination of false and dangerous information to vulnerable health consumers. Aside from the obvious requirement of accuracy, two issues are of practical importance in deploying these models in real world applications. (a) Generalizability: The models must generalize to all treatments (not just the ones used in the training of the models). (b) Scalability: The models can be applied efficiently to billions of documents on the Health Web. First, we provide methods and related empirical data demonstrating strong accuracy and generalizability. Second, by combining the MapReduce distributed architecture and high dimensionality compression via Markov Boundary feature selection, we show how to scale the application of the models to WWW-scale corpora. The present work provides evidence that (a) a very small subset of unproven cancer treatments is sufficient to build a model to identify unproven treatments on the web; (b) unproven treatments use distinct language to market their claims and this language is learnable; (c) through distributed parallelization and state of the art feature selection, it is possible to prepare the corpora and build and apply models with large scalability.
NASA Technical Reports Server (NTRS)
Tamkin, Glenn S. (Inventor); Duffy, Daniel Q. (Inventor); Schnase, John L. (Inventor)
2016-01-01
A system, method and computer-readable storage devices for providing a climate data analytic services application programming interface distribution package. The example system can provide various components. The system provides a climate data analytic services application programming interface library that enables software applications running on a client device to invoke the capabilities of a climate data analytic service. The system provides a command-line interface that provides a means of interacting with a climate data analytic service by issuing commands directly to the system's server interface. The system provides sample programs that call on the capabilities of the application programming interface library and can be used as templates for the construction of new client applications. The system can also provide test utilities, build utilities, service integration utilities, and documentation.
Systematic plan of building Web geographic information system based on ActiveX control
NASA Astrophysics Data System (ADS)
Zhang, Xia; Li, Deren; Zhu, Xinyan; Chen, Nengcheng
2003-03-01
A systematic plan of building Web Geographic Information System (WebGIS) using ActiveX technology is proposed in this paper. In the proposed plan, ActiveX control technology is adopted in building client-side application, and two different schemas are introduced to implement communication between controls in users¡ browser and middle application server. One is based on Distribute Component Object Model (DCOM), the other is based on socket. In the former schema, middle service application is developed as a DCOM object that communicates with ActiveX control through Object Remote Procedure Call (ORPC) and accesses data in GIS Data Server through Open Database Connectivity (ODBC). In the latter, middle service application is developed using Java language. It communicates with ActiveX control through socket based on TCP/IP and accesses data in GIS Data Server through Java Database Connectivity (JDBC). The first one is usually developed using C/C++, and it is difficult to develop and deploy. The second one is relatively easy to develop, but its performance of data transfer relies on Web bandwidth. A sample application is developed using the latter schema. It is proved that the performance of the sample application is better than that of some other WebGIS applications in some degree.
A Linguistic Model in Component Oriented Programming
NASA Astrophysics Data System (ADS)
Crăciunean, Daniel Cristian; Crăciunean, Vasile
2016-12-01
It is a fact that the component-oriented programming, well organized, can bring a large increase in efficiency in the development of large software systems. This paper proposes a model for building software systems by assembling components that can operate independently of each other. The model is based on a computing environment that runs parallel and distributed applications. This paper introduces concepts as: abstract aggregation scheme and aggregation application. Basically, an aggregation application is an application that is obtained by combining corresponding components. In our model an aggregation application is a word in a language.
A development framework for distributed artificial intelligence
NASA Technical Reports Server (NTRS)
Adler, Richard M.; Cottman, Bruce H.
1989-01-01
The authors describe distributed artificial intelligence (DAI) applications in which multiple organizations of agents solve multiple domain problems. They then describe work in progress on a DAI system development environment, called SOCIAL, which consists of three primary language-based components. The Knowledge Object Language defines models of knowledge representation and reasoning. The metaCourier language supplies the underlying functionality for interprocess communication and control access across heterogeneous computing environments. The metaAgents language defines models for agent organization coordination, control, and resource management. Application agents and agent organizations will be constructed by combining metaAgents and metaCourier building blocks with task-specific functionality such as diagnostic or planning reasoning. This architecture hides implementation details of communications, control, and integration in distributed processing environments, enabling application developers to concentrate on the design and functionality of the intelligent agents and agent networks themselves.
NASA Astrophysics Data System (ADS)
DiMola, Ashley M.
Buildings account for over 18% of the world's anthropogenic Greenhouse Gas (GHG) emissions. As a result, a technology that can offset GHG emissions associated with buildings has the potential to save over 9 Giga-tons of GHG emissions per year. High temperature fuel cell and absorption chiller (HTFC/AC) technology offers a relatively low-carbon option for meeting cooling and electric loads for buildings while producing almost no criteria pollutants. GHG emissions in the state of California would decrease by 7.48 million metric tons per year if every commercial building in the State used HTFC/AC technology to meet its power and cooling requirements. In order to realize the benefits of HTFC/AC technology on a wide scale, the distributed generation market needs to be exposed to the technology and informed of its economic viability and real-world potential. This work characterizes the economics associated with HTFC/AC technology using select scenarios that are representative of realistic applications. The financial impacts of various input factors are evaluated and the HTFC/AC simulations are compared to the economics of traditional building utilities. It is shown that, in addition to the emissions reductions derived from the systems, HTFC/AC technology is financially preferable in all of the scenarios evaluated. This work also presents the design of a showcase environment, centered on a beta-test application, that presents (1) system operating data gathered using a custom data acquisition module, and (2) HTFC/AC technology in a clear and approachable manner in order to serve the target audience of market stakeholders.
Point Picking and Distributing on the Disc and Sphere
2015-07-01
interesting application of geodesic subdivision is the design of domes, buildings, and structures (e.g., Spaceship Earth14 at Epcot, Walt Disney World; the...Computational Geometry in C. Cambridge (United Kingdom): Cambridge University Press; 1993. 41 14. Spaceship Earth. Walt Disney World; [accessed 2014
PERTS: A Prototyping Environment for Real-Time Systems
NASA Technical Reports Server (NTRS)
Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.
1991-01-01
We discuss an ongoing project to build a Prototyping Environment for Real-Time Systems, called PERTS. PERTS is a unique prototyping environment in that it has (1) tools and performance models for the analysis and evaluation of real-time prototype systems, (2) building blocks for flexible real-time programs and the support system software, (3) basic building blocks of distributed and intelligent real time applications, and (4) an execution environment. PERTS will make the recent and future theoretical advances in real-time system design and engineering readily usable to practitioners. In particular, it will provide an environment for the use and evaluation of new design approaches, for experimentation with alternative system building blocks and for the analysis and performance profiling of prototype real-time systems.
Research of Ancient Architectures in Jin-Fen Area Based on GIS&BIM Technology
NASA Astrophysics Data System (ADS)
Jia, Jing; Zheng, Qiuhong; Gao, Huiying; Sun, Hai
2017-05-01
The number of well-preserved ancient buildings located in Shanxi Province, enjoying the absolute maximum proportion of ancient architectures in China, is about 18418, among which, 9053 buildings have the structural style of wood frame. The value of the application of BIM (Building Information Modeling) and GIS (Geographic Information System) is gradually probed and testified in the corresponding fields of ancient architecture’s spatial distribution information management, routine maintenance and special conservation & restoration, the evaluation and simulation of related disasters, such as earthquake. The research objects are ancient architectures in JIN-FEN area, which were first investigated by Sicheng LIANG and recorded in his work of “Chinese ancient architectures survey report”. The research objects, i.e. the ancient architectures in Jin-Fen area include those in Sicheng LIANG’s investigation, and further adjustments were made through authors’ on-site investigation and literature searching & collection. During this research process, the spatial distributing Geodatabase of research objects is established utilizing GIS. The BIM components library for ancient buildings is formed combining on-site investigation data and precedent classic works, such as “Yingzao Fashi”, a treatise on architectural methods in Song Dynasty, “Yongle Encyclopedia” and “Gongcheng Zuofa Zeli”, case collections of engineering practice, by the Ministry of Construction of Qing Dynasty. A building of Guangsheng temple in Hongtong county is selected as an example to elaborate the BIM model construction process based on the BIM components library for ancient buildings. Based on the foregoing work results of spatial distribution data, attribute data of features, 3D graphic information and parametric building information model, the information management system for ancient architectures in Jin-Fen Area, utilizing GIS&BIM technology, could be constructed to support the further research of seismic disaster analysis and seismic performance simulation.
Should we trust build-up/wash-off water quality models at the scale of urban catchments?
Bonhomme, Céline; Petrucci, Guido
2017-01-01
Models of runoff water quality at the scale of an urban catchment usually rely on build-up/wash-off formulations obtained through small-scale experiments. Often, the physical interpretation of the model parameters, valid at the small-scale, is transposed to large-scale applications. Testing different levels of spatial variability, the parameter distributions of a water quality model are obtained in this paper through a Monte Carlo Markov Chain algorithm and analyzed. The simulated variable is the total suspended solid concentration at the outlet of a periurban catchment in the Paris region (2.3 km 2 ), for which high-frequency turbidity measurements are available. This application suggests that build-up/wash-off models applied at the catchment-scale do not maintain their physical meaning, but should be considered as "black-box" models. Copyright © 2016 Elsevier Ltd. All rights reserved.
Integrating the Web and continuous media through distributed objects
NASA Astrophysics Data System (ADS)
Labajo, Saul P.; Garcia, Narciso N.
1998-09-01
The Web has rapidly grown to become the standard for documents interchange on the Internet. At the same time the interest on transmitting continuous media flows on the Internet, and its associated applications like multimedia on demand, is also growing. Integrating both kinds of systems should allow building real hypermedia systems where all media objects can be linked from any other, taking into account temporal and spatial synchronization. A way to achieve this integration is using the Corba architecture. This is a standard for open distributed systems. There are also recent efforts to integrate Web and Corba systems. We use this architecture to build a service for distribution of data flows endowed with timing restrictions. We use to integrate it with the Web, by one side Java applets that can use the Corba architecture and are embedded on HTML pages. On the other side, we also benefit from the efforts to integrate Corba and the Web.
Low-Cost Bio-Based Phase Change Materials as an Energy Storage Medium in Building Envelopes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biswas, Kaushik; Abhari, Mr. Ramin; Shukla, Dr. Nitin
2015-01-01
A promising approach to increasing the energy efficiency of buildings is the implementation of phase change material (PCM) in building envelope systems. Several studies have reported the energy saving potential of PCM in building envelopes. However, wide application of PCMs in building applications has been inhibited, in part, by their high cost. This article describes a novel paraffin product made of naturally occurring fatty acids/glycerides trapped into high density polyethylene (HDPE) pellets and its performance in a building envelope application, with the ultimate goal of commercializing a low-cost PCM platform. The low-cost PCM pellets were mixed with cellulose insulation, installedmore » in external walls and field-tested under natural weatherization conditions for a period of several months. In addition, several PCM samples and PCM-cellulose samples were prepared under controlled conditions for laboratory-scale testing. The laboratory tests were performed to determine the phase change properties of PCM-enhanced cellulose insulation both at microscopic and macroscopic levels. This article presents the data and analysis from the exterior test wall and the laboratory-scale test data. PCM behavior is influenced by the weather and interior conditions, PCM phase change temperature and PCM distribution within the wall cavity, among other factors. Under optimal conditions, the field data showed up to 20% reduction in weekly heat transfer through an external wall due to the PCM compared to cellulose-only insulation.« less
The potential for microtechnology applications in energy systems: Results of an experts workshop
NASA Astrophysics Data System (ADS)
1995-02-01
Microscale technologies, or microelectromechanical systems (MEMS), are currently under development in the United States and abroad. Examples include microsensors, microactuators (including micromotors), and microscale heat exchangers. Typically, microscale devices have features ranging in size from a few microns to several millimeters, with fabrication methods adapted from those developed for the semiconductor industry. Microtechnologies are already being commercialized; initial markets include the biomedical and transportation industries. Applications are being developed in other industries as well. Researchers at the Pacific Northwest Laboratory (PNL) hypothesize that a significant number of energy applications are possible. These applications range from environmental sensors that support enhanced control of building (or room) temperature and ventilation to microscale heat pumps and microscale heat engines that could collectively provide for kilowatt quantities of energy conversion. If efficient versions of these devices are developed, they could significantly advance the commercialization of distributed energy conversion systems, thereby reducing the energy losses associated with energy distribution. Based upon the potential for energy savings, the U.S. Department of Energy (DOE) Office of Building Technologies (OBT) has proposed a new initiative in energy systems miniaturization. The program would focus on the development of microtechnologies for the manufactured housing sector and would begin in either FY 1997 or FY 1998, ramping up to $5 million per year investment by FY 2001.
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.
Gerber, Daniel L.; Vossos, Vagelis; Feng, Wei; ...
2017-06-12
Direct current (DC) power distribution has recently gained traction in buildings research due to the proliferation of on-site electricity generation and battery storage, and an increasing prevalence of internal DC loads. The research discussed in this paper uses Modelica-based simulation to compare the efficiency of DC building power distribution with an equivalent alternating current (AC) distribution. The buildings are all modeled with solar generation, battery storage, and loads that are representative of the most efficient building technology. A variety of paramet ric simulations determine how and when DC distribution proves advantageous. These simulations also validate previous studies that use simplermore » approaches and arithmetic efficiency models. This work shows that using DC distribution can be considerably more efficient: a medium sized office building using DC distribution has an expected baseline of 12% savings, but may also save up to 18%. In these results, the baseline simulation parameters are for a zero net energy (ZNE) building that can island as a microgrid. DC is most advantageous in buildings with large solar capacity, large battery capacity, and high voltage distribution.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Daniel L.; Vossos, Vagelis; Feng, Wei
Direct current (DC) power distribution has recently gained traction in buildings research due to the proliferation of on-site electricity generation and battery storage, and an increasing prevalence of internal DC loads. The research discussed in this paper uses Modelica-based simulation to compare the efficiency of DC building power distribution with an equivalent alternating current (AC) distribution. The buildings are all modeled with solar generation, battery storage, and loads that are representative of the most efficient building technology. A variety of paramet ric simulations determine how and when DC distribution proves advantageous. These simulations also validate previous studies that use simplermore » approaches and arithmetic efficiency models. This work shows that using DC distribution can be considerably more efficient: a medium sized office building using DC distribution has an expected baseline of 12% savings, but may also save up to 18%. In these results, the baseline simulation parameters are for a zero net energy (ZNE) building that can island as a microgrid. DC is most advantageous in buildings with large solar capacity, large battery capacity, and high voltage distribution.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Auslander, David; Culler, David; Wright, Paul
The goal of the 2.5 year Distributed Intelligent Automated Demand Response (DIADR) project was to reduce peak electricity load of Sutardja Dai Hall at UC Berkeley by 30% while maintaining a healthy, comfortable, and productive environment for the occupants. We sought to bring together both central and distributed control to provide “deep” demand response1 at the appliance level of the building as well as typical lighting and HVAC applications. This project brought together Siemens Corporate Research and Siemens Building Technology (the building has a Siemens Apogee Building Automation System (BAS)), Lawrence Berkeley National Laboratory (leveraging their Open Automated Demand Responsemore » (openADR), Auto-Demand Response, and building modeling expertise), and UC Berkeley (related demand response research including distributed wireless control, and grid-to-building gateway development). Sutardja Dai Hall houses the Center for Information Technology Research in the Interest of Society (CITRIS), which fosters collaboration among industry and faculty and students of four UC campuses (Berkeley, Davis, Merced, and Santa Cruz). The 141,000 square foot building, occupied in 2009, includes typical office spaces and a nanofabrication laboratory. Heating is provided by a district heating system (steam from campus as a byproduct of the campus cogeneration plant); cooling is provided by one of two chillers: a more typical electric centrifugal compressor chiller designed for the cool months (Nov- March) and a steam absorption chiller for use in the warm months (April-October). Lighting in the open office areas is provided by direct-indirect luminaries with Building Management System-based scheduling for open areas, and occupancy sensors for private office areas. For the purposes of this project, we focused on the office portion of the building. Annual energy consumption is approximately 8053 MWh; the office portion is estimated as 1924 MWh. The maximum peak load during the study period was 1175 kW. Several new tools facilitated this work, such as the Smart Energy Box, the distributed load controller or Energy Information Gateway, the web-based DR controller (dubbed the Central Load-Shed Coordinator or CLSC), and the Demand Response Capacity Assessment & Operation Assistance Tool (DRCAOT). In addition, an innovative data aggregator called sMAP (simple Measurement and Actuation Profile) allowed data from different sources collected in a compact form and facilitated detailed analysis of the building systems operation. A smart phone application (RAP or Rapid Audit Protocol) facilitated an inventory of the building’s plug loads. Carbon dioxide sensors located in conference rooms and classrooms allowed demand controlled ventilation. The extensive submetering and nimble access to this data provided great insight into the details of the building operation as well as quick diagnostics and analyses of tests. For example, students discovered a short-cycling chiller, a stuck damper, and a leaking cooling coil in the first field tests. For our final field tests, we were able to see how each zone was affected by the DR strategies (e.g., the offices on the 7th floor grew very warm quickly) and fine-tune the strategies accordingly.« less
There is a need to develop modeling and data analysis tools to increase our understanding of human exposures to air pollutants beyond what can be explained by "limited" field data. Modeling simulations of complex distributions of pollutant concentrations within roadw...
Combating Stovepipes: Implementing Workflow in uPortal
ERIC Educational Resources Information Center
Askren, Mark; Arseniev, Marina
2004-01-01
With a new wave of campus IT solutions comes a familiar challenge: How does one encourage and help build distributed applications without being overrun by stovepipe solutions? Anticipating significant enrollment growth (5 percent per year) for the next several years, double-digit annual growth in funded research, and little chance that resources…
Result Merging Strategies for a Current News Metasearcher.
ERIC Educational Resources Information Center
Rasolofo, Yves; Hawking, David; Savoy, Jacques
2003-01-01
Metasearching of online current news services is a potentially useful Web application of distributed information retrieval techniques. Reports experiences in building a metasearcher designed to provide up-to-date searching over a significant number of rapidly changing current news sites, focusing on how to merge results from the search engines at…
Control of New Copper Corrosion in High-Alkalinity Drinking Water using Orthophosphate - article
Research and field experience have shown that high-alkalinity waters can be associated with elevated copper levels in drinking water. The objective of this study was to document the application of orthophosphate to the distribution system of a building with a copper problem asso...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akyol, Bora A.; Allwardt, Craig H.; Beech, Zachary W.
VOLTTRON is a flexible, reliable, and scalable platform for distributed control and sensing. VOLTTRON serves in four primary roles: •A reference platform for researchers to quickly develop control applications for transactive energy. •A reference platform with flexible data store support for energy analytics applications either in academia or in commercial enterprise. •A platform from which commercial enterprise can develop products without license issues and easily integrate into their product line. •An accelerator to drive industry adoption of transactive energy and advanced building energy analytics. Pacific Northwest National Laboratory, with funding from the U.S. Department of Energy’s Building Technologies Office, developedmore » and maintains VOLTTRON as an open-source community project. VOLTTRON source code includes agent execution software; agents that perform critical services that enable and enhance VOLTTRON functionality; and numerous agents that utilize the platform to perform a specific function (fault detection, demand response, etc.). The platform supports energy, operational, and financial transactions between networked entities (equipment, organizations, buildings, grid, etc.) and enhance the control infrastructure of existing buildings through the use of open-source device communication, control protocols, and integrated analytics.« less
Intelligent Agents for the Digital Battlefield
1998-11-01
specific outcome of our long term research will be the development of a collaborative agent technology system, CATS , that will provide the underlying...software infrastructure needed to build large, heterogeneous, distributed agent applications. CATS will provide a software environment through which multiple...intelligent agents may interact with other agents, both human and computational. In addition, CATS will contain a number of intelligent agent components that will be useful for a wide variety of applications.
Economic aspects of possible residential heating conservation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopkowicz, M.; Szul, A.
1995-12-31
The paper presents methods of evaluation of energy and economy related effects of different actions aimed at conservation in residential buildings. It identifies also the method of selecting the most effective way of distribution funds assigned to weatherization as well as necessary improvements to be implemented within the heating node and the internal heating system of the building. The analysis of data gathered for four 11-stories high residential buildings of {open_quotes}Zeran{close_quotes} type being subject of the Conservation Demonstrative Project, included a differentiated scope of weatherization efforts and various actions aimed at system upgrading. Basing upon the discussion of the splitmore » of heat losses in a building as well as the established energy savings for numerous options of upgrading works, the main problem has been defined. It consists in optimal distribution of financial means for the discussed measures if the total amount of funds assigned for modifications is defined. The method based upon the principle of relative increments has been suggested. The economical and energy specifications of the building and its components, required for this method have also been elaborated. The application of this method allowed to define the suggested optimal scope of actions within the entire fund assigned for the comprehensive weatherization.« less
Exploiting replication in distributed systems
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.; Joseph, T. A.
1989-01-01
Techniques are examined for replicating data and execution in directly distributed systems: systems in which multiple processes interact directly with one another while continuously respecting constraints on their joint behavior. Directly distributed systems are often required to solve difficult problems, ranging from management of replicated data to dynamic reconfiguration in response to failures. It is shown that these problems reduce to more primitive, order-based consistency problems, which can be solved using primitives such as the reliable broadcast protocols. Moreover, given a system that implements reliable broadcast primitives, a flexible set of high-level tools can be provided for building a wide variety of directly distributed application programs.
Data-Driven Benchmarking of Building Energy Efficiency Utilizing Statistical Frontier Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kavousian, A; Rajagopal, R
2014-01-01
Frontier methods quantify the energy efficiency of buildings by forming an efficient frontier (best-practice technology) and by comparing all buildings against that frontier. Because energy consumption fluctuates over time, the efficiency scores are stochastic random variables. Existing applications of frontier methods in energy efficiency either treat efficiency scores as deterministic values or estimate their uncertainty by resampling from one set of measurements. Availability of smart meter data (repeated measurements of energy consumption of buildings) enables using actual data to estimate the uncertainty in efficiency scores. Additionally, existing applications assume a linear form for an efficient frontier; i.e.,they assume that themore » best-practice technology scales up and down proportionally with building characteristics. However, previous research shows that buildings are nonlinear systems. This paper proposes a statistical method called stochastic energy efficiency frontier (SEEF) to estimate a bias-corrected efficiency score and its confidence intervals from measured data. The paper proposes an algorithm to specify the functional form of the frontier, identify the probability distribution of the efficiency score of each building using measured data, and rank buildings based on their energy efficiency. To illustrate the power of SEEF, this paper presents the results from applying SEEF on a smart meter data set of 307 residential buildings in the United States. SEEF efficiency scores are used to rank individual buildings based on energy efficiency, to compare subpopulations of buildings, and to identify irregular behavior of buildings across different time-of-use periods. SEEF is an improvement to the energy-intensity method (comparing kWh/sq.ft.): whereas SEEF identifies efficient buildings across the entire spectrum of building sizes, the energy-intensity method showed bias toward smaller buildings. The results of this research are expected to assist researchers and practitioners compare and rank (i.e.,benchmark) buildings more robustly and over a wider range of building types and sizes. Eventually, doing so is expected to result in improved resource allocation in energy-efficiency programs.« less
Attributes of the Federal Energy Management Program's Federal Site Building Characteristics Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loper, Susan A.; Sandusky, William F.
2010-12-31
Typically, the Federal building stock is referred to as a group of about one-half million buildings throughout the United States. Additional information beyond this level is generally limited to distribution of that total by agency and maybe distribution of the total by state. However, additional characterization of the Federal building stock is required as the Federal sector seeks ways to implement efficiency projects to reduce energy and water use intensity as mandated by legislation and Executive Order. Using a Federal facility database that was assembled for use in a geographic information system tool, additional characterization of the Federal building stockmore » is provided including information regarding the geographical distribution of sites, building counts and percentage of total by agency, distribution of sites and building totals by agency, distribution of building count and floor space by Federal building type classification by agency, and rank ordering of sites, buildings, and floor space by state. A case study is provided regarding how the building stock has changed for the Department of Energy from 2000 through 2008.« less
Programmable multi-node quantum network design and simulation
NASA Astrophysics Data System (ADS)
Dasari, Venkat R.; Sadlier, Ronald J.; Prout, Ryan; Williams, Brian P.; Humble, Travis S.
2016-05-01
Software-defined networking offers a device-agnostic programmable framework to encode new network functions. Externally centralized control plane intelligence allows programmers to write network applications and to build functional network designs. OpenFlow is a key protocol widely adopted to build programmable networks because of its programmability, flexibility and ability to interconnect heterogeneous network devices. We simulate the functional topology of a multi-node quantum network that uses programmable network principles to manage quantum metadata for protocols such as teleportation, superdense coding, and quantum key distribution. We first show how the OpenFlow protocol can manage the quantum metadata needed to control the quantum channel. We then use numerical simulation to demonstrate robust programmability of a quantum switch via the OpenFlow network controller while executing an application of superdense coding. We describe the software framework implemented to carry out these simulations and we discuss near-term efforts to realize these applications.
USDA-ARS?s Scientific Manuscript database
Developing better agricultural monitoring capabilities based on Earth Observation data is critical for strengthening food production information and market transparency. The coming Sentinel-2 mission has the optimal capacity for regional to global agriculture monitoring in terms of resolution (10-20...
E-Business in Education. What You Need To Know: Building Competencies for Tomorrow's Opportunities.
ERIC Educational Resources Information Center
Norris, Donald M.; Olson, Mark A.
This guidebook is based on the belief that e-business applications will transform academia and academic support experiences, with learners participating in distributed learning environments that mix physical and virtual learning resources in many combinations, and it offers insights into the strategies and planning needed to develop a college…
DOE Office of Scientific and Technical Information (OSTI.GOV)
VOLTTRON is an agent execution platform providing services to its agents that allow them to easily communicate with physical devices and other resources. VOLTTRON delivers an innovative distributed control and sensing software platform that supports modern control strategies, including agent-based and transaction-based controls. It enables mobile and stationary software agents to perform information gathering, processing, and control actions. VOLTTRON can independently manage a wide range of applications, such as HVAC systems, electric vehicles, distributed energy or entire building loads, leading to improved operational efficiency.
Self-defrosting recuperative air-to-air heat exchanger
Drake, Richard L.
1993-01-01
A heat exchanger includes a stationary spirally or concentrically wound heat exchanger core with rotating baffles on upper and lower ends thereof. The rotating baffles include rotating inlets and outlets which are in communication with respective fixed inlets and outlets via annuli. The rotation of the baffles causes a concurrent rotation of the temperature distribution within the stationary exchanger core, thereby preventing frost build-up in some applications and preventing the formation of hot spots in other applications.
New distributed radar technology based on UAV or UGV application
NASA Astrophysics Data System (ADS)
Molchanov, Pavlo A.; Contarino, Vincent M.
2013-05-01
Regular micro and nano radars cannot provide reliable tracking of low altitude low profile aerial targets in urban and mountain areas because of reflection and re-reflections from buildings and terrain. They become visible and vulnerable to guided missiles if positioned on a tower or blimp. Doppler radar cannot distinguish moving cars and small low altitude aerial targets in an urban area. A new concept of pocket size distributed radar technology based on the application of UAV (Unmanned Air Vehicles), UGV (Unmanned Ground Vehicles) is proposed for tracking of low altitude low profile aerial targets at short and medium distances for protection of stadium, camp, military facility in urban or mountain areas.
Multimodal inspection in power engineering and building industries: new challenges and solutions
NASA Astrophysics Data System (ADS)
Kujawińska, Małgorzata; Malesa, Marcin; Malowany, Krzysztof
2013-09-01
Recently the demand and number of applications of full-field, optical measurement methods based on noncoherent light sources increased significantly. They include traditional image processing, thermovision, digital image correlation (DIC) and structured light methods. However, there are still numerous challenges connected with implementation of these methods to in-situ, long-term monitoring in industrial, civil engineering and cultural heritage applications, multimodal measurements of a variety of object features or simply adopting instruments to work in hard environmental conditions. In this paper we focus on 3D DIC method and present its enhancements concerning software modifications (new visualization methods and a method for automatic merging of data distributed in time) and hardware improvements. The modified 3D DIC system combined with infrared camera system is applied in many interesting cases: measurements of boiler drum during annealing and of pipelines in heat power stations and monitoring of different building steel struts at construction site and validation of numerical models of large building structures constructed of graded metal plate arches.
Optimal Solar PV Arrays Integration for Distributed Generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omitaomu, Olufemi A; Li, Xueping
2012-01-01
Solar photovoltaic (PV) systems hold great potential for distributed energy generation by installing PV panels on rooftops of residential and commercial buildings. Yet challenges arise along with the variability and non-dispatchability of the PV systems that affect the stability of the grid and the economics of the PV system. This paper investigates the integration of PV arrays for distributed generation applications by identifying a combination of buildings that will maximize solar energy output and minimize system variability. Particularly, we propose mean-variance optimization models to choose suitable rooftops for PV integration based on Markowitz mean-variance portfolio selection model. We further introducemore » quantity and cardinality constraints to result in a mixed integer quadratic programming problem. Case studies based on real data are presented. An efficient frontier is obtained for sample data that allows decision makers to choose a desired solar energy generation level with a comfortable variability tolerance level. Sensitivity analysis is conducted to show the tradeoffs between solar PV energy generation potential and variability.« less
Beam shaping in high-power laser systems with using refractive beam shapers
NASA Astrophysics Data System (ADS)
Laskin, Alexander; Laskin, Vadim
2012-06-01
Beam Shaping of the spatial (transverse) profile of laser beams is highly desirable by building optical systems of high-power lasers as well in various applications with these lasers. Pumping of the crystals of Ti:Sapphire lasers by the laser radiation with uniform (flattop) intensity profile improves performance of these ultrashort pulse high-power lasers in terms of achievable efficiency, peak-power and stability, output beam profile. Specifications of the solid-state lasers built according to MOPA configuration can be also improved when radiation of the master oscillator is homogenized and then is amplified by the power amplifier. Features of building these high power lasers require that a beam shaping solution should be capable to work with single mode and multimode beams, provide flattop and super-Gauss intensity distributions, the consistency and divergence of a beam after the intensity re-distribution should be conserved and low absorption provided. These specific conditions are perfectly fulfilled by the refractive field mapping beam shapers due to their unique features: almost lossless intensity profile transformation, low output divergence, high transmittance and flatness of output beam profile, extended depth of field, adaptability to real intensity profiles of TEM00 and multimode laser sources. Combining of the refractive field mapping beam shapers with other optical components, like beam-expanders, relay imaging lenses, anamorphic optics makes it possible to generate the laser spots of necessary shape, size and intensity distribution. There are plenty of applications of high-power lasers where beam shaping bring benefits: irradiating photocathode of Free Electron Lasers (FEL), material ablation, micromachining, annealing in display making techniques, cladding, heat treating and others. This paper will describe some design basics of refractive beam shapers of the field mapping type, with emphasis on the features important for building and applications of high-power laser sources. There will be presented results of applying the refractive beam shapers in real installations.
NASA Astrophysics Data System (ADS)
Yang, S. W.; Ma, J. J.; Wang, J. M.
2018-04-01
As representative vulnerable regions of the city, dense distribution areas of temporary color steel building are a major target for control of fire risks, illegal buildings, environmental supervision, urbanization quality and enhancement for city's image. In the domestic and foreign literature, the related research mainly focuses on fire risks and violation monitoring. However, due to temporary color steel building's special characteristics, the corresponding research about temporal and spatial distribution, and influence on urban spatial form etc. has not been reported. Therefore, firstly, the paper research aim plans to extract information of large-scale color steel building from high-resolution images. Secondly, the color steel plate buildings were classified, and the spatial and temporal distribution and aggregation characteristics of small (temporary buildings) and large (factory building, warehouse, etc.) buildings were studied respectively. Thirdly, the coupling relationship between the spatial distribution of color steel plate and the spatial pattern of urban space was analysed. The results show that there is a good coupling relationship between the color steel plate building and the urban spatial form. Different types of color steel plate building represent the pattern of regional differentiation of urban space and the phased pattern of urban development.
A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth
2005-03-15
The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less
Mobile Infrared Thermographic Surveys Of Buildings Within A Community
NASA Astrophysics Data System (ADS)
Allen, Sharon
1988-01-01
Over the years, constant developments and improvements have been made in the portability of infrared equipment. The ability to move around and travel from job to job easily greatly enhances the effectiveness of most in-field infrared thermographic surveys. Many vehicles have been modified to offer mobile infrared thermographic services. This paper describes one approach, and the results, to mobile infrared thermography. It covers the various stages in adapting a vehicle for mobile infrared thermography (IR) and problems encountered along the way. Originally designed for scanning electrical distribution lines, the "IR Van" also serves as a mobile unit for building diagnostics. The paper addresses building diagnostic applications for mobile IR and some of the findings recorded during an initial community investigation.
Hadoop-BAM: directly manipulating next generation sequencing data in the cloud.
Niemenmaa, Matti; Kallio, Aleksi; Schumacher, André; Klemelä, Petri; Korpelainen, Eija; Heljanko, Keijo
2012-03-15
Hadoop-BAM is a novel library for the scalable manipulation of aligned next-generation sequencing data in the Hadoop distributed computing framework. It acts as an integration layer between analysis applications and BAM files that are processed using Hadoop. Hadoop-BAM solves the issues related to BAM data access by presenting a convenient API for implementing map and reduce functions that can directly operate on BAM records. It builds on top of the Picard SAM JDK, so tools that rely on the Picard API are expected to be easily convertible to support large-scale distributed processing. In this article we demonstrate the use of Hadoop-BAM by building a coverage summarizing tool for the Chipster genome browser. Our results show that Hadoop offers good scalability, and one should avoid moving data in and out of Hadoop between analysis steps.
A uniform approach for programming distributed heterogeneous computing systems
Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas
2014-01-01
Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater’s performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations. PMID:25844015
A uniform approach for programming distributed heterogeneous computing systems.
Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas
2014-12-01
Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater's performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Kaiyu; Yan, Da; Hong, Tianzhen
2014-02-28
Overtime is a common phenomenon around the world. Overtime drives both internal heat gains from occupants, lighting and plug-loads, and HVAC operation during overtime periods. Overtime leads to longer occupancy hours and extended operation of building services systems beyond normal working hours, thus overtime impacts total building energy use. Current literature lacks methods to model overtime occupancy because overtime is stochastic in nature and varies by individual occupants and by time. To address this gap in the literature, this study aims to develop a new stochastic model based on the statistical analysis of measured overtime occupancy data from an officemore » building. A binomial distribution is used to represent the total number of occupants working overtime, while an exponential distribution is used to represent the duration of overtime periods. The overtime model is used to generate overtime occupancy schedules as an input to the energy model of a second office building. The measured and simulated cooling energy use during the overtime period is compared in order to validate the overtime model. A hybrid approach to energy model calibration is proposed and tested, which combines ASHRAE Guideline 14 for the calibration of the energy model during normal working hours, and a proposed KS test for the calibration of the energy model during overtime. The developed stochastic overtime model and the hybrid calibration approach can be used in building energy simulations to improve the accuracy of results, and better understand the characteristics of overtime in office buildings.« less
van Thienen, P; Vreeburg, J H G; Blokker, E J M
2011-02-01
Various particle transport mechanisms play a role in the build-up of discoloration potential in drinking water distribution networks. In order to enhance our understanding of and ability to predict this build-up, it is essential to recognize and understand their role. Gravitational settling with drag has primarily been considered in this context. However, since flow in water distribution pipes is nearly always in the turbulent regime, turbulent processes should be considered also. In addition to these, single particle effects and forces may affect radial particle transport. In this work, we present an application of a previously published turbulent particle deposition theory to conditions relevant for drinking water distribution systems. We predict quantitatively under which conditions turbophoresis, including the virtual mass effect, the Saffman lift force, and the Magnus force may contribute significantly to sediment transport in radial direction and compare these results to experimental observations. The contribution of turbophoresis is mostly limited to large particles (>50 μm) in transport mains, and not expected to play a major role in distribution mains. The Saffman lift force may enhance this process to some degree. The Magnus force is not expected to play any significant role in drinking water distribution systems. © 2010 Elsevier Ltd. All rights reserved.
Advanced Operating System Technologies
NASA Astrophysics Data System (ADS)
Cittolin, Sergio; Riccardi, Fabio; Vascotto, Sandro
In this paper we describe an R&D effort to define an OS architecture suitable for the requirements of the Data Acquisition and Control of an LHC experiment. Large distributed computing systems are foreseen to be the core part of the DAQ and Control system of the future LHC experiments. Neworks of thousands of processors, handling dataflows of several gigaBytes per second, with very strict timing constraints (microseconds), will become a common experience in the following years. Problems like distributyed scheduling, real-time communication protocols, failure-tolerance, distributed monitoring and debugging will have to be faced. A solid software infrastructure will be required to manage this very complicared environment, and at this moment neither CERN has the necessary expertise to build it, nor any similar commercial implementation exists. Fortunately these problems are not unique to the particle and high energy physics experiments, and the current research work in the distributed systems field, especially in the distributed operating systems area, is trying to address many of the above mentioned issues. The world that we are going to face in the next ten years will be quite different and surely much more interconnected than the one we see now. Very ambitious projects exist, planning to link towns, nations and the world in a single "Data Highway". Teleconferencing, Video on Demend, Distributed Multimedia Applications are just a few examples of the very demanding tasks to which the computer industry is committing itself. This projects are triggering a great research effort in the distributed, real-time micro-kernel based operating systems field and in the software enginering areas. The purpose of our group is to collect the outcame of these different research efforts, and to establish a working environment where the different ideas and techniques can be tested, evaluated and possibly extended, to address the requirements of a DAQ and Control System suitable for LHC. Our work started in the second half of 1994, with a research agreement between CERN and Chorus Systemes (France), world leader in the micro-kernel OS technology. The Chorus OS is targeted to distributed real-time applications, and it can very efficiently support different "OS personalities" in the same environment, like Posix, UNIX, and a CORBA compliant distributed object architecture. Projects are being set-up to verify the suitability of our work for LHC applications, we are building a scaled-down prototype of the DAQ system foreseen for the CMS experiment at LHC, where we will directly test our protocols and where we will be able to make measurements and benchmarks, guiding our development and allowing us to build an analytical model of the system, suitable for simulation and large scale verification.
ERIC Educational Resources Information Center
Bachore, Zelalem
2012-01-01
Ontology not only is considered to be the backbone of the semantic web but also plays a significant role in distributed and heterogeneous information systems. However, ontology still faces limited application and adoption to date. One of the major problems is that prevailing engineering-oriented methodologies for building ontologies do not…
Self-defrosting recuperative air-to-air heat exchanger
Drake, R.L.
1993-12-28
A heat exchanger is described which includes a stationary spirally or concentrically wound heat exchanger core with rotating baffles on upper and lower ends thereof. The rotating baffles include rotating inlets and outlets which are in communication with respective fixed inlets and outlets via annuli. The rotation of the baffles causes a concurrent rotation of the temperature distribution within the stationary exchanger core, thereby preventing frost build-up in some applications and preventing the formation of hot spots in other applications. 3 figures.
Advanced Information Processing System (AIPS)
NASA Technical Reports Server (NTRS)
Pitts, Felix L.
1993-01-01
Advanced Information Processing System (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.
NASA Mars rover: a testbed for evaluating applications of covariance intersection
NASA Astrophysics Data System (ADS)
Uhlmann, Jeffrey K.; Julier, Simon J.; Kamgar-Parsi, Behzad; Lanzagorta, Marco O.; Shyu, Haw-Jye S.
1999-07-01
The Naval Research Laboratory (NRL) has spearheaded the development and application of Covariance Intersection (CI) for a variety of decentralized data fusion problems. Such problems include distributed control, onboard sensor fusion, and dynamic map building and localization. In this paper we describe NRL's development of a CI-based navigation system for the NASA Mars rover that stresses almost all aspects of decentralized data fusion. We also describe how this project relates to NRL's augmented reality, advanced visualization, and REBOT projects.
NASA Astrophysics Data System (ADS)
Zaharov, A. A.; Nissenbaum, O. V.; Ponomaryov, K. Y.; Nesgovorov, E. S.
2018-01-01
In this paper we study application of Internet of Thing concept and devices to secure automated process control systems. We review different approaches in IoT (Internet of Things) architecture and design and propose them for several applications in security of automated process control systems. We consider an Attribute-based encryption in context of access control mechanism implementation and promote a secret key distribution scheme between attribute authorities and end devices.
A Weibull distribution accrual failure detector for cloud computing.
Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.
Smart Building: Decision Making Architecture for Thermal Energy Management.
Uribe, Oscar Hernández; Martin, Juan Pablo San; Garcia-Alegre, María C; Santos, Matilde; Guinea, Domingo
2015-10-30
Smart applications of the Internet of Things are improving the performance of buildings, reducing energy demand. Local and smart networks, soft computing methodologies, machine intelligence algorithms and pervasive sensors are some of the basics of energy optimization strategies developed for the benefit of environmental sustainability and user comfort. This work presents a distributed sensor-processor-communication decision-making architecture to improve the acquisition, storage and transfer of thermal energy in buildings. The developed system is implemented in a near Zero-Energy Building (nZEB) prototype equipped with a built-in thermal solar collector, where optical properties are analysed; a low enthalpy geothermal accumulation system, segmented in different temperature zones; and an envelope that includes a dynamic thermal barrier. An intelligent control of this dynamic thermal barrier is applied to reduce the thermal energy demand (heating and cooling) caused by daily and seasonal weather variations. Simulations and experimental results are presented to highlight the nZEB thermal energy reduction.
Supervised Outlier Detection in Large-Scale Mvs Point Clouds for 3d City Modeling Applications
NASA Astrophysics Data System (ADS)
Stucker, C.; Richard, A.; Wegner, J. D.; Schindler, K.
2018-05-01
We propose to use a discriminative classifier for outlier detection in large-scale point clouds of cities generated via multi-view stereo (MVS) from densely acquired images. What makes outlier removal hard are varying distributions of inliers and outliers across a scene. Heuristic outlier removal using a specific feature that encodes point distribution often delivers unsatisfying results. Although most outliers can be identified correctly (high recall), many inliers are erroneously removed (low precision), too. This aggravates object 3D reconstruction due to missing data. We thus propose to discriminatively learn class-specific distributions directly from the data to achieve high precision. We apply a standard Random Forest classifier that infers a binary label (inlier or outlier) for each 3D point in the raw, unfiltered point cloud and test two approaches for training. In the first, non-semantic approach, features are extracted without considering the semantic interpretation of the 3D points. The trained model approximates the average distribution of inliers and outliers across all semantic classes. Second, semantic interpretation is incorporated into the learning process, i.e. we train separate inlieroutlier classifiers per semantic class (building facades, roof, ground, vegetation, fields, and water). Performance of learned filtering is evaluated on several large SfM point clouds of cities. We find that results confirm our underlying assumption that discriminatively learning inlier-outlier distributions does improve precision over global heuristics by up to ≍ 12 percent points. Moreover, semantically informed filtering that models class-specific distributions further improves precision by up to ≍ 10 percent points, being able to remove very isolated building, roof, and water points while preserving inliers on building facades and vegetation.
Higher order statistical moment application for solar PV potential analysis
NASA Astrophysics Data System (ADS)
Basri, Mohd Juhari Mat; Abdullah, Samizee; Azrulhisham, Engku Ahmad; Harun, Khairulezuan
2016-10-01
Solar photovoltaic energy could be as alternative energy to fossil fuel, which is depleting and posing a global warming problem. However, this renewable energy is so variable and intermittent to be relied on. Therefore the knowledge of energy potential is very important for any site to build this solar photovoltaic power generation system. Here, the application of higher order statistical moment model is being analyzed using data collected from 5MW grid-connected photovoltaic system. Due to the dynamic changes of skewness and kurtosis of AC power and solar irradiance distributions of the solar farm, Pearson system where the probability distribution is calculated by matching their theoretical moments with that of the empirical moments of a distribution could be suitable for this purpose. On the advantage of the Pearson system in MATLAB, a software programming has been developed to help in data processing for distribution fitting and potential analysis for future projection of amount of AC power and solar irradiance availability.
Fiber-Optic Distribution Of Pulsed Power To Multiple Sensors
NASA Technical Reports Server (NTRS)
Kirkham, Harold
1996-01-01
Optoelectronic systems designed according to time-sharing scheme distribute optical power to multiple integrated-circuit-based sensors in fiber-optic networks. Networks combine flexibility of electronic sensing circuits with advantage of electrical isolation afforded by use of optical fibers instead of electrical conductors to transmit both signals and power. Fiber optics resist corrosion and immune to electromagnetic interference. Sensor networks of this type useful in variety of applications; for example, in monitoring strains in aircraft, buildings, and bridges, and in monitoring and controlling shapes of flexible structures.
Building and measuring a high performance network architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kramer, William T.C.; Toole, Timothy; Fisher, Chuck
2001-04-20
Once a year, the SC conferences present a unique opportunity to create and build one of the most complex and highest performance networks in the world. At SC2000, large-scale and complex local and wide area networking connections were demonstrated, including large-scale distributed applications running on different architectures. This project was designed to use the unique opportunity presented at SC2000 to create a testbed network environment and then use that network to demonstrate and evaluate high performance computational and communication applications. This testbed was designed to incorporate many interoperable systems and services and was designed for measurement from the very beginning.more » The end results were key insights into how to use novel, high performance networking technologies and to accumulate measurements that will give insights into the networks of the future.« less
State University of New York Institute of Technology (SUNYIT) Visiting Scholars Program
2013-05-01
team members, and build the necessary backend metal interconnections. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED 4 Baek-Young Choi...Cooperative and Opportunistic Mobile Cloud for Energy Efficient Positioning; Department of Computer Science Electrical Engineering, University of...Missouri - Kansas City The fast growing popularity of smartphones and tablets enables us the use of various intelligent mobile applications. As many of
Software Testbed for Developing and Evaluating Integrated Autonomous Systems
2015-03-01
EUROPA planning system for plan generation. The adaptive controller executes the new plan, using augmented, hierarchical finite state machines to...using the Internet Communications Engine ( ICE ), an object-oriented toolkit for building distributed applications. TABLE OF CONTENTS 1...ANML model is translated into the New Domain Definition Language (NDDL) and sent to NASA???s EUROPA planning system for plan generation. The adaptive
Intelligent Systems Technologies and Utilization of Earth Observation Data
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.; McConaughy, G. R.; Morse, H. S.
2004-01-01
The addition of raw data and derived geophysical parameters from several Earth observing satellites over the last decade to the data held by NASA data centers has created a data rich environment for the Earth science research and applications communities. The data products are being distributed to a large and diverse community of users. Due to advances in computational hardware, networks and communications, information management and software technologies, significant progress has been made in the last decade in archiving and providing data to users. However, to realize the full potential of the growing data archives, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system (KBS). Potential Intelligent Archive concepts include: 1) Mining archived data holdings to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services; 3) Recognizing the value of results, indexing and formatting them for easy access; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building; and 5) Being aware of other nodes in the KBS, participating in open systems interfaces and protocols for virtualization, and achieving collaborative interoperability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pavlak, Gregory S.; Henze, Gregor P.; Hirsch, Adam I.
This paper demonstrates an energy signal tool to assess the system-level and whole-building energy use of an office building in downtown Denver, Colorado. The energy signal tool uses a traffic light visualization to alert a building operator to energy use which is substantially different from expected. The tool selects which light to display for a given energy end-use by comparing measured energy use to expected energy use, accounting for uncertainty. A red light is only displayed when a fault is likely enough, and abnormal operation costly enough, that taking action will yield the lowest cost result. While the theoretical advancesmore » and tool development were reported previously, it has only been tested using a basic building model and has not, until now, been experimentally verified. Expected energy use for the field demonstration is provided by a compact reduced-order representation of the Alliance Center, generated from a detailed DOE-2.2 energy model. Actual building energy consumption data is taken from the summer of 2014 for the office building immediately after a significant renovation project. The purpose of this paper is to demonstrate a first look at the building following its major renovation compared to the design intent. The tool indicated strong under-consumption in lighting and plug loads and strong over-consumption in HVAC energy consumption, which prompted several focused actions for follow-up investigation. In addition, this paper illustrates the application of Bayesian inference to the estimation of posterior parameter probability distributions to measured data. Practical discussion of the application is provided, along with additional findings from further investigating the significant difference between expected and actual energy consumption.« less
ScyFlow: An Environment for the Visual Specification and Execution of Scientific Workflows
NASA Technical Reports Server (NTRS)
McCann, Karen M.; Yarrow, Maurice; DeVivo, Adrian; Mehrotra, Piyush
2004-01-01
With the advent of grid technologies, scientists and engineers are building more and more complex applications to utilize distributed grid resources. The core grid services provide a path for accessing and utilizing these resources in a secure and seamless fashion. However what the scientists need is an environment that will allow them to specify their application runs at a high organizational level, and then support efficient execution across any given set or sets of resources. We have been designing and implementing ScyFlow, a dual-interface architecture (both GUT and APT) that addresses this problem. The scientist/user specifies the application tasks along with the necessary control and data flow, and monitors and manages the execution of the resulting workflow across the distributed resources. In this paper, we utilize two scenarios to provide the details of the two modules of the project, the visual editor and the runtime workflow engine.
SemanticOrganizer: A Customizable Semantic Repository for Distributed NASA Project Teams
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Berrios, Daniel C.; Carvalho, Robert E.; Hall, David R.; Rich, Stephen J.; Sturken, Ian B.; Swanson, Keith J.; Wolfe, Shawn R.
2004-01-01
SemanticOrganizer is a collaborative knowledge management system designed to support distributed NASA projects, including diverse teams of scientists, engineers, and accident investigators. The system provides a customizable, semantically structured information repository that stores work products relevant to multiple projects of differing types. SemanticOrganizer is one of the earliest and largest semantic web applications deployed at NASA to date, and has been used in diverse contexts ranging from the investigation of Space Shuttle Columbia's accident to the search for life on other planets. Although the underlying repository employs a single unified ontology, access control and ontology customization mechanisms make the repository contents appear different for each project team. This paper describes SemanticOrganizer, its customization facilities, and a sampling of its applications. The paper also summarizes some key lessons learned from building and fielding a successful semantic web application across a wide-ranging set of domains with diverse users.
Pauliuk, Stefan; Kondo, Yasushi; Nakamura, Shinichiro; Nakajima, Kenichi
2017-01-01
Substantial amounts of post-consumer scrap are exported to other regions or lost during recovery and remelting, and both export and losses pose a constraint to desires for having regionally closed material cycles. To quantify the challenges and trade-offs associated with closed-loop metal recycling, we looked at the material cycles from the perspective of a single material unit and trace a unit of material through several product life cycles. Focusing on steel, we used current process parameters, loss rates, and trade patterns of the steel cycle to study how steel that was originally contained in high quality applications such as machinery or vehicles with stringent purity requirements gets subsequently distributed across different regions and product groups such as building and construction with less stringent purity requirements. We applied MaTrace Global, a supply-driven multiregional model of steel flows coupled to a dynamic stock model of steel use. We found that, depending on region and product group, up to 95% of the steel consumed today will leave the use phase of that region until 2100, and that up to 50% can get lost in obsolete stocks, landfills, or slag piles until 2100. The high losses resulting from business-as-usual scrap recovery and recycling can be reduced, both by diverting postconsumer scrap into long-lived applications such as buildings and by improving the recovery rates in the waste management and remelting industries. Because the lifetimes of high-quality (cold-rolled) steel applications are shorter and remelting occurs more often than for buildings and infrastructure, we found and quantified a tradeoff between low losses and high-quality applications in the steel cycle. Furthermore, we found that with current trade patterns, reduced overall losses will lead to higher fractions of secondary steel being exported to other regions. Current loss rates, product lifetimes, and trade patterns impede the closure of the steel cycle.
Construction of high-rise buildings in the Far East of Russia
NASA Astrophysics Data System (ADS)
Kudryavtsev, Sergey; Bugunov, Semen; Pogulyaeva, Evgeniya; Peters, Anastasiya; Kotenko, Zhanna; Grigor'yev, Danil
2018-03-01
The construction of high-rise buildings on plate foundation in geotechnical conditions of the Russian Far East is a complicated problem. In this respect foundation engineering becomes rather essential. In order to set a firm foundation it is necessary to take into account the pressure distribution at the structure base, in homogeneity of building deformation, which is due to collaborative geotechnical calculations complicated by a number of factors: actual over-placement of soils, the complex geometry of the building under construction, spatial work of the foundation ground with consideration for physical nonlinearity, the influence of the stiffness of the superstructure (reinforced concrete framing) upon the development of foundation deformations, foundation performance (the performance of the bed plate under the building and stairwells), the origination of internal forces in the superstructure with differential settlement. The solution of spatial problems regarding the mutual interaction between buildings and foundations with account of the factors mentioned above is fully achievable via the application of numerical modeling methodology. The work makes a review of the results of high-rise plate building numerical modeling in geotechnical conditions of the Russian Far East by way of the example of Khabarovsk city.
A Weibull distribution accrual failure detector for cloud computing
Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229
A strip chart recorder pattern recognition tool kit for Shuttle operations
NASA Technical Reports Server (NTRS)
Hammen, David G.; Moebes, Travis A.; Shelton, Robert O.; Savely, Robert T.
1993-01-01
During Space Shuttle operations, Mission Control personnel monitor numerous mission-critical systems such as electrical power; guidance, navigation, and control; and propulsion by means of paper strip chart recorders. For example, electrical power controllers monitor strip chart recorder pen traces to identify onboard electrical equipment activations and deactivations. Recent developments in pattern recognition technologies coupled with new capabilities that distribute real-time Shuttle telemetry data to engineering workstations make it possible to develop computer applications that perform some of the low-level monitoring now performed by controllers. The number of opportunities for such applications suggests a need to build a pattern recognition tool kit to reduce software development effort through software reuse. We are building pattern recognition applications while keeping such a tool kit in mind. We demonstrated the initial prototype application, which identifies electrical equipment activations, during three recent Shuttle flights. This prototype was developed to test the viability of the basic system architecture, to evaluate the performance of several pattern recognition techniques including those based on cross-correlation, neural networks, and statistical methods, to understand the interplay between an advanced automation application and human controllers to enhance utility, and to identify capabilities needed in a more general-purpose tool kit.
Exploiting Thread Parallelism for Ocean Modeling on Cray XC Supercomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarje, Abhinav; Jacobsen, Douglas W.; Williams, Samuel W.
The incorporation of increasing core counts in modern processors used to build state-of-the-art supercomputers is driving application development towards exploitation of thread parallelism, in addition to distributed memory parallelism, with the goal of delivering efficient high-performance codes. In this work we describe the exploitation of threading and our experiences with it with respect to a real-world ocean modeling application code, MPAS-Ocean. We present detailed performance analysis and comparisons of various approaches and configurations for threading on the Cray XC series supercomputers.
On the viability of exploiting L-shell fluorescence for X-ray polarimetry
NASA Technical Reports Server (NTRS)
Weisskopf, M. C.; Sutherland, P. G.; Elsner, R. F.; Ramsey, B. D.
1985-01-01
It has been suggested that one may build an X-ray polarimeter by exploiting the polarization dependence of the angular distribution of L-shell fluorescence photons. In this paper the sensitivity of this approach to polarimetry is examined theoretically. The calculations are applied to several detection schemes using imaging proportional counters that would have direct application in X-ray astronomy. It is found, however, that the sensitivity of this method for measuring X-ray polarization is too low to be of use for other than laboratory applications.
Flight dynamics software in a distributed network environment
NASA Technical Reports Server (NTRS)
Jeletic, J.; Weidow, D.; Boland, D.
1995-01-01
As with all NASA facilities, the announcement of reduced budgets, reduced staffing, and the desire to implement smaller/quicker/cheaper missions has required the Agency's organizations to become more efficient in what they do. To accomplish these objectives, the FDD has initiated the development of the Flight Dynamics Distributed System (FDDS). The underlying philosophy of FDDS is to build an integrated system that breaks down the traditional barriers of attitude, mission planning, and navigation support software to provide a uniform approach to flight dynamics applications. Through the application of open systems concepts and state-of-the-art technologies, including object-oriented specification concepts, object-oriented software, and common user interface, communications, data management, and executive services, the FDD will reengineer most of its six million lines of code.
Virtually-synchronous communication based on a weak failure suspector
NASA Technical Reports Server (NTRS)
Schiper, Andre; Ricciardi, Aleta
1993-01-01
Failure detectors (or, more accurately Failure Suspectors (FS)) appear to be a fundamental service upon which to build fault-tolerant, distributed applications. This paper shows that a FS with very weak semantics (i.e., that delivers failure and recovery information in no specific order) suffices to implement virtually-synchronous communication (VSC) in an asynchronous system subject to process crash failures and network partitions. The VSC paradigm is particularly useful in asynchronous systems and greatly simplifies building fault-tolerant applications that mask failures by replicating processes. We suggest a three-component architecture to implement virtually-synchronous communication: (1) at the lowest level, the FS component; (2) on top of it, a component (2a) that defines new views; and (3) a component (2b) that reliably multicasts messages within a view. The issues covered in this paper also lead to a better understanding of the various membership service semantics proposed in recent literature.
An Empirical Study of Synchrophasor Communication Delay in a Utility TCP/IP Network
NASA Astrophysics Data System (ADS)
Zhu, Kun; Chenine, Moustafa; Nordström, Lars; Holmström, Sture; Ericsson, Göran
2013-07-01
Although there is a plethora of literature dealing with Phasor Measurement Unit (PMU) communication delay, there has not been any effort made to generalize empirical delay results by identifying the distribution with the best fit. The existing studies typically assume a distribution or simply build on analogies to communication network routing delay. Specifically, this study provides insight into the characterization of the communication delay of both unprocessed PMU data and synchrophasors sorted by a Phasor Data Concentrator (PDC). The results suggest that a bi-modal distribution containing two normal distributions offers the best fit of the delay of the unprocessed data, whereas the delay profile of the sorted synchrophasors resembles a normal distribution based on these results, the possibility of evaluating the reliability of a synchrophasor application with respect to a particular choice of PDC timeout is discussed.
Building a generalized distributed system model
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi
1991-01-01
A number of topics related to building a generalized distributed system model are discussed. The effects of distributed database modeling on evaluation of transaction rollbacks, the measurement of effects of distributed database models on transaction availability measures, and a performance analysis of static locking in replicated distributed database systems are covered.
Scaling and allometry in the building geometries of Greater London
NASA Astrophysics Data System (ADS)
Batty, M.; Carvalho, R.; Hudson-Smith, A.; Milton, R.; Smith, D.; Steadman, P.
2008-06-01
Many aggregate distributions of urban activities such as city sizes reveal scaling but hardly any work exists on the properties of spatial distributions within individual cities, notwithstanding considerable knowledge about their fractal structure. We redress this here by examining scaling relationships in a world city using data on the geometric properties of individual buildings. We first summarise how power laws can be used to approximate the size distributions of buildings, in analogy to city-size distributions which have been widely studied as rank-size and lognormal distributions following Zipf [ Human Behavior and the Principle of Least Effort (Addison-Wesley, Cambridge, 1949)] and Gibrat [ Les Inégalités Économiques (Librarie du Recueil Sirey, Paris, 1931)]. We then extend this analysis to allometric relationships between buildings in terms of their different geometric size properties. We present some preliminary analysis of building heights from the Emporis database which suggests very strong scaling in world cities. The data base for Greater London is then introduced from which we extract 3.6 million buildings whose scaling properties we explore. We examine key allometric relationships between these different properties illustrating how building shape changes according to size, and we extend this analysis to the classification of buildings according to land use types. We conclude with an analysis of two-point correlation functions of building geometries which supports our non-spatial analysis of scaling.
TRSkit: A Simple Digital Library Toolkit
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Esler, Sandra L.
1997-01-01
This paper introduces TRSkit, a simple and effective toolkit for building digital libraries on the World Wide Web. The toolkit was developed for the creation of the Langley Technical Report Server and the NASA Technical Report Server, but is applicable to most simple distribution paradigms. TRSkit contains a handful of freely available software components designed to be run under the UNIX operating system and served via the World Wide Web. The intended customer is the person that must continuously and synchronously distribute anywhere from 100 - 100,000's of information units and does not have extensive resources to devote to the problem.
Reframing the Dissemination Challenge: A Marketing and Distribution Perspective
Bernhardt, Jay M.
2009-01-01
A fundamental obstacle to successful dissemination and implementation of evidence-based public health programs is the near-total absence of systems and infrastructure for marketing and distribution. We describe the functions of a marketing and distribution system, and we explain how it would help move effective public health programs from research to practice. Then we critically evaluate the 4 dominant strategies now used to promote dissemination and implementation, and we explain how each would be enhanced by marketing and distribution systems. Finally, we make 6 recommendations for building the needed system infrastructure and discuss the responsibility within the public health community for implementation of these recommendations. Without serious investment in such infrastructure, application of proven solutions in public health practice will continue to occur slowly and rarely. PMID:19833993
Reframing the dissemination challenge: a marketing and distribution perspective.
Kreuter, Matthew W; Bernhardt, Jay M
2009-12-01
A fundamental obstacle to successful dissemination and implementation of evidence-based public health programs is the near-total absence of systems and infrastructure for marketing and distribution. We describe the functions of a marketing and distribution system, and we explain how it would help move effective public health programs from research to practice. Then we critically evaluate the 4 dominant strategies now used to promote dissemination and implementation, and we explain how each would be enhanced by marketing and distribution systems. Finally, we make 6 recommendations for building the needed system infrastructure and discuss the responsibility within the public health community for implementation of these recommendations. Without serious investment in such infrastructure, application of proven solutions in public health practice will continue to occur slowly and rarely.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Westervelt, E.T.; Northup, G.R.; Lawrie, L.K.
1990-09-01
This report describes the data analysis and recommendations of a project demonstrating the energy performance of theoretically based retrofit packages on existing standard Army building at Fort Carson, CO. Four standard designs were investigated: a motor vehicle repair shop, the Type 64 (L-shaped) barracks, an enlisted personnel mess hall, and a two-company, rolling-pin-shaped barracks for enlisted personnel. The tested conservation measures included envelope and system modifications. Energy data were gathered and analyzed from 14 buildings. Based on measured savings and current costs of fuel and construction, none of the four original packages are life-cycle cost-effective at present, but two maymore » become effective in the near future. Of higher priority for energy and cost savings is the improvement of building operations, in particular heat production and distribution systems, which lack efficiency and control. Followup work at the L-shaped barracks yielded substantial savings, with a saving-to-investment ration of 5 to 1. Cost scenarios, energy models, and building were developed for the original retrofits to assess applicability elsewhere and in the future.« less
Smart Building: Decision Making Architecture for Thermal Energy Management
Hernández Uribe, Oscar; San Martin, Juan Pablo; Garcia-Alegre, María C.; Santos, Matilde; Guinea, Domingo
2015-01-01
Smart applications of the Internet of Things are improving the performance of buildings, reducing energy demand. Local and smart networks, soft computing methodologies, machine intelligence algorithms and pervasive sensors are some of the basics of energy optimization strategies developed for the benefit of environmental sustainability and user comfort. This work presents a distributed sensor-processor-communication decision-making architecture to improve the acquisition, storage and transfer of thermal energy in buildings. The developed system is implemented in a near Zero-Energy Building (nZEB) prototype equipped with a built-in thermal solar collector, where optical properties are analysed; a low enthalpy geothermal accumulation system, segmented in different temperature zones; and an envelope that includes a dynamic thermal barrier. An intelligent control of this dynamic thermal barrier is applied to reduce the thermal energy demand (heating and cooling) caused by daily and seasonal weather variations. Simulations and experimental results are presented to highlight the nZEB thermal energy reduction. PMID:26528978
NASA Astrophysics Data System (ADS)
Kline, Jeffrey D.; Moses, Alissa; Burcsu, Theresa
2010-05-01
Forest policymakers, public lands managers, and scientists in the Pacific Northwest (USA) seek ways to evaluate the landscape-level effects of policies and management through the multidisciplinary development and application of spatially explicit methods and models. The Interagency Mapping and Analysis Project (IMAP) is an ongoing effort to generate landscape-wide vegetation data and models to evaluate the integrated effects of disturbances and management activities on natural resource conditions in Oregon and Washington (USA). In this initial analysis, we characterized the spatial distribution of forest and range land development in a four-county pilot study region in central Oregon. The empirical model describes the spatial distribution of buildings and new building construction as a function of population growth, existing development, topography, land-use zoning, and other factors. We used the model to create geographic information system maps of likely future development based on human population projections to inform complementary landscape analyses underway involving vegetation, habitat, and wildfire interactions. In an example application, we use the model and resulting maps to show the potential impacts of future forest and range land development on mule deer ( Odocoileus hemionus) winter range. Results indicate significant development encroachment and habitat loss already in 2000 with development located along key migration routes and increasing through the projection period to 2040. The example application illustrates a simple way for policymakers and public lands managers to combine existing data and preliminary model outputs to begin to consider the potential effects of development on future landscape conditions.
NASA Technical Reports Server (NTRS)
Falke, Stefan; Husar, Rudolf
2011-01-01
The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.
Fine-Scale Population Estimation by 3D Reconstruction of Urban Residential Buildings
Wang, Shixin; Tian, Ye; Zhou, Yi; Liu, Wenliang; Lin, Chenxi
2016-01-01
Fine-scale population estimation is essential in emergency response and epidemiological applications as well as urban planning and management. However, representing populations in heterogeneous urban regions with a finer resolution is a challenge. This study aims to obtain fine-scale population distribution based on 3D reconstruction of urban residential buildings with morphological operations using optical high-resolution (HR) images from the Chinese No. 3 Resources Satellite (ZY-3). Specifically, the research area was first divided into three categories when dasymetric mapping was taken into consideration. The results demonstrate that the morphological building index (MBI) yielded better results than built-up presence index (PanTex) in building detection, and the morphological shadow index (MSI) outperformed color invariant indices (CIIT) in shadow extraction and height retrieval. Building extraction and height retrieval were then combined to reconstruct 3D models and to estimate population. Final results show that this approach is effective in fine-scale population estimation, with a mean relative error of 16.46% and an overall Relative Total Absolute Error (RATE) of 0.158. This study gives significant insights into fine-scale population estimation in complicated urban landscapes, when detailed 3D information of buildings is unavailable. PMID:27775670
Estimating air chemical emissions from research activities using stack measurement data.
Ballinger, Marcel Y; Duchsherer, Cheryl J; Woodruff, Rodger K; Larson, Timothy V
2013-03-01
Current methods of estimating air emissions from research and development (R&D) activities use a wide range of release fractions or emission factors with bases ranging from empirical to semi-empirical. Although considered conservative, the uncertainties and confidence levels of the existing methods have not been reported. Chemical emissions were estimated from sampling data taken from four research facilities over 10 years. The approach was to use a Monte Carlo technique to create distributions of annual emission estimates for target compounds detected in source test samples. Distributions were created for each year and building sampled for compounds with sufficient detection frequency to qualify for the analysis. The results using the Monte Carlo technique without applying a filter to remove negative emission values showed almost all distributions spanning zero, and 40% of the distributions having a negative mean. This indicates that emissions are so low as to be indistinguishable from building background. Application of a filter to allow only positive values in the distribution provided a more realistic value for emissions and increased the distribution mean by an average of 16%. Release fractions were calculated by dividing the emission estimates by a building chemical inventory quantity. Two variations were used for this quantity: chemical usage, and chemical usage plus one-half standing inventory. Filters were applied so that only release fraction values from zero to one were included in the resulting distributions. Release fractions had a wide range among chemicals and among data sets for different buildings and/or years for a given chemical. Regressions of release fractions to molecular weight and vapor pressure showed weak correlations. Similarly, regressions of mean emissions to chemical usage, chemical inventory, molecular weight, and vapor pressure also gave weak correlations. These results highlight the difficulties in estimating emissions from R&D facilities using chemical inventory data. Air emissions from research operations are difficult to estimate because of the changing nature of research processes and the small quantity and wide variety of chemicals used. Analysis of stack measurements taken over multiple facilities and a 10-year period using a Monte Carlo technique provided a method to quantify the low emissions and to estimate release fractions based on chemical inventories. The variation in release fractions did not correlate well with factors investigated, confirming the complexities in estimating R&D emissions.
Spatial Information in local society's cultural conservation and research
NASA Astrophysics Data System (ADS)
Jang, J.-J.; Liao, H.-M.; Fan, I.-C.
2015-09-01
Center for Geographic Information Science, Research Center for Humanities and Social Sciences,Academia Sinica (GIS center), Coordinate short-, medium-, and long-term operations of multidisciplinary researches focusing on related topics in the sciences and humanities. Based on the requirements of multi-disciplinary research applications, sustain collection and construction of sustaining and unifying spatial base data and knowledge and building of spatial data infrastructure. Since the 1990s, GIS center build geographic information platform: "Time and space infrastructure of Chinese civilization" (Chinese Civilizationin Time and Space, CCTS) and "Taiwan History and Culture Map" (Taiwan History and Culture in Time and Space, THCTS) . the goal of both system is constructing an integrated GIS-based application infrastructure on the spatial extent of China and Taiwan, in the timeframe of Chinese and Taiwanese history, and with the contents of Chinese and Taiwanese civilization. Base on THCTS, we began to build Cultural Resources GIS(CRGIS, http://crgis.rchss.sinica.edu.tw) in 2006, to collect temples, historic Monuments, historic buildings, old trees, wind lions god and other cultural resource in Taiwan, and provide a platform for the volunteers to make for all types of tangible, intangible cultural resources, add, edit, organize and query data via Content Management System(CMS) . CRGIS collected aggregated 13,000 temples, 4,900 churches. On this basis, draw a variety of religious beliefs map-multiple times Temple distributions, different main god distributions, church distribution. Such as Mazu maps, Multiple times temple distributions map (before 1823, 1823-1895,1895-1949,1949-2015 years) at Taijiang inner sea areas in Tainan. In Taiwan, there is a religious ritual through folk activities for a period ranging from one day to several days, passing specific geospatial range and passes through some temples or houses. Such an important folk activity somewhat similar to Western parade, called " raojing " , the main spirit is passing through of these ranges in the process, to reach the people within bless range, many scholars and academic experts's folk research are dependent on such spatial information. 2012, GIS center applied WebGIS and GPS to gather raojing activities spatial information in cooperation with multi-units, aggregated seven sessions, 22 days, 442 temples had pass through . The atlas also published named "Atlas of the 2012 Religious Processions in the Tainan Region" in 2014. we also applied national cultural resources data form relevant government authorities, through the metadata design and data processing(geocoding), established cultural geospatial and thematic information ,such as 800 monuments, 1,100 historic buildings and 4,300 old trees data. In recent years, based on CRGIS technology and operational concepts, different domain experts or local culture-ahistory research worker/team had to cooperate with us to establish local or thematic material and cultural resources. As in collaboration with local culture-history research worker in Kinmen County in 2012, build Kinmen intangible cultural assets - Wind Lion God ,set metadata and build 122 wind lion god `s attribute data and maps through field survey, it is worth mention such fieldwork data integrity is more than the official registration data form Kinmen National Park, the number of is wind lion god more than 40; in 2013,we were in cooperation with academic experts to establish property data and map of the theatre during the Japanese colonial era in Taiwan, a total of 170 theatres ; we were in cooperation with Japanese scholars, used his 44 detaile field survey data of sugar refineries during the Japanese colonial era in Taiwan ,to produce a sugar refineries distribution map and extend to a thematic web(http://map.net.tw/) [The Cultural Heritage Maps of Taiwan Suger Factories in a Century]site according to CRGIS independent cultural concept. Deployment and operation of the CRGIS, the meaning is not only build the thematic GIS system ,but also contain these concepts: Open Data, Wikipedia ,Public Participation, and we provide an interactive platform with culture resource data and geospatial technology. In addition to providing these reference material for local culture education, local cultural recognition, but to further cooperate with scholars, academic experts , local culture-history research worker / team, accumulated rich record of the past, research results, through the spatial database planning, data processing(ex. geocoding), field survey, geospatial materials overlapping, such as nesting geospatial technology to compile the cultural information and value-added applications.
Un-Building Blocks: A Model of Reverse Engineering and Applicable Heuristics
2015-12-01
CONCLUSIONS The machine does not isolate man from the great problems of nature but plunges him more deeply into them. Antoine de Saint-Exupery— Wind ...DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Reverse engineering is the problem -solving activity that ensues when one takes a...Douglas Moses, Vice Provost for Academic Affairs iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT Reverse engineering is the problem -solving
Trusted Computing Management Server Making Trusted Computing User Friendly
NASA Astrophysics Data System (ADS)
Sothmann, Sönke; Chaudhuri, Sumanta
Personal Computers (PC) with build in Trusted Computing (TC) technology are already well known and widely distributed. Nearly every new business notebook contains now a Trusted Platform Module (TPM) and could be used with increased trust and security features in daily application and use scenarios. However in real life the number of notebooks and PCs where the TPM is really activated and used is still very small.
User interface using a 3D model for video surveillance
NASA Astrophysics Data System (ADS)
Hata, Toshihiko; Boh, Satoru; Tsukada, Akihiro; Ozaki, Minoru
1998-02-01
These days fewer people, who must carry out their tasks quickly and precisely, are required in industrial surveillance and monitoring applications such as plant control or building security. Utilizing multimedia technology is a good approach to meet this need, and we previously developed Media Controller, which is designed for the applications and provides realtime recording and retrieval of digital video data in a distributed environment. In this paper, we propose a user interface for such a distributed video surveillance system in which 3D models of buildings and facilities are connected to the surveillance video. A novel method of synchronizing camera field data with each frame of a video stream is considered. This method records and reads the camera field data similarity to the video data and transmits it synchronously with the video stream. This enables the user interface to have such useful functions as comprehending the camera field immediately and providing clues when visibility is poor, for not only live video but also playback video. We have also implemented and evaluated the display function which makes surveillance video and 3D model work together using Media Controller with Java and Virtual Reality Modeling Language employed for multi-purpose and intranet use of 3D model.
Developing and Optimizing Applications in Hadoop
NASA Astrophysics Data System (ADS)
Kothuri, P.; Garcia, D.; Hermans, J.
2017-10-01
This contribution is about sharing our recent experiences of building Hadoop based application. Hadoop ecosystem now offers myriad of tools which can overwhelm new users, yet there are successful ways these tools can be leveraged to solve problems. We look at factors to consider when using Hadoop to model and store data, best practices for moving data in and out of the system and common processing patterns, at each stage relating with the real world experience gained while developing such application. We share many of the design choices, tools developed and how to profile a distributed application which can be applied for other scenarios as well. In conclusion, the goal of the presentation is to provide guidance to architect Hadoop based application and share some of the reusable components developed in this process.
Hadoop-BAM: directly manipulating next generation sequencing data in the cloud
Niemenmaa, Matti; Kallio, Aleksi; Schumacher, André; Klemelä, Petri; Korpelainen, Eija; Heljanko, Keijo
2012-01-01
Summary: Hadoop-BAM is a novel library for the scalable manipulation of aligned next-generation sequencing data in the Hadoop distributed computing framework. It acts as an integration layer between analysis applications and BAM files that are processed using Hadoop. Hadoop-BAM solves the issues related to BAM data access by presenting a convenient API for implementing map and reduce functions that can directly operate on BAM records. It builds on top of the Picard SAM JDK, so tools that rely on the Picard API are expected to be easily convertible to support large-scale distributed processing. In this article we demonstrate the use of Hadoop-BAM by building a coverage summarizing tool for the Chipster genome browser. Our results show that Hadoop offers good scalability, and one should avoid moving data in and out of Hadoop between analysis steps. Availability: Available under the open-source MIT license at http://sourceforge.net/projects/hadoop-bam/ Contact: matti.niemenmaa@aalto.fi Supplementary information: Supplementary material is available at Bioinformatics online. PMID:22302568
TeleMed: Wide-area, secure, collaborative object computing with Java and CORBA for healthcare
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forslund, D.W.; George, J.E.; Gavrilov, E.M.
1998-12-31
Distributed computing is becoming commonplace in a variety of industries with healthcare being a particularly important one for society. The authors describe the development and deployment of TeleMed in a few healthcare domains. TeleMed is a 100% Java distributed application build on CORBA and OMG standards enabling the collaboration on the treatment of chronically ill patients in a secure manner over the Internet. These standards enable other systems to work interoperably with TeleMed and provide transparent access to high performance distributed computing to the healthcare domain. The goal of wide scale integration of electronic medical records is a grand-challenge scalemore » problem of global proportions with far-reaching social benefits.« less
Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis
Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...
2008-01-01
Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less
Time-domain diffuse optics: towards next generation devices
NASA Astrophysics Data System (ADS)
Contini, Davide; Dalla Mora, Alberto; Arridge, Simon; Martelli, Fabrizio; Tosi, Alberto; Boso, Gianluca; Farina, Andrea; Durduran, Turgut; Martinenghi, Edoardo; Torricelli, Alessandro; Pifferi, Antonio
2015-07-01
Diffuse optics is a powerful tool for clinical applications ranging from oncology to neurology, but also for molecular imaging, and quality assessment of food, wood and pharmaceuticals. We show that ideally time-domain diffuse optics can give higher contrast and a higher penetration depth with respect to standard technology. In order to completely exploit the advantages of a time-domain system a distribution of sources and detectors with fast gating capabilities covering all the sample surface is needed. Here, we present the building block to build up such system. This basic component is made of a miniaturised source-detector pair embedded into the probe based on pulsed Vertical-Cavity Surface-Emitting Lasers (VCSEL) as sources and Single-Photon Avalanche Diodes (SPAD) or Silicon Photomultipliers (SiPM) as detectors. The possibility to miniaturized and dramatically increase the number of source detectors pairs open the way to an advancement of diffuse optics in terms of improvement of performances and exploration of new applications. Furthermore, availability of compact devices with reduction in size and cost can boost the application of this technique.
Zhang, Weiwei; Huang, Guoyou; Ng, Kelvin; Ji, Yuan; Gao, Bin; Huang, Liqing; Zhou, Jinxiong; Lu, Tian Jian; Xu, Feng
2018-03-26
Hydrogel particles that can be engineered to compartmentally culture cells in a three-dimensional (3D) and high-throughput manner have attracted increasing interest in the biomedical area. However, the ability to generate hydrogel particles with specially designed structures and their potential biomedical applications need to be further explored. This work introduces a method for fabricating hydrogel particles in an ellipsoidal cap-like shape (i.e., ellipsoidal cap-like hydrogel particles) by employing an open-pore anodic aluminum oxide membrane. Hydrogel particles of different sizes are fabricated. The ability to produce ellipsoidal cap-like magnetic hydrogel particles with controlled distribution of magnetic nanoparticles is demonstrated. Encapsulated cells show high viability, indicating the potential for using these hydrogel particles as structure- and remote-controllable building blocks for tissue engineering application. Moreover, the hydrogel particles are also used as sacrificial templates for fabricating ellipsoidal cap-like concave wells, which are further applied for producing size controllable cell aggregates. The results are beneficial for the development of hydrogel particles and their applications in 3D cell culture.
Performance evaluation of radiant cooling system application on a university building in Indonesia
NASA Astrophysics Data System (ADS)
Satrio, Pujo; Sholahudin, S.; Nasruddin
2017-03-01
The paper describes a study developed to estimate the energy savings potential of a radiant cooling system installed in an institutional building in Indonesia. The simulations were carried out using IESVE to evaluate thermal performance and energy consumption The building model was calibrated using the measured data for the installed radiant system. Then this calibrated model was used to simulate the energy consumption and temperature distribution to determine the proportional energy savings and occupant comfort under different systems. The result was radiant cooling which integrated with a Dedicated Outside Air System (DOAS) could make 41,84% energy savings compared to the installed cooling system. The Computational Fluid Dynamics (CFD) simulation showed that a radiant system integrated with DOAS provides superior human comfort than a radiant system integrated with Variable Air Volume (VAV). Percentage People Dissatisfied was kept below 10% using the proposed system.
Li, Xiaolu; Liang, Yu
2015-05-20
Analysis of light detection and ranging (LiDAR) intensity data to extract surface features is of great interest in remote sensing research. One potential application of LiDAR intensity data is target classification. A new bidirectional reflectance distribution function (BRDF) model is derived for target characterization of rough and smooth surfaces. Based on the geometry of our coaxial full-waveform LiDAR system, the integration method is improved through coordinate transformation to establish the relationship between the BRDF model and intensity data of LiDAR. A series of experiments using typical urban building materials are implemented to validate the proposed BRDF model and integration method. The fitting results show that three parameters extracted from the proposed BRDF model can distinguish the urban building materials from perspectives of roughness, specular reflectance, and diffuse reflectance. A comprehensive analysis of these parameters will help characterize surface features in a physically rigorous manner.
WASTE HANDLING BUILDING ELECTRICAL SYSTEM DESCRIPTION DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
S.C. Khamamkar
2000-06-23
The Waste Handling Building Electrical System performs the function of receiving, distributing, transforming, monitoring, and controlling AC and DC power to all waste handling building electrical loads. The system distributes normal electrical power to support all loads that are within the Waste Handling Building (WHB). The system also generates and distributes emergency power to support designated emergency loads within the WHB within specified time limits. The system provides the capability to transfer between normal and emergency power. The system provides emergency power via independent and physically separated distribution feeds from the normal supply. The designated emergency electrical equipment will bemore » designed to operate during and after design basis events (DBEs). The system also provides lighting, grounding, and lightning protection for the Waste Handling Building. The system is located in the Waste Handling Building System. The system consists of a diesel generator, power distribution cables, transformers, switch gear, motor controllers, power panel boards, lighting panel boards, lighting equipment, lightning protection equipment, control cabling, and grounding system. Emergency power is generated with a diesel generator located in a QL-2 structure and connected to the QL-2 bus. The Waste Handling Building Electrical System distributes and controls primary power to acceptable industry standards, and with a dependability compatible with waste handling building reliability objectives for non-safety electrical loads. It also generates and distributes emergency power to the designated emergency loads. The Waste Handling Building Electrical System receives power from the Site Electrical Power System. The primary material handling power interfaces include the Carrier/Cask Handling System, Canister Transfer System, Assembly Transfer System, Waste Package Remediation System, and Disposal Container Handling Systems. The system interfaces with the MGR Operations Monitoring and Control System for supervisory monitoring and control signals. The system interfaces with all facility support loads such as heating, ventilation, and air conditioning, office, fire protection, monitoring and control, safeguards and security, and communications subsystems.« less
Application of acoustic radiosity methods to noise propagation within buildings
NASA Astrophysics Data System (ADS)
Muehleisen, Ralph T.; Beamer, C. Walter
2005-09-01
The prediction of sound pressure levels in rooms from transmitted sound is a difficult problem. The sound energy in the source room incident on the common wall must be accurately predicted. In the receiving room, the propagation of sound from the planar wall source must also be accurately predicted. The radiosity method naturally computes the spatial distribution of sound energy incident on a wall and also naturally predicts the propagation of sound from a planar area source. In this paper, the application of the radiosity method to sound transmission problems is introduced and explained.
Content-based image exploitation for situational awareness
NASA Astrophysics Data System (ADS)
Gains, David
2008-04-01
Image exploitation is of increasing importance to the enterprise of building situational awareness from multi-source data. It involves image acquisition, identification of objects of interest in imagery, storage, search and retrieval of imagery, and the distribution of imagery over possibly bandwidth limited networks. This paper describes an image exploitation application that uses image content alone to detect objects of interest, and that automatically establishes and preserves spatial and temporal relationships between images, cameras and objects. The application features an intuitive user interface that exposes all images and information generated by the system to an operator thus facilitating the formation of situational awareness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Building simulations are increasingly used in various applications related to energy efficient buildings. For individual buildings, applications include: design of new buildings, prediction of retrofit savings, ratings, performance path code compliance and qualification for incentives. Beyond individual building applications, larger scale applications (across the stock of buildings at various scales: national, regional and state) include: codes and standards development, utility program design, regional/state planning, and technology assessments. For these sorts of applications, a set of representative buildings are typically simulated to predict performance of the entire population of buildings. Focusing on the U.S. single-family residential building stock, this paper willmore » describe how multiple data sources for building characteristics are combined into a highly-granular database that preserves the important interdependencies of the characteristics. We will present the sampling technique used to generate a representative set of thousands (up to hundreds of thousands) of building models. We will also present results of detailed calibrations against building stock consumption data.« less
Applications of Cosmic Muon Tracking at Shallow Depth Underground
NASA Astrophysics Data System (ADS)
Oláh, L.; Barnaföldi, G. G.; Hamar, G.; Melegh, H. G.; Surányi, G.; Varga, D.
2014-06-01
A portable cosmic muon telescope has been developed for environmental and geophysical applications, as well as cosmic background measurements for nuclear research in underground labs by the REGARD group (Wigner RCP of the HAS and Eötvös Loránd University collaboration on gaseous detector R&D). The modular, low power consuming (5 W) Close Cathode Chamber-based tracking system has 10 mrad angular resolution with its sensitive area of 0.1 m2. The angular distribution of cosmic muons has been measured at shallow depth underground (< 70 meter-rock-equivalent) in four different remote locations. Application of cosmic muon detection for the reconstruction of underground caverns and building structures are demonstrated by the measurements.
Imam, Neena; Barhen, Jacob
2009-01-01
For real-time acoustic source localization applications, one of the primary challenges is the considerable growth in computational complexity associated with the emergence of ever larger, active or passive, distributed sensor networks. These sensors rely heavily on battery-operated system components to achieve highly functional automation in signal and information processing. In order to keep communication requirements minimal, it is desirable to perform as much processing on the receiver platforms as possible. However, the complexity of the calculations needed to achieve accurate source localization increases dramatically with the size of sensor arrays, resulting in substantial growth of computational requirements that cannot bemore » readily met with standard hardware. One option to meet this challenge builds upon the emergence of digital optical-core devices. The objective of this work was to explore the implementation of key building block algorithms used in underwater source localization on the optical-core digital processing platform recently introduced by Lenslet Inc. This demonstration of considerably faster signal processing capability should be of substantial significance to the design and innovation of future generations of distributed sensor networks.« less
Incremental isometric embedding of high-dimensional data using connected neighborhood graphs.
Zhao, Dongfang; Yang, Li
2009-01-01
Most nonlinear data embedding methods use bottom-up approaches for capturing the underlying structure of data distributed on a manifold in high dimensional space. These methods often share the first step which defines neighbor points of every data point by building a connected neighborhood graph so that all data points can be embedded to a single coordinate system. These methods are required to work incrementally for dimensionality reduction in many applications. Because input data stream may be under-sampled or skewed from time to time, building connected neighborhood graph is crucial to the success of incremental data embedding using these methods. This paper presents algorithms for updating $k$-edge-connected and $k$-connected neighborhood graphs after a new data point is added or an old data point is deleted. It further utilizes a simple algorithm for updating all-pair shortest distances on the neighborhood graph. Together with incremental classical multidimensional scaling using iterative subspace approximation, this paper devises an incremental version of Isomap with enhancements to deal with under-sampled or unevenly distributed data. Experiments on both synthetic and real-world data sets show that the algorithm is efficient and maintains low dimensional configurations of high dimensional data under various data distributions.
Transfer Learning for Class Imbalance Problems with Inadequate Data.
Al-Stouhi, Samir; Reddy, Chandan K
2016-07-01
A fundamental problem in data mining is to effectively build robust classifiers in the presence of skewed data distributions. Class imbalance classifiers are trained specifically for skewed distribution datasets. Existing methods assume an ample supply of training examples as a fundamental prerequisite for constructing an effective classifier. However, when sufficient data is not readily available, the development of a representative classification algorithm becomes even more difficult due to the unequal distribution between classes. We provide a unified framework that will potentially take advantage of auxiliary data using a transfer learning mechanism and simultaneously build a robust classifier to tackle this imbalance issue in the presence of few training samples in a particular target domain of interest. Transfer learning methods use auxiliary data to augment learning when training examples are not sufficient and in this paper we will develop a method that is optimized to simultaneously augment the training data and induce balance into skewed datasets. We propose a novel boosting based instance-transfer classifier with a label-dependent update mechanism that simultaneously compensates for class imbalance and incorporates samples from an auxiliary domain to improve classification. We provide theoretical and empirical validation of our method and apply to healthcare and text classification applications.
Jeong, Seol Young; Jo, Hyeong Gon; Kang, Soon Ju
2014-03-21
A tracking service like asset management is essential in a dynamic hospital environment consisting of numerous mobile assets (e.g., wheelchairs or infusion pumps) that are continuously relocated throughout a hospital. The tracking service is accomplished based on the key technologies of an indoor location-based service (LBS), such as locating and monitoring multiple mobile targets inside a building in real time. An indoor LBS such as a tracking service entails numerous resource lookups being requested concurrently and frequently from several locations, as well as a network infrastructure requiring support for high scalability in indoor environments. A traditional centralized architecture needs to maintain a geographic map of the entire building or complex in its central server, which can cause low scalability and traffic congestion. This paper presents a self-organizing and fully distributed indoor mobile asset management (MAM) platform, and proposes an architecture for multiple trackees (such as mobile assets) and trackers based on the proposed distributed platform in real time. In order to verify the suggested platform, scalability performance according to increases in the number of concurrent lookups was evaluated in a real test bed. Tracking latency and traffic load ratio in the proposed tracking architecture was also evaluated.
Characterizing U.S. Heat Demand Market for Potential Application of Geothermal Direct Use
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCabe, Kevin; Gleason, Michael; Reber, Tim
In this paper, we assess the U.S. demand for low-temperature thermal energy at the county resolution for four major end-use sectors: residential buildings, commercial buildings, manufacturing facilities, and agricultural facilities. Existing, publicly available data on the U.S. thermal demand market are characterized by coarse spatial resolution, with assessments typically at the state-level or larger. For many uses, these data are sufficient; however, our research was motivated by an interest in assessing the potential demand for direct use (DU) of low-temperature (30 degrees to 150 degrees C) geothermal heat. The availability and quality of geothermal resources for DU applications are highlymore » spatially heterogeneous; therefore, to assess the potential market for these resources, it is necessary to understand the spatial variation in demand for low-temperature resources at a local resolution. This paper presents the datasets and methods we used to develop county-level estimates of the thermal demand for the residential, commercial, manufacturing, and agricultural sectors. Although this analysis was motivated by an interest in geothermal energy deployment, the results are likely to have broader applications throughout the energy industry. The county-resolution thermal demand data developed in this study for four major U.S. sectors may have far-reaching implications for building technologies, industrial processes, and various distributed renewable energy thermal resources (e.g. biomass, solar).« less
Characterizing U.S. Heat Demand for Potential Application of Geothermal Direct Use: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCabe, Kevin; Gleason, Michael; Reber, Tim
In this paper, we assess the U.S. demand for low-temperature thermal energy at the county resolution for four major end-use sectors: residential buildings, commercial buildings, manufacturing facilities, and agricultural facilities. Existing, publicly available data on the U.S. thermal demand market are characterized by coarse spatial resolution, with assessments typically at the state-level or larger. For many uses, these data are sufficient; however, our research was motivated by an interest in assessing the potential demand for direct use (DU) of low-temperature (30 degrees to 150 degrees C) geothermal heat. The availability and quality of geothermal resources for DU applications are highlymore » spatially heterogeneous; therefore, to assess the potential market for these resources, it is necessary to understand the spatial variation in demand for low-temperature resources at a local resolution. This paper presents the datasets and methods we used to develop county-level estimates of the thermal demand for the residential, commercial, manufacturing, and agricultural sectors. Although this analysis was motivated by an interest in geothermal energy deployment, the results are likely to have broader applications throughout the energy industry. The county-resolution thermal demand data developed in this study for four major U.S. sectors may have far-reaching implications for building technologies, industrial processes, and various distributed renewable energy thermal resources (e.g. biomass, solar).« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-26
... Buildings Service; Information Collection; GSA Form 3453, Application/Permit for Use of Space in Public Buildings and Grounds AGENCY: Public Buildings Service, GSA. ACTION: Notice of request for comments... regarding GSA Form 3453, Application/Permit for Use of Space in Public Buildings and Grounds. Public...
An overview of solar energy applications in buildings in Greece
NASA Astrophysics Data System (ADS)
Papamanolis, Nikos
2016-09-01
This work classifies and describes the main fields of solar energy exploitation in buildings in Greece, a country with high solar energy capacities. The study focuses on systems and technologies that apply to residential and commercial buildings following the prevailing design and construction practices (conventional buildings) and investigates the effects of the architectural and constructional characteristics of these buildings on the respective applications. In addition, it examines relevant applications in other building categories and in buildings with increased ecological sensitivity in their design and construction (green buildings). Through its findings, the study seeks to improve the efficiency and broaden the scope of solar energy applications in buildings in Greece to the benefit of their energy and environmental performance.
Collaborative volume visualization with applications to underwater acoustic signal processing
NASA Astrophysics Data System (ADS)
Jarvis, Susan; Shane, Richard T.
2000-08-01
Distributed collaborative visualization systems represent a technology whose time has come. Researchers at the Fraunhofer Center for Research in Computer Graphics have been working in the areas of collaborative environments and high-end visualization systems for several years. The medical application. TeleInVivo, is an example of a system which marries visualization and collaboration. With TeleInvivo, users can exchange and collaboratively interact with volumetric data sets in geographically distributed locations. Since examination of many physical phenomena produce data that are naturally volumetric, the visualization frameworks used by TeleInVivo have been extended for non-medical applications. The system can now be made compatible with almost any dataset that can be expressed in terms of magnitudes within a 3D grid. Coupled with advances in telecommunications, telecollaborative visualization is now possible virtually anywhere. Expert data quality assurance and analysis can occur remotely and interactively without having to send all the experts into the field. Building upon this point-to-point concept of collaborative visualization, one can envision a larger pooling of resources to form a large overview of a region of interest from contributions of numerous distributed members.
A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...
2016-01-01
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
A Verification System for Distributed Objects with Asynchronous Method Calls
NASA Astrophysics Data System (ADS)
Ahrendt, Wolfgang; Dylla, Maximilian
We present a verification system for Creol, an object-oriented modeling language for concurrent distributed applications. The system is an instance of KeY, a framework for object-oriented software verification, which has so far been applied foremost to sequential Java. Building on KeY characteristic concepts, like dynamic logic, sequent calculus, explicit substitutions, and the taclet rule language, the system presented in this paper addresses functional correctness of Creol models featuring local cooperative thread parallelism and global communication via asynchronous method calls. The calculus heavily operates on communication histories which describe the interfaces of Creol units. Two example scenarios demonstrate the usage of the system.
NASA Astrophysics Data System (ADS)
Flores, Robert Joseph
Distributed generation can provide many benefits over traditional central generation such as increased reliability and efficiency while reducing emissions. Despite these potential benefits, distributed generation is generally not purchased unless it reduces energy costs. Economic dispatch strategies can be designed such that distributed generation technologies reduce overall facility energy costs. In this thesis, a microturbine generator is dispatched using different economic control strategies, reducing the cost of energy to the facility. Several industrial and commercial facilities are simulated using acquired electrical, heating, and cooling load data. Industrial and commercial utility rate structures are modeled after Southern California Edison and Southern California Gas Company tariffs and used to find energy costs for the simulated buildings and corresponding microturbine dispatch. Using these control strategies, building models, and utility rate models, a parametric study examining various generator characteristics is performed. An economic assessment of the distributed generation is then performed for both the microturbine generator and parametric study. Without the ability to export electricity to the grid, the economic value of distributed generation is limited to reducing the individual costs that make up the cost of energy for a building. Any economic dispatch strategy must be built to reduce these individual costs. While the ability of distributed generation to reduce cost depends of factors such as electrical efficiency and operations and maintenance cost, the building energy demand being serviced has a strong effect on cost reduction. Buildings with low load factors can accept distributed generation with higher operating costs (low electrical efficiency and/or high operations and maintenance cost) due to the value of demand reduction. As load factor increases, lower operating cost generators are desired due to a larger portion of the building load being met in an effort to reduce demand. In addition, buildings with large thermal demand have access to the least expensive natural gas, lowering the cost of operating distributed generation. Recovery of exhaust heat from DG reduces cost only if the buildings thermal demand coincides with the electrical demand. Capacity limits exist where annual savings from operation of distributed generation decrease if further generation is installed. For low operating cost generators, the approximate limit is the average building load. This limit decreases as operating costs increase. In addition, a high capital cost of distributed generation can be accepted if generator operating costs are low. As generator operating costs increase, capital cost must decrease if a positive economic performance is desired.
Review of optimization techniques of polygeneration systems for building applications
NASA Astrophysics Data System (ADS)
Y, Rong A.; Y, Su; R, Lahdelma
2016-08-01
Polygeneration means simultaneous production of two or more energy products in a single integrated process. Polygeneration is an energy-efficient technology and plays an important role in transition into future low-carbon energy systems. It can find wide applications in utilities, different types of industrial sectors and building sectors. This paper mainly focus on polygeneration applications in building sectors. The scales of polygeneration systems in building sectors range from the micro-level for a single home building to the large- level for residential districts. Also the development of polygeneration microgrid is related to building applications. The paper aims at giving a comprehensive review for optimization techniques for designing, synthesizing and operating different types of polygeneration systems for building applications.
Evaluation of HRV Biofeedback as a Resilience Building Reserve Component
2017-08-01
Inventions, patent applications, and/or licenses Nothing to report Other Products Study recruitment/announcement video Mobile app for...International Award Amount: $1,833,144 Study / Product Aim(s) 1. Develop and test the PHIT platform for use with the BART protocol. 2. Examine...STATEMENT Approved for Public Release; Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The specific aims of this study are to (1) develop a
Commercial Buildings Research Group. Steve's areas of expertise are electric power distribution systems, DC techniques for maximizing the energy efficiency of electrical distribution systems in commercial buildings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-01
... Buildings Service; Submission for OMB Review; Application/ Permit for Use of Space in Public Buildings and Grounds, GSA Form 3453 AGENCY: Public Buildings Service, GSA. ACTION: Notice of request for comments... regarding GSA Form 3453, Application/Permit for Use of Space in Public Buildings and Grounds. A notice was...
Monitoring Building Deformation with InSAR: Experiments and Validation.
Yang, Kui; Yan, Li; Huang, Guoman; Chen, Chu; Wu, Zhengpeng
2016-12-20
Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS) regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE) indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated.
Review of Development Survey of Phase Change Material Models in Building Applications
Akeiber, Hussein J.; Wahid, Mazlan A.; Hussen, Hasanen M.; Mohammad, Abdulrahman Th.
2014-01-01
The application of phase change materials (PCMs) in green buildings has been increasing rapidly. PCM applications in green buildings include several development models. This paper briefly surveys the recent research and development activities of PCM technology in building applications. Firstly, a basic description of phase change and their principles is provided; the classification and applications of PCMs are also included. Secondly, PCM models in buildings are reviewed and discussed according to the wall, roof, floor, and cooling systems. Finally, conclusions are presented based on the collected data. PMID:25313367
Design and Implementation of Replicated Object Layer
NASA Technical Reports Server (NTRS)
Koka, Sudhir
1996-01-01
One of the widely used techniques for construction of fault tolerant applications is the replication of resources so that if one copy fails sufficient copies may still remain operational to allow the application to continue to function. This thesis involves the design and implementation of an object oriented framework for replicating data on multiple sites and across different platforms. Our approach, called the Replicated Object Layer (ROL) provides a mechanism for consistent replication of data over dynamic networks. ROL uses the Reliable Multicast Protocol (RMP) as a communication protocol that provides for reliable delivery, serialization and fault tolerance. Besides providing type registration, this layer facilitates distributed atomic transactions on replicated data. A novel algorithm called the RMP Commit Protocol, which commits transactions efficiently in reliable multicast environment is presented. ROL provides recovery procedures to ensure that site and communication failures do not corrupt persistent data, and male the system fault tolerant to network partitions. ROL will facilitate building distributed fault tolerant applications by performing the burdensome details of replica consistency operations, and making it completely transparent to the application.Replicated databases are a major class of applications which could be built on top of ROL.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pollock, E.O. Jr.
1987-10-15
The active solar Domestic Hot Water (DHW) system at the HQ Army-Air Force Exchange Service (AAFES) Building was designed and constructed as part of the Solar in Federal Buildings Programs (SFBP). This retrofitted system is one of eight of the systems in the SFBP selected for quality monitoring. The purpose of this monitoring effort is to document the performance of quality state-of-the-art solar systems in large federal building applications. The six-story HQ AAFES Building houses a cafeteria, officer's mess and club and office space for 2400 employees. The siphon-return drainback system uses 1147 ft/sup 2/ of Aircraftsman flat-plate collectors tomore » collect solar energy which is used to preheat domestic hot water. Solar energy is stored in a 1329-gallon tank and transferred to the hot water load through a heat exchanger located in the 356-gallon DHW preheat tank. Auxiliary energy is supplied by two gas fired boilers which boost the temperature to 130/sup 0/F before it is distributed to the load. Highlights of the performance of the HQ AAFES Building solar system during the monitoring period from August 1984 through May 1985 are presented in this report.« less
NASA Astrophysics Data System (ADS)
Hanson, A. G.
1987-03-01
The learning experience of a group of Federal-agency planners who face upgrading or augmenting existing on-premises communication systems and building wiring is documented. In July 1984, an interagency Fiber Optics Task Group was formed under the aegis of the Federal Telecommunication Standards Committee to study on-premises distribution systems, with emphasis on optical fiber implementation, sharing mutual problems and potential solutions for them. Chronological summary records of technical content of 11 Task Group meetings through September 1986 are summarized. Also condensed are the engineering presentations to the Task Group by industry on applicable state-of-the-art technology, including local area networks, private automatic branch exchanges, building wiring architecture, and optic fiber systems and components.
Department of National Defence's use of thermography for facilities maintenance
NASA Astrophysics Data System (ADS)
Kittson, John E.
1990-03-01
Since the late seventies DND through the Director General Works has been actively encouraging the use of thermography as an efficient and effective technique for supporting preventive maintenance quality assurance and energy conservation programs at Canadian Forces Bases (CFBs). This paper will provide an overview of DND''s experiences in the utilization of thermography for facilities maintenance applications. 1. HISTORICAL MILESTONES The following are milestones of DND''s use of thermography: a. Purchase of Infrared Equipment In 1976/77 DND purchased five AGA 750 Infrared Thermovision Systems which were distributed to commands. In 1980/81/82 six AGA liOs five AGA TPT8Os two AGA 782s and one AGA 720 were acquired. Finally DND also purchased seven AGEMA 870 systems during 1987/88. b. First and Second Interdepartaental Building Thermography Courses In 1978 and 1980 DND hosted two building thermography courses that were conducted by Public Works Canada. c. CE Thermographer Specialist Training Courses DND developed a training standard in 1983 for Construction Engineering (CE) Thermographer qualification which included all CE applications of thermography. The first annual inhouse training course was conducted at CFB Borden Ontario in 1984. These are now being conducted at the CFB Chilliwack Detachment in Vernon British Columbia. 2 . MARKETING FACILITIES MAINTENANCE IR Of paramount importance for successfully developing DND appreciation for thermography was providing familiarization training to CE staff at commands and bases. These threeday presentations emphasized motivational factors conducting thermographic surveys and utilizing infrared data of roofs electrical/mechanical systems heating plants steam distribution and building enclosures. These factors consisted mainly of the following objectives: a. preventive maintenance by locating deficiencies to be repaired b. quality assurance by verification of workmanship materials and design c. energy conservation by locating heat loss areas 2 / SPIE Vol. 1313 Thermosense XII (1990)
Williams, W E
1987-01-01
The maturing of technologies in computer capabilities, particularly direct digital signals, has provided an exciting variety of new communication and facility control opportunities. These include telecommunications, energy management systems, security systems, office automation systems, local area networks, and video conferencing. New applications are developing continuously. The so-called "intelligent" or "smart" building concept evolves from the development of this advanced technology in building environments. Automation has had a dramatic effect on facility planning. For decades, communications were limited to the telephone, the typewritten message, and copy machines. The office itself and its functions had been essentially unchanged for decades. Office automation systems began to surface during the energy crisis and, although their newer technology was timely, they were, for the most part, designed separately from other new building systems. For example, most mainframe computer systems were originally stand-alone, as were word processing installations. In the last five years, the advances in distributive systems, networking, and personal computer capabilities have provided opportunities to make such dramatic improvements in productivity that the Selectric typewriter has gone from being the most advanced piece of office equipment to nearly total obsolescence.
NASA Astrophysics Data System (ADS)
Rahmat, R. F.; Anthonius; Muchtar, M. A.; Hizriadi, A.; Syahputra, M. F.
2018-03-01
Universitas Sumatera Utara is one of the public universities that have over 100 buildings with total area of more than 133.141 square meters. Information delivery on the location of the institutional buildings becomes challenging since the university land reaches 93.4 Ha. The information usually delivers orally, in video presentation and in the form of two-dimensional such as maps, posters, and brochures. These three techniques of information delivery have their advantages and disadvantages. Thus, we know that virtual reality has come to existence, touching every domain of knowledge. In this paper we study and implement virtual reality as a new approach to distribute the information to cover all of the deficiencies. The utilization of virtual reality technology combined with 3D modeling is aims to introduce and inform the location of USU institutional buildings in interactive and innovative ways. With the application existence, the campus introduction is expected to be more convenient so that all the USU students will be able to find the exact location of the building they are headed for.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
Representing distributed cognition in complex systems: how a submarine returns to periscope depth.
Stanton, Neville A
2014-01-01
This paper presents the Event Analysis of Systemic Teamwork (EAST) method as a means of modelling distributed cognition in systems. The method comprises three network models (i.e. task, social and information) and their combination. This method was applied to the interactions between the sound room and control room in a submarine, following the activities of returning the submarine to periscope depth. This paper demonstrates three main developments in EAST. First, building the network models directly, without reference to the intervening methods. Second, the application of analysis metrics to all three networks. Third, the combination of the aforementioned networks in different ways to gain a broader understanding of the distributed cognition. Analyses have shown that EAST can be used to gain both qualitative and quantitative insights into distributed cognition. Future research should focus on the analyses of network resilience and modelling alternative versions of a system.
NASA Technical Reports Server (NTRS)
Campbell, R. H.; Essick, R. B.; Grass, J.; Johnston, G.; Kenny, K.; Russo, V.
1986-01-01
The EOS project is investigating the design and construction of a family of real-time distributed embedded operating systems for reliable, distributed aerospace applications. Using the real-time programming techniques developed in co-operation with NASA in earlier research, the project staff is building a kernel for a multiple processor networked system. The first six months of the grant included a study of scheduling in an object-oriented system, the design philosophy of the kernel, and the architectural overview of the operating system. In this report, the operating system and kernel concepts are described. An environment for the experiments has been built and several of the key concepts of the system have been prototyped. The kernel and operating system is intended to support future experimental studies in multiprocessing, load-balancing, routing, software fault-tolerance, distributed data base design, and real-time processing.
Machine intelligence and autonomy for aerospace systems
NASA Technical Reports Server (NTRS)
Heer, Ewald (Editor); Lum, Henry (Editor)
1988-01-01
The present volume discusses progress toward intelligent robot systems in aerospace applications, NASA Space Program automation and robotics efforts, the supervisory control of telerobotics in space, machine intelligence and crew/vehicle interfaces, expert-system terms and building tools, and knowledge-acquisition for autonomous systems. Also discussed are methods for validation of knowledge-based systems, a design methodology for knowledge-based management systems, knowledge-based simulation for aerospace systems, knowledge-based diagnosis, planning and scheduling methods in AI, the treatment of uncertainty in AI, vision-sensing techniques in aerospace applications, image-understanding techniques, tactile sensing for robots, distributed sensor integration, and the control of articulated and deformable space structures.
Experimental microbubble generation by sudden pressure drop and fluidics
NASA Astrophysics Data System (ADS)
Franco Gutierrez, Fernando; Figueroa Espinoza, Bernardo; Aguilar Corona, Alicia; Vargas Correa, Jesus; Solorio Diaz, Gildardo
2014-11-01
Mass and heat transfer, as well as chemical species in bubbly flow are of importance in environmental and industrial applications. Microbubbles are well suited to these applications due to the large interface contact area and residence time. The objective of this investigation is to build devices to produce microbubbles using two methods: pressure differences and fluidics. Some characteristics, advantages and drawbacks of both methods are briefly discussed, as well as the characterization of the bubbly suspensions in terms of parameters such as the pressure jump and bubble equivalent diameter distribution. The authors acknowledge the support of Consejo Nacional de Ciencia y Tecnología.
Building 65 Hydraulic Systems Handbook: Components, Systems, and Applications
2016-04-01
blocks. 1 REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 The public reporting burden for this collection of information is estimated...display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YY) 2. REPORT TYPE 3. DATES...Include Area Code) N/A Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39-18 i DISTRIBUTION STATEMENT A: Approved for public release
NASA Astrophysics Data System (ADS)
Behmanesh, Iman; Yousefianmoghadam, Seyedsina; Nozari, Amin; Moaveni, Babak; Stavridis, Andreas
2018-07-01
This paper investigates the application of Hierarchical Bayesian model updating for uncertainty quantification and response prediction of civil structures. In this updating framework, structural parameters of an initial finite element (FE) model (e.g., stiffness or mass) are calibrated by minimizing error functions between the identified modal parameters and the corresponding parameters of the model. These error functions are assumed to have Gaussian probability distributions with unknown parameters to be determined. The estimated parameters of error functions represent the uncertainty of the calibrated model in predicting building's response (modal parameters here). The focus of this paper is to answer whether the quantified model uncertainties using dynamic measurement at building's reference/calibration state can be used to improve the model prediction accuracies at a different structural state, e.g., damaged structure. Also, the effects of prediction error bias on the uncertainty of the predicted values is studied. The test structure considered here is a ten-story concrete building located in Utica, NY. The modal parameters of the building at its reference state are identified from ambient vibration data and used to calibrate parameters of the initial FE model as well as the error functions. Before demolishing the building, six of its exterior walls were removed and ambient vibration measurements were also collected from the structure after the wall removal. These data are not used to calibrate the model; they are only used to assess the predicted results. The model updating framework proposed in this paper is applied to estimate the modal parameters of the building at its reference state as well as two damaged states: moderate damage (removal of four walls) and severe damage (removal of six walls). Good agreement is observed between the model-predicted modal parameters and those identified from vibration tests. Moreover, it is shown that including prediction error bias in the updating process instead of commonly-used zero-mean error function can significantly reduce the prediction uncertainties.
Results of Outdoor to Indoor Propagation Measurements from 5-32GHz
NASA Technical Reports Server (NTRS)
Houts, Jacquelynne R.; McDonough, Ryan S.
2016-01-01
The demand for wireless services has increased exponentially in the last few years and shows no signs of slowing in the near future. In order for the next generation wireless to provide seamless access, whether the user is indoors or out, a thorough understanding and validation of models describing the impact of building entry loss (BEL) is required. This information is currently lacking and presents a challenge for most system designers. For this reason empirical data is needed to assess the impact of BEL at frequencies that are being explored for future mobile broadband applications This paper present the results of measurements of outdoor-to-indoor propagation from 5-32 GHz in three different buildings. The first is a newer building that is similar in construction to modern residential home. The second is an older commercial office building. The last building is a very new commercial office building built using modern green building techniques. These three buildings allow for the measurement of propagation losses through both modern and older materials; such as glass windows and exterior block and siding. Initial results found that at particular spatial locations the BEL could be less than 1dB or more than 70dB with free space losses discounted (this is likely influenced by multipath). Additionally, it was observed that the PDF distributions of a majority of the measurements trended toward log-normal with means and standard deviations ranging from 8-38dB and 6-14dB, respectively.
Localized Electrical Heating System for Various Types of Buildings
NASA Astrophysics Data System (ADS)
Shelehov, I. Y.; Smirnov, E. I.; Inozemsev, V. P.
2017-11-01
The article presents an overview of the factors determining the establishment of zones with high temperature in industrial, public and administrative buildings. The authors state the task on the improvement of the electric energy use efficiency and cost savings associated with the heating of these buildings by infrared electric heater devices. Materials and methods: The experiments were conducted in a room with the sizes of 3x6 m2 with a ceiling height of 3 m, the concrete floor was covered with laminate, in which increments of 250 mm were drilled and installed the thermocouple. In the process, had used the patented heating element with distributed heating layer. Signals from the thermocouples were recorded by instruments of the firm “ARIES” brand TPM138 with the standard software delivered together with devices (Owen Process Manager). The obtained distributions of the temperature fields were imported into MS Excel. Control voltage, current consumption, power was carried out by the device of firm “ARIES” brand of IMS. The results of the study: the article defines the purpose of the study and carried out the characterization of infrared heaters with various types of heating elements. The authors detail the main parameters of different types of infrared heaters, evaluated its possibility for application in other areas where the need to create areas of increased temperature. Discussion and conclusion: the result of this work it was determined that heating appliances that use patented heating element with distributed heating layer, improve thermal performance and bring you maximum comfort at a much greater distance compared to existing similar devices
NASA Astrophysics Data System (ADS)
Chowdhry, Bhawani Shankar; White, Neil M.; Jeswani, Jai Kumar; Dayo, Khalil; Rathi, Manorma
2009-07-01
Disasters affecting infrastructure, such as the 2001 earthquakes in India, 2005 in Pakistan, 2008 in China and the 2004 tsunami in Asia, provide a common need for intelligent buildings and smart civil structures. Now, imagine massive reductions in time to get the infrastructure working again, realtime information on damage to buildings, massive reductions in cost and time to certify that structures are undamaged and can still be operated, reductions in the number of structures to be rebuilt (if they are known not to be damaged). Achieving these ideas would lead to huge, quantifiable, long-term savings to government and industry. Wireless sensor networks (WSNs) can be deployed in buildings to make any civil structure both smart and intelligent. WSNs have recently gained much attention in both public and research communities because they are expected to bring a new paradigm to the interaction between humans, environment, and machines. This paper presents the deployment of WSN nodes in the Top Quality Centralized Instrumentation Centre (TQCIC). We created an ad hoc networking application to collect real-time data sensed from the nodes that were randomly distributed throughout the building. If the sensors are relocated, then the application automatically reconfigures itself in the light of the new routing topology. WSNs are event-based systems that rely on the collective effort of several micro-sensor nodes, which are continuously observing a physical phenomenon. WSN applications require spatially dense sensor deployment in order to achieve satisfactory coverage. The degree of spatial correlation increases with the decreasing inter-node separation. Energy consumption is reduced dramatically by having only those sensor nodes with unique readings transmit their data. We report on an algorithm based on a spatial correlation technique that assures high QoS (in terms of SNR) of the network as well as proper utilization of energy, by suppressing redundant data transmission. The visualization and analysis of WSN data are presented in a Windows-based user interface.
Thermal comfort analysis: A case study of LIG housing in Chhattisgarh
NASA Astrophysics Data System (ADS)
Netam, Nisha; Sanyal, Shubhashis; Bhowmick, Shubhankar
2016-07-01
The present work reports the evaluation of temperature distribution inside a Low Income Group (LIG) house located in the city of Raipur, Chhattisgarh, using a Agros2D which is multi-platform C++ application capable of higher-order finite element formulation with h, p and hp adaptivity for the solution of differential equations based on the Hermes library. Variation of room air temperature along the length (x) of the house is calculated at different altitudes viz. 0.5m, 1m, 1.5m, 2m, 2.5m, and 3m. 2D model is generated for all the respective altitudes which show the temperature distribution inside the building.
NASA Astrophysics Data System (ADS)
Lokko, Mae-ling Jovenes
As global quantities of waste by-products from food production as well as the range of their applications increase, researchers are realizing critical opportunities to transform the burden of underutilized wastes into ecological profits. Within the tropical hot-humid region, where half the world's current and projected future population growth is concentrated, there is a dire demand for building materials to meet ambitious development schemes and rising housing deficits. However, the building sector has largely overlooked the potential of local agricultural wastes to serve as alternatives to energy-intensive, imported building technologies. Industrial ecologists have recently investigated the use of agrowaste biocomposites to replace conventional wood products that use harmful urea-formaldehyde, phenolic and isocyanate resins. Furthermore, developments in the performance of building material systems with respect to cost, energy, air quality management and construction innovation have evolved metrics about what constitutes material 'upcycling' within building life cycle. While these developments have largely been focused on technical and cost performance, much less attention has been paid to addressing deeply-seated social and cultural barriers to adoption that have sedimented over decades of importation. This dissertation evaluates the development coconut agricultural building material systems in four phases: (i) non-toxic, low-energy production of medium-high density boards (500-1200 kg/m3) from coconut fibers and emerging biobinders; (ii) characterization and evaluation of coconut agricultural building materials hygrothermal performance (iii) scaled-up design development of coconut modular building material systems and (iv) development of a value translation framework for the bottom-up distribution of value to stakeholders within the upcycling framework. This integrated design methodological approach is significant to develop ecological thinking around agrowaste building materials, influence social and cultural acceptability and create value translation frameworks that sufficiently characterize the composite value proposition of upcycled building systems.
Wind tunnel tests for wind pressure distribution on gable roof buildings.
Jing, Xiao-kun; Li, Yuan-qi
2013-01-01
Gable roof buildings are widely used in industrial buildings. Based on wind tunnel tests with rigid models, wind pressure distributions on gable roof buildings with different aspect ratios were measured simultaneously. Some characteristics of the measured wind pressure field on the surfaces of the models were analyzed, including mean wind pressure, fluctuating wind pressure, peak negative wind pressure, and characteristics of proper orthogonal decomposition results of the measured wind pressure field. The results show that extremely high local suctions often occur in the leading edges of longitudinal wall and windward roof, roof corner, and roof ridge which are the severe damaged locations under strong wind. The aspect ratio of building has a certain effect on the mean wind pressure coefficients, and the effect relates to wind attack angle. Compared with experimental results, the region division of roof corner and roof ridge from AIJ2004 is more reasonable than those from CECS102:2002 and MBMA2006.The contributions of the first several eigenvectors to the overall wind pressure distributions become much bigger. The investigation can offer some basic understanding for estimating wind load distribution on gable roof buildings and facilitate wind-resistant design of cladding components and their connections considering wind load path.
Urban Earthquake Shaking and Loss Assessment
NASA Astrophysics Data System (ADS)
Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.
2009-04-01
This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Level 2 analysis of the ELER Software (similar to HAZUS and SELENA) is essentially intended for earthquake risk assessment (building damage, consequential human casualties and macro economic loss quantifiers) in urban areas. The basic Shake Mapping is similar to the Level 0 and Level 1 analysis however, options are available for more sophisticated treatment of site response through externally entered data and improvement of the shake map through incorporation of accelerometric and other macroseismic data (similar to the USGS ShakeMap System). The building inventory data for the Level 2 analysis will consist of grid (geo-cell) based urban building and demographic inventories. For building grouping the European building typology developed within the EU-FP5 RISK-EU project is used. The building vulnerability/fragility relationships to be used can be user selected from a list of applicable relationships developed on the basis of a comprehensive study, Both empirical and analytical relationships (based on the Coefficient Method, Equivalent Linearization Method and the Reduction Factor Method of analysis) can be employed. Casualties in Level 2 analysis are estimated based on the number of buildings in different damaged states and the casualty rates for each building type and damage level. Modifications to the casualty rates can be used if necessary. ELER Level 2 analysis will include calculation of direct monetary losses as a result building damage that will allow for repair-cost estimations and specific investigations associated with earthquake insurance applications (PML and AAL estimations). ELER Level 2 analysis loss results obtained for Istanbul for a scenario earthquake using different techniques will be presented with comparisons using different earthquake damage assessment software. The urban earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation, related Monte-Carlo type simulations and eathquake insurance applications.
Building asynchronous geospatial processing workflows with web services
NASA Astrophysics Data System (ADS)
Zhao, Peisheng; Di, Liping; Yu, Genong
2012-02-01
Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.
Performance of Radiant Heating Systems of Low-Energy Buildings
NASA Astrophysics Data System (ADS)
Sarbu, Ioan; Mirza, Matei; Crasmareanu, Emanuel
2017-10-01
After the introduction of plastic piping, the application of water-based radiant heating with pipes embedded in room surfaces (i.e., floors, walls, and ceilings), has significantly increased worldwide. Additionally, interest and growth in radiant heating and cooling systems have increased in recent years because they have been demonstrated to be energy efficient in comparison to all-air distribution systems. This paper briefly describes the heat distribution systems in buildings, focusing on the radiant panels (floor, wall, ceiling, and floor-ceiling). Main objective of this study is the performance investigation of different types of low-temperature heating systems with different methods. Additionally, a comparative analysis of the energy, environmental, and economic performances of floor, wall, ceiling, and floor-ceiling heating using numerical simulation with Transient Systems Simulation (TRNSYS) software is performed. This study showed that the floor-ceiling heating system has the best performance in terms of the lowest energy consumption, operation cost, CO2 emission, and the nominal boiler power. The comparison of the room operative air temperatures and the set-point operative air temperature indicates also that all radiant panel systems provide satisfactory results without significant deviations.
Jeong, Seol Young; Jo, Hyeong Gon; Kang, Soon Ju
2014-01-01
A tracking service like asset management is essential in a dynamic hospital environment consisting of numerous mobile assets (e.g., wheelchairs or infusion pumps) that are continuously relocated throughout a hospital. The tracking service is accomplished based on the key technologies of an indoor location-based service (LBS), such as locating and monitoring multiple mobile targets inside a building in real time. An indoor LBS such as a tracking service entails numerous resource lookups being requested concurrently and frequently from several locations, as well as a network infrastructure requiring support for high scalability in indoor environments. A traditional centralized architecture needs to maintain a geographic map of the entire building or complex in its central server, which can cause low scalability and traffic congestion. This paper presents a self-organizing and fully distributed indoor mobile asset management (MAM) platform, and proposes an architecture for multiple trackees (such as mobile assets) and trackers based on the proposed distributed platform in real time. In order to verify the suggested platform, scalability performance according to increases in the number of concurrent lookups was evaluated in a real test bed. Tracking latency and traffic load ratio in the proposed tracking architecture was also evaluated. PMID:24662407
NASA Astrophysics Data System (ADS)
Zhiyong, Xian
2017-09-01
In the context of green development, green materials are the future trend of Medical-Nursing Combined building. This paper summarizes the concept and types of green building materials. Then, on the basis of existing research, it constructs the green material system framework of Medical-Nursing Combined building, puts forward the application mode of green building materials, and studies the policy and legal protection of green material application.
Monitoring Building Deformation with InSAR: Experiments and Validation
Yang, Kui; Yan, Li; Huang, Guoman; Chen, Chu; Wu, Zhengpeng
2016-01-01
Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS) regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE) indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated. PMID:27999403
Engineering the extracellular environment: Strategies for building 2D and 3D cellular structures.
Guillame-Gentil, Orane; Semenov, Oleg; Roca, Ana Sala; Groth, Thomas; Zahn, Raphael; Vörös, Janos; Zenobi-Wong, Marcy
2010-12-21
Cell fate is regulated by extracellular environmental signals. Receptor specific interaction of the cell with proteins, glycans, soluble factors as well as neighboring cells can steer cells towards proliferation, differentiation, apoptosis or migration. In this review, approaches to build cellular structures by engineering aspects of the extracellular environment are described. These methods include non-specific modifications to control the wettability and stiffness of surfaces using self-assembled monolayers (SAMs) and polyelectrolyte multilayers (PEMs) as well as methods where the temporal activation and spatial distribution of adhesion ligands is controlled. Building on these techniques, construction of two-dimensional cell sheets using temperature sensitive polymers or electrochemical dissolution is described together with current applications of these grafts in the clinical arena. Finally, methods to pattern cells in three-dimensions as well as to functionalize the 3D environment with biologic motifs take us one step closer to being able to engineer multicellular tissues and organs. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Research on digital city geographic information common services platform
NASA Astrophysics Data System (ADS)
Chen, Dequan; Wu, Qunyong; Wang, Qinmin
2008-10-01
Traditional GIS (Geographic Information System) software development mode exposes many defects that will largely slow down the city informational progress. It is urgent need to build a common application infrastructure for informational project to speed up the development pace of digital city. The advent of service-oriented architecture (SOA) has motivated the adoption of GIS functionality portals that can be executed in distributed computing environment. According to the SOA principle, we bring forward and design a digital city geographic information common services platform which provides application development service interfaces for field users that can be further extended relevant business application. In the end, a public-oriented Web GIS is developed based on the platform for helping public users to query geographic information in their daily life. It indicates that our platform have the capacity that can be integrated by other applications conveniently.
Application of BIM Technology in Building Water Supply and Drainage Design
NASA Astrophysics Data System (ADS)
Wei, Tianyun; Chen, Guiqing; Wang, Junde
2017-12-01
Through the application of BIM technology, the idea of building water supply and drainage designers can be related to the model, the various influencing factors to affect water supply and drainage design can be considered more comprehensively. BIM(Building information model) technology assist in improving the design process of building water supply and drainage, promoting the building water supply and drainage planning, enriching the building water supply and drainage design method, improving the water supply and drainage system design level and building quality. Combined with fuzzy comprehensive evaluation method to analyze the advantages of BIM technology in building water supply and drainage design. Therefore, application prospects of BIM technology are very worthy of promotion.
Vukovic, Vladimir; Tabares-Velasco, Paulo Cesar; Srebric, Jelena
2010-09-01
A growing interest in security and occupant exposure to contaminants revealed a need for fast and reliable identification of contaminant sources during incidental situations. To determine potential contaminant source positions in outdoor environments, current state-of-the-art modeling methods use computational fluid dynamic simulations on parallel processors. In indoor environments, current tools match accidental contaminant distributions with cases from precomputed databases of possible concentration distributions. These methods require intensive computations in pre- and postprocessing. On the other hand, neural networks emerged as a tool for rapid concentration forecasting of outdoor environmental contaminants such as nitrogen oxides or sulfur dioxide. All of these modeling methods depend on the type of sensors used for real-time measurements of contaminant concentrations. A review of the existing sensor technologies revealed that no perfect sensor exists, but intensity of work in this area provides promising results in the near future. The main goal of the presented research study was to extend neural network modeling from the outdoor to the indoor identification of source positions, making this technology applicable to building indoor environments. The developed neural network Locator of Contaminant Sources was also used to optimize number and allocation of contaminant concentration sensors for real-time prediction of indoor contaminant source positions. Such prediction should take place within seconds after receiving real-time contaminant concentration sensor data. For the purpose of neural network training, a multizone program provided distributions of contaminant concentrations for known source positions throughout a test building. Trained networks had an output indicating contaminant source positions based on measured concentrations in different building zones. A validation case based on a real building layout and experimental data demonstrated the ability of this method to identify contaminant source positions. Future research intentions are focused on integration with real sensor networks and model improvements for much more complicated contamination scenarios.
The Osseus platform: a prototype for advanced web-based distributed simulation
NASA Astrophysics Data System (ADS)
Franceschini, Derrick; Riecken, Mark
2016-05-01
Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.
Developing Global Building Exposure for Disaster Forecasting, Mitigation, and Response
NASA Astrophysics Data System (ADS)
Huyck, C. K.
2016-12-01
Nongovernmental organizations and governments are recognizing the importance of insurance penetration in developing countries to mitigate the tremendous setbacks that follow natural disasters., but to effectively manage risk stakeholders must accurately quantify the built environment. Although there are countless datasets addressing elements of buildings, there are surprisingly few that are directly applicable to assessing vulnerability to natural disasters without skewing the spatial distribution of risk towards known assets. Working with NASA center partners Center for International Earth Science Information Network (CIESIN) at Columbia University in New York (http://www.ciesin.org), ImageCat have developed a novel method of developing Global Exposure Data (GED) from EO sources. The method has been applied to develop exposure datasets for GFDRR, CAT modelers, and aid in post-earthquake allocation of resources for UNICEF.
Iris: Constructing and Analyzing Spectral Energy Distributions with the Virtual Observatory
NASA Astrophysics Data System (ADS)
Laurino, O.; Budynkiewicz, J.; Busko, I.; Cresitello-Dittmar, M.; D'Abrusco, R.; Doe, S.; Evans, J.; Pevunova, O.
2014-05-01
We present Iris 2.0, the latest release of the Virtual Astronomical Observatory application for building and analyzing Spectral Energy Distributions (SEDs). With Iris, users may read in and display SEDs inspect and edit any selection of SED data, fit models to SEDs in arbitrary spectral ranges, and calculate confidence limits on best-fit parameters. SED data may be loaded into the application from VOTable and FITS files compliant with the International Virtual Observatoy Alliance interoperable data models, or retrieved directly from NED or the Italian Space Agency Science Data Center; data in non-standard formats may also be converted within the application. Users may seamlessy exchange data between Iris and other Virtual Observatoy tools using the Simple Application Messaging Protocol. Iris 2.0 also provides a tool for redshifting, interpolating, and measuring integratd fluxes, and allows simple aperture corrections for individual points and SED segments. Custom Python functions, template models and template libraries may be imported into Iris for fitting SEDs. Iris may be extended through Java plugins; users can install third-party packages, or develop their own plugin using Iris' Software Development Kit. Iris 2.0 is available for Linux and Mac OS X systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
LeMar, P.
Integrated Energy Systems (IES) combine on-site power or distributed generation technologies with thermally activated technologies to provide cooling, heating, humidity control, energy storage and/or other process functions using thermal energy normally wasted in the production of electricity/power. IES produce electricity and byproduct thermal energy onsite, with the potential of converting 80 percent or more of the fuel into useable energy. IES have the potential to offer the nation the benefits of unprecedented energy efficiency gains, consumer choice and energy security. It may also dramatically reduce industrial and commercial building sector carbon and air pollutant emissions and increase source energy efficiency.more » Applications of distributed energy and Combined heat and power (CHP) in ''Commercial and Institutional Buildings'' have, however, been historically limited due to insufficient use of byproduct thermal energy, particularly during summer months when heating is at a minimum. In recent years, custom engineered systems have evolved incorporating potentially high-value services from Thermally Activated Technologies (TAT) like cooling and humidity control. Such TAT equipment can be integrated into a CHP system to utilize the byproduct heat output effectively to provide absorption cooling or desiccant humidity control for the building during these summer months. IES can therefore expand the potential thermal energy services and thereby extend the conventional CHP market into building sector applications that could not be economically served by CHP alone. Now more than ever, these combined cooling, heating and humidity control systems (IES) can potentially decrease carbon and air pollutant emissions, while improving source energy efficiency in the buildings sector. Even with these improvements over conventional CHP systems, IES face significant technological and economic hurdles. Of crucial importance to the success of IES is the ability to treat the heating, ventilation, air conditioning, water heating, lighting, and power systems loads as parts of an integrated system, serving the majority of these loads either directly or indirectly from the CHP output. The CHP Technology Roadmaps (Buildings and Industry) have focused research and development on a comprehensive integration approach: component integration, equipment integration, packaged and modular system development, system integration with the grid, and system integration with building and process loads. This marked change in technology research and development has led to the creation of a new acronym to better reflect the nature of development in this important area of energy efficiency: Integrated Energy Systems (IES). Throughout this report, the terms ''CHP'' and ''IES'' will sometimes be used interchangeably, with CHP generally reserved for the electricity and heat generating technology subsystem portion of an IES. The focus of this study is to examine the potential for IES in buildings when the system perspective is taken, and the IES is employed as a dynamic system, not just as conventional CHP. This effort is designed to determine market potential by analyzing IES performance on an hour-by-hour basis, examining the full range of building types, their loads and timing, and assessing how these loads can be technically and economically met by IES.« less
AMRZone: A Runtime AMR Data Sharing Framework For Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Wenzhao; Tang, Houjun; Harenberg, Steven
Frameworks that facilitate runtime data sharing across multiple applications are of great importance for scientific data analytics. Although existing frameworks work well over uniform mesh data, they can not effectively handle adaptive mesh refinement (AMR) data. Among the challenges to construct an AMR-capable framework include: (1) designing an architecture that facilitates online AMR data management; (2) achieving a load-balanced AMR data distribution for the data staging space at runtime; and (3) building an effective online index to support the unique spatial data retrieval requirements for AMR data. Towards addressing these challenges to support runtime AMR data sharing across scientific applications,more » we present the AMRZone framework. Experiments over real-world AMR datasets demonstrate AMRZone's effectiveness at achieving a balanced workload distribution, reading/writing large-scale datasets with thousands of parallel processes, and satisfying queries with spatial constraints. Moreover, AMRZone's performance and scalability are even comparable with existing state-of-the-art work when tested over uniform mesh data with up to 16384 cores; in the best case, our framework achieves a 46% performance improvement.« less
Balancing Hydronic Systems in Multifamily Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruch, Russell; Ludwig, Peter; Maurer, Tessa
2014-07-01
In multifamily hydronic systems, temperature imbalance may be caused by undersized piping, improperly adjusted balancing valves, inefficient water temperature and flow levels, and owner/occupant interaction with the boilers, distribution, and controls. The imbalance leads to tenant discomfort, higher energy use intensity, and inefficient building operation. This research, conducted by Building America team Partnership for Advanced Residential Retrofit, explores cost-effective distribution upgrades and balancing measures in multifamily hydronic systems, providing a resource to contractors, auditors, and building owners on best practices to improve tenant comfort and lower operating costs. The team surveyed existing knowledge on cost-effective retrofits for optimizing distribution inmore » typical multifamily hydronic systems, with the aim of identifying common situations and solutions, and then conducted case studies on two Chicago area buildings with known balancing issues in order to quantify the extent of temperature imbalance. At one of these buildings a booster pump was installed on a loop to an underheated wing of the building. This study found that unit temperature in a multifamily hydronic building can vary as much as 61°F, particularly if windows are opened or tenants use intermittent supplemental heating sources like oven ranges. Average temperature spread at the building as a result of this retrofit decreased from 22.1°F to 15.5°F.« less
NASA Astrophysics Data System (ADS)
Chen, L.; Lai, C.; Marchewka, R.; Berry, R. M.; Tam, K. C.
2016-07-01
Structural colors and photoluminescence have been widely used for anti-counterfeiting and security applications. We report for the first time the use of CdS quantum dot (QD)-functionalized cellulose nanocrystals (CNCs) as building blocks to fabricate nanothin films via layer-by-layer (LBL) self-assembly for anti-counterfeiting applications. Both negatively- and positively-charged CNC/QD nanohybrids with a high colloidal stability and a narrow particle size distribution were prepared. The controllable LBL coating process was characterized by scanning electron microscopy and ellipsometry. The rigid structure of CNCs leads to nanoporous structured films on poly(ethylene terephthalate) (PET) substrates with high transmittance (above 70%) over the entire range of visible light and also resulted in increased hydrophilicity (contact angles of ~40 degrees). Nanothin films on PET substrates showed good flexibility and enhanced stability in both water and ethanol. The modified PET films with structural colors from thin-film interference and photoluminescence from QDs can be used in anti-counterfeiting applications.Structural colors and photoluminescence have been widely used for anti-counterfeiting and security applications. We report for the first time the use of CdS quantum dot (QD)-functionalized cellulose nanocrystals (CNCs) as building blocks to fabricate nanothin films via layer-by-layer (LBL) self-assembly for anti-counterfeiting applications. Both negatively- and positively-charged CNC/QD nanohybrids with a high colloidal stability and a narrow particle size distribution were prepared. The controllable LBL coating process was characterized by scanning electron microscopy and ellipsometry. The rigid structure of CNCs leads to nanoporous structured films on poly(ethylene terephthalate) (PET) substrates with high transmittance (above 70%) over the entire range of visible light and also resulted in increased hydrophilicity (contact angles of ~40 degrees). Nanothin films on PET substrates showed good flexibility and enhanced stability in both water and ethanol. The modified PET films with structural colors from thin-film interference and photoluminescence from QDs can be used in anti-counterfeiting applications. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr03039d
Study of fuel cell on-site, integrated energy systems in residential/commercial applications
NASA Technical Reports Server (NTRS)
Wakefield, R. A.; Karamchetty, S.; Rand, R. H.; Ku, W. S.; Tekumalla, V.
1980-01-01
Three building applications were selected for a detailed study: a low rise apartment building; a retail store, and a hospital. Building design data were then specified for each application, based on the design and construction of typical, actual buildings. Finally, a computerized building loads analysis program was used to estimate hourly end use load profiles for each building. Conventional and fuel cell based energy systems were designed and simulated for each building in each location. Based on the results of a computer simulation of each energy system, levelized annual costs and annual energy consumptions were calculated for all systems.
2013-12-10
intertidal vegetation . Comments from resource managers requested products incor- porating bathymetry and sediment data. To further build on the...and availability of intertidal vegetation are other key factors in successful movement into the estuary for brown shrimp, both of these data were...distribution of intertidal vegetation . The NWI classes EEM1 and EEM2 are the two classes into which intertidal vegeta- tion falls in Galveston. On the ground
Near Real-Time Earthquake Exposure and Damage Assessment: An Example from Turkey
NASA Astrophysics Data System (ADS)
Kamer, Yavor; Çomoǧlu, Mustafa; Erdik, Mustafa
2014-05-01
Confined by infamous strike-slip North Anatolian Fault from the north and by the Hellenic subduction trench from the south Turkey is one of the most seismically active countries in Europe. Due this increased exposure and the fragility of the building stock Turkey is among the top countries exposed to earthquake hazard in terms of mortality and economic losses. In this study we focus recent and ongoing efforts to mitigate the earthquake risk in near real-time. We present actual results of recent earthquakes, such as the M6 event off-shore Antalya which occurred on 28 December 2013. Starting at the moment of detection, we obtain a preliminary ground motion intensity distribution based on epicenter and magnitude. Our real-time application is further enhanced by the integration of the SeisComp3 ground motion parameter estimation tool with the Earthquake Loss Estimation Routine (ELER). SeisComp3 provides the online station parameters which are then automatically incorporated into the ShakeMaps produced by ELER. The resulting ground motion distributions are used together with the building inventory to calculate expected number of buildings in various damage states. All these analysis are conducted in an automated fashion and are communicated within a few minutes of a triggering event. In our efforts to disseminate earthquake information to the general public we make extensive use of social networks such as Tweeter and collaborate with mobile phone operators.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lacommare, Kristina S H; Stadler, Michael; Aki, Hirohisa
The addition of storage technologies such as flow batteries, conventional batteries, and heat storage can improve the economic as well as environmental attractiveness of on-site generation (e.g., PV, fuel cells, reciprocating engines or microturbines operating with or without CHP) and contribute to enhanced demand response. In order to examine the impact of storage technologies on demand response and carbon emissions, a microgrid's distributed energy resources (DER) adoption problem is formulated as a mixed-integer linear program that has the minimization of annual energy costs as its objective function. By implementing this approach in the General Algebraic Modeling System (GAMS), the problemmore » is solved for a given test year at representative customer sites, such as schools and nursing homes, to obtain not only the level of technology investment, but also the optimal hourly operating schedules. This paper focuses on analysis of storage technologies in DER optimization on a building level, with example applications for commercial buildings. Preliminary analysis indicates that storage technologies respond effectively to time-varying electricity prices, i.e., by charging batteries during periods of low electricity prices and discharging them during peak hours. The results also indicate that storage technologies significantly alter the residual load profile, which can contribute to lower carbon emissions depending on the test site, its load profile, and its adopted DER technologies.« less
Stochastic analysis of concentration field in a wake region.
Yassin, Mohamed F; Elmi, Abdirashid A
2011-02-01
Identifying geographic locations in urban areas from which air pollutants enter the atmosphere is one of the most important information needed to develop effective mitigation strategies for pollution control. Stochastic analysis is a powerful tool that can be used for estimating concentration fluctuation in plume dispersion in a wake region around buildings. Only few studies have been devoted to evaluate applications of stochastic analysis to pollutant dispersion in an urban area. This study was designed to investigate the concentration fields in the wake region using obstacle model such as an isolated building model. We measured concentration fluctuations at centerline of various downwind distances from the source, and different heights with the frequency of 1 KHz. Concentration fields were analyzed stochastically, using the probability density functions (pdf). Stochastic analysis was performed on the concentration fluctuation and the pdf of mean concentration, fluctuation intensity, and crosswind mean-plume dispersion. The pdf of the concentration fluctuation data have shown a significant non-Gaussian behavior. The lognormal distribution appeared to be the best fit to the shape of concentration measured in the boundary layer. We observed that the plume dispersion pdf near the source was shorter than the plume dispersion far from the source. Our findings suggest that the use of stochastic technique in complex building environment can be a powerful tool to help understand the distribution and location of air pollutants.
Large Scale Software Building with CMake in ATLAS
NASA Astrophysics Data System (ADS)
Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration
2017-10-01
The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.
A Metallurgical Evaluation of the Powder-Bed Laser Additive Manufactured 4140 Steel Material
NASA Astrophysics Data System (ADS)
Wang, Wesley; Kelly, Shawn
2016-03-01
Using laser powder bed fusion (PBF-L) additive manufacturing (AM) process for steel or iron powder has been attempted for decades. This work used a medium carbon steel (AISI 4140) powder to explore the feasibility of AM. The high carbon equivalent of 4140 steel (CEIIW ≈ 0.83) has a strong tendency toward cold cracking. As such, the process parameters must be carefully controlled to ensure the AM build quality. Through an orthogonally designed experimental matrix, a laser-welding procedure was successfully developed to produce 4140 steel AM builds with no welding defects. In addition, the microstructure and micro-cleanliness of the as-welded PBF-L AM builds were also examined. The results showed an ultra-fine martensite lath structure and an ultra-clean internal quality with minimal oxide inclusion distribution. After optimizing the PBF-L AM process parameters, including the laser power and scan speed, the as-welded AM builds yielded an average tensile strength higher than 1482 MPa and an average 33 J Charpy V-notch impact toughness at -18°C. The surface quality, tensile strength, and Charpy V-notch impact toughness of AM builds were comparable to the wrought 4140 steel. The excellent mechanical properties of 4140 steel builds created by the PBF-L AM AM process make industrial production more feasible, which shows great potential for application in the aerospace, automobile, and machinery industries.
Study on vulnerability matrices of masonry buildings of mainland China
NASA Astrophysics Data System (ADS)
Sun, Baitao; Zhang, Guixin
2018-04-01
The degree and distribution of damage to buildings subjected to earthquakes is a concern of the Chinese Government and the public. Seismic damage data indicates that seismic capacities of different types of building structures in various regions throughout mainland China are different. Furthermore, the seismic capacities of the same type of structure in different regions may vary. The contributions of this research are summarized as follows: 1) Vulnerability matrices and earthquake damage matrices of masonry structures in mainland China were chosen as research samples. The aim was to analyze the differences in seismic capacities of sample matrices and to present general rules for categorizing seismic resistance. 2) Curves relating the percentage of damaged masonry structures with different seismic resistances subjected to seismic demand in different regions of seismic intensity (VI to X) have been developed. 3) A method has been proposed to build vulnerability matrices of masonry structures. The damage ratio for masonry structures under high-intensity events such as the Ms 6.1 Panzhihua earthquake in Sichuan province on 30 August 2008, was calculated to verify the applicability of this method. This research offers a significant theoretical basis for predicting seismic damage and direct loss assessment of groups of buildings, as well as for earthquake disaster insurance.
Building Energy Management Open Source Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
This is the repository for Building Energy Management Open Source Software (BEMOSS), which is an open source operating system that is engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. BEMOSS offers the following key features: (1) Open source, open architecture – BEMOSS is an open source operating system that is built upon VOLTTRON – a distributed agent platform developed by Pacific Northwest National Laboratory (PNNL). BEMOSS was designed to make it easy for hardware manufacturers to seamlessly interface their devices with BEMOSS. Software developers can also contribute to adding additional BEMOSS functionalities and applications.more » (2) Plug & play – BEMOSS was designed to automatically discover supported load controllers (including smart thermostats, VAV/RTUs, lighting load controllers and plug load controllers) in commercial buildings. (3) Interoperability – BEMOSS was designed to work with load control devices form different manufacturers that operate on different communication technologies and data exchange protocols. (4) Cost effectiveness – Implementation of BEMOSS deemed to be cost-effective as it was built upon a robust open source platform that can operate on a low-cost single-board computer, such as Odroid. This feature could contribute to its rapid deployment in small- or medium-sized commercial buildings. (5) Scalability and ease of deployment – With its multi-node architecture, BEMOSS provides a distributed architecture where load controllers in a multi-floor and high occupancy building could be monitored and controlled by multiple single-board computers hosting BEMOSS. This makes it possible for a building engineer to deploy BEMOSS in one zone of a building, be comfortable with its operation, and later on expand the deployment to the entire building to make it more energy efficient. (6) Ability to provide local and remote monitoring – BEMOSS provides both local and remote monitoring ability with role-based access control. (7) Security – In addition to built-in security features provided by VOLTTRON, BEMOSS provides enhanced security features, including BEMOSS discovery approval process, encrypted core-to-node communication, thermostat anti-tampering feature and many more. (8) Support from the Advisory Committee – BEMOSS was developed in consultation with an advisory committee from the beginning of the project. BEMOSS advisory committee comprises representatives from 22 organizations from government and industry.« less
ATLAS software configuration and build tool optimisation
NASA Astrophysics Data System (ADS)
Rybkin, Grigory; Atlas Collaboration
2014-06-01
ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.
Analysis of Turbulent Boundary-Layer over Rough Surfaces with Application to Projectile Aerodynamics
1988-12-01
12 V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES ....................... 12 1. COMPONENT BUILD-UP IN DRAG...dimensional roughness. II. CLASSIFICATION OF PREDICTION METHODS Prediction methods can be classified into two main approache-: 1) Correlation methodologies ...data are availaNe. V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES 1. COMPONENT BUILD-UP IN DRAG The new correlation can be used for an engine.ring
NASA Astrophysics Data System (ADS)
Ishizawa, O. A.; Clouteau, D.
2007-12-01
Long-duration, amplifications and spatial response's variability of the seismic records registered in Mexico City during the September 1985 earthquake cannot only be explained by the soil velocity model. We will try to explain these phenomena by studying the extent of the effect of buildings' diffracted wave fields during an earthquake. The main question is whether the presence of a large number of buildings can significantly modify the seismic wave field. We are interested in the interaction between the incident wave field propagating in a stratified half- space and a large number of structures at the free surface, i.e., the coupled city-site effect. We study and characterize the seismic wave propagation regimes in a city using the theory of wave propagation in random media. In the coupled city-site system, the buildings are modeled as resonant scatterers uniformly distributed at the surface of a deterministic, horizontally layered elastic half-space representing the soil. Based on the mean-field and the field correlation equations, we build a theoretical model which takes into account the multiple scattering of seismic waves and allows us to describe the coupled city-site system behavior in a simple and rapid way. The results obtained for the configurationally averaged field quantities are validated by means of 3D results for the seismic response of a deterministic model. The numerical simulations of this model are computed with MISS3D code based on classical Soil-Structure Interaction techniques and on a variational coupling between Boundary Integral Equations for a layered soil and a modal Finite Element approach for the buildings. This work proposes a detailed numerical and a theoretical analysis of the city-site interaction (CSI) in Mexico City area. The principal parameters in the study of the CSI are the buildings resonant frequency distribution, the soil characteristics of the site, the urban density and position of the buildings in the city, as well as the type of incident wave. The main results of the theoretical and numerical models allow us to characterize the seismic movement in urban areas.
Wind Tunnel Tests for Wind Pressure Distribution on Gable Roof Buildings
2013-01-01
Gable roof buildings are widely used in industrial buildings. Based on wind tunnel tests with rigid models, wind pressure distributions on gable roof buildings with different aspect ratios were measured simultaneously. Some characteristics of the measured wind pressure field on the surfaces of the models were analyzed, including mean wind pressure, fluctuating wind pressure, peak negative wind pressure, and characteristics of proper orthogonal decomposition results of the measured wind pressure field. The results show that extremely high local suctions often occur in the leading edges of longitudinal wall and windward roof, roof corner, and roof ridge which are the severe damaged locations under strong wind. The aspect ratio of building has a certain effect on the mean wind pressure coefficients, and the effect relates to wind attack angle. Compared with experimental results, the region division of roof corner and roof ridge from AIJ2004 is more reasonable than those from CECS102:2002 and MBMA2006.The contributions of the first several eigenvectors to the overall wind pressure distributions become much bigger. The investigation can offer some basic understanding for estimating wind load distribution on gable roof buildings and facilitate wind-resistant design of cladding components and their connections considering wind load path. PMID:24082851
Building integration of photovoltaic systems in cold climates
NASA Astrophysics Data System (ADS)
Athienitis, Andreas K.; Candanedo, José A.
2010-06-01
This paper presents some of the research activities on building-integrated photovoltaic (BIPV) systems developed by the Solar and Daylighting Laboratory at Concordia University. BIPV systems offer considerable advantages as compared to stand-alone PV installations. For example, BIPV systems can play a role as essential components of the building envelope. BIPV systems operate as distributed power generators using the most widely available renewable source. Since BIPV systems do not require additional space, they are especially appropriate for urban environments. BIPV/Thermal (BIPV/T) systems may use exterior air to extract useful heat from the PV panels, cooling them and thereby improving their electric performance. The recovered thermal energy can then be used for space heating and domestic hot water (DHW) heating, supporting the utilization of BIVP/T as an appropriate technology for cold climates. BIPV and BIPV/T systems are the subject of several ongoing research and demonstration projects (in both residential and commercial buildings) led by Concordia University. The concept of integrated building design and operation is at the centre of these efforts: BIPV and BIPV/T systems must be treated as part of a comprehensive strategy taking into account energy conservation measures, passive solar design, efficient lighting and HVAC systems, and integration of other renewable energy systems (solar thermal, heat pumps, etc.). Concordia Solar Laboratory performs fundamental research on heat transfer and modeling of BIPV/T systems, numerical and experimental investigations on BIPV and BIPV/T in building energy systems and non-conventional applications (building-attached greenhouses), and the design and optimization of buildings and communities.
Applications of Optimal Building Energy System Selection and Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marnay, Chris; Stadler, Michael; Siddiqui, Afzal
2011-04-01
Berkeley Lab has been developing the Distributed Energy Resources Customer Adoption Model (DER-CAM) for several years. Given load curves for energy services requirements in a building microgrid (u grid), fuel costs and other economic inputs, and a menu of available technologies, DER-CAM finds the optimum equipment fleet and its optimum operating schedule using a mixed integer linear programming approach. This capability is being applied using a software as a service (SaaS) model. Optimisation problems are set up on a Berkeley Lab server and clients can execute their jobs as needed, typically daily. The evolution of this approach is demonstrated bymore » description of three ongoing projects. The first is a public access web site focused on solar photovoltaic generation and battery viability at large commercial and industrial customer sites. The second is a building CO2 emissions reduction operations problem for a University of California, Davis student dining hall for which potential investments are also considered. And the third, is both a battery selection problem and a rolling operating schedule problem for a large County Jail. Together these examples show that optimization of building u grid design and operation can be effectively achieved using SaaS.« less
NASA Astrophysics Data System (ADS)
Van Uffelen, Marco; Berghmans, Francis; Brichard, Benoit; Borgermans, Paul; Decréton, Marc C.
2002-09-01
Optical fibers stimulate much interest since many years for their potential use in various nuclear environments, both for radiation tolerant and EMI-free data communication as well as for distributed sensing. Besides monitoring temperature and stress, measuring ionizing doses with optical fibers is particularly essential in applications such as long-term nuclear waste disposal monitoring, and for real-time aging monitoring of power and signal cables installed inside a reactor containment building. Two distinct options exist to perform optical fiber dosimetry. First, find an accurate model for a restricted application field that accounts for all the parameters that influence the radiation response of a standard fiber, or second, develop a dedicated fiber with a response that will solely depend on the deposited energy. Using various models presented in literature, we evaluate both standard commercially available and custom-made optical fibers under gamma radiation, particularly for distributed dosimetry applications with an optical time domain reflectometer (OTDR). We therefore present the radiation induced attenuation at near-infrared telecom wavelengths up to MGy total dose levels, with dose rates ranging from about 1 Gy/h up to 1 kGy/h, whereas temperature was raised step-wise from 25 °C to 85 °C. Our results allow to determine and compare the practical limitations of distributed dose measurements with both fiber types in terms of temperature sensitivity, dose estimation accuracy and spatial resolution.
Frequency distributions and correlations of solar X-ray flare parameters
NASA Technical Reports Server (NTRS)
Crosby, Norma B.; Aschwanden, Markus J.; Dennis, Brian R.
1993-01-01
Frequency distributions of flare parameters are determined from over 12,000 solar flares. The flare duration, the peak counting rate, the peak hard X-ray flux, the total energy in electrons, and the peak energy flux in electrons are among the parameters studied. Linear regression fits, as well as the slopes of the frequency distributions, are used to determine the correlations between these parameters. The relationship between the variations of the frequency distributions and the solar activity cycle is also investigated. Theoretical models for the frequency distribution of flare parameters are dependent on the probability of flaring and the temporal evolution of the flare energy build-up. The results of this study are consistent with stochastic flaring and exponential energy build-up. The average build-up time constant is found to be 0.5 times the mean time between flares.
Development and application of induced-strain actuators for building structures
NASA Astrophysics Data System (ADS)
Morita, Koichi; Fujita, Takafumi; Ise, Shiro; Kawaguchi, Ken-ichi; Kamada, Takayoshi; Fujitani, Hideo
2001-07-01
Induced strain actuator (ISA) can change their own shapes according to external electric/magnetic fields, and vice versa. Recently these materials have been widely used for the small/precision. The objectives in this study are to develop smart members for building and to realize the smart, comfortable and safe structures. The research items are 1) Semi-active isolation of structures using piezoelectric actuator, 2) Using ISA as sensor materials and 3) Improvement of Acoustic Environment. Semi-active base isolation system with controllable friction damper using piezoelectric actuators is proposed. Simulation study was carried out, and by semi-active isolation, it could be realized to reduce response displacement of the structure to 50% of values of the passive isolation. ISA materials can act as sensors because they cause change of electric or magnetic fields under deformation. PVDF sensors are suitable for membrane structures. We evaluate performance of PVDF sensors for membrane structures by experiment. Polymer based ISA films or distributed ISA devices can control vibration mode of plane members. Applications to music halls or dwelling partition walls are expected. Results of experimental studies of noise control are discussed.
24 CFR 92.251 - Property standards.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., as applicable, one of three model codes: Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI); or the Council of American Building Officials (CABO) one or two...) Housing that is constructed or rehabilitated with HOME funds must meet all applicable local codes...
24 CFR 92.251 - Property standards.
Code of Federal Regulations, 2013 CFR
2013-04-01
..., as applicable, one of three model codes: Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI); or the Council of American Building Officials (CABO) one or two...) Housing that is constructed or rehabilitated with HOME funds must meet all applicable local codes...
24 CFR 92.251 - Property standards.
Code of Federal Regulations, 2012 CFR
2012-04-01
..., as applicable, one of three model codes: Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI); or the Council of American Building Officials (CABO) one or two...) Housing that is constructed or rehabilitated with HOME funds must meet all applicable local codes...
24 CFR 92.251 - Property standards.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., as applicable, one of three model codes: Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI); or the Council of American Building Officials (CABO) one or two...) Housing that is constructed or rehabilitated with HOME funds must meet all applicable local codes...
Building a generalized distributed system model
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi; Foudriat, E. C.
1991-01-01
A modeling tool for both analysis and design of distributed systems is discussed. Since many research institutions have access to networks of workstations, the researchers decided to build a tool running on top of the workstations to function as a prototype as well as a distributed simulator for a computing system. The effects of system modeling on performance prediction in distributed systems and the effect of static locking and deadlocks on the performance predictions of distributed transactions are also discussed. While the probability of deadlock is considerably small, its effects on performance could be significant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2014-09-01
In multifamily hydronic systems, temperature imbalance may be caused by undersized piping, improperly adjusted balancing valves, inefficient water temperature and flow levels, and owner/occupant interaction with the boilers, distribution and controls. The effects of imbalance include tenant discomfort, higher energy use intensity and inefficient building operation. This paper explores cost-effective distribution upgrades and balancing measures in multifamily hydronic systems, providing a resource to contractors, auditors, and building owners on best practices to improve tenant comfort and lower operating costs. The research was conducted by The Partnership for Advanced Residential Retrofit (PARR) in conjunction with Elevate Energy. The team surveyed existingmore » knowledge on cost-effective retrofits for optimizing distribution in typical multifamily hydronic systems, with the aim of identifying common situations and solutions, and then conducted case studies on two Chicago area buildings with known balancing issues in order to quantify the extent of temperature imbalance. At one of these buildings a booster pump was installed on a loop to an underheated wing of the building. This study found that unit temperature in a multifamily hydronic building can vary as much as 61 degrees F, particularly if windows are opened or tenants use intermittent supplemental heating sources like oven ranges. Average temperature spread at the building as a result of this retrofit decreased from 22.1 degrees F to 15.5 degrees F.« less
Möller, Thorsten; Schuldt, Heiko; Gerber, Andreas; Klusch, Matthias
2006-06-01
Healthcare digital libraries (DLs) increasingly make use of dedicated services to access functionality and/or data. Semantic (web) services enhance single services and facilitate compound services, thereby supporting advanced applications on top of a DL. The traditional process management approach tends to focus on process definition at build time rather than on actual service events in run time, and to anticipate failures in order to define appropriate strategies. This paper presents a novel approach where service coordination is distributed among a set of agents. A dedicated component plans compound semantic services on demand for a particular application. In failure, the planner is reinvoked to define contin- gency strategies. Finally, matchmaking is effected at runtime by choosing the appropriate service provider. These combined technologies will provide key support for highly flexible next-generation DL applications. Such technologies are under development within CASCOM.
Le Floch, Jean-Michel; Fan, Y; Humbert, Georges; Shan, Qingxiao; Férachou, Denis; Bara-Maillet, Romain; Aubourg, Michel; Hartnett, John G; Madrangeas, Valerie; Cros, Dominique; Blondy, Jean-Marc; Krupka, Jerzy; Tobar, Michael E
2014-03-01
Dielectric resonators are key elements in many applications in micro to millimeter wave circuits, including ultra-narrow band filters and frequency-determining components for precision frequency synthesis. Distributed-layered and bulk low-loss crystalline and polycrystalline dielectric structures have become very important for building these devices. Proper design requires careful electromagnetic characterization of low-loss material properties. This includes exact simulation with precision numerical software and precise measurements of resonant modes. For example, we have developed the Whispering Gallery mode technique for microwave applications, which has now become the standard for characterizing low-loss structures. This paper will give some of the most common characterization techniques used in the micro to millimeter wave regime at room and cryogenic temperatures for designing high-Q dielectric loaded cavities.
Biology-Inspired Explorers for Space Systems
NASA Astrophysics Data System (ADS)
Ramohalli, Kumar; Lozano, Peter; Furfaro, Roberto
2002-01-01
Building upon three innovative technologies, each of which received a NTR award from NASA, a specific explorer is described. This "robot" does away with conventional gears, levers, pulleys,.... And uses "Muscle Materials" instead; these shape-memory materials, formerly in the Nickel-Titanium family, but now in the much wider class of ElectroActivePolymers(EAP), have the ability to precisely respond to pre"programmed" shape changes upon application of an electrical input. Of course, the pre"programs" are at the molecular level, much like in biological systems. Another important feature is the distributed power. That is, the power use in the "limbs" is distributed, so that if one "limb" should fail, the others can still function. The robot has been built and demonstrated to the media (newspapers and television). The fundamental control aspects are currently being worked upon, and we expect to have a more complete mathematical description of its operation. Future plans, and specific applications for reliable planetary exploration will be outlined.
Hierarchy Software Development Framework (h-dp-fwk) project
NASA Astrophysics Data System (ADS)
Zaytsev, A.
2010-04-01
Hierarchy Software Development Framework provides a lightweight tool for building portable modular applications for performing automated data analysis tasks in a batch mode. The history of design and development activities devoted to the project has begun in March 2005 and from the very beginning it was targeting the case of building experimental data processing applications for the CMD-3 experiment which is being commissioned at Budker Institute of Nuclear Physics (BINP, Novosibirsk, Russia). Its design addresses the generic case of modular data processing application operating within the well defined distributed computing environment. The main features of the framework are modularity, built-in message and data exchange mechanisms, XInclude and XML schema enabled XML configuration management tools, dedicated log management tools, internal debugging tools, both dynamic and static module chains support, internal DSO version and consistency checking, well defined API for developing specialized frameworks. It is supported on Scientific Linux 4 and 5 and planned to be ported to other platforms as well. The project is provided with the comprehensive set of technical documentation and users' guides. The licensing schema for the source code, binaries and documentation implies that the product is free for non-commercial use. Although the development phase is not over and many features are to be implemented yet the project is considered ready for public use and creating applications in various fields including development of events reconstruction software for small and moderate scale HEP experiments.
MEMS temperature scanner: principles, advances, and applications
NASA Astrophysics Data System (ADS)
Otto, Thomas; Saupe, Ray; Stock, Volker; Gessner, Thomas
2010-02-01
Contactless measurement of temperatures has gained enormous significance in many application fields, ranging from climate protection over quality control to object recognition in public places or military objects. Thereby measurement of linear or spatially temperature distribution is often necessary. For this purposes mostly thermographic cameras or motor driven temperature scanners are used today. Both are relatively expensive and the motor drive devices are limited regarding to the scanning rate additionally. An economic alternative are temperature scanner devices based on micro mirrors. The micro mirror, attached in a simple optical setup, reflects the emitted radiation from the observed heat onto an adapted detector. A line scan of the target object is obtained by periodic deflection of the micro scanner. Planar temperature distribution will be achieved by perpendicularly moving the target object or the scanner device. Using Planck radiation law the temperature of the object is calculated. The device can be adapted to different temperature ranges and resolution by using different detectors - cooled or uncooled - and parameterized scanner parameters. With the basic configuration 40 spatially distributed measuring points can be determined with temperatures in a range from 350°C - 1000°C. The achieved miniaturization of such scanners permits the employment in complex plants with high building density or in direct proximity to the measuring point. The price advantage enables a lot of applications, especially new application in the low-price market segment This paper shows principle, setup and application of a temperature measurement system based on micro scanners working in the near infrared range. Packaging issues and measurement results will be discussed as well.
Simulation of probabilistic wind loads and building analysis
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Chamis, Christos C.
1991-01-01
Probabilistic wind loads likely to occur on a structure during its design life are predicted. Described here is a suitable multifactor interactive equation (MFIE) model and its use in the Composite Load Spectra (CLS) computer program to simulate the wind pressure cumulative distribution functions on four sides of a building. The simulated probabilistic wind pressure load was applied to a building frame, and cumulative distribution functions of sway displacements and reliability against overturning were obtained using NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), a stochastic finite element computer code. The geometry of the building and the properties of building members were also considered as random in the NESSUS analysis. The uncertainties of wind pressure, building geometry, and member section property were qualified in terms of their respective sensitivities on the structural response.
Application of BIM technology in green building material management system
NASA Astrophysics Data System (ADS)
Zhineng, Tong
2018-06-01
The current green building materials management system in China's construction industry is not perfect, and there are still many shortcomings. Active construction of green building materials management system based on BIM technology, combined with the characteristics of green building materials and its relationship with BIM technology application, is urgently needed to better realize the scientific management of green building materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2014-09-01
In multifamily building hydronic systems, temperature imbalance may be caused by undersized piping, improperly adjusted balancing valves, inefficient water temperature and flow levels, and owner/occupant interaction with the boilers, distribution and controls. The effects of imbalance include tenant discomfort, higher energy use intensity and inefficient building operation. In this case study , Partnership for Advanced Residential Retrofit and Elevate Energy. explores cost-effective distribution upgrades and balancing measures in multifamily hydronic systems, providing a resource to contractors, auditors, and building owners on best practices to improve tenant comfort and lower operating costs.
Chow, C W; Kuo, F M; Shi, J W; Yeh, C H; Wu, Y F; Wang, C H; Li, Y T; Pan, C L
2010-01-18
Fiber-to-the-antenna (FTTA) system can be a cost-effective technique for distributing high frequency signals from the head-end office to a number of remote antenna units via passive optical splitter and propagating through low-loss and low-cost optical fibers. Here, we experimentally demonstrate an optical ultra-wideband (UWB) - impulse radio (IR) FTTA system for in-building and in-home applications. The optical UWB-IR wireless link is operated in the W-band (75 GHz - 110 GHz) using our developed near-ballistic unitraveling-carrier photodiode based photonic transmitter (PT) and a 10 GHz mode-locked laser. 2.5 Gb/s UWB-IR FTTA systems with 1,024 high split-ratio and transmission over 300 m optical fiber are demonstrated using direct PT modulation.
Applications of thermal infrared imagery for energy conservation and environmental surveys
NASA Technical Reports Server (NTRS)
Carney, J. R.; Vogel, T. C.; Howard, G. E., Jr.; Love, E. R.
1977-01-01
The survey procedures, developed during the winter and summer of 1976, employ color and color infrared aerial photography, thermal infrared imagery, and a handheld infrared imaging device. The resulting imagery was used to detect building heat losses, deteriorated insulation in built-up type building roofs, and defective underground steam lines. The handheld thermal infrared device, used in conjunction with the aerial thermal infrared imagery, provided a method for detecting and locating those roof areas that were underlain with wet insulation. In addition, the handheld infrared device was employed to conduct a survey of a U.S. Army installation's electrical distribution system under full operating loads. This survey proved to be cost effective procedure for detecting faulty electrical insulators and connections that if allowed to persist could have resulted in both safety hazards and loss in production.
NASA Astrophysics Data System (ADS)
Ziegler, Hannes Moritz
Planners and managers often rely on coarse population distribution data from the census for addressing various social, economic, and environmental problems. In the analysis of physical vulnerabilities to sea-level rise, census units such as blocks or block groups are coarse relative to the required decision-making application. This study explores the benefits offered from integrating image classification and dasymetric mapping at the household level to provide detailed small area population estimates at the scale of residential buildings. In a case study of Boca Raton, FL, a sea-level rise inundation grid based on mapping methods by NOAA is overlaid on the highly detailed population distribution data to identify vulnerable residences and estimate population displacement. The enhanced spatial detail offered through this method has the potential to better guide targeted strategies for future development, mitigation, and adaptation efforts.
A salient region detection model combining background distribution measure for indoor robots.
Li, Na; Xu, Hui; Wang, Zhenhua; Sun, Lining; Chen, Guodong
2017-01-01
Vision system plays an important role in the field of indoor robot. Saliency detection methods, capturing regions that are perceived as important, are used to improve the performance of visual perception system. Most of state-of-the-art methods for saliency detection, performing outstandingly in natural images, cannot work in complicated indoor environment. Therefore, we propose a new method comprised of graph-based RGB-D segmentation, primary saliency measure, background distribution measure, and combination. Besides, region roundness is proposed to describe the compactness of a region to measure background distribution more robustly. To validate the proposed approach, eleven influential methods are compared on the DSD and ECSSD dataset. Moreover, we build a mobile robot platform for application in an actual environment, and design three different kinds of experimental constructions that are different viewpoints, illumination variations and partial occlusions. Experimental results demonstrate that our model outperforms existing methods and is useful for indoor mobile robots.
Toropova, Alla P; Schultz, Terry W; Toropov, Andrey A
2016-03-01
Data on toxicity toward Tetrahymena pyriformis is indicator of applicability of a substance in ecologic and pharmaceutical aspects. Quantitative structure-activity relationships (QSARs) between the molecular structure of benzene derivatives and toxicity toward T. pyriformis (expressed as the negative logarithms of the population growth inhibition dose, mmol/L) are established. The available data were randomly distributed three times into the visible training and calibration sets, and invisible validation sets. The statistical characteristics for the validation set are the following: r(2)=0.8179 and s=0.338 (first distribution); r(2)=0.8682 and s=0.341 (second distribution); r(2)=0.8435 and s=0.323 (third distribution). These models are built up using only information on the molecular structure: no data on physicochemical parameters, 3D features of the molecular structure and quantum mechanics descriptors are involved in the modeling process. Copyright © 2016 Elsevier B.V. All rights reserved.
24 CFR 200.72 - Zoning, deed and building restrictions.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Zoning, deed and building... Zoning, deed and building restrictions. The project when completed shall not violate any material zoning or deed restrictions applicable to the project site, and shall comply with all applicable building...
26 CFR 1.263(a)-0 - Outline of regulations under section 263(a).
Code of Federal Regulations, 2014 CFR
2014-04-01
... amounts paid for improvements. (e) Determining the unit of property. (1) In general. (2) Building. (i) In general. (ii) Application of improvement rules to a building. (A) Building structure. (B) Building system. (iii) Condominium. (A) In general. (B) Application of improvement rules to a condominium. (iv...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Nan; Marnay, Chris; Firestone, Ryan
The August 2003 blackout of the northeastern U.S. and CANADA caused great economic losses and inconvenience to New York City and other affected areas. The blackout was a warning to the rest of the world that the ability of conventional power systems to meet growing electricity demand is questionable. Failure of large power systems can lead to serious emergencies. Introduction of on-site generation, renewable energy such as solar and wind power and the effective utilization of exhaust heat is needed, to meet the growing energy demands of the residential and commercial sectors. Additional benefit can be achieved by integrating thesemore » distributed technologies into distributed energy resource (DER) systems. This work demonstrates a method for choosing and designing economically optimal DER systems. An additional purpose of this research is to establish a database of energy tariffs, DER technology cost and performance characteristics, and building energy consumption for Japan. This research builds on prior DER studies at the Ernest Orlando Lawrence Berkeley National Laboratory (LBNL) and with their associates in the Consortium for Electric Reliability Technology Solutions (CERTS) and operation, including the development of the microgrid concept, and the DER selection optimization program, the Distributed Energy Resources Customer Adoption Model (DER-CAM). DER-CAM is a tool designed to find the optimal combination of installed equipment and an idealized operating schedule to minimize a site's energy bills, given performance and cost data on available DER technologies, utility tariffs, and site electrical and thermal loads over a test period, usually an historic year. Since hourly electric and thermal energy data are rarely available, they are typically developed by building simulation for each of six end use loads used to model the building: electric-only loads, space heating, space cooling, refrigeration, water heating, and natural-gas-only loads. DER-CAM provides a global optimization, albeit idealized, that shows how the necessary useful energy loads can be provided for at minimum cost by selection and operation of on-site generation, heat recovery, cooling, and efficiency improvements. This study examines five prototype commercial buildings and uses DER-CAM to select the economically optimal DER system for each. The five building types are office, hospital, hotel, retail, and sports facility. Each building type was considered for both 5,000 and 10,000 square meter floor sizes. The energy consumption of these building types is based on building energy simulation and published literature. Based on the optimization results, energy conservation and the emissions reduction were also evaluated. Furthermore, a comparison study between Japan and the U.S. has been conducted covering the policy, technology and the utility tariffs effects on DER systems installations. This study begins with an examination of existing DER research. Building energy loads were then generated through simulation (DOE-2) and scaled to match available load data in the literature. Energy tariffs in Japan and the U.S. were then compared: electricity prices did not differ significantly, while commercial gas prices in Japan are much higher than in the U.S. For smaller DER systems, the installation costs in Japan are more than twice those in the U.S., but this difference becomes smaller with larger systems. In Japan, DER systems are eligible for a 1/3 rebate of installation costs, while subsidies in the U.S. vary significantly by region and application. For 10,000 m{sup 2} buildings, significant decreases in fuel consumption, carbon emissions, and energy costs were seen in the economically optimal results. This was most noticeable in the sports facility, followed the hospital and hotel. This research demonstrates that office buildings can benefit from CHP, in contrast to popular opinion. For hospitals and sports facilities, the use of waste heat is particularly effective for water and space heating. For the other building types, waste heat is most effectively used for both heating and cooling. The same examination was done for the 5,000 m{sup 2} buildings. Although CHP installation capacity is smaller and the payback periods are longer, economic, fuel efficiency, and environmental benefits are still seen. While these benefits remain even when subsidies are removed, the increased installation costs lead to lower levels of installation capacity and thus benefit.« less
High-Throughput and Low-Latency Network Communication with NetIO
NASA Astrophysics Data System (ADS)
Schumacher, Jörn; Plessl, Christian; Vandelli, Wainer
2017-10-01
HPC network technologies like Infiniband, TrueScale or OmniPath provide low- latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS exclusively target the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs, but it requires a non-negligible effort and expert knowledge. At the same time, message services like ZeroMQ have gained popularity in the HEP community. They make it possible to build distributed applications with a high-level approach and provide good performance. Unfortunately, their usage usually limits developers to TCP/IP- based networks. While it is possible to operate a TCP/IP stack on top of Infiniband and OmniPath, this approach may not be very efficient compared to a direct use of native APIs. NetIO is a simple, novel asynchronous message service that can operate on Ethernet, Infiniband and similar network fabrics. In this paper the design and implementation of NetIO is presented and described, and its use is evaluated in comparison to other approaches. NetIO supports different high-level programming models and typical workloads of HEP applications. The ATLAS FELIX project [1] successfully uses NetIO as its central communication platform. The architecture of NetIO is described in this paper, including the user-level API and the internal data-flow design. The paper includes a performance evaluation of NetIO including throughput and latency measurements. The performance is compared against the state-of-the- art ZeroMQ message service. Performance measurements are performed in a lab environment with Ethernet and FDR Infiniband networks.
Intelligent Systems Technologies to Assist in Utilization of Earth Observation Data
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuram K.; McConaughy, Gail; Lynnes, Christopher; McDonald, Kenneth; Kempler, Steven
2003-01-01
With the launch of several Earth observing satellites over the last decade, we are now in a data rich environment. From NASA's Earth Observing System (EOS) satellites alone, we are accumulating more than 3 TB per day of raw data and derived geophysical parameters. The data products are being distributed to a large user community comprising scientific researchers, educators and operational government agencies. Notable progress has been made in the last decade in facilitating access to data. However, to realize the full potential of the growing archives of valuable scientific data, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system. Potential Intelligent Archive concepts include: 1) Mining archived data holdings using Intelligent Data Understanding algorithms to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services involved in a scientific enterprise; 3) Recognizing the value of results, indexing and formatting them for easy access, and delivering them to concerned individuals; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building (i.e., the transformations from data to information to knowledge) instead of just data pipelining; and 5) Being aware of other nodes in the knowledge building system, participating in open systems interfaces and protocols for virtualization, and collaborative interoperability. This paper presents some of these concepts and identifies issues to be addressed by research in future intelligent systems technology.
Distributed Optimization of Multi-Agent Systems: Framework, Local Optimizer, and Applications
NASA Astrophysics Data System (ADS)
Zu, Yue
Convex optimization problem can be solved in a centralized or distributed manner. Compared with centralized methods based on single-agent system, distributed algorithms rely on multi-agent systems with information exchanging among connected neighbors, which leads to great improvement on the system fault tolerance. Thus, a task within multi-agent system can be completed with presence of partial agent failures. By problem decomposition, a large-scale problem can be divided into a set of small-scale sub-problems that can be solved in sequence/parallel. Hence, the computational complexity is greatly reduced by distributed algorithm in multi-agent system. Moreover, distributed algorithm allows data collected and stored in a distributed fashion, which successfully overcomes the drawbacks of using multicast due to the bandwidth limitation. Distributed algorithm has been applied in solving a variety of real-world problems. Our research focuses on the framework and local optimizer design in practical engineering applications. In the first one, we propose a multi-sensor and multi-agent scheme for spatial motion estimation of a rigid body. Estimation performance is improved in terms of accuracy and convergence speed. Second, we develop a cyber-physical system and implement distributed computation devices to optimize the in-building evacuation path when hazard occurs. The proposed Bellman-Ford Dual-Subgradient path planning method relieves the congestion in corridor and the exit areas. At last, highway traffic flow is managed by adjusting speed limits to minimize the fuel consumption and travel time in the third project. Optimal control strategy is designed through both centralized and distributed algorithm based on convex problem formulation. Moreover, a hybrid control scheme is presented for highway network travel time minimization. Compared with no controlled case or conventional highway traffic control strategy, the proposed hybrid control strategy greatly reduces total travel time on test highway network.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Transformations, Inc., has extensive experience building high-performance homes - production and custom - in a variety of Massachusetts locations and uses mini-split heat pumps (MSHPs) for space conditioning in most of its homes. The use of MSHPs for simplified space-conditioning distribution provides significant first-cost savings, which offsets the increased investment in the building enclosure. In this project, the U.S. Department of Energy Building America team Building Science Corporation evaluated the long-term performance of MSHPs in 8 homes during a period of 3 years. The work examined electrical use of MSHPs, distributions of interior temperatures and humidity when using simplified (two-point)more » heating systems in high-performance housing, and the impact of open-door/closed-door status on temperature distributions.« less
Castro, Alexander Garcia; Rocca-Serra, Philippe; Stevens, Robert; Taylor, Chris; Nashar, Karim; Ragan, Mark A; Sansone, Susanna-Assunta
2006-01-01
Background Incorporation of ontologies into annotations has enabled 'semantic integration' of complex data, making explicit the knowledge within a certain field. One of the major bottlenecks in developing bio-ontologies is the lack of a unified methodology. Different methodologies have been proposed for different scenarios, but there is no agreed-upon standard methodology for building ontologies. The involvement of geographically distributed domain experts, the need for domain experts to lead the design process, the application of the ontologies and the life cycles of bio-ontologies are amongst the features not considered by previously proposed methodologies. Results Here, we present a methodology for developing ontologies within the biological domain. We describe our scenario, competency questions, results and milestones for each methodological stage. We introduce the use of concept maps during knowledge acquisition phases as a feasible transition between domain expert and knowledge engineer. Conclusion The contributions of this paper are the thorough description of the steps we suggest when building an ontology, example use of concept maps, consideration of applicability to the development of lower-level ontologies and application to decentralised environments. We have found that within our scenario conceptual maps played an important role in the development process. PMID:16725019
NASA Technical Reports Server (NTRS)
Huang, Norden E.
1999-01-01
A new method for analyzing nonlinear and nonstationary data has been developed. The key part of the method is the Empirical Mode Decomposition method with which any complicated data set can be decomposed into a finite and often small number of Intrinsic Mode Functions (IMF). An IMF is defined as any function having the same numbers of zero-crossing and extrema, and also having symmetric envelopes defined by the local maxima and minima respectively. The IMF also admits well-behaved Hilbert transform. This decomposition method is adaptive, and, therefore, highly efficient. Since the decomposition is based on the local characteristic time scale of the data, it is applicable to nonlinear and nonstationary processes. With the Hilbert transform, the Intrinsic Mode Functions yield instantaneous frequencies as functions of time that give sharp identifications of imbedded structures. The final presentation of the results is an energy-frequency-time distribution, designated as the Hilbert Spectrum, Example of application of this method to earthquake and building response will be given. The results indicate those low frequency components, totally missed by the Fourier analysis, are clearly identified by the new method. Comparisons with Wavelet and window Fourier analysis show the new method offers much better temporal and frequency resolutions.
Cryogenic Controls for Fermilab's Srf Cavities and Test Facility
NASA Astrophysics Data System (ADS)
Norris, B.; Bossert, R.; Klebaner, A.; Lackey, S.; Martinez, A.; Pei, L.; Soyars, W.; Sirotenko, V.
2008-03-01
A new superconducting radio frequency (SRF) cavities test facility is now operational at Fermilab's Meson Detector Building (MDB). The Cryogenic Test Facility (CTF), located in a separate building 500 m away, supplies the facility with cryogens. The design incorporates ambient temperature pumping for superfluid helium production, as well as three 0.6 kW at 4.5 K refrigerators, five screw compressors, a helium purifier, helium and nitrogen inventory, cryogenic distribution system, and a variety of test cryostats. To control and monitor the vastly distributed cryogenic system, a flexible scheme has been developed. Both commercial and experimental physics tools are used. APACS+™, a process automation control system from Siemens-Moore, is at the heart of the design. APACS+™ allows engineers to configure an ever evolving test facility while maintaining control over the plant and distribution system. APACS+™ nodes at CTF and MDB are coupled by a fiber optic network. DirectLogic205 PLCs by KOYO® are used as the field level interface to most I/O. The top layer of this system uses EPICS (Experimental Physics and Industrial Control System) as a SCADA/HMI. Utilities for graphical display, control loop setting, real time/historical plotting and alarming have been implemented by using the world-wide library of applications for EPICS. OPC client/server technology is used to bridge across each different platform. This paper presents this design and its successful implementation.
Epitrochoid Power-Law Nozzle Rapid Prototype Build/Test Project (Briefing Charts)
2015-02-01
Production Approved for public release; distribution is unlimited. PA clearance # 15122. 4 Epitrochoid Power-Law Nozzle Build/Test Build on SpaceX ...Multiengine Approach SpaceX ) Approved for public release; distribution is unlimited. PA clearance # 15122. Engines: Merlin 1D on Falcon 9 v1.1 (Photo 5...to utilize features of high performance engines advances and the economies of scale of the multi-engine approach of SpaceX Falcon 9 – Rapid Prototype
NASA Astrophysics Data System (ADS)
Longmore, S. P.; Bikos, D.; Szoke, E.; Miller, S. D.; Brummer, R.; Lindsey, D. T.; Hillger, D.
2014-12-01
The increasing use of mobile phones equipped with digital cameras and the ability to post images and information to the Internet in real-time has significantly improved the ability to report events almost instantaneously. In the context of severe weather reports, a representative digital image conveys significantly more information than a simple text or phone relayed report to a weather forecaster issuing severe weather warnings. It also allows the forecaster to reasonably discern the validity and quality of a storm report. Posting geo-located, time stamped storm report photographs utilizing a mobile phone application to NWS social media weather forecast office pages has generated recent positive feedback from forecasters. Building upon this feedback, this discussion advances the concept, development, and implementation of a formalized Photo Storm Report (PSR) mobile application, processing and distribution system and Advanced Weather Interactive Processing System II (AWIPS-II) plug-in display software.The PSR system would be composed of three core components: i) a mobile phone application, ii) a processing and distribution software and hardware system, and iii) AWIPS-II data, exchange and visualization plug-in software. i) The mobile phone application would allow web-registered users to send geo-location, view direction, and time stamped PSRs along with severe weather type and comments to the processing and distribution servers. ii) The servers would receive PSRs, convert images and information to NWS network bandwidth manageable sizes in an AWIPS-II data format, distribute them on the NWS data communications network, and archive the original PSRs for possible future research datasets. iii) The AWIPS-II data and exchange plug-ins would archive PSRs, and the visualization plug-in would display PSR locations, times and directions by hour, similar to surface observations. Hovering on individual PSRs would reveal photo thumbnails and clicking on them would display the full resolution photograph.Here, we present initial NWS forecaster feedback received from social media posted PSRs, motivating the possible advantages of PSRs within AWIPS-II, the details of developing and implementing a PSR system, and possible future applications beyond severe weather reports and AWIPS-II.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Charles; Green, Andrew S.; Dahle, Douglas
2013-08-01
The findings of this study indicate that potential exists in non-building applications to save energy and costs. This potential could save billions of federal dollars, reduce reliance on fossil fuels, increase energy independence and security, and reduce greenhouse gas emissions. The Federal Government has nearly twenty years of experience with achieving similar energy cost reductions, and letting the energy costs savings pay for themselves, by applying energy savings performance contracts (ESPC) inits buildings. Currently, the application of ESPCs is limited by statute to federal buildings. This study indicates that ESPCs can be a compatible and effective contracting tool for achievingmore » savings in non-building applications.« less
Rodgers, Kirsten C; Akintobi, Tabia; Thompson, Winifred Wilkins; Evans, Donoria; Escoffery, Cam; Kegler, Michelle C
2014-06-01
Community-engaged research is effective in addressing health disparities but may present challenges for both academic institutions and community partners. Therefore, the need to build capacity for conducting collaborative research exists. The purpose of this study is to present a model for building research capacity in academic-community partnerships. The Building Collaborative Research Capacity Model was developed as part of the Community Engagement Research Program (CERP) of the Atlanta Clinical and Translational Science Institute (ACTSI). Six domains of collaborative research capacity were identified and used to develop a model. Inputs, activities, outputs, and outcomes of building collaborative research capacity are described. To test this model, a competitive request for applications was widely distributed and four community-based organizations were funded to participate in a 2-year program with the aim of conducting a pilot study and submitting a research proposal for funding to National Institutes of Health or another major funding agency. During the first year, the community-based organization partners were trained on conducting collaborative research and matched with an academic partner from an ACTSI institution. Three of the academic-community partnerships submitted pilot study results and two submitted a grant proposal to a national agency. The Building Collaborative Research Capacity Model is an innovative approach to strengthening academic-community partnerships. This model will help build needed research capacity, serve as a framework for academicians and community partners, and lead to sustainable partnerships that improve community health. © 2013 Society for Public Health Education.
Corridor One:An Integrated Distance Visualization Enuronments for SSI+ASCI Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christopher R. Johnson, Charles D. Hansen
2001-10-29
The goal of Corridor One: An Integrated Distance Visualization Environment for ASCI and SSI Application was to combine the forces of six leading edge laboratories working in the areas of visualization and distributed computing and high performance networking (Argonne National Laboratory, Lawrence Berkeley National Laboratory, Los Alamos National Laboratory, University of Illinois, University of Utah and Princeton University) to develop and deploy the most advanced integrated distance visualization environment for large-scale scientific visualization and demonstrate it on applications relevant to the DOE SSI and ASCI programs. The Corridor One team brought world class expertise in parallel rendering, deep image basedmore » rendering, immersive environment technology, large-format multi-projector wall based displays, volume and surface visualization algorithms, collaboration tools and streaming media technology, network protocols for image transmission, high-performance networking, quality of service technology and distributed computing middleware. Our strategy was to build on the very successful teams that produced the I-WAY, ''Computational Grids'' and CAVE technology and to add these to the teams that have developed the fastest parallel visualizations systems and the most widely used networking infrastructure for multicast and distributed media. Unfortunately, just as we were getting going on the Corridor One project, DOE cut the program after the first year. As such, our final report consists of our progress during year one of the grant.« less
The role of order in distributed programs
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.; Marzullo, Keith
1989-01-01
The role of order in building distributed systems is discussed. It is the belief that a principle of event ordering underlies the wide range of operating systems mechanisms that were put forward for building robust distributed software. Stated concisely, this principle achieves correct distributed behavior by ordering classes of distributed events that conflict with one another. By focusing on order, simplified descriptions can be obtained and convincingly correct solutions to problems that might otherwise have looked extremely complex. Moreover, it is observed that there are a limited number of ways to obtain order, and that the choice made impacts greatly on performance.
Balancing Hydronic Systems in Multifamily Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruch, R.; Ludwig, P.; Maurer, T.
2014-07-01
In multifamily hydronic systems, temperature imbalance may be caused by undersized piping, improperly adjusted balancing valves, inefficient water temperature and flow levels, and owner/occupant interaction with the boilers, distribution and controls. The effects of imbalance include tenant discomfort, higher energy use intensity and inefficient building operation. This paper explores cost-effective distribution upgrades and balancing measures in multifamily hydronic systems, providing a resource to contractors, auditors, and building owners on best practices to improve tenant comfort and lower operating costs. The research was conducted by The Partnership for Advanced Residential Retrofit (PARR) in conjunction with Elevate Energy. The team surveyed existingmore » knowledge on cost-effective retrofits for optimizing distribution in typical multifamily hydronic systems, with the aim of identifying common situations and solutions, and then conducted case studies on two Chicago area buildings with known balancing issues in order to quantify the extent of temperature imbalance. At one of these buildings a booster pump was installed on a loop to an underheated wing of the building. This study found that unit temperature in a multifamily hydronic building can vary as much as 61 degrees F, particularly if windows are opened or tenants use intermittent supplemental heating sources like oven ranges. Average temperature spread at the building as a result of this retrofit decreased from 22.1 degrees F to 15.5 degrees F.« less
Matsuoka, Rei; Shimada, Atsushi; Komuro, Yasuaki; Sugita, Yuji; Kohda, Daisuke
2016-03-01
Contacts with neighboring molecules in protein crystals inevitably restrict the internal motions of intrinsically flexible proteins. The resultant clear electron densities permit model building, as crystallographic snapshot structures. Although these still images are informative, they could provide biased pictures of the protein motions. If the mobile parts are located at a site lacking direct contacts in rationally designed crystals, then the amplitude of the movements can be experimentally analyzed. We propose a fusion protein method, to create crystal contact-free space (CCFS) in protein crystals and to place the mobile parts in the CCFS. Conventional model building fails when large amplitude motions exist. In this study, the mobile parts appear as smeared electron densities in the CCFS, by suitable processing of the X-ray diffraction data. We applied the CCFS method to a highly mobile presequence peptide bound to the mitochondrial import receptor, Tom20, and a catalytically relevant flexible segment in the oligosaccharyltransferase, AglB. These two examples demonstrated the general applicability of the CCFS method to the analysis of the spatial distribution of motions within protein molecules. © 2016 The Protein Society.
NASA Astrophysics Data System (ADS)
Wi, S.; Ray, P. A.; Brown, C.
2015-12-01
A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.
Singh, R; Joshi, P K; Kumar, M; Dash, P P; Joshi, B D
2009-08-01
Geospatial tools supported by ancillary geo-database and extensive fieldwork regarding the distribution of tiger and its prey in Anchankmar Wildlife Sanctuary (AMWLS) were used to build a tiger habitat suitability model. This consists of a quantitative geographical information system (GIS) based approach using field parameters and spatial thematic information. The estimates of tiger sightings, its prey sighting and predicted distribution with the assistance of contextual environmental data including terrain, road network, settlement and drainage surfaces were used to develop the model. Eight variables in the dataset viz., forest cover type, forest cover density, slope, aspect, altitude, and distance from road, settlement and drainage were seen as suitable proxies and were used as independent variables in the analysis. Principal component analysis and binomial multiple logistic regression were used for statistical treatments of collected habitat parameters from field and independent variables respectively. The assessment showed a strong expert agreement between the predicted and observed suitable areas. A combination of the generated information and published literature was also used while building a habitat suitability map for the tiger. The modeling approach has taken the habitat preference parameters of the tiger and potential distribution of prey species into account. For assessing the potential distribution of prey species, independent suitability models were developed and validated with the ground truth. It is envisaged that inclusion of the prey distribution probability strengthens the model when a key species is under question. The results of the analysis indicate that tiger occur throughout the sanctuary. The results have been found to be an important input as baseline information for population modeling and natural resource management in the wildlife sanctuary. The development and application of similar models can help in better management of the protected areas of national interest.
Integrating distributed multimedia systems and interactive television networks
NASA Astrophysics Data System (ADS)
Shvartsman, Alex A.
1996-01-01
Recent advances in networks, storage and video delivery systems are about to make commercial deployment of interactive multimedia services over digital television networks a reality. The emerging components individually have the potential to satisfy the technical requirements in the near future. However, no single vendor is offering a complete end-to-end commercially-deployable and scalable interactive multimedia applications systems over digital/analog television systems. Integrating a large set of maturing sub-assemblies and interactive multimedia applications is a major task in deploying such systems. Here we deal with integration issues, requirements and trade-offs in building delivery platforms and applications for interactive television services. Such integration efforts must overcome lack of standards, and deal with unpredictable development cycles and quality problems of leading- edge technology. There are also the conflicting goals of optimizing systems for video delivery while enabling highly interactive distributed applications. It is becoming possible to deliver continuous video streams from specific sources, but it is difficult and expensive to provide the ability to rapidly switch among multiple sources of video and data. Finally, there is the ever- present challenge of integrating and deploying expensive systems whose scalability and extensibility is limited, while ensuring some resiliency in the face of inevitable changes. This proceedings version of the paper is an extended abstract.
Spatiotemporal patterns of population distribution as crucial element for risk management
NASA Astrophysics Data System (ADS)
Gokesch, Karin; Promper, Catrin; van Westen, Cees J.; Glade, Thomas
2014-05-01
The spatiotemporal distribution and presence of the population in a certain area is a crucial element within natural hazard risk management, especially in the case of rapid onset hazard events and emergency management. When fast onset hazards such as earthquakes, flash floods or industrial accidents occur, people may not have adequate time for evacuation and the emergency management requires a fast response and reaction. Therefore, information on detailed distribution of people affected by a certain hazard is important for a fast assessment of the situation including the number and the type of people (distinguishing between elderly or handicapped people, children, working population etc.) affected. This study thus aims at analyzing population distribution on an hourly basis for different days e.g. workday or holiday. The applied method combines the basic assessment of population distribution in a given area with specific location-related patterns of distribution-changes over time. The calculations are based on detailed information regarding the expected presence of certain groups of people, e.g. school children, working or elderly people, which all show different patterns of movement over certain time periods. The study area is the city of Waidhofen /Ybbs located in the Alpine foreland in the Southwest of Lower Austria. This city serves as a regional center providing basic infrastructure, shops and schools for the surrounding countryside. Therefore a lot of small and medium businesses are located in this area showing a rather high variation of population present at different times of the day. The available building footprint information was classified with respect to building type and occupancy type, which was used to estimate the expected residents within the buildings, based on the floorspace of the buildings and the average floorspace per person. Additional information on the distribution and the average duration of stay of the people in these buildings was assessed using general population statistics and specific information about selected buildings, such as schools, hospitals or homes for the elderly, to calculate the distribution patterns for each group of people over time.
Accessing and distributing EMBL data using CORBA (common object request broker architecture).
Wang, L; Rodriguez-Tomé, P; Redaschi, N; McNeil, P; Robinson, A; Lijnzaad, P
2000-01-01
The EMBL Nucleotide Sequence Database is a comprehensive database of DNA and RNA sequences and related information traditionally made available in flat-file format. Queries through tools such as SRS (Sequence Retrieval System) also return data in flat-file format. Flat files have a number of shortcomings, however, and the resources therefore currently lack a flexible environment to meet individual researchers' needs. The Object Management Group's common object request broker architecture (CORBA) is an industry standard that provides platform-independent programming interfaces and models for portable distributed object-oriented computing applications. Its independence from programming languages, computing platforms and network protocols makes it attractive for developing new applications for querying and distributing biological data. A CORBA infrastructure developed by EMBL-EBI provides an efficient means of accessing and distributing EMBL data. The EMBL object model is defined such that it provides a basis for specifying interfaces in interface definition language (IDL) and thus for developing the CORBA servers. The mapping from the object model to the relational schema in the underlying Oracle database uses the facilities provided by PersistenceTM, an object/relational tool. The techniques of developing loaders and 'live object caching' with persistent objects achieve a smart live object cache where objects are created on demand. The objects are managed by an evictor pattern mechanism. The CORBA interfaces to the EMBL database address some of the problems of traditional flat-file formats and provide an efficient means for accessing and distributing EMBL data. CORBA also provides a flexible environment for users to develop their applications by building clients to our CORBA servers, which can be integrated into existing systems.
Accessing and distributing EMBL data using CORBA (common object request broker architecture)
Wang, Lichun; Rodriguez-Tomé, Patricia; Redaschi, Nicole; McNeil, Phil; Robinson, Alan; Lijnzaad, Philip
2000-01-01
Background: The EMBL Nucleotide Sequence Database is a comprehensive database of DNA and RNA sequences and related information traditionally made available in flat-file format. Queries through tools such as SRS (Sequence Retrieval System) also return data in flat-file format. Flat files have a number of shortcomings, however, and the resources therefore currently lack a flexible environment to meet individual researchers' needs. The Object Management Group's common object request broker architecture (CORBA) is an industry standard that provides platform-independent programming interfaces and models for portable distributed object-oriented computing applications. Its independence from programming languages, computing platforms and network protocols makes it attractive for developing new applications for querying and distributing biological data. Results: A CORBA infrastructure developed by EMBL-EBI provides an efficient means of accessing and distributing EMBL data. The EMBL object model is defined such that it provides a basis for specifying interfaces in interface definition language (IDL) and thus for developing the CORBA servers. The mapping from the object model to the relational schema in the underlying Oracle database uses the facilities provided by PersistenceTM, an object/relational tool. The techniques of developing loaders and 'live object caching' with persistent objects achieve a smart live object cache where objects are created on demand. The objects are managed by an evictor pattern mechanism. Conclusions: The CORBA interfaces to the EMBL database address some of the problems of traditional flat-file formats and provide an efficient means for accessing and distributing EMBL data. CORBA also provides a flexible environment for users to develop their applications by building clients to our CORBA servers, which can be integrated into existing systems. PMID:11178259
Application of remote thermal scanning to the NASA energy conservation program
NASA Technical Reports Server (NTRS)
Bowman, R. L.; Jack, J. R.
1977-01-01
Airborne thermal scans of all NASA centers were made during 1975 and 1976. The remotely sensed data were used to identify a variety of heat losses, including those from building roofs and central heating system distribution lines. Thermal imagery from several NASA centers is presented to demonstrate the capability of detecting these heat losses remotely. Many heat loss areas located by the scan data were verified by ground surveys. At this point, at least for such energy-intensive areas, thermal scanning is an excellent means of detecting many possible energy losses.
Building Cognition: The Construction of Computational Representations for Scientific Discovery.
Chandrasekharan, Sanjay; Nersessian, Nancy J
2015-11-01
Novel computational representations, such as simulation models of complex systems and video games for scientific discovery (Foldit, EteRNA etc.), are dramatically changing the way discoveries emerge in science and engineering. The cognitive roles played by such computational representations in discovery are not well understood. We present a theoretical analysis of the cognitive roles such representations play, based on an ethnographic study of the building of computational models in a systems biology laboratory. Specifically, we focus on a case of model-building by an engineer that led to a remarkable discovery in basic bioscience. Accounting for such discoveries requires a distributed cognition (DC) analysis, as DC focuses on the roles played by external representations in cognitive processes. However, DC analyses by and large have not examined scientific discovery, and they mostly focus on memory offloading, particularly how the use of existing external representations changes the nature of cognitive tasks. In contrast, we study discovery processes and argue that discoveries emerge from the processes of building the computational representation. The building process integrates manipulations in imagination and in the representation, creating a coupled cognitive system of model and modeler, where the model is incorporated into the modeler's imagination. This account extends DC significantly, and we present some of the theoretical and application implications of this extended account. Copyright © 2014 Cognitive Science Society, Inc.
Mountain torrents: Quantifying vulnerability and assessing uncertainties
Totschnig, Reinhold; Fuchs, Sven
2013-01-01
Vulnerability assessment for elements at risk is an important component in the framework of risk assessment. The vulnerability of buildings affected by torrent processes can be quantified by vulnerability functions that express a mathematical relationship between the degree of loss of individual elements at risk and the intensity of the impacting process. Based on data from the Austrian Alps, we extended a vulnerability curve for residential buildings affected by fluvial sediment transport processes to other torrent processes and other building types. With respect to this goal to merge different data based on different processes and building types, several statistical tests were conducted. The calculation of vulnerability functions was based on a nonlinear regression approach applying cumulative distribution functions. The results suggest that there is no need to distinguish between different sediment-laden torrent processes when assessing vulnerability of residential buildings towards torrent processes. The final vulnerability functions were further validated with data from the Italian Alps and different vulnerability functions presented in the literature. This comparison showed the wider applicability of the derived vulnerability functions. The uncertainty inherent to regression functions was quantified by the calculation of confidence bands. The derived vulnerability functions may be applied within the framework of risk management for mountain hazards within the European Alps. The method is transferable to other mountain regions if the input data needed are available. PMID:27087696
Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Model View Definition
2013-06-01
building information models ( BIM ) at the coordinated design stage of building construction. 1.3 Approach To...standard for exchanging Building Information Modeling ( BIM ) data, which defines hundreds of classes for common use in software, currently supported by...specifications, Construction Operations Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF:
Alesi, Patrick
2008-04-01
This paper follows the development of the business continuity planning (BCP) programme at Lehman Brothers following the events of September 11th. Previous attempts to implement a `traditional' form of BCP had been ineffective, but following the events, the firm began to look at BCP in a new light. This paper deals with three main themes: creating a culture of resiliency, leveraging technology, and building flexible plans. Distributing accountability for BCP to business line managers, integrating BCP change management into the normal course of business, and providing every employee with personalised BCP information breeds a culture of resiliency where people are empowered to react to events without burdensome, hierarchical response and recovery procedures. Building a strong relationship with one's application development community can result in novel, customised BCP solutions; existing systems and data structures can be used to enhance an existing BCP. Even the best plans are often challenged by events; understanding that flexibility is essential to effective incident response is a critical element in the development of a proper business continuity plan.
2013-03-18
0188 3. DATES COVERED (From - To) - UU UU UU UU Approved for public release; distribution is unlimited. Stability and degradation mechanisms of metal ...Stability and degradation mechanisms of metal –organic frameworks containing the Zr6O4(OH)4 secondary building unit Report Title See publication. 3...Stability and degradation mechanisms of metal –organic frameworks containing the Zr6O4(OH)4 secondary building unit Approved for public release; distribution
The Arbo‑zoonet Information System.
Di Lorenzo, Alessio; Di Sabatino, Daria; Blanda, Valeria; Cioci, Daniela; Conte, Annamaria; Bruno, Rossana; Sauro, Francesca; Calistri, Paolo; Savini, Lara
2016-06-30
The Arbo‑zoonet Information System has been developed as part of the 'International Network for Capacity Building for the Control of Emerging Viral Vector Borne Zoonotic Diseases (Arbo‑zoonet)' project. The project aims to create common knowledge, sharing data, expertise, experiences, and scientific information on West Nile Disease (WND), Crimean‑Congo haemorrhagic fever (CCHF), and Rift Valley fever (RVF). These arthropod‑borne diseases of domestic and wild animals can affect humans, posing great threat to public health. Since November 2011, when the Schmallenberg virus (SBV) has been discovered for the first time in Northern Europe, the Arbo‑zoonet Information System has been used in order to collect information on newly discovered disease and to manage the epidemic emergency. The system monitors the geographical distribution and epidemiological evolution of CCHF, RVF, and WND since 1946. More recently, it has also been deployed to monitor the SBV data. The Arbo‑zoonet Information System includes a web application for the management of the database in which data are stored and a WebGIS application to explore spatial disease distributions, facilitating the epidemiological analysis. The WebGIS application is an effective tool to show and share the information and to facilitate the exchange and dissemination of relevant data among project's participants.
Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments
NASA Astrophysics Data System (ADS)
Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin
The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.
Mapping Applications Center, National Mapping Division, U.S. Geological Survey
,
1996-01-01
The Mapping Applications Center (MAC), National Mapping Division (NMD), is the eastern regional center for coordinating the production, distribution, and sale of maps and digital products of the U.S. Geological Survey (USGS). It is located in the John Wesley Powell Federal Building in Reston, Va. The MAC's major functions are to (1) establish and manage cooperative mapping programs with State and Federal agencies; (2) perform new research in preparing and applying geospatial information; (3) prepare digital cartographic data, special purpose maps, and standard maps from traditional and classified source materials; (4) maintain the domestic names program of the United States; (5) manage the National Aerial Photography Program (NAPP); (6) coordinate the NMD's publications and outreach programs; and (7) direct the USGS mapprinting operations.
Desktop supercomputer: what can it do?
NASA Astrophysics Data System (ADS)
Bogdanov, A.; Degtyarev, A.; Korkhov, V.
2017-12-01
The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.
Nanostructures and functional materials fabricated by interferometric lithography.
Xia, Deying; Ku, Zahyun; Lee, S C; Brueck, S R J
2011-01-11
Interferometric lithography (IL) is a powerful technique for the definition of large-area, nanometer-scale, periodically patterned structures. Patterns are recorded in a light-sensitive medium, such as a photoresist, that responds nonlinearly to the intensity distribution associated with the interference of two or more coherent beams of light. The photoresist patterns produced with IL are a platform for further fabrication of nanostructures and growth of functional materials and are building blocks for devices. This article provides a brief review of IL technologies and focuses on various applications for nanostructures and functional materials based on IL including directed self-assembly of colloidal nanoparticles, nanophotonics, semiconductor materials growth, and nanofluidic devices. Perspectives on future directions for IL and emerging applications in other fields are presented.
75 FR 60423 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-30
... November 1, 2010. Title and OMB Number: DoD Building Pass Application; DD Form 2249; OMB Number 0704-0328... applicants for DoD Building Passes. The information collected from the DD Form 2249, ``DoD Building Pass...
NASA Astrophysics Data System (ADS)
Kar, Rony; Dalui, Sujit Kumar
2016-03-01
The variation of pressure at the faces of the octagonal plan shaped tall building due to interference of three square plan shaped tall building of same height is analysed by computational fluid dynamics module, namely ANSYS CFX for 0° wind incidence angle only. All the buildings are closely spaced (distance between two buildings varies from 0.4 h to 2 h, where h is the height of the building). Different cases depending upon the various positions of the square plan shaped buildings are analysed and compared with the octagonal plan shaped building in isolated condition. The comparison is presented in the form of interference factors (IF) and IF contours. Abnormal pressure distribution is observed in some cases. Shielding and channelling effect on the octagonal plan shaped building due to the presence of the interfering buildings are also noted. In the interfering condition the pressure distribution at the faces of the octagonal plan shaped building is not predictable. As the distance between the principal octagonal plan shaped building and the third square plan shaped interfering building increases the behaviour of faces becomes more systematic. The coefficient of pressure (C p) for each face of the octagonal plan shaped building in each interfering case can be easily found if we multiply the IF with the C p in the isolated case.
Simulation of concentration distribution of urban particles under wind
NASA Astrophysics Data System (ADS)
Chen, Yanghou; Yang, Hangsheng
2018-02-01
The concentration of particulate matter in the air is too high, which seriously affects people’s health. The concentration of particles in densely populated towns is also high. Understanding the distribution of particles in the air helps to remove them passively. The concentration distribution of particles in urban streets is simulated by using the FLUENT software. The simulation analysis based on Discrete Phase Modelling (DPM) of FLUENT. Simulation results show that the distribution of the particles is caused by different layout of buildings. And it is pointed out that in the windward area of the building and the leeward sides of the high-rise building are the areas with high concentration of particles. Understanding the concentration of particles in different areas is also helpful for people to avoid and reduce the concentration of particles in high concentration areas.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-14
... and Environmental Design Green Building Rating System of the U.S. Green Building Council. Alternatives... International Falls, MN AGENCY: Public Buildings Service, General Services Administration (GSA). ACTION: Notice...: October 14, 2011. FOR FURTHER INFORMATION CONTACT: Donald R. Melcher, Jr., GSA Public Buildings Service...
ASSESSMENT OF VENTILATION RATES IN 100 BASE OFFICE BUILDINGS
The U.S. EPA's Office of Radiation and Indoor Air studied 100 public and private office buildings across the U.S. from 1994-1998. The purpose of the study, entitled The Building Assessment Survey and Evaluation Study (BASE), was to: a) provide a distribution of IAQ, building, and...
Building Security and Personal Safety. SPEC Kit 150.
ERIC Educational Resources Information Center
Bingham, Karen Havill
This report on a survey of Association of Research Libraries (ARL) member libraries on building security and personal safety policies examines three areas in detail: (1) general building security (access to the building, key distribution, patrols or monitors, intrusion prevention, lighting, work environment after dark); (2) problem behavior…
Center for the Built Environment: Research on Building HVAC Systems
, and lessons learned. Underfloor Air Distribution (UFAD) Cooling Airflow Design Tool Developing simplified design tools for optimization of underfloor systems. Underfloor Air Distribution (UFAD) Cost Near-ZNE Buildings Setpoint Energy Savings Calculator UFAD Case Studies UFAD Cooling Design Tool UFAD
A support architecture for reliable distributed computing systems
NASA Technical Reports Server (NTRS)
Dasgupta, Partha; Leblanc, Richard J., Jr.
1988-01-01
The Clouds project is well underway to its goal of building a unified distributed operating system supporting the object model. The operating system design uses the object concept of structuring software at all levels of the system. The basic operating system was developed and work is under progress to build a usable system.
Building a generalized distributed system model
NASA Technical Reports Server (NTRS)
Mukkamala, R.
1993-01-01
The key elements in the 1992-93 period of the project are the following: (1) extensive use of the simulator to implement and test - concurrency control algorithms, interactive user interface, and replica control algorithms; and (2) investigations into the applicability of data and process replication in real-time systems. In the 1993-94 period of the project, we intend to accomplish the following: (1) concentrate on efforts to investigate the effects of data and process replication on hard and soft real-time systems - especially we will concentrate on the impact of semantic-based consistency control schemes on a distributed real-time system in terms of improved reliability, improved availability, better resource utilization, and reduced missed task deadlines; and (2) use the prototype to verify the theoretically predicted performance of locking protocols, etc.
Research into a distributed fault diagnosis system and its application
NASA Astrophysics Data System (ADS)
Qian, Suxiang; Jiao, Weidong; Lou, Yongjian; Shen, Xiaomei
2005-12-01
CORBA (Common Object Request Broker Architecture) is a solution to distributed computing methods over heterogeneity systems, which establishes a communication protocol between distributed objects. It takes great emphasis on realizing the interoperation between distributed objects. However, only after developing some application approaches and some practical technology in monitoring and diagnosis, can the customers share the monitoring and diagnosis information, so that the purpose of realizing remote multi-expert cooperation diagnosis online can be achieved. This paper aims at building an open fault monitoring and diagnosis platform combining CORBA, Web and agent. Heterogeneity diagnosis object interoperate in independent thread through the CORBA (soft-bus), realizing sharing resource and multi-expert cooperation diagnosis online, solving the disadvantage such as lack of diagnosis knowledge, oneness of diagnosis technique and imperfectness of analysis function, so that more complicated and further diagnosis can be carried on. Take high-speed centrifugal air compressor set for example, we demonstrate a distributed diagnosis based on CORBA. It proves that we can find out more efficient approaches to settle the problems such as real-time monitoring and diagnosis on the net and the break-up of complicated tasks, inosculating CORBA, Web technique and agent frame model to carry on complemental research. In this system, Multi-diagnosis Intelligent Agent helps improve diagnosis efficiency. Besides, this system offers an open circumstances, which is easy for the diagnosis objects to upgrade and for new diagnosis server objects to join in.
Zuo, Wangda; Wetter, Michael; Tian, Wei; ...
2015-07-13
Here, this paper describes a coupled dynamic simulation of an indoor environment with heating, ventilation, and air conditioning (HVAC) systems, controls and building envelope heat transfer. The coupled simulation can be used for the design and control of ventilation systems with stratified air distributions. Those systems are commonly used to reduce building energy consumption while improving the indoor environment quality. The indoor environment was simulated using the fast fluid dynamics (FFD) simulation programme. The building fabric heat transfer, HVAC and control system were modelled using the Modelica Buildings library. After presenting the concept, the mathematical algorithm and the implementation ofmore » the coupled simulation were introduced. The coupled FFD–Modelica simulation was then evaluated using three examples of room ventilation with complex flow distributions with and without feedback control. Lastly, further research and development needs were also discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zuo, Wangda; Wetter, Michael; Tian, Wei
Here, this paper describes a coupled dynamic simulation of an indoor environment with heating, ventilation, and air conditioning (HVAC) systems, controls and building envelope heat transfer. The coupled simulation can be used for the design and control of ventilation systems with stratified air distributions. Those systems are commonly used to reduce building energy consumption while improving the indoor environment quality. The indoor environment was simulated using the fast fluid dynamics (FFD) simulation programme. The building fabric heat transfer, HVAC and control system were modelled using the Modelica Buildings library. After presenting the concept, the mathematical algorithm and the implementation ofmore » the coupled simulation were introduced. The coupled FFD–Modelica simulation was then evaluated using three examples of room ventilation with complex flow distributions with and without feedback control. Lastly, further research and development needs were also discussed.« less
Laloš, Jernej; Babnik, Aleš; Možina, Janez; Požar, Tomaž
2016-03-01
The near-field, surface-displacement waveforms in plates are modeled using interwoven concepts of Green's function formalism and streamlined Huygens' principle. Green's functions resemble the building blocks of the sought displacement waveform, superimposed and weighted according to the simplified distribution. The approach incorporates an arbitrary circular spatial source distribution and an arbitrary circular spatial sensitivity in the area probed by the sensor. The displacement histories for uniform, Gaussian and annular normal-force source distributions and the uniform spatial sensor sensitivity are calculated, and the corresponding weight distributions are compared. To demonstrate the applicability of the developed scheme, measurements of laser ultrasound induced solely by the radiation pressure are compared with the calculated waveforms. The ultrasound is induced by laser pulse reflection from the mirror-surface of a glass plate. The measurements show excellent agreement not only with respect to various wave-arrivals but also in the shape of each arrival. Their shape depends on the beam profile of the excitation laser pulse and its corresponding spatial normal-force distribution. Copyright © 2015 Elsevier B.V. All rights reserved.
M13 Bacteriophage-Based Self-Assembly Structures and Their Functional Capabilities.
Moon, Jong-Sik; Kim, Won-Geun; Kim, Chuntae; Park, Geun-Tae; Heo, Jeong; Yoo, So Y; Oh, Jin-Woo
2015-06-01
Controlling the assembly of basic structural building blocks in a systematic and orderly fashion is an emerging issue in various areas of science and engineering such as physics, chemistry, material science, biological engineering, and electrical engineering. The self-assembly technique, among many other kinds of ordering techniques, has several unique advantages and the M13 bacteriophage can be utilized as part of this technique. The M13 bacteriophage (Phage) can easily be modified genetically and chemically to demonstrate specific functions. This allows for its use as a template to determine the homogeneous distribution and percolated network structures of inorganic nanostructures under ambient conditions. Inexpensive and environmentally friendly synthesis can be achieved by using the M13 bacteriophage as a novel functional building block. Here, we discuss recent advances in the application of M13 bacteriophage self-assembly structures and the future of this technology.
M13 Bacteriophage-Based Self-Assembly Structures and Their Functional Capabilities
Moon, Jong-Sik; Kim, Won-Geun; Kim, Chuntae; Park, Geun-Tae; Heo, Jeong; Yoo, So Y; Oh, Jin-Woo
2015-01-01
Controlling the assembly of basic structural building blocks in a systematic and orderly fashion is an emerging issue in various areas of science and engineering such as physics, chemistry, material science, biological engineering, and electrical engineering. The self-assembly technique, among many other kinds of ordering techniques, has several unique advantages and the M13 bacteriophage can be utilized as part of this technique. The M13 bacteriophage (Phage) can easily be modified genetically and chemically to demonstrate specific functions. This allows for its use as a template to determine the homogeneous distribution and percolated network structures of inorganic nanostructures under ambient conditions. Inexpensive and environmentally friendly synthesis can be achieved by using the M13 bacteriophage as a novel functional building block. Here, we discuss recent advances in the application of M13 bacteriophage self-assembly structures and the future of this technology. PMID:26146494
[Carbon footprint of buildings in the urban agglomeration of central Liaoning, China].
Shi, Yu; Yun, Ying Xia; Liu, Chong; Chu, Ya Qi
2017-06-18
With the development of urbanization in China, buildings consumed lots of material and energy. How to estimate carbon emission of buildings is an important scientific problem. Carbon footprint of the central Liaoning agglomeration was studied with carbon footprint approach, geographic information system (GIS) and high-resolution remote sensing (HRRS) technology. The results showed that the construction carbon footprint coefficient of central Liaoning urban agglomeration was 269.16 kg·m -2 . The approach of interpreting total building area and spatial distribution with HRRS was effective, and the accuracy was 89%. The extraction approach was critical for total carbon footprint and spatial distribution estimation. The building area and total carbon footprint of central Liaoning urban agglomeration in descending order was Shenyang, Anshan, Fushun, Liao-yang, Yingkou, Tieling and Benxi. The annual average increment of footprint from 2011 to 2013 in descending order was Shenyang, Benxi, Fushun, Anshan, Tieling, Yingkou and Liaoyang. The accurate estimation of construction carbon footprint spatial and its distribution was of significance for the planning and optimization of carbon emission reduction.
Development of probabilistic multimedia multipathway computer codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; LePoire, D.; Gnanapragasam, E.
2002-01-01
The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less
An Improved Simulation of the Diurnally Varying Street Canyon Flow
NASA Astrophysics Data System (ADS)
Yaghoobian, Neda; Kleissl, Jan; Paw U, Kyaw Tha
2012-11-01
The impact of diurnal variation of temperature distribution over building and ground surfaces on the wind flow and scalar transport in street canyons is numerically investigated using the PArallelized LES Model (PALM). The Temperature of Urban Facets Indoor-Outdoor Building Energy Simulator (TUF-IOBES) is used for predicting urban surface heat fluxes as boundary conditions for a modified version of PALM. TUF-IOBES dynamically simulates indoor and outdoor building surface temperatures and heat fluxes in an urban area taking into account weather conditions, indoor heat sources, building and urban material properties, composition of the building envelope (e.g. windows, insulation), and HVAC equipment. Temperature (and heat flux) distribution over urban surfaces of the 3-D raster-type geometry of TUF-IOBES makes it possible to provide realistic, high resolution boundary conditions for the numerical simulation of flow and scalar transport in an urban canopy. Compared to some previous analyses using uniformly distributed thermal forcing associated with urban surfaces, the present analysis shows that resolving non-uniform thermal forcings can provide more detailed and realistic patterns of the local air flow and pollutant dispersion in urban canyons.
NASA Astrophysics Data System (ADS)
Li, Xiaodan; Ni, Jiaqiang; Zhu, Qingfeng; Su, Hang; Cui, Jianzhong; Zhang, Yifei; Li, Jianzhong
2017-11-01
The AlSi10Mg alloy samples with the size of 14×14×91mm were produced by the selective laser melting (SLM) method in different building direction. The structures and the properties at -70°C of the sample in different direction were investigated. The results show that the structure in different building direction shows different morphology. The fish scale structures distribute on the side along the building direction, and the oval structures distribute on the side vertical to the building direction. Some pores in with the maximum size of 100 μm exist of the structure. And there is no major influence for the build orientation on the tensile properties. The tensile strength and the elongation of the sample in the building direction are 340 Mpa and 11.2 % respectively. And the tensile strength and the elongation of the sample vertical to building direction are 350 Mpa and 13.4 % respectively
NASA Astrophysics Data System (ADS)
Foulser-Piggott, R.; Saito, K.; Spence, R.
2012-04-01
Loss estimates produced by catastrophe models are dependent on the quality of the input data, including both the hazard and exposure data. Currently, some of the exposure data input into a catastrophe model is aggregated over an area and therefore an estimate of the risk in this area may have a low level of accuracy. In order to obtain a more detailed and accurate loss estimate, it is necessary to have higher resolution exposure data. However, high resolution exposure data is not commonly available worldwide and therefore methods to infer building distribution and characteristics at higher resolution from existing information must be developed. This study is focussed on the development of disaggregation methodologies for exposure data which, if implemented in current catastrophe models, would lead to improved loss estimates. The new methodologies developed for disaggregating exposure data make use of GIS, remote sensing and statistical techniques. The main focus of this study is on earthquake risk, however the methods developed are modular so that they may be applied to different hazards. A number of different methods are proposed in order to be applicable to different regions of the world which have different amounts of data available. The new methods give estimates of both the number of buildings in a study area and a distribution of building typologies, as well as a measure of the vulnerability of the building stock to hazard. For each method, a way to assess and quantify the uncertainties in the methods and results is proposed, with particular focus on developing an index to enable input data quality to be compared. The applicability of the methods is demonstrated through testing for two study areas, one in Japan and the second in Turkey, selected because of the occurrence of recent and damaging earthquake events. The testing procedure is to use the proposed methods to estimate the number of buildings damaged at different levels following a scenario earthquake event. This enables the results of the models to be compared with real data and the relative performance of the different methodologies to be evaluated. A sensitivity analysis is also conducted for two main reasons. Firstly, to determine the key input variables in the methodology that have the most significant impact on the resulting loss estimate. Secondly, to enable the uncertainty in the different approaches to be quantified and therefore provide a range of uncertainty in the loss estimates.
CompatPM: enabling energy efficient multimedia workloads for distributed mobile platforms
NASA Astrophysics Data System (ADS)
Nathuji, Ripal; O'Hara, Keith J.; Schwan, Karsten; Balch, Tucker
2007-01-01
The computation and communication abilities of modern platforms are enabling increasingly capable cooperative distributed mobile systems. An example is distributed multimedia processing of sensor data in robots deployed for search and rescue, where a system manager can exploit the application's cooperative nature to optimize the distribution of roles and tasks in order to successfully accomplish the mission. Because of limited battery capacities, a critical task a manager must perform is online energy management. While support for power management has become common for the components that populate mobile platforms, what is lacking is integration and explicit coordination across the different management actions performed in a variety of system layers. This papers develops an integration approach for distributed multimedia applications, where a global manager specifies both a power operating point and a workload for a node to execute. Surprisingly, when jointly considering power and QoS, experimental evaluations show that using a simple deadline-driven approach to assigning frequencies can be non-optimal. These trends are further affected by certain characteristics of underlying power management mechanisms, which in our research, are identified as groupings that classify component power management as "compatible" (VFC) or "incompatible" (VFI) with voltage and frequency scaling. We build on these findings to develop CompatPM, a vertically integrated control strategy for power management in distributed mobile systems. Experimental evaluations of CompatPM indicate average energy improvements of 8% when platform resources are managed jointly rather than independently, demonstrating that previous attempts to maximize battery life by simply minimizing frequency are inappropriate from a platform-level perspective.
Autonomic Management in a Distributed Storage System
NASA Astrophysics Data System (ADS)
Tauber, Markus
2010-07-01
This thesis investigates the application of autonomic management to a distributed storage system. Effects on performance and resource consumption were measured in experiments, which were carried out in a local area test-bed. The experiments were conducted with components of one specific distributed storage system, but seek to be applicable to a wide range of such systems, in particular those exposed to varying conditions. The perceived characteristics of distributed storage systems depend on their configuration parameters and on various dynamic conditions. For a given set of conditions, one specific configuration may be better than another with respect to measures such as resource consumption and performance. Here, configuration parameter values were set dynamically and the results compared with a static configuration. It was hypothesised that under non-changing conditions this would allow the system to converge on a configuration that was more suitable than any that could be set a priori. Furthermore, the system could react to a change in conditions by adopting a more appropriate configuration. Autonomic management was applied to the peer-to-peer (P2P) and data retrieval components of ASA, a distributed storage system. The effects were measured experimentally for various workload and churn patterns. The management policies and mechanisms were implemented using a generic autonomic management framework developed during this work. The experimental evaluations of autonomic management show promising results, and suggest several future research topics. The findings of this thesis could be exploited in building other distributed storage systems that focus on harnessing storage on user workstations, since these are particularly likely to be exposed to varying, unpredictable conditions.
Chilled water study EEAP program for Walter Reed Army Medical Center: Book 2. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-02-01
The Energy Engineering Analysis Program (EEAP) Study for Walter Reed Army Medical Center (WRAMC) was to provide a thorough examination of the central chilled water plants on site. WRAMC is comprised of seventy-one (71) buildings located on a 113-acre site in Washington, D.C. There are two (2) central chilled water plants (Buildings 48 and 49) each with a primary chilled water distribution system. In addition to the two (2) central plants, three (3) buildings utilize their own independent chillers. Two (2) of the independent chillers (Buildings 7 and T-2), one of which is inoperative (T-2), are smaller air-cooled units, whilemore » the third (Building 54) has a 1,900-ton chilled water plant comprised of three (3) centrifugal chillers. Of the two (2) central chilled water plants, Building 48 houses six (6) chillers totalling 7,080 tons of cooling and Building 49 houses one (1) chiller with 660 tons of cooling. The total chiller cooling capacity available on site is 9,840 tons. The chilled water systems were reviewed for alternative ways of conserving energy on site and reducing the peak-cooling load. Distribution systems were reviewed to determine which buildings were served by each of the chilled water plants and to determine chilled water usage on site. Evaluations were made of building exterior and interior composition in order to estimate cooling loads. Interviews with site personnel helped Entech better understand the chilled water plants, the distribution systems, and how each system was utilized.« less
NASA Astrophysics Data System (ADS)
Ren, Jianlin; Cao, Xiaodong; Liu, Junjie
2018-04-01
Passengers usually spend hours in the airport terminal buildings waiting for their departure. During the long waiting period, ambient fine particles (PM2.5) and ultrafine particles (UFP) generated by airliners may penetrate into terminal buildings through open doors and the HVAC system. However, limited data are available on passenger exposure to particulate pollutants in terminal buildings. We conducted on-site measurements on PM2.5 and UFP concentration and the particle size distribution in the terminal building of Tianjin Airport, China during three different seasons. The results showed that the PM2.5 concentrations in the terminal building were considerably larger than the values guided by Chinese standard and WHO on all of the tested seasons, and the conditions were significantly affected by the outdoor air (Spearman test, p < 0.01). The indoor/outdoor PM2.5 ratios (I/O) ranged from 0.67 to 0.84 in the arrival hall and 0.79 to 0.96 in the departure hall. The particle number concentration in the terminal building presented a bi-modal size distribution, with one mode being at 30 nm and another mode at 100 nm. These results were totally different from the size distribution measured in a normal urban environment. The total UFP exposure during the whole waiting period (including in the terminal building and airliner cabin) of a passenger is approximately equivalent to 11 h of exposure to normal urban environments. This study is expected to contribute to the improvement of indoor air quality and health of passengers in airport terminal buildings.
Automated Deployment of Advanced Controls and Analytics in Buildings
NASA Astrophysics Data System (ADS)
Pritoni, Marco
Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.
NASA Technical Reports Server (NTRS)
Dhaliwal, Swarn S.
1997-01-01
An investigation was undertaken to build the software foundation for the WHERE (Web-based Hyper-text Environment for Requirements Engineering) project. The TCM (Toolkit for Conceptual Modeling) was chosen as the foundation software for the WHERE project which aims to provide an environment for facilitating collaboration among geographically distributed people involved in the Requirements Engineering process. The TCM is a collection of diagram and table editors and has been implemented in the C++ programming language. The C++ implementation of the TCM was translated into Java in order to allow the editors to be used for building various functionality of the WHERE project; the WHERE project intends to use the Web as its communication back- bone. One of the limitations of the translated software (TcmJava), which militated against its use in the WHERE project, was persistent data management mechanisms which it inherited from the original TCM; it was designed to be used in standalone applications. Before TcmJava editors could be used as a part of the multi-user, geographically distributed applications of the WHERE project, a persistent storage mechanism must be built which would allow data communication over the Internet, using the capabilities of the Web. An approach involving features of Java, CORBA (Common Object Request Broker), the Web, a middle-ware (Java Relational Binding (JRB)), and a database server was used to build the persistent data management infrastructure for the WHERE project. The developed infrastructure allows a TcmJava editor to be downloaded and run from a network host by using a JDK 1.1 (Java Developer's Kit) compatible Web-browser. The aforementioned editor establishes connection with a server by using the ORB (Object Request Broker) software and stores/retrieves data in/from the server. The server consists of a CORBA object or objects depending upon whether the data is to be made persistent on a single server or multiple servers. The CORBA object providing the persistent data server is implemented using the Java progranu-ning language. It uses the JRB to store/retrieve data in/from a relational database server. The persistent data management system provides transaction and user management facilities which allow multi-user, distributed access to the stored data in a secure manner.
Very fast road database verification using textured 3D city models obtained from airborne imagery
NASA Astrophysics Data System (ADS)
Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie
2014-10-01
Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models of buildings, trees, and ground as output. Building s and ground are textured by means of available images. This facilitates the orientation in the model and the interactive verification of the road objects that where initially classified as unknown. The three main modules of the texturing algorithm are: Pose estimation (if the videos are not geo-referenced), occlusion analysis, and texture synthesis.
Parker, J L; Larson, R R; Eskelson, E; Wood, E M; Veranth, J M
2008-10-01
Particle count-based size distribution and PM(2.5) mass were monitored inside and outside an elementary school in Salt Lake City (UT, USA) during the winter atmospheric inversion season. The site is influenced by urban traffic and the airshed is subject to periods of high PM(2.5) concentration that is mainly submicron ammonium and nitrate. The school building has mechanical ventilation with filtration and variable-volume makeup air. Comparison of the indoor and outdoor particle size distribution on the five cleanest and five most polluted school days during the study showed that the ambient submicron particulate matter (PM) penetrated the building, but indoor concentrations were about one-eighth of outdoor levels. The indoor:outdoor PM(2.5) mass ratio averaged 0.12 and particle number ratio for sizes smaller than 1 microm averaged 0.13. The indoor submicron particle count and indoor PM(2.5) mass increased slightly during pollution episodes but remained well below outdoor levels. When the building was occupied the indoor coarse particle count was much higher than ambient levels. These results contribute to understanding the relationship between ambient monitoring station data and the actual human exposure inside institutional buildings. The study confirms that staying inside a mechanically ventilated building reduces exposure to outdoor submicron particles. This study supports the premise that remaining inside buildings during particulate matter (PM) pollution episodes reduces exposure to submicron PM. New data on a mechanically ventilated institutional building supplements similar studies made in residences.
Building heating and cooling applications thermal energy storage program overview
NASA Technical Reports Server (NTRS)
Eissenberg, D. M.
1980-01-01
Thermal energy storage technology and development of building heating and cooling applications in the residential and commercial sectors is outlined. Three elements are identified to undergo an applications assessment, technology development, and demonstration. Emphasis is given to utility load management thermal energy system application where the stress is on the 'customer side of the meter'. Thermal storage subsystems for space conditioning and conservation means of increased thermal mass within the building envelope and by means of low-grade waste heat recovery are covered.
R&D Opportunities for Membranes and Separation Technologies in Building Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goetzler, William; Guernsey, Matt; Bargach, Youssef
This report recommends innovative membrane and separation technologies that can assist the Building Technologies Office in achieving its 2030 goal. This report identifies research and development (R&D) initiatives across several building applications where further investigations could result in impactful savings.
2016-06-16
procedure. The predictive capabilities of the high-resolution computational fluid dynamics ( CFD ) simulations of urban flow are validated against a very...turbulence over a 2D building array using high-resolution CFD and a distributed drag force approach a Department of Mechanical Engineering, University
Marketing and Distributive Education Curriculum Guide: Hardware-Building Materials, Farm and Garden.
ERIC Educational Resources Information Center
Cluck, Janice Bora
Designed to be used with the General Marketing Curriculum Planning Guide (ED 156 860), this guide is intended to provide the curriculum coordinator with a basis for planning a comprehensive program in the field of marketing for farm and garden hardware building materials; it is designed also to allow marketing and distributive education…
Preparation of fine powdered composite for latent heat storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fořt, Jan, E-mail: jan.fort.1@fsv.cvut.cz; Trník, Anton, E-mail: anton.trnik@fsv.cvut.cz; Pavlíková, Milena, E-mail: milena.pavlikova@fsv.cvut.cz
Application of latent heat storage building envelope systems using phase-change materials represents an attractive method of storing thermal energy and has the advantages of high-energy storage density and the isothermal nature of the storage process. This study deals with a preparation of a new type of powdered phase change composite material for thermal energy storage. The idea of a composite is based upon the impregnation of a natural silicate material by a reasonably priced commercially produced pure phase change material and forming the homogenous composite powdered structure. For the preparation of the composite, vacuum impregnation method is used. The particlemore » size distribution accessed by the laser diffraction apparatus proves that incorporation of the organic phase change material into the structure of inorganic siliceous pozzolana does not lead to the clustering of the particles. The compatibility of the prepared composite is characterized by the Fourier transformation infrared analysis (FTIR). Performed DSC analysis shows potential of the developed composite for thermal energy storage that can be easily incorporated into the cement-based matrix of building materials. Based on the obtained results, application of the developed phase change composite can be considered with a great promise.« less
First Operational Experience With a High-Energy Physics Run Control System Based on Web Technologies
NASA Astrophysics Data System (ADS)
Bauer, Gerry; Beccati, Barbara; Behrens, Ulf; Biery, Kurt; Branson, James; Bukowiec, Sebastian; Cano, Eric; Cheung, Harry; Ciganek, Marek; Cittolin, Sergio; Coarasa Perez, Jose Antonio; Deldicque, Christian; Erhan, Samim; Gigi, Dominique; Glege, Frank; Gomez-Reino, Robert; Gulmini, Michele; Hatton, Derek; Hwong, Yi Ling; Loizides, Constantin; Ma, Frank; Masetti, Lorenzo; Meijers, Frans; Meschi, Emilio; Meyer, Andreas; Mommsen, Remigius K.; Moser, Roland; O'Dell, Vivian; Oh, Alexander; Orsini, Luciano; Paus, Christoph; Petrucci, Andrea; Pieri, Marco; Racz, Attila; Raginel, Olivier; Sakulin, Hannes; Sani, Matteo; Schieferdecker, Philipp; Schwick, Christoph; Shpakov, Dennis; Simon, Michal; Sumorok, Konstanty; Yoon, Andre Sungho
2012-08-01
Run control systems of modern high-energy particle physics experiments have requirements similar to those of today's Internet applications. The Compact Muon Solenoid (CMS) collaboration at CERN's Large Hadron Collider (LHC) therefore decided to build the run control system for its detector based on web technologies. The system is composed of Java Web Applications distributed over a set of Apache Tomcat servlet containers that connect to a database back-end. Users interact with the system through a web browser. The present paper reports on the successful scaling of the system from a small test setup to the production data acquisition system that comprises around 10.000 applications running on a cluster of about 1600 hosts. We report on operational aspects during the first phase of operation with colliding beams including performance, stability, integration with the CMS Detector Control System and tools to guide the operator.
Non-invasive lightweight integration engine for building EHR from autonomous distributed systems.
Angulo, Carlos; Crespo, Pere; Maldonado, José A; Moner, David; Pérez, Daniel; Abad, Irene; Mandingorra, Jesús; Robles, Montserrat
2007-12-01
In this paper we describe Pangea-LE, a message-oriented lightweight data integration engine that allows homogeneous and concurrent access to clinical information from disperse and heterogeneous data sources. The engine extracts the information and passes it to the requesting client applications in a flexible XML format. The XML response message can be formatted on demand by appropriate Extensible Stylesheet Language (XSL) transformations in order to meet the needs of client applications. We also present a real deployment in a hospital where Pangea-LE collects and generates an XML view of all the available patient clinical information. The information is presented to healthcare professionals in an Electronic Health Record (EHR) viewer Web application with patient search and EHR browsing capabilities. Implantation in a real setting has been a success due to the non-invasive nature of Pangea-LE which respects the existing information systems.
Non-invasive light-weight integration engine for building EHR from autonomous distributed systems.
Crespo Molina, Pere; Angulo Fernández, Carlos; Maldonado Segura, José A; Moner Cano, David; Robles Viejo, Montserrat
2006-01-01
Pangea-LE is a message oriented light-weight integration engine, allowing concurrent access to clinical information from disperse and heterogeneous data sources. The engine extracts the information and serves it to the requester client applications in a flexible XML format. This XML response message can be formatted on demand by the appropriate XSL (Extensible Stylesheet Language) transformation in order to fit client application needs. In this article we present a real use case sample where Pangea-LE collects and generates "on the fly" a structured view of all the patient clinical information available in a healthcare organisation. This information is presented to healthcare professionals in an EHR (Electronic Health Record) viewer Web application with patient search and EHR browsing capabilities. Implantation in a real environment has been a notable success due to the non-invasive method which extremely respects the existing information systems.
PhamDB: a web-based application for building Phamerator databases.
Lamine, James G; DeJong, Randall J; Nelesen, Serita M
2016-07-01
PhamDB is a web application which creates databases of bacteriophage genes, grouped by gene similarity. It is backwards compatible with the existing Phamerator desktop software while providing an improved database creation workflow. Key features include a graphical user interface, validation of uploaded GenBank files, and abilities to import phages from existing databases, modify existing databases and queue multiple jobs. Source code and installation instructions for Linux, Windows and Mac OSX are freely available at https://github.com/jglamine/phage PhamDB is also distributed as a docker image which can be managed via Kitematic. This docker image contains the application and all third party software dependencies as a pre-configured system, and is freely available via the installation instructions provided. snelesen@calvin.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Application of BIM Technology in Prefabricated Buildings
NASA Astrophysics Data System (ADS)
Zhanglin, Guo; Si, Gao; Jun-e, Liu
2017-08-01
The development of fabricated buildings has become the main trend of the developm ent of modern construction industry in China. As the main tool of building information, BIM (b uilding information modeling) has greatly promoted the development of construction industry. Based on the review of the papers about the fabricated buildings and BIM technology in recent years, this paper analyzes the advantages of fabricated buildings and BIM technology, then exp lores the application of BIM technology in fabricated buildings. It aims to realize the rationaliz ation and scientification of project lifecycle management in fabricated construction project, and finally form a coherent information platform in the fabricated building.
Deist, Timo M; Jochems, A; van Soest, Johan; Nalbantov, Georgi; Oberije, Cary; Walsh, Seán; Eble, Michael; Bulens, Paul; Coucke, Philippe; Dries, Wim; Dekker, Andre; Lambin, Philippe
2017-06-01
Machine learning applications for personalized medicine are highly dependent on access to sufficient data. For personalized radiation oncology, datasets representing the variation in the entire cancer patient population need to be acquired and used to learn prediction models. Ethical and legal boundaries to ensure data privacy hamper collaboration between research institutes. We hypothesize that data sharing is possible without identifiable patient data leaving the radiation clinics and that building machine learning applications on distributed datasets is feasible. We developed and implemented an IT infrastructure in five radiation clinics across three countries (Belgium, Germany, and The Netherlands). We present here a proof-of-principle for future 'big data' infrastructures and distributed learning studies. Lung cancer patient data was collected in all five locations and stored in local databases. Exemplary support vector machine (SVM) models were learned using the Alternating Direction Method of Multipliers (ADMM) from the distributed databases to predict post-radiotherapy dyspnea grade [Formula: see text]. The discriminative performance was assessed by the area under the curve (AUC) in a five-fold cross-validation (learning on four sites and validating on the fifth). The performance of the distributed learning algorithm was compared to centralized learning where datasets of all institutes are jointly analyzed. The euroCAT infrastructure has been successfully implemented in five radiation clinics across three countries. SVM models can be learned on data distributed over all five clinics. Furthermore, the infrastructure provides a general framework to execute learning algorithms on distributed data. The ongoing expansion of the euroCAT network will facilitate machine learning in radiation oncology. The resulting access to larger datasets with sufficient variation will pave the way for generalizable prediction models and personalized medicine.
Hardware-in-the-Loop Modeling and Simulation Methods for Daylight Systems in Buildings
NASA Astrophysics Data System (ADS)
Mead, Alex Robert
This dissertation introduces hardware-in-the-loop modeling and simulation techniques to the daylighting community, with specific application to complex fenestration systems. No such application of this class of techniques, optimally combining mathematical-modeling and physical-modeling experimentation, is known to the author previously in the literature. Daylighting systems in buildings have a large impact on both the energy usage of a building as well as the occupant experience within a space. As such, a renewed interest has been placed on designing and constructing buildings with an emphasis on daylighting in recent times as part of the "green movement.''. Within daylighting systems, a specific subclass of building envelope is receiving much attention: complex fenestration systems (CFSs). CFSs are unique as compared to regular fenestration systems (e.g. glazing) in the regard that they allow for non-specular transmission of daylight into a space. This non-specular nature can be leveraged by designers to "optimize'' the times of the day and the days of the year that daylight enters a space. Examples of CFSs include: Venetian blinds, woven fabric shades, and prismatic window coatings. In order to leverage the non-specular transmission properties of CFSs, however, engineering analysis techniques capable of faithfully representing the physics of these systems are needed. Traditionally, the analysis techniques available to the daylighting community fall broadly into three classes: simplified techniques, mathematical-modeling and simulation, and physical-modeling and experimentation. Simplified techniques use "rules-of-thumb'' heuristics to provide insights for simple daylighting systems. Mathematical-modeling and simulation use complex numerical models to provide more detailed insights into system performance. Finally, physical-models can be instrumented and excited using artificial and natural light sources to provide performance insight into a daylighting system. Each class of techniques, broadly speaking however, has advantages and disadvantages with respect to the cost of execution (e.g. money, time, expertise) and the fidelity of the provided insight into the performance of the daylighting system. This varying tradeoff of cost and insight between the techniques determines which techniques are employed for which projects. Daylighting systems with CFS components, however, when considered for simulation with respect to these traditional technique classes, defy high fidelity analysis. Simplified techniques are clearly not applicable. Mathematical-models must have great complexity in order to capture the non-specular transmission accurately, which greatly limit their applicability. This leaves physical modeling, the most costly, as the preferred method for CFS. While mathematical-modeling and simulation methods do exist, they are in general costly and and still approximations of the underlying CFS behavior. Meaning in fact, measurements of CFSs are currently the only practical method to capture the behavior of CFSs. Traditional measurements of CFSs transmission and reflection properties are conducted using an instrument called a goniophotometer and produce a measurement in the form of a Bidirectional Scatter Distribution Function (BSDF) based on the Klems Basis. This measurement must be executed for each possible state of the CFS, hence only a subset of the possible behaviors can be captured for CFSs with continuously varying configurations. In the current era of rapid prototyping (e.g. 3D printing) and automated control of buildings including daylighting systems, a new analysis technique is needed which can faithfully represent these CFSs which are being designed and constructed at an increasing rate. Hardware-in-the-loop modeling and simulation is a perfect fit to the current need of analyzing daylighting systems with CFSs. In the proposed hardware-in-the-loop modeling and simulation approach of this dissertation, physical-models of real CFSs are excited using either natural or artificial light. The exiting luminance distribution from these CFSs is measured and used as inputs to a Radiance mathematical-model of the interior of the space, which is proposed to be lit by the CFS containing daylighting system. Hence, the components of the total daylighting and building system which are not mathematically-modeled well, the CFS, are physically excited and measured, while the components which are modeled properly, namely the interior building space, are mathematically-modeled. In order to excite and measure CFSs behavior, a novel parallel goniophotometer, referred to as the CUBE 2.0, is developed in this dissertation. The CUBE 2.0 measures the input illuminance distribution and the output luminance distribution with respect to a CFS under test. Further, the process is fully automated allowing for deployable experiments on proposed building sites, as well as in laboratory based experiments. In this dissertation, three CFSs, two commercially available and one novel--Twitchell's Textilene 80 Black, Twitchell's Shade View Ebony, and Translucent Concrete Panels (TCP)--are simulated on the CUBE 2.0 system for daylong deployments at one minute time steps. These CFSs are assumed to be placed in the glazing space within the Reference Office Radiance model, for which horizontal illuminance on a work plane of 0.8 m height is calculated for each time step. While Shade View Ebony and TCPs are unmeasured CFSs with respect to BSDF, Textilene 80 Black has been previously measured. As such a validation of the CUBE 2.0 using the goniophotometer measured BSDF is presented, with measurement errors of the horizontal illuminance between +3% and -10%. These error levels are considered to be valid within experimental daylighting investigations. Non-validated results are also presented in full for both Shade View Ebony as well as TCP. Concluding remarks and future directions for HWiL simulation close the dissertation.
Retrofit photovoltaic systems for intermediate sized applications - A design and market study
NASA Astrophysics Data System (ADS)
Noel, G. T.; Hagely, J. R.
An assessment of the technical and economic feasibility of retrofitting a significant portion of the existing intermediate sector building/application inventory with photovoltaic systems is presented. The assessment includes the development of detailed engineering and architectural designs as well as cost estimates for 12 representative installations. Promising applications include retail stores, warehouses, office buildings, religious buildings, shopping centers, education buildings, hospitals, and industrial sites. A market study indicates that there is a national invetory of 1.5 to 2.0 million feasible intermediate sector applications, with the majority being in the 20 to 400 kW size range. The present cost of the major systems components and the cost of necessary building modifications are the primary current barriers to the realization of a large retrofit photovoltaic system market. The development of standardized modular system designs and installation techniques are feasible ways to minimize costs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franta, G.; Baylin, F.; Crowther, R.
1981-06-01
This Solar Design Workbook presents solar building design applications for commercial buildir^s. The book is divided into four sections. The first section describes the variety of solar applications in buildings including conservation aspects, solar fundamentals, passive systems, active systems, daylighting, and other solar options. Solar system design evaluation techniques including considerations for building energy requirements, passive systems, active systems, and economics are presented in Section II. The third section attempts to assist the designer in the building design process for energy conservation and solar applications including options and considerations for pre-design, design, and post-design phases. The information required for themore » solar design proee^ has not been fully developed at this time. Therefore, Section III is incomplete, but an overview of the considerations with some of the design proces elements is presented. Section IV illustrates ease studies that utilize solar applications in the building design.« less
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.
Simonyan, Vahan; Mazumder, Raja
2014-09-30
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.
Diamond-based single-photon emitters
NASA Astrophysics Data System (ADS)
Aharonovich, I.; Castelletto, S.; Simpson, D. A.; Su, C.-H.; Greentree, A. D.; Prawer, S.
2011-07-01
The exploitation of emerging quantum technologies requires efficient fabrication of key building blocks. Sources of single photons are extremely important across many applications as they can serve as vectors for quantum information—thereby allowing long-range (perhaps even global-scale) quantum states to be made and manipulated for tasks such as quantum communication or distributed quantum computation. At the single-emitter level, quantum sources also afford new possibilities in terms of nanoscopy and bio-marking. Color centers in diamond are prominent candidates to generate and manipulate quantum states of light, as they are a photostable solid-state source of single photons at room temperature. In this review, we discuss the state of the art of diamond-based single-photon emitters and highlight their fabrication methodologies. We present the experimental techniques used to characterize the quantum emitters and discuss their photophysical properties. We outline a number of applications including quantum key distribution, bio-marking and sub-diffraction imaging, where diamond-based single emitters are playing a crucial role. We conclude with a discussion of the main challenges and perspectives for employing diamond emitters in quantum information processing.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis
Simonyan, Vahan; Mazumder, Raja
2014-01-01
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953
22 CFR 125.4 - Exemptions of general applicability.
Code of Federal Regulations, 2012 CFR
2012-04-01
... boundaries of a “Build-to-Print” data package; (2) Build/Design-to-Specification. “Build/Design-to... THE EXPORT OF TECHNICAL DATA AND CLASSIFIED DEFENSE ARTICLES § 125.4 Exemptions of general applicability. (a) The following exemptions apply to exports of technical data for which approval is not needed...
22 CFR 125.4 - Exemptions of general applicability.
Code of Federal Regulations, 2013 CFR
2013-04-01
... boundaries of a “Build-to-Print” data package; (2) Build/Design-to-Specification. “Build/Design-to... THE EXPORT OF TECHNICAL DATA AND CLASSIFIED DEFENSE ARTICLES § 125.4 Exemptions of general applicability. (a) The following exemptions apply to exports of technical data for which approval is not needed...
22 CFR 125.4 - Exemptions of general applicability.
Code of Federal Regulations, 2011 CFR
2011-04-01
... boundaries of a “Build-to-Print” data package; (2) Build/Design-to-Specification. “Build/Design-to... THE EXPORT OF TECHNICAL DATA AND CLASSIFIED DEFENSE ARTICLES § 125.4 Exemptions of general applicability. (a) The following exemptions apply to exports of technical data for which approval is not needed...
22 CFR 125.4 - Exemptions of general applicability.
Code of Federal Regulations, 2014 CFR
2014-04-01
... boundaries of a “Build-to-Print” data package; (2) Build/Design-to-Specification. “Build/Design-to... THE EXPORT OF TECHNICAL DATA AND CLASSIFIED DEFENSE ARTICLES § 125.4 Exemptions of general applicability. (a) The following exemptions apply to exports of technical data for which approval is not needed...
Chen, L; Lai, C; Marchewka, R; Berry, R M; Tam, K C
2016-07-21
Structural colors and photoluminescence have been widely used for anti-counterfeiting and security applications. We report for the first time the use of CdS quantum dot (QD)-functionalized cellulose nanocrystals (CNCs) as building blocks to fabricate nanothin films via layer-by-layer (LBL) self-assembly for anti-counterfeiting applications. Both negatively- and positively-charged CNC/QD nanohybrids with a high colloidal stability and a narrow particle size distribution were prepared. The controllable LBL coating process was characterized by scanning electron microscopy and ellipsometry. The rigid structure of CNCs leads to nanoporous structured films on poly(ethylene terephthalate) (PET) substrates with high transmittance (above 70%) over the entire range of visible light and also resulted in increased hydrophilicity (contact angles of ∼40 degrees). Nanothin films on PET substrates showed good flexibility and enhanced stability in both water and ethanol. The modified PET films with structural colors from thin-film interference and photoluminescence from QDs can be used in anti-counterfeiting applications.
NASA Technical Reports Server (NTRS)
Richards, Stephen F.
1991-01-01
Although computerized operations have significant gains realized in many areas, one area, scheduling, has enjoyed few benefits from automation. The traditional methods of industrial engineering and operations research have not proven robust enough to handle the complexities associated with the scheduling of realistic problems. To address this need, NASA has developed the computer-aided scheduling system (COMPASS), a sophisticated, interactive scheduling tool that is in wide-spread use within NASA and the contractor community. Therefore, COMPASS provides no explicit support for the large class of problems in which several people, perhaps at various locations, build separate schedules that share a common pool of resources. This research examines the issue of distributing scheduling, as applied to application domains characterized by the partial ordering of tasks, limited resources, and time restrictions. The focus of this research is on identifying issues related to distributed scheduling, locating applicable problem domains within NASA, and suggesting areas for ongoing research. The issues that this research identifies are goals, rescheduling requirements, database support, the need for communication and coordination among individual schedulers, the potential for expert system support for scheduling, and the possibility of integrating artificially intelligent schedulers into a network of human schedulers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rouet, François-Henry; Li, Xiaoye S.; Ghysels, Pieter
In this paper, we present a distributed-memory library for computations with dense structured matrices. A matrix is considered structured if its off-diagonal blocks can be approximated by a rank-deficient matrix with low numerical rank. Here, we use Hierarchically Semi-Separable (HSS) representations. Such matrices appear in many applications, for example, finite-element methods, boundary element methods, and so on. Exploiting this structure allows for fast solution of linear systems and/or fast computation of matrix-vector products, which are the two main building blocks of matrix computations. The compression algorithm that we use, that computes the HSS form of an input dense matrix, reliesmore » on randomized sampling with a novel adaptive sampling mechanism. We discuss the parallelization of this algorithm and also present the parallelization of structured matrix-vector product, structured factorization, and solution routines. The efficiency of the approach is demonstrated on large problems from different academic and industrial applications, on up to 8,000 cores. Finally, this work is part of a more global effort, the STRUctured Matrices PACKage (STRUMPACK) software package for computations with sparse and dense structured matrices. Hence, although useful on their own right, the routines also represent a step in the direction of a distributed-memory sparse solver.« less
Rouet, François-Henry; Li, Xiaoye S.; Ghysels, Pieter; ...
2016-06-30
In this paper, we present a distributed-memory library for computations with dense structured matrices. A matrix is considered structured if its off-diagonal blocks can be approximated by a rank-deficient matrix with low numerical rank. Here, we use Hierarchically Semi-Separable (HSS) representations. Such matrices appear in many applications, for example, finite-element methods, boundary element methods, and so on. Exploiting this structure allows for fast solution of linear systems and/or fast computation of matrix-vector products, which are the two main building blocks of matrix computations. The compression algorithm that we use, that computes the HSS form of an input dense matrix, reliesmore » on randomized sampling with a novel adaptive sampling mechanism. We discuss the parallelization of this algorithm and also present the parallelization of structured matrix-vector product, structured factorization, and solution routines. The efficiency of the approach is demonstrated on large problems from different academic and industrial applications, on up to 8,000 cores. Finally, this work is part of a more global effort, the STRUctured Matrices PACKage (STRUMPACK) software package for computations with sparse and dense structured matrices. Hence, although useful on their own right, the routines also represent a step in the direction of a distributed-memory sparse solver.« less
7 CFR 51.56 - Buildings and structures.
Code of Federal Regulations, 2010 CFR
2010-01-01
... its distribution throughout the plant and washing machinery; (c) There shall also be an efficient... STANDARDS) Regulations 1 Requirements for Plants Operating Under Continuous Inspection on A Contract Basis § 51.56 Buildings and structures. The packing plant buildings shall be properly constructed and...
New rain shed (Building No. 241), overhead pipeline and raw ...
New rain shed (Building No. 241), overhead pipeline and raw water tank T4. Distribution pump house can be seen at the center of building. - Hawaii Volcanoes National Park Water Collection System, Hawaii Volcanoes National Park, Volcano, Hawaii County, HI
Advanced algorithms for distributed fusion
NASA Astrophysics Data System (ADS)
Gelfand, A.; Smith, C.; Colony, M.; Bowman, C.; Pei, R.; Huynh, T.; Brown, C.
2008-03-01
The US Military has been undergoing a radical transition from a traditional "platform-centric" force to one capable of performing in a "Network-Centric" environment. This transformation will place all of the data needed to efficiently meet tactical and strategic goals at the warfighter's fingertips. With access to this information, the challenge of fusing data from across the batttlespace into an operational picture for real-time Situational Awareness emerges. In such an environment, centralized fusion approaches will have limited application due to the constraints of real-time communications networks and computational resources. To overcome these limitations, we are developing a formalized architecture for fusion and track adjudication that allows the distribution of fusion processes over a dynamically created and managed information network. This network will support the incorporation and utilization of low level tracking information within the Army Distributed Common Ground System (DCGS-A) or Future Combat System (FCS). The framework is based on Bowman's Dual Node Network (DNN) architecture that utilizes a distributed network of interlaced fusion and track adjudication nodes to build and maintain a globally consistent picture across all assets.
NASA Astrophysics Data System (ADS)
Trombley, N.; Weber, E.; Moehl, J.
2017-12-01
Many studies invoke dasymetric mapping to make more accurate depictions of population distribution by spatially restricting populations to inhabited/inhabitable portions of observational units (e.g., census blocks) and/or by varying population density among different land classes. LandScan USA uses this approach by restricting particular population components (such as residents or workers) to building area detected from remotely sensed imagery, but also goes a step further by classifying each cell of building area in accordance with ancillary land use information from national parcel data (CoreLogic, Inc.'s ParcelPoint database). Modeling population density according to land use is critical. For instance, office buildings would have a higher density of workers than warehouses even though the latter would likely have more cells of detection. This paper presents a modeling approach by which different land uses are assigned different densities to more accurately distribute populations within them. For parts of the country where the parcel data is insufficient, an alternate methodology is developed that uses National Land Cover Database (NLCD) data to define the land use type of building detection. Furthermore, LiDAR data is incorporated for many of the largest cities across the US, allowing the independent variables to be updated from two-dimensional building detection area to total building floor space. In the end, four different regression models are created to explain the effect of different land uses on worker distribution: A two-dimensional model using land use types from the parcel data A three-dimensional model using land use types from the parcel data A two-dimensional model using land use types from the NLCD data, and A three-dimensional model using land use types from the NLCD data. By and large, the resultant coefficients followed intuition, but importantly allow the relationships between different land uses to be quantified. For instance, in the model using two-dimensional building area, commercial building area had a density 2.5 times greater than public building area and 4 times greater than industrial building area. These coefficients can be applied to define the ratios at which population is distributed to building cells. Finally, possible avenues for refining the methodology are presented.
Modeling mobile source emissions during traffic jams in a micro urban environment.
Kondrashov, Valery V; Reshetin, Vladimir P; Regens, James L; Gunter, James T
2002-01-01
Urbanization typically involves a continuous increase in motor vehicle use, resulting in congestion known as traffic jams. Idling emissions due to traffic jams combine with the complex terrain created by buildings to concentrate atmospheric pollutants in localized areas. This research simulates emissions concentrations and distributions for a congested street in Minsk, Belarus. Ground-level (up to 50-meters above the street's surface) pollutant concentrations were calculated using STAR (version 3.10) with emission factors obtained from the U.S. Environmental Protection Agency, wind speed and direction, and building location and size. Relative emissions concentrations and distributions were simulated at 1-meter and 10-meters above street level. The findings demonstrate the importance of wind speed and direction, and building size and location on emissions concentrations and distributions, with the leeward sides of buildings retaining up to 99 percent of the emitted pollutants within 1-meter of street level, and up to 77 percent 10-meters above the street.
Architecture Knowledge for Evaluating Scalable Databases
2015-01-16
problems, arising from the proliferation of new data models and distributed technologies for building scalable, available data stores . Architects must...longer are relational databases the de facto standard for building data repositories. Highly distributed, scalable “ NoSQL ” databases [11] have emerged...This is especially challenging at the data storage layer. The multitude of competing NoSQL database technologies creates a complex and rapidly
ChemCalc: a building block for tomorrow's chemical infrastructure.
Patiny, Luc; Borel, Alain
2013-05-24
Web services, as an aspect of cloud computing, are becoming an important part of the general IT infrastructure, and scientific computing is no exception to this trend. We propose a simple approach to develop chemical Web services, through which servers could expose the essential data manipulation functionality that students and researchers need for chemical calculations. These services return their results as JSON (JavaScript Object Notation) objects, which facilitates their use for Web applications. The ChemCalc project http://www.chemcalc.org demonstrates this approach: we present three Web services related with mass spectrometry, namely isotopic distribution simulation, peptide fragmentation simulation, and molecular formula determination. We also developed a complete Web application based on these three Web services, taking advantage of modern HTML5 and JavaScript libraries (ChemDoodle and jQuery).
Generalised solutions for fully nonlinear PDE systems and existence-uniqueness theorems
NASA Astrophysics Data System (ADS)
Katzourakis, Nikos
2017-07-01
We introduce a new theory of generalised solutions which applies to fully nonlinear PDE systems of any order and allows for merely measurable maps as solutions. This approach bypasses the standard problems arising by the application of Distributions to PDEs and is not based on either integration by parts or on the maximum principle. Instead, our starting point builds on the probabilistic representation of derivatives via limits of difference quotients in the Young measures over a toric compactification of the space of jets. After developing some basic theory, as a first application we consider the Dirichlet problem and we prove existence-uniqueness-partial regularity of solutions to fully nonlinear degenerate elliptic 2nd order systems and also existence of solutions to the ∞-Laplace system of vectorial Calculus of Variations in L∞.
Numerical Simulation of Dispersion from Urban Greenhouse Gas Sources
NASA Astrophysics Data System (ADS)
Nottrott, Anders; Tan, Sze; He, Yonggang; Winkler, Renato
2017-04-01
Cities are characterized by complex topography, inhomogeneous turbulence, and variable pollutant source distributions. These features create a scale separation between local sources and urban scale emissions estimates known as the Grey-Zone. Modern computational fluid dynamics (CFD) techniques provide a quasi-deterministic, physically based toolset to bridge the scale separation gap between source level dynamics, local measurements, and urban scale emissions inventories. CFD has the capability to represent complex building topography and capture detailed 3D turbulence fields in the urban boundary layer. This presentation discusses the application of OpenFOAM to urban CFD simulations of natural gas leaks in cities. OpenFOAM is an open source software for advanced numerical simulation of engineering and environmental fluid flows. When combined with free or low cost computer aided drawing and GIS, OpenFOAM generates a detailed, 3D representation of urban wind fields. OpenFOAM was applied to model scalar emissions from various components of the natural gas distribution system, to study the impact of urban meteorology on mobile greenhouse gas measurements. The numerical experiments demonstrate that CH4 concentration profiles are highly sensitive to the relative location of emission sources and buildings. Sources separated by distances of 5-10 meters showed significant differences in vertical dispersion of plumes, due to building wake effects. The OpenFOAM flow fields were combined with an inverse, stochastic dispersion model to quantify and visualize the sensitivity of point sensors to upwind sources in various built environments. The Boussinesq approximation was applied to investigate the effects of canopy layer temperature gradients and convection on sensor footprints.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amick, D.S.
The purpose was to develop a model for understanding prehistoric chipped-stone distribution in the central Duck River Basin region located in Middle Tennessee. The model design consists of theoretical and empirical observation which are integrated to form an interpretive framework. Seven lithic assemblages ranging from middle to late Archaic periods (ca. 7000 to 3000 years B.P.) in age were analyzed to assess the model. The first stage of model-building was the development of a theoretical foundation for approaching the problem. A cultural adaptation paradigm is the foundation cornerstone. Lithic technology is viewed as an integral part of prehistoric cultural adaptationmore » and lithic remains are regarded as critical indicators of behavioral variability. Environmental variables are stressed in interpreting lithic debris patterning across the landscape. The second stage of model-building involved the construction of a set of observations on lithic resource types and their distributions in the study area. A regional lithic resource survey was implemented. Application of the model to archaeological data was the last stage of model-building. Seven Middle-Late Archaic assemblages are compared. Results of the interassemblage analysis suggest that Late Archaic settlement systems were more logisically organized and technological systems were less expediently organized than those of Middle Archaic groups. Differences between Middle and Late Archaic adaptations are viewed as response to regional environmental and demographic trends. These trends are documented and their implications for subsistence, settlement, and social organization are considered.« less
Building a highly available and intrusion tolerant Database Security and Protection System (DSPS).
Cai, Liang; Yang, Xiao-Hu; Dong, Jin-Xiang
2003-01-01
Database Security and Protection System (DSPS) is a security platform for fighting malicious DBMS. The security and performance are critical to DSPS. The authors suggested a key management scheme by combining the server group structure to improve availability and the key distribution structure needed by proactive security. This paper detailed the implementation of proactive security in DSPS. After thorough performance analysis, the authors concluded that the performance difference between the replicated mechanism and proactive mechanism becomes smaller and smaller with increasing number of concurrent connections; and that proactive security is very useful and practical for large, critical applications.
DITDOS: A set of design specifications for distributed data inventories
NASA Technical Reports Server (NTRS)
King, T. A.; Walker, R. J.; Joy, S. P.
1995-01-01
The analysis of space science data often requires researchers to work with many different types of data. For instance, correlative analysis can require data from multiple instruments on a single spacecraft, multiple spacecraft, and ground-based data. Typically, data from each source are available in a different format and have been written on a different type of computer, and so much effort must be spent to read the data and convert it to the computer and format that the researchers use in their analysis. The large and ever-growing amount of data and the large investment by the scientific community in software that require a specific data format make using standard data formats impractical. A format-independent approach to accessing and analyzing disparate data is key to being able to deliver data to a diverse community in a timely fashion. The system in use at the Planetary Plasma Interactions (PPI) node of the NASA Planetary Data System (PDS) is based on the object-oriented Distributed Inventory Tracking and Data Ordering Specification (DITDOS), which describes data inventories in a storage independent way. The specifications have been designed to make it possible to build DITDOS compliant inventories that can exist on portable media such as CD-ROM's. The portable media can be moved within a system, or from system to system, and still be used without modification. Several applications have been developed to work with DITDOS compliant data holdings. One is a windows-based client/server application, which helps guide the user in the selection of data. A user can select a data base, then a data set, then a specific data file, and then either order the data and receive it immediately if it is online or request that it be brought online if it is not. A user can also view data by any of the supported methods. DITDOS makes it possible to use already existing applications for data-specific actions, and this is done whenever possible. Another application is a stand-alone tool to assist in the extraction of data from portable media, such as CD-ROM's. In addition to the applications, there is a set of libraries that can facilitate building new DITDOS compliant applications.
Modelling Root Systems Using Oriented Density Distributions
NASA Astrophysics Data System (ADS)
Dupuy, Lionel X.
2011-09-01
Root architectural models are essential tools to understand how plants access and utilize soil resources during their development. However, root architectural models use complex geometrical descriptions of the root system and this has limitations to model interactions with the soil. This paper presents the development of continuous models based on the concept of oriented density distribution function. The growth of the root system is built as a hierarchical system of partial differential equations (PDEs) that incorporate single root growth parameters such as elongation rate, gravitropism and branching rate which appear explicitly as coefficients of the PDE. Acquisition and transport of nutrients are then modelled by extending Darcy's law to oriented density distribution functions. This framework was applied to build a model of the growth and water uptake of barley root system. This study shows that simplified and computer effective continuous models of the root system development can be constructed. Such models will allow application of root growth models at field scale.
Pacific Northwest GridWise™ Testbed Demonstration Projects; Part I. Olympic Peninsula Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammerstrom, Donald J.; Ambrosio, Ron; Carlon, Teresa A.
2008-01-09
This report describes the implementation and results of a field demonstration wherein residential electric water heaters and thermostats, commercial building space conditioning, municipal water pump loads, and several distributed generators were coordinated to manage constrained feeder electrical distribution through the two-way communication of load status and electric price signals. The field demonstration took place in Washington and Oregon and was paid for by the U.S. Department of Energy and several northwest utilities. Price is found to be an effective control signal for managing transmission or distribution congestion. Real-time signals at 5-minute intervals are shown to shift controlled load in time.more » The behaviors of customers and their responses under fixed, time-of-use, and real-time price contracts are compared. Peak loads are effectively reduced on the experimental feeder. A novel application of portfolio theory is applied to the selection of an optimal mix of customer contract types.« less
Distributed Cooperation Solution Method of Complex System Based on MAS
NASA Astrophysics Data System (ADS)
Weijin, Jiang; Yuhui, Xu
To adapt the model in reconfiguring fault diagnosing to dynamic environment and the needs of solving the tasks of complex system fully, the paper introduced multi-Agent and related technology to the complicated fault diagnosis, an integrated intelligent control system is studied in this paper. Based on the thought of the structure of diagnostic decision and hierarchy in modeling, based on multi-layer decomposition strategy of diagnosis task, a multi-agent synchronous diagnosis federation integrated different knowledge expression modes and inference mechanisms are presented, the functions of management agent, diagnosis agent and decision agent are analyzed, the organization and evolution of agents in the system are proposed, and the corresponding conflict resolution algorithm in given, Layered structure of abstract agent with public attributes is build. System architecture is realized based on MAS distributed layered blackboard. The real world application shows that the proposed control structure successfully solves the fault diagnose problem of the complex plant, and the special advantage in the distributed domain.
Low-cost real-time 3D PC distributed-interactive-simulation (DIS) application for C4I
NASA Astrophysics Data System (ADS)
Gonthier, David L.; Veron, Harry
1998-04-01
A 3D Distributed Interactive Simulation (DIS) application was developed and demonstrated in a PC environment. The application is capable of running in the stealth mode or as a player which includes battlefield simulations, such as ModSAF. PCs can be clustered together, but not necessarily collocated, to run a simulation or training exercise on their own. A 3D perspective view of the battlefield is displayed that includes terrain, trees, buildings and other objects supported by the DIS application. Screen update rates of 15 to 20 frames per second have been achieved with fully lit and textured scenes thus providing high quality and fast graphics. A complete PC system can be configured for under $2,500. The software runs under Windows95 and WindowsNT. It is written in C++ and uses a commercial API called RenderWare for 3D rendering. The software uses Microsoft Foundation classes and Microsoft DirectPlay for joystick input. The RenderWare libraries enhance the performance through optimization for MMX and the Pentium Pro processor. The RenderWare and the Righteous 3D graphics board from Orchid Technologies with an advertised rendering rate of up to 2 million texture mapped triangles per second. A low-cost PC DIS simulator that can partake in a real-time collaborative simulation with other platforms is thus achieved.
Application of Laser Imaging for Bio/geophysical Studies
NASA Technical Reports Server (NTRS)
Hummel, J. R.; Goltz, S. M.; Depiero, N. L.; Degloria, D. P.; Pagliughi, F. M.
1992-01-01
SPARTA, Inc. has developed a low-cost, portable laser imager that, among other applications, can be used in bio/geophysical applications. In the application to be discussed here, the system was utilized as an imaging system for background features in a forested locale. The SPARTA mini-ladar system was used at the International Paper Northern Experimental Forest near Howland, Maine to assist in a project designed to study the thermal and radiometric phenomenology at forest edges. The imager was used to obtain data from three complex sites, a 'seed' orchard, a forest edge, and a building. The goal of the study was to demonstrate the usefulness of the laser imager as a tool to obtain geometric and internal structure data about complex 3-D objects in a natural background. The data from these images have been analyzed to obtain information about the distributions of the objects in a scene. A range detection algorithm has been used to identify individual objects in a laser image and an edge detection algorithm then applied to highlight the outlines of discrete objects. An example of an image processed in such a manner is shown. Described here are the results from the study. In addition, results are presented outlining how the laser imaging system could be used to obtain other important information about bio/geophysical systems, such as the distribution of woody material in forests.
Joint Real-Time Energy and Demand-Response Management using a Hybrid Coalitional-Noncooperative Game
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Fulin; Gu, Yi; Hao, Jun
In order to model the interactions among utility companies, building demands and renewable energy generators (REGs), a hybrid coalitional-noncooperative game framework has been proposed. We formulate a dynamic non-cooperative game to study the energy dispatch within multiple utility companies, while we take a coalitional perspective on REGs and buildings demands through a hedonic coalition formation game approach. In this case, building demands request different power supply from REGs, then the building demands can be organized into an ultimate coalition structure through a distributed hedonic shift algorithm. At the same time, utility companies can also obtain a stable power generation profile.more » In addition, the interactive progress among the utility companies and building demands which cannot be supplied by REGs is implemented by distributed game theoretic algorithms. Numerical results illustrate that the proposed hybrid coalitional-noncooperative game scheme reduces the cost of both building demands and utility companies compared with the initial scene.« less
Distributed temperature sensing using a SPIRAL configuration ultrasonic waveguide
NASA Astrophysics Data System (ADS)
Periyannan, Suresh; Balasubramaniam, Krishnan
2017-02-01
Distributed temperature sensing has important applications in the long term monitoring of critical enclosures such as containment vessels, flue gas stacks, furnaces, underground storage tanks and buildings for fire risk. This paper presents novel techniques for such measurements, using wire in a spiral configuration and having special embodiments such a notch for obtaining wave reflections from desired locations. Transduction is performed using commercially available Piezo-electric crystal that is bonded to one end of the waveguide. Lower order axisymmetric guided ultrasonic modes were employed. Time of fight (TOF) differences between predefined reflectors located on the waveguides are used to infer temperature profile in a chamber with different temperatures. The L(0,1) wave mode (pulse echo approach) was generated/received in a spiral waveguide at different temperatures for this work. The ultrasonic measurements were compared with commercially available thermocouples.
The integration of FPGA TDC inside White Rabbit node
NASA Astrophysics Data System (ADS)
Li, H.; Xue, T.; Gong, G.; Li, J.
2017-04-01
White Rabbit technology is capable of delivering sub-nanosecond accuracy and picosecond precision of synchronization and normal data packets over the fiber network. Carry chain structure in FPGA is a popular way to build TDC and tens of picosecond RMS resolution has been achieved. The integration of WR technology with FPGA TDC can enhance and simplify the TDC in many aspects that includes providing a low jitter clock for TDC, a synchronized absolute UTC/TAI timestamp for coarse counter, a fancy way to calibrate the carry chain DNL and an easy to use Ethernet link for data and control information transmit. This paper presents a FPGA TDC implemented inside a normal White Rabbit node with sub-nanosecond measurement precision. The measured standard deviation reaches 50ps between two distributed TDCs. Possible applications of this distributed TDC are also discussed.
Measurement and application of bidirectional reflectance distribution function
NASA Astrophysics Data System (ADS)
Liao, Fei; Li, Lin; Lu, Chengwen
2016-10-01
When a beam of light with certain intensity and distribution reaches the surface of a material, the distribution of the diffused light is related to the incident angle, the receiving angle, the wavelength of the light and the types of the material. Bidirectional Reflectance Distribution Function (BRDF) is a method to describe this distribution. For an optical system, the optical and mechanical materials' BRDF are unique, and if we want to calculate stray light of the system we should know the correct BRDF data of the whole materials. There are fundamental significances in the area of space remote sensor where BRDF is needed in the precise radiation calibration. It is also important in the military field where BRDF can be used in the object identification and target tracking, etc. In this paper, 11 kinds of aerospace materials' BRDF are measured and more than 310,000 groups of BRDF data are achieved , and also a BRDF database is established in China for the first time. With the BRDF data of the database, we can create the detector model, build the stray light radiation surface model in the stray light analysis software. In this way, the stray radiation on the detector can be calculated correctly.
Remote Sensing of Atlanta's Urban Sprawl and the Distribution of Land Cover and Surface Temperatures
NASA Technical Reports Server (NTRS)
Laymon, Charles A.; Estes, Maurice G., Jr.; Quattrochi, Dale A.; Arnold, James E. (Technical Monitor)
2001-01-01
Between 1973 and 1992, an average of 20 ha of forest was lost each day to urban expansion of Atlanta, Georgia. Urban surfaces have very different thermal properties than natural surfaces-storing solar energy throughout the day and continuing to release it as sensible heat well after sunset. The resulting heat island effect serves as catalysts for chemical reactions from vehicular exhaust and industrialization leading to a deterioration in air quality. In this study, high spatial resolution multispectral remote sensing data has been used to characterize the type, thermal properties, and distribution of land surface materials throughout the Atlanta metropolitan area. Ten-meter data were acquired with the Advanced Thermal and Land Applications Sensor (ATLAS) on May 11 and 12, 1997. ATLAS is a 15-channel multispectral scanner that incorporates the Landsat TM bands with additional bands in the middle reflective infrared and thermal infrared range. The high spatial resolution permitted discrimination of discrete surface types (e.g., concrete, asphalt), individual structures (e.g., buildings, houses) and their associated thermal characteristics. There is a strong temperature contrast between vegetation and anthropomorphic features. Vegetation has a modal temperature at about 20 C, whereas asphalt shingles, pavement, and buildings have a modal temperature of about 39 C. Broad-leaf vegetation classes are indistinguishable on a thermal basis alone. There is slightly more variability (plus or minus 5 C) among the urban surfaces. Grasses, mixed vegetation and mixed urban surfaces are intermediate in temperature and are characterized by broader temperature distributions with modes of about 29 C. Thermal maps serve as a basis for understanding the distribution of "hotspots", i.e., how landscape features and urban fabric contribute the most heat to the lower atmosphere.
Remote Sensing of Atlanta's Urban Sprawl and the Distribution of Land Cover and Surface Temperature
NASA Technical Reports Server (NTRS)
Laymon, Charles A.; Estes, Maurice G., Jr.; Quattrochi, Dale A.; Goodman, H. Michael (Technical Monitor)
2001-01-01
Between 1973 and 1992, an average of 20 ha of forest was lost each day to urban expansion of Atlanta, Georgia. Urban surfaces have very different thermal properties than natural surfaces-storing solar energy throughout the day and continuing to release it as sensible heat well after sunset. The resulting heat island effect serves as catalysts for chemical reactions from vehicular exhaust and industrialization leading to a deterioration in air quality. In this study, high spatial resolution multispectral remote sensing data has been used to characterize the type, thermal properties, and distribution of land surface materials throughout the Atlanta metropolitan area. Ten-meter data were acquired with the Advanced Thermal and Land Applications Sensor (ATLAS) on May 11 and 12, 1997. ATLAS is a 15-channel multispectral scanner that incorporates the Landsat TM bands with additional bands in the middle reflective infrared and thermal infrared range. The high spatial resolution permitted discrimination of discrete surface types (e.g., concrete, asphalt), individual structures (e.g., buildings, houses) and their associated thermal characteristics. There is a strong temperature contrast between vegetation and anthropomorphic features. Vegetation has a modal temperature at about 20 C, whereas asphalt shingles, pavement, and buildings have a modal temperature of about 39 C. Broad-leaf vegetation classes are indistinguishable on a thermal basis alone. There is slightly more variability (+/-5 C) among the urban surfaces. Grasses, mixed vegetation and mixed urban surfaces are intermediate in temperature and are characterized by broader temperature distributions with modes of about 29 C. Thermal maps serve as a basis for understanding the distribution of "hotspots", i.e., how landscape features and urban fabric contribute the most heat to the lower atmosphere.
Assessing population exposure for landslide risk analysis using dasymetric cartography
NASA Astrophysics Data System (ADS)
Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.
2015-04-01
Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric cartography. This work was supported by the FCT - Portuguese Foundation for Science and Technology.
Local-aggregate modeling for big data via distributed optimization: Applications to neuroimaging.
Hu, Yue; Allen, Genevera I
2015-12-01
Technological advances have led to a proliferation of structured big data that have matrix-valued covariates. We are specifically motivated to build predictive models for multi-subject neuroimaging data based on each subject's brain imaging scans. This is an ultra-high-dimensional problem that consists of a matrix of covariates (brain locations by time points) for each subject; few methods currently exist to fit supervised models directly to this tensor data. We propose a novel modeling and algorithmic strategy to apply generalized linear models (GLMs) to this massive tensor data in which one set of variables is associated with locations. Our method begins by fitting GLMs to each location separately, and then builds an ensemble by blending information across locations through regularization with what we term an aggregating penalty. Our so called, Local-Aggregate Model, can be fit in a completely distributed manner over the locations using an Alternating Direction Method of Multipliers (ADMM) strategy, and thus greatly reduces the computational burden. Furthermore, we propose to select the appropriate model through a novel sequence of faster algorithmic solutions that is similar to regularization paths. We will demonstrate both the computational and predictive modeling advantages of our methods via simulations and an EEG classification problem. © 2015, The International Biometric Society.
NASA Astrophysics Data System (ADS)
Despa, D.; Nama, G. F.; Muhammad, M. A.; Anwar, K.
2018-04-01
Electrical quantities such as Voltage, Current, Power, Power Factor, Energy, and Frequency in electrical power system tends to fluctuate, as a result of load changes, disturbances, or other abnormal states. The change-state in electrical quantities should be identify immediately, otherwise it can lead to serious problem for whole system. Therefore a necessity is required to determine the condition of electricity change-state quickly and appropriately in order to make effective decisions. Online monitoring of power distribution system based on Internet of Things (IoT) technology was deploy and implemented on Department of Mechanical Engineering University of Lampung (Unila), especially at three-phase main distribution panel H-building. The measurement system involve multiple sensors such current sensors and voltage sensors, while data processing conducted by Arduino, the measurement data stored in to the database server and shown in a real-time through a web-based application. This measurement system has several important features especially for realtime monitoring, robust data acquisition and logging, system reporting, so it will produce an important information that can be used for various purposes of future power analysis such estimation and planning. The result of this research shown that the condition of electrical power system at H-building performed unbalanced load, which often leads to drop-voltage condition
Advanced Optical Burst Switched Network Concepts
NASA Astrophysics Data System (ADS)
Nejabati, Reza; Aracil, Javier; Castoldi, Piero; de Leenheer, Marc; Simeonidou, Dimitra; Valcarenghi, Luca; Zervas, Georgios; Wu, Jian
In recent years, as the bandwidth and the speed of networks have increased significantly, a new generation of network-based applications using the concept of distributed computing and collaborative services is emerging (e.g., Grid computing applications). The use of the available fiber and DWDM infrastructure for these applications is a logical choice offering huge amounts of cheap bandwidth and ensuring global reach of computing resources [230]. Currently, there is a great deal of interest in deploying optical circuit (wavelength) switched network infrastructure for distributed computing applications that require long-lived wavelength paths and address the specific needs of a small number of well-known users. Typical users are particle physicists who, due to their international collaborations and experiments, generate enormous amounts of data (Petabytes per year). These users require a network infrastructures that can support processing and analysis of large datasets through globally distributed computing resources [230]. However, providing wavelength granularity bandwidth services is not an efficient and scalable solution for applications and services that address a wider base of user communities with different traffic profiles and connectivity requirements. Examples of such applications may be: scientific collaboration in smaller scale (e.g., bioinformatics, environmental research), distributed virtual laboratories (e.g., remote instrumentation), e-health, national security and defense, personalized learning environments and digital libraries, evolving broadband user services (i.e., high resolution home video editing, real-time rendering, high definition interactive TV). As a specific example, in e-health services and in particular mammography applications due to the size and quantity of images produced by remote mammography, stringent network requirements are necessary. Initial calculations have shown that for 100 patients to be screened remotely, the network would have to securely transport 1.2 GB of data every 30 s [230]. According to the above explanation it is clear that these types of applications need a new network infrastructure and transport technology that makes large amounts of bandwidth at subwavelength granularity, storage, computation, and visualization resources potentially available to a wide user base for specified time durations. As these types of collaborative and network-based applications evolve addressing a wide range and large number of users, it is infeasible to build dedicated networks for each application type or category. Consequently, there should be an adaptive network infrastructure able to support all application types, each with their own access, network, and resource usage patterns. This infrastructure should offer flexible and intelligent network elements and control mechanism able to deploy new applications quickly and efficiently.
Zeng, Yuanyuan; Sreenan, Cormac J; Sitanayah, Lanny; Xiong, Naixue; Park, Jong Hyuk; Zheng, Guilin
2011-01-01
Fire hazard monitoring and evacuation for building environments is a novel application area for the deployment of wireless sensor networks. In this context, adaptive routing is essential in order to ensure safe and timely data delivery in building evacuation and fire fighting resource applications. Existing routing mechanisms for wireless sensor networks are not well suited for building fires, especially as they do not consider critical and dynamic network scenarios. In this paper, an emergency-adaptive, real-time and robust routing protocol is presented for emergency situations such as building fire hazard applications. The protocol adapts to handle dynamic emergency scenarios and works well with the routing hole problem. Theoretical analysis and simulation results indicate that our protocol provides a real-time routing mechanism that is well suited for dynamic emergency scenarios in building fires when compared with other related work.
Zeng, Yuanyuan; Sreenan, Cormac J.; Sitanayah, Lanny; Xiong, Naixue; Park, Jong Hyuk; Zheng, Guilin
2011-01-01
Fire hazard monitoring and evacuation for building environments is a novel application area for the deployment of wireless sensor networks. In this context, adaptive routing is essential in order to ensure safe and timely data delivery in building evacuation and fire fighting resource applications. Existing routing mechanisms for wireless sensor networks are not well suited for building fires, especially as they do not consider critical and dynamic network scenarios. In this paper, an emergency-adaptive, real-time and robust routing protocol is presented for emergency situations such as building fire hazard applications. The protocol adapts to handle dynamic emergency scenarios and works well with the routing hole problem. Theoretical analysis and simulation results indicate that our protocol provides a real-time routing mechanism that is well suited for dynamic emergency scenarios in building fires when compared with other related work. PMID:22163774
Assessing hail risk for a building portfolio by generating stochastic events
NASA Astrophysics Data System (ADS)
Nicolet, Pierrick; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel; Nguyen, Liliane; Voumard, Jérémie
2015-04-01
Among the natural hazards affecting buildings, hail is one of the most costly and is nowadays a major concern for building insurance companies. In Switzerland, several costly events were reported these last years, among which the July 2011 event, which cost around 125 million EUR to the Aargauer public insurance company (North-western Switzerland). This study presents the new developments in a stochastic model which aims at evaluating the risk for a building portfolio. Thanks to insurance and meteorological radar data of the 2011 Aargauer event, vulnerability curves are proposed by comparing the damage rate to the radar intensity (i.e. the maximum hailstone size reached during the event, deduced from the radar signal). From these data, vulnerability is defined by a two-step process. The first step defines the probability for a building to be affected (i.e. to claim damages), while the second, if the building is affected, attributes a damage rate to the building from a probability distribution specific to the intensity class. To assess the risk, stochastic events are then generated by summing a set of Gaussian functions with 6 random parameters (X and Y location, maximum hailstone size, standard deviation, eccentricity and orientation). The location of these functions is constrained by a general event shape and by the position of the previously defined functions of the same event. For each generated event, the total cost is calculated in order to obtain a distribution of event costs. The general events parameters (shape, size, …) as well as the distribution of the Gaussian parameters are inferred from two radar intensity maps, namely the one of the aforementioned event, and a second from an event which occurred in 2009. After a large number of simulations, the hailstone size distribution obtained in different regions is compared to the distribution inferred from pre-existing hazard maps, built from a larger set of radar data. The simulation parameters are then adjusted by trial and error, in order to get the best reproduction of the expected distributions. The value of the mean annual risk obtained using the model is also compared to the mean annual risk calculated using directly the hazard maps. According to the first results, the return period of an event inducing a total damage cost equal or greater than 125 million EUR for the Aargauer insurance company would be of around 10 to 40 years.
A Platform Architecture for Sensor Data Processing and Verification in Buildings
ERIC Educational Resources Information Center
Ortiz, Jorge Jose
2013-01-01
This thesis examines the state of the art of building information systems and evaluates their architecture in the context of emerging technologies and applications for deep analysis of the built environment. We observe that modern building information systems are difficult to extend, do not provide general services for application development, do…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-08
..., occupational skills training, and employment services to disadvantaged youth in their communities while... Solicitation for Grant Applications for YouthBuild Grants AGENCY: Employment and Training Administration, Labor... Workforce Investment Act [29 U.S.C. 2918a]. YouthBuild grants will be awarded through a competitive process...
An experimental investigation for external RC shear wall applications
NASA Astrophysics Data System (ADS)
Kaltakci, M. Y.; Ozturk, M.; Arslan, M. H.
2010-09-01
The strength and rigidity of most reinforced concrete (RC) buildings in Turkey, which are frequently hit by destructive earthquakes, is not at a sufficient level. Therefore, the result of earthquakes is a significant loss of life and property. The strengthening method most commonly preferred for these type of RC buildings is the application of RC infilled walls (shear walls) in the frame openings of the building. However, since the whole building has to be emptied and additional heavy costs arise during this type of strengthening, users prefer not to strengthen their buildings despite the heavy risk they are exposed to. Therefore, it is necessary to develop easier-to-apply and more effective methods for the rapid strengthening of housing and the heavily-used public buildings which cannot be emptied during the strengthening process (such as hospitals and schools). This study empirically analyses the different methods of a new system which can meet this need. In this new system, named "external shear wall application", RC shear walls are applied on the external surface of the building, along the frame plane rather than in the building. To this end, 7 test samples in 1/2 and 1/3 geometrical scale were designed to analyse the efficiency of the strengthening technique where the shear wall leans on the frame from outside of the building (external shear wall application) and of the strengthening technique where a specific space is left between the frame and the external shear wall by using a coupling beam to connect elements (application of external shear wall with coupling beam). Test results showed that the maximum lateral load capacity, initial rigidity and energy dissipation behaviours of the samples strengthened with external shear wall were much better than those of the bare frames.
Novel single photon sources for new generation of quantum communications
2017-06-13
be used as building blocks for quantum cryptography and quantum key distribution There were numerous important achievements for the projects in the...single photon sources that will be used as build- ing blocks for quantum cryptography and quantum key distribution There were numerous im- portant...and enable absolutely secured information transfer between distant nodes – key prerequisite for quantum cryptography . Experiment: the experimental
3D WindScanner lidar measurements of wind and turbulence around wind turbines, buildings and bridges
NASA Astrophysics Data System (ADS)
Mikkelsen, T.; Sjöholm, M.; Angelou, N.; Mann, J.
2017-12-01
WindScanner is a distributed research infrastructure developed at DTU with the participation of a number of European countries. The research infrastructure consists of a mobile technically advanced facility for remote measurement of wind and turbulence in 3D. The WindScanners provide coordinated measurements of the entire wind and turbulence fields, of all three wind components scanned in 3D space. Although primarily developed for research related to on- and offshore wind turbines and wind farms, the facility is also well suited for scanning turbulent wind fields around buildings, bridges, aviation structures and of flow in urban environments. The mobile WindScanner facility enables 3D scanning of wind and turbulence fields in full scale within the atmospheric boundary layer at ranges from 10 meters to 5 (10) kilometers. Measurements of turbulent coherent structures are applied for investigation of flow pattern and dynamical loads from turbines, building structures and bridges and in relation to optimization of the location of, for example, wind farms and suspension bridges. This paper presents our achievements to date and reviews briefly the state-of-the-art of the WindScanner measurement technology with examples of uses for wind engineering applications.
Rivas, T; Pozo, S; Paz, M
2014-06-01
We describe the results of sulphur and oxygen isotope analyses used to identify sources of the gypsum present in black crusts that grow on the granite of historical buildings. The crusts were sampled at various locations in and near the city of Vigo (NW Spain) and were analysed for their sulphur content and δ(34)S and δ(18)O isotope ratios. Sampled crusts had δ(34)S values of 7.3‰ to 12.9‰ and δ(18)O values of 6.56‰ to 12.51‰. Sampled as potential sulphur sources were bulk depositions, seawater, foundation, ashlar and construction materials and combustion residues. The results indicated marine and, to a lesser extent, anthropogenic, origins for the sulphur and ruled out the contribution of sub-soil sulphates by capillary rise from building foundations. Isotope analyses would indicate that cement and mortar were enriched in sulphur after their application in buildings. The fact that facade orientation (towards the sea or fossil fuel pollution sources) was correlated with sulphur isotope distribution pointed to various contributions to black crust formation. Copyright © 2014 Elsevier B.V. All rights reserved.
Liu, An; Wijesiri, Buddhi; Hong, Nian; Zhu, Panfeng; Egodawatta, Prasanna; Goonetilleke, Ashantha
2018-05-08
Road deposited pollutants (build-up) are continuously re-distributed by external factors such as traffic and wind turbulence, influencing stormwater runoff quality. However, current stormwater quality modelling approaches do not account for the re-distribution of pollutants. This undermines the accuracy of stormwater quality predictions, constraining the design of effective stormwater treatment measures. This study, using over 1000 data points, developed a Bayesian Network modelling approach to investigate the re-distribution of pollutant build-up on urban road surfaces. BTEX, which are a group of highly toxic pollutants, was the case study pollutants. Build-up sampling was undertaken in Shenzhen, China, using a dry and wet vacuuming method. The research outcomes confirmed that the vehicle type and particle size significantly influence the re-distribution of particle-bound BTEX. Compared to heavy-duty traffic in commercial areas, light-duty traffic dominates the re-distribution of particles of all size ranges. In industrial areas, heavy-duty traffic re-distributes particles >75 μm, and light-duty traffic re-distributes particles <75 μm. In residential areas, light-duty traffic re-distributes particles >300 μm and <75 μm and heavy-duty traffic re-distributes particles in the 300-150 μm range. The study results provide important insights to improve stormwater quality modelling and the interpretation of modelling outcomes, contributing to safeguard the urban water environment. Copyright © 2018 Elsevier B.V. All rights reserved.
Technologies for network-centric C4ISR
NASA Astrophysics Data System (ADS)
Dunkelberger, Kirk A.
2003-07-01
Three technologies form the heart of any network-centric command, control, communication, intelligence, surveillance, and reconnaissance (C4ISR) system: distributed processing, reconfigurable networking, and distributed resource management. Distributed processing, enabled by automated federation, mobile code, intelligent process allocation, dynamic multiprocessing groups, check pointing, and other capabilities creates a virtual peer-to-peer computing network across the force. Reconfigurable networking, consisting of content-based information exchange, dynamic ad-hoc routing, information operations (perception management) and other component technologies forms the interconnect fabric for fault tolerant inter processor and node communication. Distributed resource management, which provides the means for distributed cooperative sensor management, foe sensor utilization, opportunistic collection, symbiotic inductive/deductive reasoning and other applications provides the canonical algorithms for network-centric enterprises and warfare. This paper introduces these three core technologies and briefly discusses a sampling of their component technologies and their individual contributions to network-centric enterprises and warfare. Based on the implied requirements, two new algorithms are defined and characterized which provide critical building blocks for network centricity: distributed asynchronous auctioning and predictive dynamic source routing. The first provides a reliable, efficient, effective approach for near-optimal assignment problems; the algorithm has been demonstrated to be a viable implementation for ad-hoc command and control, object/sensor pairing, and weapon/target assignment. The second is founded on traditional dynamic source routing (from mobile ad-hoc networking), but leverages the results of ad-hoc command and control (from the contributed auctioning algorithm) into significant increases in connection reliability through forward prediction. Emphasis is placed on the advantages gained from the closed-loop interaction of the multiple technologies in the network-centric application environment.
NASA Astrophysics Data System (ADS)
Kinoshita, Shunichi; Eder, Wolfgang; Wöger, Julia; Hohenegger, Johann; Briguglio, Antonino
2017-04-01
Investigations on Palaeonummulites venosus using the natural laboratory approach for determining chamber building rate, test diameter increase rate, reproduction time and longevity is based on the decomposition of monthly obtained frequency distributions based on chamber number and test diameter into normal-distributed components. The shift of the component parameters 'mean' and 'standard deviation' during the investigation period of 15 months was used to calculate Michaelis-Menten functions applied to estimate the averaged chamber building rate and diameter increase rate under natural conditions. The individual dates of birth were estimated using the inverse averaged chamber building rate and the inverse diameter increase rate fitted by the individual chamber number or the individual test diameter at the sampling date. Distributions of frequencies and densities (i.e. frequency divided by sediment weight) based on chamber building rate and diameter increase rate resulted both in a continuous reproduction through the year with two peaks, the stronger in May /June determined as the beginning of the summer generation (generation1) and the weaker in November determined as the beginning of the winter generation (generation 2). This reproduction scheme explains the existence of small and large specimens in the same sample. Longevity, calculated as the maximum difference in days between the individual's birth date and the sampling date seems to be round about one year, obtained by both estimations based on the chamber building rate and the diameter increase rate.
Developing building-damage scales for lahars: application to Merapi volcano, Indonesia
NASA Astrophysics Data System (ADS)
Jenkins, Susanna F.; Phillips, Jeremy C.; Price, Rebecca; Feloy, Kate; Baxter, Peter J.; Hadmoko, Danang Sri; de Bélizal, Edouard
2015-09-01
Lahar damage to buildings can include burial by sediment and/or failure of walls, infiltration into the building and subsequent damage to contents. The extent to which a building is damaged will be dictated by the dynamic characteristics of the lahar, i.e. the velocity, depth, sediment concentration and grain size, as well as the structural characteristics and setting of the building in question. The focus of this paper is on quantifying how buildings may respond to impact by lahar. We consider the potential for lahar damage to buildings on Merapi volcano, Indonesia, as a result of the voluminous deposits produced during the large (VEI 4) eruption in 2010. A building-damage scale has been developed that categorises likely lahar damage levels and, through theoretical calculations of expected building resistance to impact, approximate ranges of impact pressures. We found that most weak masonry buildings on Merapi would be destroyed by dilute lahars with relatively low velocities (ca. 3 m/s) and pressures (ca. 5 kPa); however, the majority of stronger rubble stone buildings may be expected to withstand higher velocities (to 6 m/s) and pressures (to 20 kPa). We applied this preliminary damage scale to a large lahar in the Putih River on 9 January 2011, which inundated and caused extensive building damage in the village of Gempol, 16 km southwest of Merapi. The scale was applied remotely through the use of public satellite images and through field studies to categorise damage and estimate impact pressures and velocities within the village. Results were compared with those calculated independently from Manning's calculations for flow velocity and depth within Gempol village using an estimate of flow velocity at one upstream site as input. The results of this calculation showed reasonable agreement with an average channel velocity derived from travel time observations. The calculated distribution of flow velocities across the area of damaged buildings was consistent with building damage as classified by the new damage scale. The complementary results, even given the basic nature of the tools and data, suggest that the damage scale provides a valid representation of the failure mode that is consistent with estimates of the flow conditions. The use of open-source simplified tools and data in producing these consistent findings is very promising.
Suitability of point kernel dose calculation techniques in brachytherapy treatment planning
Lakshminarayanan, Thilagam; Subbaiah, K. V.; Thayalan, K.; Kannan, S. E.
2010-01-01
Brachytherapy treatment planning system (TPS) is necessary to estimate the dose to target volume and organ at risk (OAR). TPS is always recommended to account for the effect of tissue, applicator and shielding material heterogeneities exist in applicators. However, most brachytherapy TPS software packages estimate the absorbed dose at a point, taking care of only the contributions of individual sources and the source distribution, neglecting the dose perturbations arising from the applicator design and construction. There are some degrees of uncertainties in dose rate estimations under realistic clinical conditions. In this regard, an attempt is made to explore the suitability of point kernels for brachytherapy dose rate calculations and develop new interactive brachytherapy package, named as BrachyTPS, to suit the clinical conditions. BrachyTPS is an interactive point kernel code package developed to perform independent dose rate calculations by taking into account the effect of these heterogeneities, using two regions build up factors, proposed by Kalos. The primary aim of this study is to validate the developed point kernel code package integrated with treatment planning computational systems against the Monte Carlo (MC) results. In the present work, three brachytherapy applicators commonly used in the treatment of uterine cervical carcinoma, namely (i) Board of Radiation Isotope and Technology (BRIT) low dose rate (LDR) applicator and (ii) Fletcher Green type LDR applicator (iii) Fletcher Williamson high dose rate (HDR) applicator, are studied to test the accuracy of the software. Dose rates computed using the developed code are compared with the relevant results of the MC simulations. Further, attempts are also made to study the dose rate distribution around the commercially available shielded vaginal applicator set (Nucletron). The percentage deviations of BrachyTPS computed dose rate values from the MC results are observed to be within plus/minus 5.5% for BRIT LDR applicator, found to vary from 2.6 to 5.1% for Fletcher green type LDR applicator and are up to −4.7% for Fletcher-Williamson HDR applicator. The isodose distribution plots also show good agreements with the results of previous literatures. The isodose distributions around the shielded vaginal cylinder computed using BrachyTPS code show better agreement (less than two per cent deviation) with MC results in the unshielded region compared to shielded region, where the deviations are observed up to five per cent. The present study implies that the accurate and fast validation of complicated treatment planning calculations is possible with the point kernel code package. PMID:20589118
A Modelica-based Model Library for Building Energy and Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetter, Michael
2009-04-07
This paper describes an open-source library with component models for building energy and control systems that is based on Modelica, an equation-based objectoriented language that is well positioned to become the standard for modeling of dynamic systems in various industrial sectors. The library is currently developed to support computational science and engineering for innovative building energy and control systems. Early applications will include controls design and analysis, rapid prototyping to support innovation of new building systems and the use of models during operation for controls, fault detection and diagnostics. This paper discusses the motivation for selecting an equation-based object-oriented language.more » It presents the architecture of the library and explains how base models can be used to rapidly implement new models. To demonstrate the capability of analyzing novel energy and control systems, the paper closes with an example where we compare the dynamic performance of a conventional hydronic heating system with thermostatic radiator valves to an innovative heating system. In the new system, instead of a centralized circulation pump, each of the 18 radiators has a pump whose speed is controlled using a room temperature feedback loop, and the temperature of the boiler is controlled based on the speed of the radiator pump. All flows are computed by solving for the pressure distribution in the piping network, and the controls include continuous and discrete time controls.« less
Lattice Boltzmann Study on Seawall-Break Flows under the Influence of Breach and Buildings
NASA Astrophysics Data System (ADS)
Mei, Qiu-Ying; Zhang, Wen-Huan; Wang, Yi-Hang; Chen, Wen-Wen
2017-10-01
In the process of storm surge, the seawater often overflows and even destroys the seawall. The buildings near the shore are usually inundated by the seawater through the breach. However, at present, there is little study focusing on the effects of buildings and breach on the seawall-break flows. In this paper, the lattice Boltzmann (LB) model with nine velocities in two dimensions (D2Q9) for the shallow water equations is adopted to simulate the seawall-break flows. The flow patterns and water depth distributions for the seawall-break flows under various densities, layouts and shapes of buildings and different breach discharges, sizes and locations are investigated. It is found that when buildings with a high enough density are perpendicular to the main flow direction, an obvious backwater phenomenon appears near buildings while this phenomenon does not occur when buildings with the same density are parallel to the main flow direction. Moreover, it is observed that the occurrence of backwater phenomenon is independent of the building shape. As to the effects of breach on the seawall-break flows, it is found that only when the breach discharge is large enough or the breach size is small enough, the effects of asymmetric distribution of buildings on the seawall-break flows become important. The breach location only changes the flow pattern in the upstream area of the first building that seawater meets, but has little impact on the global water depth distribution. Supported by the National Natural Science Foundation of China under Grant No. 11502124, the Natural Science Foundation of Zhejiang Province under Grant No. LQ16A020001, the Scientific Research Fund of Zhejiang Provincial Education Department under Grant No. Y201533808, the Natural Science Foundation of Ningbo under Grant No. 2016A610075, and is sponsored by K.C. Wong Magna Fund in Ningbo University.
NASA Astrophysics Data System (ADS)
Verrucci, Enrica; Bevington, John; Vicini, Alessandro
2014-05-01
A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be inferred. Lastly, this work draws attention to the use of the IDCT suite as an education resource for inspiring and training new students and engineers in the field of disaster risk reduction.
Open multi-agent control architecture to support virtual-reality-based man-machine interfaces
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Rossmann, Juergen; Brasch, Marcel
2001-10-01
Projective Virtual Reality is a new and promising approach to intuitively operable man machine interfaces for the commanding and supervision of complex automation systems. The user interface part of Projective Virtual Reality heavily builds on latest Virtual Reality techniques, a task deduction component and automatic action planning capabilities. In order to realize man machine interfaces for complex applications, not only the Virtual Reality part has to be considered but also the capabilities of the underlying robot and automation controller are of great importance. This paper presents a control architecture that has proved to be an ideal basis for the realization of complex robotic and automation systems that are controlled by Virtual Reality based man machine interfaces. The architecture does not just provide a well suited framework for the real-time control of a multi robot system but also supports Virtual Reality metaphors and augmentations which facilitate the user's job to command and supervise a complex system. The developed control architecture has already been used for a number of applications. Its capability to integrate sensor information from sensors of different levels of abstraction in real-time helps to make the realized automation system very responsive to real world changes. In this paper, the architecture will be described comprehensively, its main building blocks will be discussed and one realization that is built based on an open source real-time operating system will be presented. The software design and the features of the architecture which make it generally applicable to the distributed control of automation agents in real world applications will be explained. Furthermore its application to the commanding and control of experiments in the Columbus space laboratory, the European contribution to the International Space Station (ISS), is only one example which will be described.
Exploiting semantic linkages among multiple sources for semantic information retrieval
NASA Astrophysics Data System (ADS)
Li, JianQiang; Yang, Ji-Jiang; Liu, Chunchen; Zhao, Yu; Liu, Bo; Shi, Yuliang
2014-07-01
The vision of the Semantic Web is to build a global Web of machine-readable data to be consumed by intelligent applications. As the first step to make this vision come true, the initiative of linked open data has fostered many novel applications aimed at improving data accessibility in the public Web. Comparably, the enterprise environment is so different from the public Web that most potentially usable business information originates in an unstructured form (typically in free text), which poses a challenge for the adoption of semantic technologies in the enterprise environment. Considering that the business information in a company is highly specific and centred around a set of commonly used concepts, this paper describes a pilot study to migrate the concept of linked data into the development of a domain-specific application, i.e. the vehicle repair support system. The set of commonly used concepts, including the part name of a car and the phenomenon term on the car repairing, are employed to build the linkage between data and documents distributed among different sources, leading to the fusion of documents and data across source boundaries. Then, we describe the approaches of semantic information retrieval to consume these linkages for value creation for companies. The experiments on two real-world data sets show that the proposed approaches outperform the best baseline 6.3-10.8% and 6.4-11.1% in terms of top five and top 10 precisions, respectively. We believe that our pilot study can serve as an important reference for the development of similar semantic applications in an enterprise environment.
Computational and mathematical methods in brain atlasing.
Nowinski, Wieslaw L
2017-12-01
Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.
The application, benefits and challenges of retrofitting the existing buildings
NASA Astrophysics Data System (ADS)
Khairi, Muhammad; Jaapar, Aini; Yahya, Zaharah
2017-11-01
Sustainable development has been the main topic of debate for years in some countries such as United Kingdom, United State of America and Malaysia. Depletion of natural resources, global warming, economics uncertainty and health issues are some of the reasons behind sustainable development movements, it is not just a political debate in the parliament but more towards collective works among sectors in order to minimizing the negative impact of development to the environment and other living organism. Retrofit an existing building is one of the solutions to reduce the dependency on constructing new buildings. There are huge numbers of existing building stocks that suitable to be retrofitted such as historical buildings, offices, residential, warehouse, factories, vacant buildings and other historical buildings. Therefore, the aim of this research is to provide information on the application, benefits and challenges of retrofitting an existing building. Two buildings were chosen as case studies following by site visits and observation to the buildings. The data were then compared in a table form. Primary and secondary sources were also used for this research. The application of retrofit should be promoted across the construction and conservation industries since it has significant tangible and intangible benefits. It is one of the most environmentally friendly and efficient solutions to optimize the energy performance and could also helps to extend the life of the existing building or historical buildings while ensuring optimum thermal comfort for the occupants which leads to higher productivity.
Software packager user's guide
NASA Technical Reports Server (NTRS)
Callahan, John R.
1995-01-01
Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.
Network-Based Real-time Integrated Fire Detection and Alarm (FDA) System with Building Automation
NASA Astrophysics Data System (ADS)
Anwar, F.; Boby, R. I.; Rashid, M. M.; Alam, M. M.; Shaikh, Z.
2017-11-01
Fire alarm systems have become increasingly an important lifesaving technology in many aspects, such as applications to detect, monitor and control any fire hazard. A large sum of money is being spent annually to install and maintain the fire alarm systems in buildings to protect property and lives from the unexpected spread of fire. Several methods are already developed and it is improving on a daily basis to reduce the cost as well as increase quality. An integrated Fire Detection and Alarm (FDA) systems with building automation was studied, to reduce cost and improve their reliability by preventing false alarm. This work proposes an improved framework for FDA system to ensure a robust intelligent network of FDA control panels in real-time. A shortest path algorithmic was chosen for series of buildings connected by fiber optic network. The framework shares information and communicates with each fire alarm panels connected in peer to peer configuration and declare the network state using network address declaration from any building connected in network. The fiber-optic connection was proposed to reduce signal noises, thus increasing large area coverage, real-time communication and long-term safety. Based on this proposed method an experimental setup was designed and a prototype system was developed to validate the performance in practice. Also, the distributed network system was proposed to connect with an optional remote monitoring terminal panel to validate proposed network performance and ensure fire survivability where the information is sequentially transmitted. The proposed FDA system is different from traditional fire alarm and detection system in terms of topology as it manages group of buildings in an optimal and efficient manner.Introduction
2017-03-21
for public release; distribution is unlimited 13. SUPPLEMENTARY NOTES None 14. ABSTRACT ESTCP project EW-201409 aimed at demonstrating the benefits ...of innovative software technology for building HV AC systems. These benefits included reduced system energy use and cost as wetl as improved...Control Approach March 2017 This document has been cleared for public release; Distribution Statement A
Assessment of Muscle Fatigue from TF Distributions of SEMG Signals
2008-06-01
Wigner - Ville distribution ( WVD ) holds the...techniques used to build a TF distribution of SEMG signals, namely spectrogram, Wigner - Ville , Choi- Williams and smoothed pseudo Wigner - Ville . SEMG signals...spectrogram but also other Cohen’s class TF distributions , such as the Choi-Williams distribution (CWD) and the smoothed pseudo Wigner - Ville distribution
Application of BIM technology in green scientific research office building
NASA Astrophysics Data System (ADS)
Ni, Xin; Sun, Jianhua; Wang, Bo
2017-05-01
BIM technology as a kind of information technology, has been along with the advancement of building industrialization application in domestic building industry gradually. Based on reasonable construction BIM model, using BIM technology platform, through collaborative design tools can effectively improve the design efficiency and design quality. Vanda northwest engineering design and research institute co., LTD., the scientific research office building project in combination with the practical situation of engineering using BIM technology, formed in the BIM model combined with related information according to the energy energy model (BEM) and the application of BIM technology in construction management stage made exploration, and the direct experience and the achievements gained by the architectural design part made a summary.
A Geo-Distributed System Architecture for Different Domains
NASA Astrophysics Data System (ADS)
Moßgraber, Jürgen; Middleton, Stuart; Tao, Ran
2013-04-01
The presentation will describe work on the system-of-systems (SoS) architecture that is being developed in the EU FP7 project TRIDEC on "Collaborative, Complex and Critical Decision-Support in Evolving Crises". In this project we deal with two use-cases: Natural Crisis Management (e.g. Tsunami Early Warning) and Industrial Subsurface Development (e.g. drilling for oil). These use-cases seem to be quite different at first sight but share a lot of similarities, like managing and looking up available sensors, extracting data from them and annotate it semantically, intelligently manage the data (big data problem), run mathematical analysis algorithms on the data and finally provide decision support on this basis. The main challenge was to create a generic architecture which fits both use-cases. The requirements to the architecture are manifold and the whole spectrum of a modern, geo-distributed and collaborative system comes into play. Obviously, one cannot expect to tackle these challenges adequately with a monolithic system or with a single technology. Therefore, a system architecture providing the blueprints to implement the system-of-systems approach has to combine multiple technologies and architectural styles. The most important architectural challenges we needed to address are 1. Build a scalable communication layer for a System-of-sytems 2. Build a resilient communication layer for a System-of-sytems 3. Efficiently publish large volumes of semantically rich sensor data 4. Scalable and high performance storage of large distributed datasets 5. Handling federated multi-domain heterogeneous data 6. Discovery of resources in a geo-distributed SoS 7. Coordination of work between geo-distributed systems The design decisions made for each of them will be presented. These developed concepts are also applicable to the requirements of the Future Internet (FI) and Internet of Things (IoT) which will provide services like smart grids, smart metering, logistics and environmental monitoring.
Code of Federal Regulations, 2014 CFR
2014-04-01
... parental support by tribal or state law (if any) applicable to the person at his or her residence, except... construction must conform to applicable tribal, county, State, or national codes and to appropriate building... of severe climate, the size of the house may be reduced to meet the region's applicable building...
Code of Federal Regulations, 2012 CFR
2012-04-01
... parental support by tribal or state law (if any) applicable to the person at his or her residence, except... construction must conform to applicable tribal, county, State, or national codes and to appropriate building... of severe climate, the size of the house may be reduced to meet the region's applicable building...
Code of Federal Regulations, 2011 CFR
2011-04-01
... parental support by tribal or state law (if any) applicable to the person at his or her residence, except... construction must conform to applicable tribal, county, State, or national codes and to appropriate building... of severe climate, the size of the house may be reduced to meet the region's applicable building...
Humidity Distributions in Multilayered Walls of High-rise Buildings
NASA Astrophysics Data System (ADS)
Gamayunova, Olga; Musorina, Tatiana; Ishkov, Alexander
2018-03-01
The limitation of free territories in large cities is the main reason for the active development of high-rise construction. Given the large-scale projects of high-rise buildings in recent years in Russia and abroad and their huge energy consumption, one of the fundamental principles in the design and reconstruction is the use of energy-efficient technologies. The main heat loss in buildings occurs through enclosing structures. However, not always the heat-resistant wall will be energy-efficient and dry at the same time (perhaps waterlogging). Temperature and humidity distributions in multilayer walls were studied in the paper, and the interrelation of other thermophysical characteristics was analyzed.
Concrete compositions and methods
Chen, Irvin; Lee, Patricia Tung; Patterson, Joshua
2015-06-23
Provided herein are compositions, methods, and systems for cementitious compositions containing calcium carbonate compositions and aggregate. The compositions find use in a variety of applications, including use in a variety of building materials and building applications.
Analyses of Public Utility Building - Students Designs, Aimed at their Energy Efficiency Improvement
NASA Astrophysics Data System (ADS)
Wołoszyn, Marek Adam
2017-10-01
Public utility buildings are formally, structurally and functionally complex entities. Frequently, the process of their design involves the retroactive reconsideration of energy engineering issues, once a building concept has already been completed. At that stage, minor formal corrections are made along with the design of the external layer of the building in order to satisfy applicable standards. Architecture students do the same when designing assigned public utility buildings. In order to demonstrate energy-related defects of building designs developed by students, the conduct of analyses was proposed. The completed designs of public utility buildings were examined with regard to energy efficiency of the solutions they feature through the application of the following programs: Ecotect, Vasari, and in case of simpler analyses ArchiCad program extensions were sufficient.
Multiple Dimensions to the Application for the Effectiveness of Team Building in ROTC
ERIC Educational Resources Information Center
Chen, Yin-Che; Chen, Yun-Chi; Tsao, Ya-Lun
2009-01-01
Team building has been given increasing attention and applied in diverse disciplines in recent years. The purpose of this study was to determine the multiple dimensions to the application for the effectiveness of team building in the military units such as ROTC since not many existing literature has investigated this in such an expectedly high…
Distribution Management System Volt/VAR Evaluation | Grid Modernization |
NREL Distribution Management System Volt/VAR Evaluation Distribution Management System Volt/VAR Evaluation This project involves building a prototype distribution management system testbed that links a GE Grid Solutions distribution management system to power hardware-in-the-loop testing. This setup is
24 CFR 941.203 - Design and construction standards.
Code of Federal Regulations, 2013 CFR
2013-04-01
... national building code, such as Uniform Building Code, Council of American Building Officials Code, or Building Officials Conference of America Code; (2) Applicable State and local laws, codes, ordinances, and... intended to serve. Building design and construction shall strive to encourage in residents a proprietary...
24 CFR 941.203 - Design and construction standards.
Code of Federal Regulations, 2012 CFR
2012-04-01
... national building code, such as Uniform Building Code, Council of American Building Officials Code, or Building Officials Conference of America Code; (2) Applicable State and local laws, codes, ordinances, and... intended to serve. Building design and construction shall strive to encourage in residents a proprietary...
The implementation and use of Ada on distributed systems with high reliability requirements
NASA Technical Reports Server (NTRS)
Knight, J. C.
1986-01-01
The general inadequacy of Ada for programming systems that must survive processor loss was shown. A solution to the problem was proposed in which there are no syntatic changes to Ada. The approach was evaluated using a full-scale, realistic application. The application used was the Advanced Transport Operating System (ATOPS), an experimental computer control system developed for a modified Boeing 737 aircraft. The ATOPS system is a full authority, real-time avionics system providing a large variety of advanced features. Methods of building fault tolerance into concurrent systems were explored. A set of criteria by which the proposed method will be judged was examined. Extensive interaction with personnel from Computer Sciences Corporation and NASA Langley occurred to determine the requirements of the ATOPS software. Backward error recovery in concurrent systems was assessed.
Application and study of land-reclaim based on Arc/Info
NASA Astrophysics Data System (ADS)
Zhao, Jun; Zhang, Ruiju; Wang, Zhian; Li, Shiyong
2005-10-01
This paper firstly puts forward the evaluation models of land-reclaim, which is derived from the thoery of Fuzzy associative memory nerve network and corresponding supplemental CASE tools, based on the model the mode of land reclaim can determined, and then the elements of land-reclaim are displayed and synthesized visually and virtually by virtue of Arc/Info software. In the process of land reclaim, it is particularly important to build the model of land-reclaim and to map the distribution of soil elements. In this way rational and feasible schemes are adopted in order to instruct the project of land reclaim. This thesis mainly takes the fourth mining area of East Beach as an example and puts this model into practice. Based on Arc/Info software the application of land-reclaim is studied and good results are achieved.
Distribution trend of high-rise buildings worldwide and factor exploration
NASA Astrophysics Data System (ADS)
Yu, Shao-Qiao
2017-08-01
This paper elaborates the development phenomenon of high-rise buildings nowadays. The development trend of super high-rise buildings worldwide is analyzed based on data from the Council on Tall Buildings and Urban Habitat, taking the top 100 high-rise buildings in different continents and with the time development and building type as the objects. Through analysis, the trend of flourishing of UAE super high-rise buildings and stable development of European and American high-rise buildings is obtained. The reasons for different development degrees of the regions are demonstrated from the aspects of social development, economy, culture and consciousness. This paper also presents unavoidable issues of super high-rise buildings and calls for rational treatment to these buildings.
2004-11-01
variation in ventilation rates over time and the distribution of ventilation air within a building, and to estimate the impact of envelope air ... tightening efforts on infiltration rates. • It may be used to determine the indoor air quality performance of a building before construction, and to
ERIC Educational Resources Information Center
Milbank, N. O.
Two similarly large buildings and air conditioning systems are comparatively analyzed as to energy consumption, costs, and inefficiency during certain measured periods of time. Building design and velocity systems are compared to heating, cooling, lighting and distribution capabilities. Energy requirements for pumps, fans and lighting are found to…
Common Object Library Description
2012-08-01
Information Modeling ( BIM ) technology to be successful, it must be consistently applied across many projects, by many teams. The National Building Information ...distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT For Building Information Modeling ( BIM ) technology to be successful, it must be... BIM standards and for future research projects. 15. SUBJECT TERMS building information modeling ( BIM ), object
Open Clients for Distributed Databases
NASA Astrophysics Data System (ADS)
Chayes, D. N.; Arko, R. A.
2001-12-01
We are actively developing a collection of open source example clients that demonstrate use of our "back end" data management infrastructure. The data management system is reported elsewhere at this meeting (Arko and Chayes: A Scaleable Database Infrastructure). In addition to their primary goal of being examples for others to build upon, some of these clients may have limited utility in them selves. More information about the clients and the data infrastructure is available on line at http://data.ldeo.columbia.edu. The available examples to be demonstrated include several web-based clients including those developed for the Community Review System of the Digital Library for Earth System Education, a real-time watch standers log book, an offline interface to use log book entries, a simple client to search on multibeam metadata and others are Internet enabled and generally web-based front ends that support searches against one or more relational databases using industry standard SQL queries. In addition to the web based clients, simple SQL searches from within Excel and similar applications will be demonstrated. By defining, documenting and publishing a clear interface to the fully searchable databases, it becomes relatively easy to construct client interfaces that are optimized for specific applications in comparison to building a monolithic data and user interface system.
Korocsec, D; Holobar, A; Divjak, M; Zazula, D
2005-12-01
Medicine is a difficult thing to learn. Experimenting with real patients should not be the only option; simulation deserves a special attention here. Virtual Reality Modelling Language (VRML) as a tool for building virtual objects and scenes has a good record of educational applications in medicine, especially for static and animated visualisations of body parts and organs. However, to create computer simulations resembling situations in real environments the required level of interactivity and dynamics is difficult to achieve. In the present paper we describe some approaches and techniques which we used to push the limits of the current VRML technology further toward dynamic 3D representation of virtual environments (VEs). Our demonstration is based on the implementation of a virtual baby model, whose vital signs can be controlled from an external Java application. The main contributions of this work are: (a) outline and evaluation of the three-level VRML/Java implementation of the dynamic virtual environment, (b) proposal for a modified VRML Timesensor node, which greatly improves the overall control of system performance, and (c) architecture of the prototype distributed virtual environment for training in neonatal resuscitation comprising the interactive virtual newborn, active bedside monitor for vital signs and full 3D representation of the surgery room.
Building resilience of the Global Positioning System to space weather
NASA Astrophysics Data System (ADS)
Fisher, Genene; Kunches, Joseph
2011-12-01
Almost every aspect of the global economy now depends on GPS. Worldwide, nations are working to create a robust Global Navigation Satellite System (GNSS), which will provide global positioning, navigation, and timing (PNT) services for applications such as aviation, electric power distribution, financial exchange, maritime navigation, and emergency management. The U.S. government is examining the vulnerabilities of GPS, and it is well known that space weather events, such as geomagnetic storms, contribute to errors in single-frequency GPS and are a significant factor for differential GPS. The GPS industry has lately begun to recognize that total electron content (TEC) signal delays, ionospheric scintillation, and solar radio bursts can also interfere with daily operations and that these threats grow with the approach of the next solar maximum, expected to occur in 2013. The key challenges raised by these circumstances are, first, to better understand the vulnerability of GPS technologies and services to space weather and, second, to develop policies that will build resilience and mitigate risk.
Capolongo, Stefano; Gola, Marco; di Noia, Michela; Nickolova, Maria; Nachiero, Dario; Rebecchi, Andrea; Settimo, Gaetano; Vittori, Gail; Buffoli, Maddalena
2016-01-01
Nowadays several rating systems exist for the evaluation of the sustainability of buildings, but often their focus is limited to environmental and efficiency aspects. Hospitals are complex constructions in which many variables affect hospital processes. Therefore, a research group has developed a tool for the evaluation of sustainability in healthcare facilities. The paper analyses social sustainability issues through a tool which evaluates users' perception from a the quality and well-being perspective. It presents a hierarchical structure composed of a criteria and indicators system which is organised through a weighing system calculated by using the Analytic Network Process. The output is the definition of a tool which evaluates how Humanisation, Comfort and Distribution criteria can affect the social sustainability of a building. Starting from its application, it is evident that the instrument enables the improvement of healthcare facilities through several design and organisational suggestions for achieving healing and sustainable architectures.
Characterization of an aerosol sample from the auxiliary building of the Three Mile Island reactor.
Kanapilly, G M; Stanley, J A; Newton, G J; Wong, B A; DeNee, P B
1983-11-01
Analyses for radioisotopic composition and dissolution characteristics were performed on an aerosol filter sample collected for a week by an air sampler located in the auxiliary building of the Three Mile Island nuclear reactor. The major radioisotopes found on the filter were 89Sr, 90Sr, 134Cs and 137Cs. Greater than 90% of both 89-90Sr and 134-137Cs dissolved within 48 hr in an in vitro test system. Scanning electron microscopic analyses showed the presence of respirable size particles as well as larger particles ranging up to 10 micron in diameter. The major matrix components were Fe, Ca, S, Mg, Al and Si. Although the radionuclides were present in a heterogeneous matrix, they were in a soluble form. This information enables a better evaluation of bioassay data and predictions of dose distribution resulting from an inhalation exposure to this aerosol. Further, the combination of techniques used in this study may be applicable to the characterization of other aerosols of unknown composition.
Wind Turbines in the Built Environment: Summary of a Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tinnesand, Heidi; Baring-Gould, Ian; Fields, Jason
2016-09-28
Built-environment wind turbine (BEWT) projects are wind energy projects that are constructed on, in, or near buildings. These projects present an opportunity for distributed, low-carbon generation combined with highly visible statements on sustainability, but the BEWT niche of the wind industry is still developing and is relatively less mature than the utility-scale wind or conventional ground-based distributed wind sectors. The findings presented in this presentation cannot be extended to wind energy deployments in general because of the large difference in application and technology maturity. This presentation summarizes the results of a report investigating the current state of the BEWT industrymore » by reviewing available literature on BEWT projects as well as interviewing project owners on their experiences deploying and operating the technology. The authors generated a series of case studies that outlines the pertinent project details, project outcomes, and lessons learned.« less
An optimization model for energy generation and distribution in a dynamic facility
NASA Technical Reports Server (NTRS)
Lansing, F. L.
1981-01-01
An analytical model is described using linear programming for the optimum generation and distribution of energy demands among competing energy resources and different economic criteria. The model, which will be used as a general engineering tool in the analysis of the Deep Space Network ground facility, considers several essential decisions for better design and operation. The decisions sought for the particular energy application include: the optimum time to build an assembly of elements, inclusion of a storage medium of some type, and the size or capacity of the elements that will minimize the total life-cycle cost over a given number of years. The model, which is structured in multiple time divisions, employ the decomposition principle for large-size matrices, the branch-and-bound method in mixed-integer programming, and the revised simplex technique for efficient and economic computer use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mosher, J. C.; Baillet, S.; Jerbi, K.
2001-01-01
We describe the use of truncated multipolar expansions for producing dynamic images of cortical neural activation from measurements of the magnetoencephalogram. We use a signal-subspace method to find the locations of a set of multipolar sources, each of which represents a region of activity in the cerebral cortex. Our method builds up an estimate of the sources in a recursive manner, i.e. we first search for point current dipoles, then magnetic dipoles, and finally first order multipoles. The dynamic behavior of these sources is then computed using a linear fit to the spatiotemporal data. The final step in the proceduremore » is to map each of the multipolar sources into an equivalent distributed source on the cortical surface. The method is illustrated through an application to epileptic interictal MEG data.« less
NASA Astrophysics Data System (ADS)
Tian, Jingjing
Low-rise woodframe buildings with disproportionately flexible ground stories represent a significant percentage of the building stock in seismically vulnerable communities in the Western United States. These structures have a readily identifiable structural weakness at the ground level due to an asymmetric distribution of large openings in the perimeter wall lines and to a lack of interior partition walls, resulting in a soft story condition that makes the structure highly susceptible to severe damage or collapse under design-level earthquakes. The conventional approach to retrofitting such structures is to increase the ground story stiffness. An alternate approach is to increase the energy dissipation capacity of the structure via the incorporation of supplemental energy dissipation devices (dampers), thereby relieving the energy dissipation demands on the framing system. Such a retrofit approach is consistent with a Performance-Based Seismic Retrofit (PBSR) philosophy through which multiple performance levels may be targeted. The effectiveness of such a retrofit is presented via examination of the seismic response of a full-scale four-story building that was tested on the outdoor shake table at NEES-UCSD and a full-scale three-story building that was tested using slow pseudo-dynamic hybrid testing at NEES-UB. In addition, a Direct Displacement Design (DDD) methodology was developed as an improvement over current DDD methods by considering torsion, with or without the implementation of damping devices, in an attempt to avoid the computational expense of nonlinear time-history analysis (NLTHA) and thus facilitating widespread application of PBSR in engineering practice.
Oduola, Adedayo O; Obembe, Abiodun; Adelaja, Olukayode J; Adeneye, Adeniyi K; Akilah, Joel; Awolola, Taiwo S
2018-05-15
Despite the availability of effective malaria vector control intervention tools, implementation of control programmes in Nigeria is challenged by inadequate entomological surveillance data. This study was designed to assess and build the existing capacity for malaria vector surveillance, control and research (MVSC&R) in Nigerian institutions. Application call to select qualified candidates for the capacity building (CB) intervention training programme was advertised in a widely read newspaper and online platforms of national and international professional bodies. Two trainings were organized to train selected applicants on field activities, laboratory tools and techniques relevant to malaria vector surveillance and control research. A semi-structured questionnaire was administered to collect data on socio-demographic characteristics of participants, knowledge and access of participants to field and laboratory techniques in MVSC&R. Similarly, pre and post-intervention tests were conducted to assess the performance and improvement in knowledge of the participants. Mentoring activities to sustain CB activities after the training were also carried out. A total of 23 suitable applicants were shortlisted out of the 89 applications received. The South West, South East and North Central geopolitical zones of the country had the highest applications and the highest selected number of qualified applicants compared to the South South and North East geopolitical zones. The distribution with respect to gender indicated that males (72.7%) were more than females (27.3%). Mean score of participants' knowledge of field techniques was 27.8 (± 10.8) before training and 67.7 (± 9.8) after the training. Similarly, participants' knowledge on laboratory techniques also improved from 37.4 (± 5.6) to 77.2 (± 10.8). The difference in the mean scores at pre and post-test was statistically significant (p < 0.05). Access of participants to laboratory and field tools used in MVSC&R was generally low with insecticide susceptibility bioassays and pyrethrum spray collection methods being the most significant (p < 0.05). The capacity available for vector control research and surveillance at institutional level in Nigeria is weak and require further strengthening. Increased training and access of personnel to relevant tools for MVSC&R is required in higher institutions in the six geopolitical zones of the country.
Learning and recognition of on-premise signs from weakly labeled street view images.
Tsai, Tsung-Hung; Cheng, Wen-Huang; You, Chuang-Wen; Hu, Min-Chun; Tsui, Arvin Wen; Chi, Heng-Yu
2014-03-01
Camera-enabled mobile devices are commonly used as interaction platforms for linking the user's virtual and physical worlds in numerous research and commercial applications, such as serving an augmented reality interface for mobile information retrieval. The various application scenarios give rise to a key technique of daily life visual object recognition. On-premise signs (OPSs), a popular form of commercial advertising, are widely used in our living life. The OPSs often exhibit great visual diversity (e.g., appearing in arbitrary size), accompanied with complex environmental conditions (e.g., foreground and background clutter). Observing that such real-world characteristics are lacking in most of the existing image data sets, in this paper, we first proposed an OPS data set, namely OPS-62, in which totally 4649 OPS images of 62 different businesses are collected from Google's Street View. Further, for addressing the problem of real-world OPS learning and recognition, we developed a probabilistic framework based on the distributional clustering, in which we proposed to exploit the distributional information of each visual feature (the distribution of its associated OPS labels) as a reliable selection criterion for building discriminative OPS models. Experiments on the OPS-62 data set demonstrated the outperformance of our approach over the state-of-the-art probabilistic latent semantic analysis models for more accurate recognitions and less false alarms, with a significant 151.28% relative improvement in the average recognition rate. Meanwhile, our approach is simple, linear, and can be executed in a parallel fashion, making it practical and scalable for large-scale multimedia applications.
12 CFR 1807.503 - Project completion.
Code of Federal Regulations, 2012 CFR
2012-01-01
... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...
12 CFR 1807.503 - Project completion.
Code of Federal Regulations, 2013 CFR
2013-01-01
... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...
12 CFR 1807.503 - Project completion.
Code of Federal Regulations, 2014 CFR
2014-01-01
... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...
12 CFR 1807.503 - Project completion.
Code of Federal Regulations, 2011 CFR
2011-01-01
... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...
An indoor augmented reality mobile application for simulation of building evacuation
NASA Astrophysics Data System (ADS)
Sharma, Sharad; Jerripothula, Shanmukha
2015-03-01
Augmented Reality enables people to remain connected with the physical environment they are in, and invites them to look at the world from new and alternative perspectives. There has been an increasing interest in emergency evacuation applications for mobile devices. Nearly all the smart phones these days are Wi-Fi and GPS enabled. In this paper, we propose a novel emergency evacuation system that will help people to safely evacuate a building in case of an emergency situation. It will further enhance knowledge and understanding of where the exits are in the building and safety evacuation procedures. We have applied mobile augmented reality (mobile AR) to create an application with Unity 3D gaming engine. We show how the mobile AR application is able to display a 3D model of the building and animation of people evacuation using markers and web camera. The system gives a visual representation of a building in 3D space, allowing people to see where exits are in the building through the use of a smart phone or tablets. Pilot studies were conducted with the system showing its partial success and demonstrated the effectiveness of the application in emergency evacuation. Our computer vision methods give good results when the markers are closer to the camera, but accuracy decreases when the markers are far away from the camera.
Analysis of building deformation in landslide area using multisensor PSInSAR™ technique.
Ciampalini, Andrea; Bardi, Federica; Bianchini, Silvia; Frodella, William; Del Ventisette, Chiara; Moretti, Sandro; Casagli, Nicola
2014-12-01
Buildings are sensitive to movements caused by ground deformation. The mapping both of spatial and temporal distribution, and of the degree of building damages represents a useful tool in order to understand the landslide evolution, magnitude and stress distribution. The high spatial resolution of space-borne SAR interferometry can be used to monitor displacements related to building deformations. In particular, PSInSAR technique is used to map and monitor ground deformation with millimeter accuracy. The usefulness of the above mentioned methods was evaluated in San Fratello municipality (Sicily, Italy), which was historically affected by landslides: the most recent one occurred on 14th February 2010. PSInSAR data collected by ERS 1/2, ENVISAT, RADARSAT-1 were used to study the building deformation velocities before the 2010 landslide. The X-band sensors COSMO-SkyMed and TerraSAR-X were used in order to monitor the building deformation after this event. During 2013, after accurate field inspection on buildings and structures, damage assessment map of San Fratello were created and then compared to the building deformation velocity maps. The most interesting results were obtained by the comparison between the building deformation velocity map obtained through COSMO-SkyMed and the damage assessment map. This approach can be profitably used by local and Civil Protection Authorities to manage the post-event phase and evaluate the residual risks.
Managing a tier-2 computer centre with a private cloud infrastructure
NASA Astrophysics Data System (ADS)
Bagnasco, Stefano; Berzano, Dario; Brunetti, Riccardo; Lusso, Stefano; Vallero, Sara
2014-06-01
In a typical scientific computing centre, several applications coexist and share a single physical infrastructure. An underlying Private Cloud infrastructure eases the management and maintenance of such heterogeneous applications (such as multipurpose or application-specific batch farms, Grid sites, interactive data analysis facilities and others), allowing dynamic allocation resources to any application. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques. Such infrastructures are being deployed in some large centres (see e.g. the CERN Agile Infrastructure project), but with several open-source tools reaching maturity this is becoming viable also for smaller sites. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 centre, an Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The private cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem and the OpenWRT Linux distribution (used for network virtualization); a future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and OCCI.
Application of GIS in exploring spatial dimensions of Efficiency in Competitiveness of Regions
NASA Astrophysics Data System (ADS)
Rahmat, Shahid; Sen, Joy
2017-04-01
Infrastructure is an important component in building competitiveness of a region. Present global scenario of economic slowdown that is led by slump in demand of goods and services and decreasing capacity of government institutions in investing public infrastructure. Strategy of augmenting competitiveness of a region can be built around improving efficient distribution of public infrastructure in the region. This efficiency in the distribution of infrastructure will reduce the burden of government institution and improve the relative output of the region in relative lesser investment. A rigorous literature study followed by an expert opinion survey (RIDIT scores) reveals that Railway, Road, ICTs and Electricity infrastructure is very crucial for better competitiveness of a region. Discussion with Experts in ICTs, Railways and Electricity sectors were conducted to find the issues, hurdles and possible solution for the development of these sectors. In an underdeveloped country like India, there is a large constrain of financial resources, for investment in infrastructure sector. Judicious planning for allocation of resources for infrastructure provisions becomes very important for efficient and sustainable development. Data Envelopment Analysis (DEA) is the mathematical programming optimization tool that measure technical efficiency of the multiple-input and/or multiple-output case by constructing a relative technical efficiency score. This paper tries to utilize DEA to identify the efficiency at which present level of selected components of Infrastructure (Railway, Road, ICTs and Electricity) is utilized in order to build competitiveness of the region. This paper tries to identify a spatial pattern of efficiency of Infrastructure with the help of spatial auto-correlation and Hot-spot analysis in Arc GIS. This analysis leads to policy implications for efficient allocation of financial resources for the provision of infrastructure in the region and building a prerequisite to boost an efficient Regional Competitiveness.
On Feature Extraction from Large Scale Linear LiDAR Data
NASA Astrophysics Data System (ADS)
Acharjee, Partha Pratim
Airborne light detection and ranging (LiDAR) can generate co-registered elevation and intensity map over large terrain. The co-registered 3D map and intensity information can be used efficiently for different feature extraction application. In this dissertation, we developed two algorithms for feature extraction, and usages of features for practical applications. One of the developed algorithms can map still and flowing waterbody features, and another one can extract building feature and estimate solar potential on rooftops and facades. Remote sensing capabilities, distinguishing characteristics of laser returns from water surface and specific data collection procedures provide LiDAR data an edge in this application domain. Furthermore, water surface mapping solutions must work on extremely large datasets, from a thousand square miles, to hundreds of thousands of square miles. National and state-wide map generation/upgradation and hydro-flattening of LiDAR data for many other applications are two leading needs of water surface mapping. These call for as much automation as possible. Researchers have developed many semi-automated algorithms using multiple semi-automated tools and human interventions. This reported work describes a consolidated algorithm and toolbox developed for large scale, automated water surface mapping. Geometric features such as flatness of water surface, higher elevation change in water-land interface and, optical properties such as dropouts caused by specular reflection, bimodal intensity distributions were some of the linear LiDAR features exploited for water surface mapping. Large-scale data handling capabilities are incorporated by automated and intelligent windowing, by resolving boundary issues and integrating all results to a single output. This whole algorithm is developed as an ArcGIS toolbox using Python libraries. Testing and validation are performed on a large datasets to determine the effectiveness of the toolbox and results are presented. Significant power demand is located in urban areas, where, theoretically, a large amount of building surface area is also available for solar panel installation. Therefore, property owners and power generation companies can benefit from a citywide solar potential map, which can provide available estimated annual solar energy at a given location. An efficient solar potential measurement is a prerequisite for an effective solar energy system in an urban area. In addition, the solar potential calculation from rooftops and building facades could open up a wide variety of options for solar panel installations. However, complex urban scenes make it hard to estimate the solar potential, partly because of shadows cast by the buildings. LiDAR-based 3D city models could possibly be the right technology for solar potential mapping. Although, most of the current LiDAR-based local solar potential assessment algorithms mainly address rooftop potential calculation, whereas building facades can contribute a significant amount of viable surface area for solar panel installation. In this paper, we introduce a new algorithm to calculate solar potential of both rooftop and building facades. Solar potential received by the rooftops and facades over the year are also investigated in the test area.
Self-Organisation and Capacity Building: Sustaining the Change
ERIC Educational Resources Information Center
Bain, Alan; Walker, Allan; Chan, Anissa
2011-01-01
Purpose: The paper aims to describe the application of theoretical principles derived from a study of self-organisation and complex systems theory and their application to school-based capacity building to support planned change. Design/methodology/approach: The paper employs a case example in a Hong Kong School to illustrate the application of…
NASA Technical Reports Server (NTRS)
Chandler, William S.; Stackhouse, Paul W., Jr.; Barnett, Audy J.; Hoell, James M.; Westberg, David J.; Ross, Amanda I.
2015-01-01
Renewable energy technologies are changing the face of the world's energy market. Currently, these technologies are being incorporated within existing structures to increase energy efficiency. Crucial to the success of the emerging renewable market is the availability of accurate, global solar radiation, and meteorology data. This poster traces the history of the development of an effort to distribute data parameters from NASA's research for use in the energy sector applications spanning from renewable energy to energy efficiency. These data may be useful to several renewable energy sectors: solar and wind power generation, agricultural crop modeling, and sustainable buildings.
Nearest neighbor density ratio estimation for large-scale applications in astronomy
NASA Astrophysics Data System (ADS)
Kremer, J.; Gieseke, F.; Steenstrup Pedersen, K.; Igel, C.
2015-09-01
In astronomical applications of machine learning, the distribution of objects used for building a model is often different from the distribution of the objects the model is later applied to. This is known as sample selection bias, which is a major challenge for statistical inference as one can no longer assume that the labeled training data are representative. To address this issue, one can re-weight the labeled training patterns to match the distribution of unlabeled data that are available already in the training phase. There are many examples in practice where this strategy yielded good results, but estimating the weights reliably from a finite sample is challenging. We consider an efficient nearest neighbor density ratio estimator that can exploit large samples to increase the accuracy of the weight estimates. To solve the problem of choosing the right neighborhood size, we propose to use cross-validation on a model selection criterion that is unbiased under covariate shift. The resulting algorithm is our method of choice for density ratio estimation when the feature space dimensionality is small and sample sizes are large. The approach is simple and, because of the model selection, robust. We empirically find that it is on a par with established kernel-based methods on relatively small regression benchmark datasets. However, when applied to large-scale photometric redshift estimation, our approach outperforms the state-of-the-art.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lutes, Robert G.; Neubauer, Casey C.; Haack, Jereme N.
2015-03-31
The Department of Energy’s (DOE’s) Building Technologies Office (BTO) is supporting the development of an open-source software tool for analyzing building energy and operational data: OpenEIS (open energy information system). This tool addresses the problems of both owners of building data and developers of tools to analyze this data. Building owners and managers have data but lack the tools to analyze it while tool developers lack data in a common format to ease development of reusable data analysis tools. This document is intended for developers of applications and explains the mechanisms for building analysis applications, accessing data, and displaying datamore » using a visualization from the included library. A brief introduction to the visualizations can be used as a jumping off point for developers familiar with JavaScript to produce their own. Several example applications are included which can be used along with this document to implement algorithms for performing energy data analysis.« less
Research on the Application of GRC Material in Exhibition Decoration Engineering
NASA Astrophysics Data System (ADS)
Cai, Yan
2018-03-01
Glass fiber reinforced cement (GRC) is a kind of new building material which is based on cement and take the alkali resistant glass fiber as reinforcing material. It is mainly used in building decoration project and it has many advantages like environmental protection, economical, practical modeling and others. This paper mainly studies the concrete application of GRC material in exhibition building decoration project.
Managed traffic evacuation using distributed sensor processing
NASA Astrophysics Data System (ADS)
Ramuhalli, Pradeep; Biswas, Subir
2005-05-01
This paper presents an integrated sensor network and distributed event processing architecture for managed in-building traffic evacuation during natural and human-caused disasters, including earthquakes, fire and biological/chemical terrorist attacks. The proposed wireless sensor network protocols and distributed event processing mechanisms offer a new distributed paradigm for improving reliability in building evacuation and disaster management. The networking component of the system is constructed using distributed wireless sensors for measuring environmental parameters such as temperature, humidity, and detecting unusual events such as smoke, structural failures, vibration, biological/chemical or nuclear agents. Distributed event processing algorithms will be executed by these sensor nodes to detect the propagation pattern of the disaster and to measure the concentration and activity of human traffic in different parts of the building. Based on this information, dynamic evacuation decisions are taken for maximizing the evacuation speed and minimizing unwanted incidents such as human exposure to harmful agents and stampedes near exits. A set of audio-visual indicators and actuators are used for aiding the automated evacuation process. In this paper we develop integrated protocols, algorithms and their simulation models for the proposed sensor networking and the distributed event processing framework. Also, efficient harnessing of the individually low, but collectively massive, processing abilities of the sensor nodes is a powerful concept behind our proposed distributed event processing algorithms. Results obtained through simulation in this paper are used for a detailed characterization of the proposed evacuation management system and its associated algorithmic components.
URBANopt Advanced Analytics Platform | Buildings | NREL
-use districts, different buildings may peak in energy consumption at different times. In certain cases applications. Districts, Neighborhoods, and Campuses For districts with different building types and mixed-use buildings? How does energy consumption vary depending on different building efficiency scenarios (e.g
Developing Distributed Collaboration Systems at NASA: A Report from the Field
NASA Technical Reports Server (NTRS)
Becerra-Fernandez, Irma; Stewart, Helen; Knight, Chris; Norvig, Peter (Technical Monitor)
2001-01-01
Web-based collaborative systems have assumed a pivotal role in the information systems development arena. While business to customers (B-to-C) and business to business (B-to-B) electronic commerce systems, search engines, and chat sites are the focus of attention, web-based systems span the gamut of information systems that were traditionally confined to internal organizational client server networks. For example, the Domino Application Server allows Lotus Notes (trademarked) uses to build collaborative intranet applications and mySAP.com (trademarked) enables web portals and e-commerce applications for SAP users. This paper presents the experiences in the development of one such system: Postdoc, a government off-the-shelf web-based collaborative environment. Issues related to the design of web-based collaborative information systems, including lessons learned from the development and deployment of the system as well as measured performance, are presented in this paper. Finally, the limitations of the implementation approach as well as future plans are presented as well.
Measure Guideline. Combination Forced-Air Space and Tankless Domestic Hot Water Heating Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rudd, Armin
2012-08-01
This document describes design and application guidance for combination space and tankless domestic hot water heating systems (combination systems) used in residential buildings, based on field evaluation, testing, and industry meetings conducted by Building Science Corporation. As residential building enclosure improvements continue to drive heating loads down, using the same water heating equipment for both space heating and domestic water heating becomes attractive from an initial cost and space-saving perspective. This topic is applicable to single- and multi-family residential buildings, both new and retrofitted.
Simplified Distributed Computing
NASA Astrophysics Data System (ADS)
Li, G. G.
2006-05-01
The distributed computing runs from high performance parallel computing, GRID computing, to an environment where idle CPU cycles and storage space of numerous networked systems are harnessed to work together through the Internet. In this work we focus on building an easy and affordable solution for computationally intensive problems in scientific applications based on existing technology and hardware resources. This system consists of a series of controllers. When a job request is detected by a monitor or initialized by an end user, the job manager launches the specific job handler for this job. The job handler pre-processes the job, partitions the job into relative independent tasks, and distributes the tasks into the processing queue. The task handler picks up the related tasks, processes the tasks, and puts the results back into the processing queue. The job handler also monitors and examines the tasks and the results, and assembles the task results into the overall solution for the job request when all tasks are finished for each job. A resource manager configures and monitors all participating notes. A distributed agent is deployed on all participating notes to manage the software download and report the status. The processing queue is the key to the success of this distributed system. We use BEA's Weblogic JMS queue in our implementation. It guarantees the message delivery and has the message priority and re-try features so that the tasks never get lost. The entire system is built on the J2EE technology and it can be deployed on heterogeneous platforms. It can handle algorithms and applications developed in any languages on any platforms. J2EE adaptors are provided to manage and communicate the existing applications to the system so that the applications and algorithms running on Unix, Linux and Windows can all work together. This system is easy and fast to develop based on the industry's well-adopted technology. It is highly scalable and heterogeneous. It is an open system and any number and type of machines can join the system to provide the computational power. This asynchronous message-based system can achieve second of response time. For efficiency, communications between distributed tasks are often done at the start and end of the tasks but intermediate status of the tasks can also be provided.
Optimal service distribution in WSN service system subject to data security constraints.
Wu, Zhao; Xiong, Naixue; Huang, Yannong; Gu, Qiong
2014-08-04
Services composition technology provides a flexible approach to building Wireless Sensor Network (WSN) Service Applications (WSA) in a service oriented tasking system for WSN. Maintaining the data security of WSA is one of the most important goals in sensor network research. In this paper, we consider a WSN service oriented tasking system in which the WSN Services Broker (WSB), as the resource management center, can map the service request from user into a set of atom-services (AS) and send them to some independent sensor nodes (SN) for parallel execution. The distribution of ASs among these SNs affects the data security as well as the reliability and performance of WSA because these SNs can be of different and independent specifications. By the optimal service partition into the ASs and their distribution among SNs, the WSB can provide the maximum possible service reliability and/or expected performance subject to data security constraints. This paper proposes an algorithm of optimal service partition and distribution based on the universal generating function (UGF) and the genetic algorithm (GA) approach. The experimental analysis is presented to demonstrate the feasibility of the suggested algorithm.
Optimal Service Distribution in WSN Service System Subject to Data Security Constraints
Wu, Zhao; Xiong, Naixue; Huang, Yannong; Gu, Qiong
2014-01-01
Services composition technology provides a flexible approach to building Wireless Sensor Network (WSN) Service Applications (WSA) in a service oriented tasking system for WSN. Maintaining the data security of WSA is one of the most important goals in sensor network research. In this paper, we consider a WSN service oriented tasking system in which the WSN Services Broker (WSB), as the resource management center, can map the service request from user into a set of atom-services (AS) and send them to some independent sensor nodes (SN) for parallel execution. The distribution of ASs among these SNs affects the data security as well as the reliability and performance of WSA because these SNs can be of different and independent specifications. By the optimal service partition into the ASs and their distribution among SNs, the WSB can provide the maximum possible service reliability and/or expected performance subject to data security constraints. This paper proposes an algorithm of optimal service partition and distribution based on the universal generating function (UGF) and the genetic algorithm (GA) approach. The experimental analysis is presented to demonstrate the feasibility of the suggested algorithm. PMID:25093346
A Local Scalable Distributed Expectation Maximization Algorithm for Large Peer-to-Peer Networks
NASA Technical Reports Server (NTRS)
Bhaduri, Kanishka; Srivastava, Ashok N.
2009-01-01
This paper offers a local distributed algorithm for expectation maximization in large peer-to-peer environments. The algorithm can be used for a variety of well-known data mining tasks in a distributed environment such as clustering, anomaly detection, target tracking to name a few. This technology is crucial for many emerging peer-to-peer applications for bioinformatics, astronomy, social networking, sensor networks and web mining. Centralizing all or some of the data for building global models is impractical in such peer-to-peer environments because of the large number of data sources, the asynchronous nature of the peer-to-peer networks, and dynamic nature of the data/network. The distributed algorithm we have developed in this paper is provably-correct i.e. it converges to the same result compared to a similar centralized algorithm and can automatically adapt to changes to the data and the network. We show that the communication overhead of the algorithm is very low due to its local nature. This monitoring algorithm is then used as a feedback loop to sample data from the network and rebuild the model when it is outdated. We present thorough experimental results to verify our theoretical claims.
Distributed Learning Metadata Standards
ERIC Educational Resources Information Center
McClelland, Marilyn
2004-01-01
Significant economies can be achieved in distributed learning systems architected with a focus on interoperability and reuse. The key building blocks of an efficient distributed learning architecture are the use of standards and XML technologies. The goal of plug and play capability among various components of a distributed learning system…
NASA Astrophysics Data System (ADS)
Balcas, J.; Hendricks, T. W.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.
2017-10-01
The SDN Next Generation Integrated Architecture (SDN-NGeNIA) project addresses some of the key challenges facing the present and next generations of science programs in HEP, astrophysics, and other fields, whose potential discoveries depend on their ability to distribute, process and analyze globally distributed Petascale to Exascale datasets. The SDN-NGenIA system under development by Caltech and partner HEP and network teams is focused on the coordinated use of network, computing and storage infrastructures, through a set of developments that build on the experience gained in recently completed and previous projects that use dynamic circuits with bandwidth guarantees to support major network flows, as demonstrated across LHC Open Network Environment [1] and in large scale demonstrations over the last three years, and recently integrated with PhEDEx and Asynchronous Stage Out data management applications of the CMS experiment at the Large Hadron Collider. In addition to the general program goals of supporting the network needs of the LHC and other science programs with similar needs, a recent focus is the use of the Leadership HPC facility at Argonne National Lab (ALCF) for data intensive applications.
7. WEYMOUTH FILTRATION PLANT, BUILDING 1 INTERIOR: LA VERNE CONTROL ...
7. WEYMOUTH FILTRATION PLANT, BUILDING 1 INTERIOR: LA VERNE CONTROL ROOM, REGULATES DISTRIBUTION OF WATER, CONTROLS POWER HOUSES. - F. E. Weymouth Filtration Plant, 700 North Moreno Avenue, La Verne, Los Angeles County, CA
Modeling and Optimization of Commercial Buildings and Stationary Fuel Cell Systems (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ainscough, C.; McLarty, D.; Sullivan, R.
2013-10-01
This presentation describes the Distributed Generation Building Energy Assessment Tool (DG-BEAT) developed by the National Renewable Energy Laboratory and the University of California Irvine. DG-BEAT is designed to allow stakeholders to assess the economics of installing stationary fuel cell systems in a variety of building types in the United States.
NASA Astrophysics Data System (ADS)
Baish, A. S.; Vivoni, E. R.; Payan, J. G.; Robles-Morua, A.; Basile, G. M.
2011-12-01
A distributed hydrologic model can help bring consensus among diverse stakeholders in regional flood planning by producing quantifiable sets of alternative futures. This value is acute in areas with high uncertainties in hydrologic conditions and sparse observations. In this study, we conduct an application of the Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator (tRIBS) in the Santa Catarina basin of Nuevo Leon, Mexico, where Hurricane Alex in July 2010 led to catastrophic flooding of the capital city of Monterrey. Distributed model simulations utilize best-available information on the regional topography, land cover, and soils obtained from Mexican government agencies or analysis of remotely-sensed imagery from MODIS and ASTER. Furthermore, we developed meteorological forcing for the flood event based on multiple data sources, including three local gauge networks, satellite-based estimates from TRMM and PERSIANN, and the North American Land Data Assimilation System (NLDAS). Remotely-sensed data allowed us to quantify rainfall distributions in the upland, rural portions of the Santa Catarina that are sparsely populated and ungauged. Rural areas had significant contributions to the flood event and as a result were considered by stakeholders for flood control measures, including new reservoirs and upland vegetation management. Participatory modeling workshops with the stakeholders revealed a disconnect between urban and rural populations in regard to understanding the hydrologic conditions of the flood event and the effectiveness of existing and potential flood control measures. Despite these challenges, the use of the distributed flood forecasts developed within this participatory framework facilitated building consensus among diverse stakeholders and exploring alternative futures in the basin.
A capacity-building conceptual framework for public health nutrition practice.
Baillie, Elizabeth; Bjarnholt, Christel; Gruber, Marlies; Hughes, Roger
2009-08-01
To describe a conceptual framework to assist in the application of capacity-building principles to public health nutrition practice. A review of the literature and consideration of the determinants of effective public health nutrition practice has been used to inform the development of a conceptual framework for capacity building in the context of public health nutrition practice. The limited literature supports a greater integration and application of capacity-building strategies and principles in public health nutrition practice, and that this application should be overt and strategic. A framework is proposed that identifies a number of determinants of capacity for effective public health nutrition action. The framework represents the key foundations for building capacity including leadership, resourcing and intelligence. Five key strategic domains supported by these foundation elements, including partnerships, organisational development, project management quality, workforce development and community development, are proposed. This framework can be used to assist the systematic assessment, development and evaluation of capacity-building activity within public health nutrition practice. Capacity building is a strategy within public health nutrition practice that needs to be central to public health nutrition intervention management. The present paper defines, contextualises and outlines a framework for integrating and making explicit the importance of capacity building within public health nutrition practice at many levels.
Model-Driven Energy Intelligence
2015-03-01
building information model ( BIM ) for operations...estimate of the potential impact on energy performance at Fort Jackson. 15. SUBJECT TERMS Building Information Modeling ( BIM ), Energy, ECMs, monitoring...dimensional AHU Air Handling Unit API Application Programming Interface BIM building information model BLCC Building Life Cycle Cost
25 CFR 256.10 - When do I qualify for Category C assistance?
Code of Federal Regulations, 2011 CFR
2011-04-01
... living The dwelling cannot be brought up to applicable building code standards and to standard housing... applicable building code standards and to standard housing condition for $35,000 or less. You do not own a...
25 CFR 256.10 - When do I qualify for Category C assistance?
Code of Federal Regulations, 2014 CFR
2014-04-01
... The dwelling cannot be brought up to applicable building code standards and to standard housing... applicable building code standards and to standard housing condition for $35,000 or less. You do not own a...
25 CFR 256.10 - When do I qualify for Category C assistance?
Code of Federal Regulations, 2013 CFR
2013-04-01
... living The dwelling cannot be brought up to applicable building code standards and to standard housing... applicable building code standards and to standard housing condition for $35,000 or less. You do not own a...
25 CFR 256.10 - When do I qualify for Category C assistance?
Code of Federal Regulations, 2012 CFR
2012-04-01
... living The dwelling cannot be brought up to applicable building code standards and to standard housing... applicable building code standards and to standard housing condition for $35,000 or less. You do not own a...
25 CFR 256.10 - When do I qualify for Category C assistance?
Code of Federal Regulations, 2010 CFR
2010-04-01
... living The dwelling cannot be brought up to applicable building code standards and to standard housing... applicable building code standards and to standard housing condition for $35,000 or less. You do not own a...
Modelling of Rail Vehicles and Track for Calculation of Ground-Vibration Transmission Into Buildings
NASA Astrophysics Data System (ADS)
Hunt, H. E. M.
1996-05-01
A methodology for the calculation of vibration transmission from railways into buildings is presented. The method permits existing models of railway vehicles and track to be incorporated and it has application to any model of vibration transmission through the ground. Special attention is paid to the relative phasing between adjacent axle-force inputs to the rail, so that vibration transmission may be calculated as a random process. The vehicle-track model is used in conjunction with a building model of infinite length. The tracking and building are infinite and parallel to each other and forces applied are statistically stationary in space so that vibration levels at any two points along the building are the same. The methodology is two-dimensional for the purpose of application of random process theory, but fully three-dimensional for calculation of vibration transmission from the track and through the ground into the foundations of the building. The computational efficiency of the method will interest engineers faced with the task of reducing vibration levels in buildings. It is possible to assess the relative merits of using rail pads, under-sleeper pads, ballast mats, floating-slab track or base isolation for particular applications.
NASA Astrophysics Data System (ADS)
Ullah, Irshad; Baharom, MNR; Ahmed, H.; Luqman, HM.; Zainal, Zainab
2017-11-01
Protection against lightning is always a challenging job for the researcher. The consequences due to lightning on different building shapes needs a comprehensive knowledge in order to provide the information to the common man. This paper is mainly concern with lightning pattern when it strikes on the building with different shape. The work is based on the practical experimental work in high voltage laboratory. Different shapes of the scaled structures have been selected in order to investigate the equal distribution of lightning voltage. The equal distribution of lightning voltage will provide the maximum probability of lightning strike on air terminal of the selected shapes. Building shapes have a very important role in lightning protection. The shapes of the roof tops have different geometry and the Franklin rod installation is also varies with changing the shape of the roof top. According to the ambient weather condition of Malaysia high voltage impulse is applied on the lightning rod installed on different geometrical shape. The equal distribution of high voltage impulse is obtained as the geometry of the scaled structure is identical and the air gap for all the tested object is kept the same. This equal distribution of the lightning voltage also proves that the probability of lightning strike is on the corner and the edges of the building structure.
Monitor Network Traffic with Packet Capture (pcap) on an Android Device
2015-09-01
administrative privileges . Under the current design Android development requirement, an Android Graphical User Interface (GUI) application cannot directly...build an Android application to monitor network traffic using open source packet capture (pcap) libraries. 15. SUBJECT TERMS ELIDe, Android , pcap 16...Building Application with Native Codes 5 8.1 Calling Native Codes Using JNI 5 8.2 Calling Native Codes from an Android Application 8 9. Retrieve Live
An indoor air quality evaluation in a residential retrofit project using spray polyurethane foam.
Tian, Shen; Ecoff, Scott; Sebroski, John; Miller, Jason; Rickenbacker, Harold; Bilec, Melissa
2018-05-01
Understanding of indoor air quality (IAQ) during and after spray polyurethane foam (SPF) application is essential to protect the health of both workers and building occupants. Previous efforts such as field monitoring, micro-chamber/spray booth emission studies, and fate/transport modeling have been conducted to understand the chemical exposure of SPF and guide risk mitigation strategies. However, each type of research has its limitation and can only reveal partial information on the relationship between SPF and IAQ. A comprehensive study is truly needed to integrate the experimental design and analytical testing methods in the field/chamber studies with the mathematical tools employed in the modeling studies. This study aims to bridge this gap and provide a more comprehensive understanding on the impact of SPF to IAQ. The field sampling plan of this research aims to evaluate the airborne concentrations of methylene diphenyl diisocyanate (MDI), formaldehyde, acetaldehyde, propionaldehyde, tris(1-chlor-2-propyl)phosphate (TCPP), trans-1-chloro-3,3,3-trifluoropropene (Solstice TM ), and airborne particles. Modifications to existing MDI sampling and analytical methods were made so that level of quantification was improved. In addition, key fate and transport modeling input parameters such as air changes per hour and airborne particle size distribution were measured. More importantly, TCPP accumulation onto materials was evaluated, which is important to study the fate and transport of semi-volatile organic compounds. The IAQ results showed that after spray application was completed in the entire building, airborne concentrations decreased for all chemicals monitored. However, it is our recommendation that during SPF application, no one should return to the application site without proper personal protection equipment as long as there are active spray activities in the building. The comparison between this field study and a recent chamber study proved surface sorption and particle deposition is an important factor in determining the fate of airborne TCPP. The study also suggests the need for further evaluation by employing mathematical models, proving the data generated in this work as informative to industry and the broader scientific community.
Calculating Least Risk Paths in 3d Indoor Space
NASA Astrophysics Data System (ADS)
Vanclooster, A.; De Maeyer, Ph.; Fack, V.; Van de Weghe, N.
2013-08-01
Over the last couple of years, research on indoor environments has gained a fresh impetus; more specifically applications that support navigation and wayfinding have become one of the booming industries. Indoor navigation research currently covers the technological aspect of indoor positioning and the modelling of indoor space. The algorithmic development to support navigation has so far been left mostly untouched, as most applications mainly rely on adapting Dijkstra's shortest path algorithm to an indoor network. However, alternative algorithms for outdoor navigation have been proposed adding a more cognitive notion to the calculated paths and as such adhering to the natural wayfinding behaviour (e.g. simplest paths, least risk paths). These algorithms are currently restricted to outdoor applications. The need for indoor cognitive algorithms is highlighted by a more challenged navigation and orientation due to the specific indoor structure (e.g. fragmentation, less visibility, confined areas…). As such, the clarity and easiness of route instructions is of paramount importance when distributing indoor routes. A shortest or fastest path indoors not necessarily aligns with the cognitive mapping of the building. Therefore, the aim of this research is to extend those richer cognitive algorithms to three-dimensional indoor environments. More specifically for this paper, we will focus on the application of the least risk path algorithm of Grum (2005) to an indoor space. The algorithm as proposed by Grum (2005) is duplicated and tested in a complex multi-storey building. The results of several least risk path calculations are compared to the shortest paths in indoor environments in terms of total length, improvement in route description complexity and number of turns. Several scenarios are tested in this comparison: paths covering a single floor, paths crossing several building wings and/or floors. Adjustments to the algorithm are proposed to be more aligned to the specific structure of indoor environments (e.g. no turn restrictions, restricted usage of rooms, vertical movement) and common wayfinding strategies indoors. In a later stage, other cognitive algorithms will be implemented and tested in both an indoor and combined indoor-outdoor setting, in an effort to improve the overall user experience during navigation in indoor environments.
SOLAR EFFECTS ON BUILDING DESIGN.
ERIC Educational Resources Information Center
Building Research Inst., Inc., Washington, DC.
A REPORT OF A PROGRAM HELD AS PART OF THE BUILDING RESEARCH INSTITUTE 1962 SPRING CONFERENCE ON THE SOLAR EFFECTS ON BUILDING DESIGN. TOPICS DISCUSSED ARE--(1) SOLAR ENERGY DATA APPLICABLE TO BUILDING DESIGN, (2) THERMAL EFFECTS OF SOLAR RADIATION ON MAN, (3) SOLAR EFFECTS ON ARCHITECTURE, (4) SOLAR EFFECTS ON BUILDING COSTS, (5) SELECTION OF…
SUNREL Applications | Buildings | NREL
used by the building design team for energy analysis. Zion National Park Visitor's Center Grand Canyon National Park Bookstore NREL Thermal Test Facility NREL Wind Site Entrance Building DPD Office Building Performance Analysis of a High-Mass Residential Building Van Geet residence The Van Geet Off-Grid Home: An
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-07-01
The module outlines the regulatory history and purpose of containment buildings. It disucsses the relationship between LDR and containment buildings. It summarizes the design and operating standard applicable to containment buildings and describes the relationship between generator accumulation standards and containment buildings.
NASA Technical Reports Server (NTRS)
Benson, H. E.; Monford, L. G., Jr.
1976-01-01
The results of a study of the application of a modular integrated utility system to six typical building types are compared with the application of a conventional utility system to the same facilities. The effects of varying the size and climatic location of the buildings and the size of the powerplants are presented. Construction details of the six building types (garden apartments, a high rise office building, high rise apartments, a shopping center, a high school, and a hospital) and typical site and floor plans are provided. The environmental effects, the unit size determination, and the market potential are discussed. The cost effectiveness of the various design options is not considered.
Achieving High Performance With TCP Over 40 GbE on NUMA Architectures for CMS Data Acquisition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bawej, Tomasz; et al.
2014-01-01
TCP and the socket abstraction have barely changed over the last two decades, but at the network layer there has been a giant leap from a few megabits to 100 gigabits in bandwidth. At the same time, CPU architectures have evolved into the multicore era and applications are expected to make full use of all available resources. Applications in the data acquisition domain based on the standard socket library running in a Non-Uniform Memory Access (NUMA) architecture are unable to reach full efficiency and scalability without the software being adequately aware about the IRQ (Interrupt Request), CPU and memory affinities.more » During the first long shutdown of LHC, the CMS DAQ system is going to be upgraded for operation from 2015 onwards and a new software component has been designed and developed in the CMS online framework for transferring data with sockets. This software attempts to wrap the low-level socket library to ease higher-level programming with an API based on an asynchronous event driven model similar to the DAT uDAPL API. It is an event-based application with NUMA optimizations, that allows for a high throughput of data across a large distributed system. This paper describes the architecture, the technologies involved and the performance measurements of the software in the context of the CMS distributed event building.« less
NASA Astrophysics Data System (ADS)
Benedetti, Marcello; Realpe-Gómez, John; Biswas, Rupak; Perdomo-Ortiz, Alejandro
2016-08-01
An increase in the efficiency of sampling from Boltzmann distributions would have a significant impact on deep learning and other machine-learning applications. Recently, quantum annealers have been proposed as a potential candidate to speed up this task, but several limitations still bar these state-of-the-art technologies from being used effectively. One of the main limitations is that, while the device may indeed sample from a Boltzmann-like distribution, quantum dynamical arguments suggest it will do so with an instance-dependent effective temperature, different from its physical temperature. Unless this unknown temperature can be unveiled, it might not be possible to effectively use a quantum annealer for Boltzmann sampling. In this work, we propose a strategy to overcome this challenge with a simple effective-temperature estimation algorithm. We provide a systematic study assessing the impact of the effective temperatures in the learning of a special class of a restricted Boltzmann machine embedded on quantum hardware, which can serve as a building block for deep-learning architectures. We also provide a comparison to k -step contrastive divergence (CD-k ) with k up to 100. Although assuming a suitable fixed effective temperature also allows us to outperform one-step contrastive divergence (CD-1), only when using an instance-dependent effective temperature do we find a performance close to that of CD-100 for the case studied here.
A generalized strategy for building resident database interfaces
NASA Technical Reports Server (NTRS)
Moroh, Marsha; Wanderman, Ken
1990-01-01
A strategy for building resident interfaces to host heterogeneous distributed data base management systems is developed. The strategy is used to construct several interfaces. A set of guidelines is developed for users to construct their own interfaces.
Ventilation System Effectiveness and Tested Indoor Air Quality Impacts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rudd, Armin; Bergey, Daniel
In this project, Building America research team Building Science Corporation tested the effectiveness of ventilation systems at two unoccupied, single-family, detached lab homes at the University of Texas - Tyler. Five ventilation system tests were conducted with various whole-building ventilation systems. Multizone fan pressurization testing characterized building and zone enclosure leakage. PFT testing showed multizone air change rates and interzonal airflow. Cumulative particle counts for six particle sizes, and formaldehyde and other Top 20 VOC concentrations were measured in multiple zones. The testing showed that single-point exhaust ventilation was inferior as a whole-house ventilation strategy. This was because the sourcemore » of outside air was not direct from outside, the ventilation air was not distributed, and no provision existed for air filtration. Indoor air recirculation by a central air distribution system can help improve the exhaust ventilation system by way of air mixing and filtration. In contrast, the supply and balanced ventilation systems showed that there is a significant benefit to drawing outside air from a known outside location, and filtering and distributing that air. Compared to the exhaust systems, the CFIS and ERV systems showed better ventilation air distribution and lower concentrations of particulates, formaldehyde and other VOCs. System improvement percentages were estimated based on four system factor categories: balance, distribution, outside air source, and recirculation filtration. Recommended system factors could be applied to reduce ventilation fan airflow rates relative to ASHRAE Standard 62.2 to save energy and reduce moisture control risk in humid climates. HVAC energy savings were predicted to be 8-10%, or $50-$75/year.« less
Lighting system combining daylight concentrators and an artificial source
Bornstein, Jonathan G.; Friedman, Peter S.
1985-01-01
A combined lighting system for a building interior includes a stack of luminescent solar concentrators (LSC), an optical conduit made of preferably optical fibers for transmitting daylight from the LSC stack, a collimating lens set at an angle, a fixture for receiving the daylight at one end and for distributing the daylight as illumination inside the building, an artificial light source at the other end of the fixture for directing artifical light into the fixture for distribution as illumination inside the building, an automatic dimmer/brightener for the artificial light source, and a daylight sensor positioned near to the LSC stack for controlling the automatic dimmer/brightener in response to the daylight sensed. The system also has a reflector positioned behind the artificial light source and a fan for exhausting heated air out of the fixture during summer and for forcing heated air into the fixture for passage into the building interior during winter.
75 FR 68806 - Statement of Organization, Functions and Delegations of Authority
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-09
... Agency business applications architectures, the engineering of business processes, the building and... architecture, engineers technology for business processes, builds, deploys, maintains and manages enterprise systems and data collections efforts; (5) applies business applications architecture to process specific...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasari, Venkat; Sadlier, Ronald J; Geerhart, Mr. Billy
Well-defined and stable quantum networks are essential to realize functional quantum applications. Quantum networks are complex and must use both quantum and classical channels to support quantum applications like QKD, teleportation, and superdense coding. In particular, the no-cloning theorem prevents the reliable copying of quantum signals such that the quantum and classical channels must be highly coordinated using robust and extensible methods. We develop new network abstractions and interfaces for building programmable quantum networks. Our approach leverages new OpenFlow data structures and table type patterns to build programmable quantum networks and to support quantum applications.
NASA Astrophysics Data System (ADS)
Baker, N. R.; Donakowski, T. D.; Foster, R. B.; Sala, D. L.; Tison, R. R.; Whaley, T. P.; Yudow, B. D.; Swenson, P. F.
1980-01-01
The heat actuated heat pump centered integrated community energy system (HAHP-ICES) is described. The system utilizes a gas fired, engine-driven, heat pump and commercial buildings, and offers several advantages over the more conventional equipment it is intended to supplant. The general nonsite specific application assumes a hypothetical community of one 59,000 cu ft office building and five 24 unit, low rise apartment buildings located in a region with a climate similar to Chicago. Various sensitivity analyses are performed and through which the performance characteristics of the HAHP are explored. The results provided the selection criteria for the site specific application of the HAHP-ICES concept to a real world community. The site-specific community consists of: 42 town houses; five 120 unit, low rise apartment buildings; five 104 unit high rise apartment buildings; one 124,000 cu ft office building; and a single 135,000 cu ft retail building.
Behavior of sandwich panels in a fire
NASA Astrophysics Data System (ADS)
Chelekova, Eugenia
2018-03-01
For the last decades there emerged a vast number of buildings and structures erected with the use of sandwich panels. The field of application for this construction material is manifold, especially in the construction of fire and explosion hazardous buildings. In advanced evacu-ation time calculation methods the coefficient of heat losses is defined with dire regard to fire load features, but without account to thermal and physical characteristics of building envelopes, or, to be exact, it is defined for brick and concrete walls with gross heat capacity. That is why the application of the heat loss coefficient expression obtained for buildings of sandwich panels is impossible because of different heat capacity of these panels from the heat capacities of brick and concrete building envelopes. The article conducts an analysis and calculation of the heal loss coefficient for buildings and structures of three layer sandwich panels as building envelopes.
Robust inference in the negative binomial regression model with an application to falls data.
Aeberhard, William H; Cantoni, Eva; Heritier, Stephane
2014-12-01
A popular way to model overdispersed count data, such as the number of falls reported during intervention studies, is by means of the negative binomial (NB) distribution. Classical estimating methods are well-known to be sensitive to model misspecifications, taking the form of patients falling much more than expected in such intervention studies where the NB regression model is used. We extend in this article two approaches for building robust M-estimators of the regression parameters in the class of generalized linear models to the NB distribution. The first approach achieves robustness in the response by applying a bounded function on the Pearson residuals arising in the maximum likelihood estimating equations, while the second approach achieves robustness by bounding the unscaled deviance components. For both approaches, we explore different choices for the bounding functions. Through a unified notation, we show how close these approaches may actually be as long as the bounding functions are chosen and tuned appropriately, and provide the asymptotic distributions of the resulting estimators. Moreover, we introduce a robust weighted maximum likelihood estimator for the overdispersion parameter, specific to the NB distribution. Simulations under various settings show that redescending bounding functions yield estimates with smaller biases under contamination while keeping high efficiency at the assumed model, and this for both approaches. We present an application to a recent randomized controlled trial measuring the effectiveness of an exercise program at reducing the number of falls among people suffering from Parkinsons disease to illustrate the diagnostic use of such robust procedures and their need for reliable inference. © 2014, The International Biometric Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeLaski, A.; Gauthier, J.; Shugars, J.
Distribution transformers offer a largely untapped opportunity for efficiency improvements in buildings. Application of energy-efficient equipment can reduce transformer losses by about 20%, substantially cutting a facility's total electricity bill and offering typical paybacks less than three years. Since nearly all of the electricity powering the commercial and industrial sectors is stepped down in voltage by facility-owned distribution transformers, broad application of energy-efficient equipment will lead to huge economy-wide energy and dollar savings as well as associated environmental benefits. This opportunity has led to a multi-party coordinated effort that offers a new model for national partnerships to pursue market transformation.more » The model, called the Informal Collaborative Model for the purposes of this paper, is characterized by voluntary commitments of multiple stakeholders to carry out key market interventions in a coordinated fashion, but without pooling resources or control. Collaborative participants are joined by a common interest in establishing and expanding the market for a new product, service, or practice that will yield substantial energy savings. This paper summarizes the technical efficiency opportunity available in distribution transformers; discusses the market barriers to widespread adoption of energy-efficient transformers; and details an overall market transformation strategy to address the identified market barriers. The respective roles of each of the diverse players--manufacturers, government agencies, and utility and regional energy efficiency programs--are given particular attention. Each of the organizations involved brings a particular set of tools and capabilities for addressing the market barriers to more efficient transformers.« less
Hybrid Modeling Based on Scsg-Br and Orthophoto
NASA Astrophysics Data System (ADS)
Zhou, G.; Huang, Y.; Yue, T.; Li, X.; Huang, W.; He, C.; Wu, Z.
2018-05-01
With the development of digital city, digital applications are more and more widespread, while the urban buildings are more complex. Therefore, establishing an effective data model is the key to express urban building models accurately. In addition, the combination of 3D building model and remote sensing data become a trend to build digital city there are a large amount of data resulting in data redundancy. In order to solve the limitation of single modelling of constructive solid geometry (CSG), this paper presents a mixed modelling method based on SCSG-BR for urban buildings representation. On one hand, the improved CSG method, which is called as "Spatial CSG (SCSG)" representation method, is used to represent the exterior shape of urban buildings. On the other hand, the boundary representation (BR) method represents the topological relationship between geometric elements of urban building, in which the textures is considered as the attribute data of the wall and the roof of urban building. What's more, the method combined file database and relational database is used to manage the data of three-dimensional building model, which can decrease the complex processes in texture mapping. During the data processing, the least-squares algorithm with constraints is used to orthogonalize the building polygons and adjust the polygons topology to ensure the accuracy of the modelling data. Finally, this paper matches the urban building model with the corresponding orthophoto. This paper selects data of Denver, Colorado, USA to establish urban building realistic model. The results show that the SCSG-BR method can represent the topological relations of building more precisely. The organization and management of urban building model data reduce the redundancy of data and improve modelling speed. The combination of orthophoto and urban building model further strengthens the application in view analysis and spatial query, which enhance the scope of digital city applications.
2011-01-01
ER D C TR -0 6- 10 , S up pl em en t 2 Building Information Modeling ( BIM ) Roadmap Supplement 2 – BIM Implementation Plan for Military...release; distribution is unlimited. ERDC TR-06-10, Supplement 2 January 2011 Building Information Modeling ( BIM ) Roadmap Supplement 2 – BIM ...ERDC TR-06-10, Supplement 2 (January 2011) 2 Abstract: Building Information Modeling ( BIM ) technology provides the communities of practice in
Indoor environment program. 1994 annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daisey, J.M.
1995-04-01
Buildings use approximately one-third of the energy consumed in the United States. The potential energy savings derived from reduced infiltration and ventilation in buildings are substantial, since energy use associated with conditioning and distributing ventilation air is about 5.5 EJ per year. However, since ventilation is the dominant mechanism for removing pollutants from indoor sources, reduction of ventilation can have adverse effects on indoor air quality, and on the health, comfort, and productivity of building occupants. The Indoor Environment Program in LBL`s Energy and Environment Division was established in 1977 to conduct integrated research on ventilation, indoor air quality, andmore » energy use and efficiency in buildings for the purpose of reducing energy liabilities associated with airflows into, within, and out of buildings while maintaining or improving occupant health and comfort. The Program is part of LBL`s Center for Building Science. Research is conducted on building energy use and efficiency, ventilation and infiltration, and thermal distribution systems; on the nature, sources, transport, transformation, and deposition of indoor air pollutants; and on exposure and health risks associated with indoor air pollutants. Pollutants of particular interest include radon; volatile, semivolatile, and particulate organic compounds; and combustion emissions, including environmental tobacco smoke, CO, and NO{sub x}.« less
Low-Flow Liquid Desiccant Air-Conditioning: Demonstrated Performance and Cost Implications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kozubal, E.; Herrmann, L.; Deru, M.
2014-09-01
Cooling loads must be dramatically reduced when designing net-zero energy buildings or other highly efficient facilities. Advances in this area have focused primarily on reducing a building's sensible cooling loads by improving the envelope, integrating properly sized daylighting systems, adding exterior solar shading devices, and reducing internal heat gains. As sensible loads decrease, however, latent loads remain relatively constant, and thus become a greater fraction of the overall cooling requirement in highly efficient building designs, particularly in humid climates. This shift toward latent cooling is a challenge for heating, ventilation, and air-conditioning (HVAC) systems. Traditional systems typically dehumidify by firstmore » overcooling air below the dew-point temperature and then reheating it to an appropriate supply temperature, which requires an excessive amount of energy. Another dehumidification strategy incorporates solid desiccant rotors that remove water from air more efficiently; however, these systems are large and increase fan energy consumption due to the increased airside pressure drop of solid desiccant rotors. A third dehumidification strategy involves high flow liquid desiccant systems. These systems require a high maintenance separator to protect the air distribution system from corrosive desiccant droplet carryover and so are more commonly used in industrial applications and rarely in commercial buildings. Both solid desiccant systems and most high-flow liquid desiccant systems (if not internally cooled) add sensible energy which must later be removed to the air stream during dehumidification, through the release of sensible heat during the sorption process.« less
NASA Technical Reports Server (NTRS)
Hinke, Thomas H.
2004-01-01
Grid technology consists of middleware that permits distributed computations, data and sensors to be seamlessly integrated into a secure, single-sign-on processing environment. In &is environment, a user has to identify and authenticate himself once to the grid middleware, and then can utilize any of the distributed resources to which he has been,panted access. Grid technology allows resources that exist in enterprises that are under different administrative control to be securely integrated into a single processing environment The grid community has adopted commercial web services technology as a means for implementing persistent, re-usable grid services that sit on top of the basic distributed processing environment that grids provide. These grid services can then form building blocks for even more complex grid services. Each grid service is characterized using the Web Service Description Language, which provides a description of the interface and how other applications can access it. The emerging Semantic grid work seeks to associates sufficient semantic information with each grid service such that applications wii1 he able to automatically select, compose and if necessary substitute available equivalent services in order to assemble collections of services that are most appropriate for a particular application. Grid technology has been used to provide limited support to various Earth and space science applications. Looking to the future, this emerging grid service technology can provide a cyberinfrastructures for both the Earth and space science communities. Groups within these communities could transform those applications that have community-wide applicability into persistent grid services that are made widely available to their respective communities. In concert with grid-enabled data archives, users could easily create complex workflows that extract desired data from one or more archives and process it though an appropriate set of widely distributed grid services discovered using semantic grid technology. As required, high-end computational resources could be drawn from available grid resource pools. Using grid technology, this confluence of data, services and computational resources could easily be harnessed to transform data from many different sources into a desired product that is delivered to a user's workstation or to a web portal though which it could be accessed by its intended audience.
NASA Astrophysics Data System (ADS)
Filgueira, R.; Ferreira da Silva, R.; Deelman, E.; Atkinson, M.
2016-12-01
We present the Data-Intensive workflows as a Service (DIaaS) model for enabling easy data-intensive workflow composition and deployment on clouds using containers. DIaaS model backbone is Asterism, an integrated solution for running data-intensive stream-based applications on heterogeneous systems, which combines the benefits of dispel4py with Pegasus workflow systems. The stream-based executions of an Asterism workflow are managed by dispel4py, while the data movement between different e-Infrastructures, and the coordination of the application execution are automatically managed by Pegasus. DIaaS combines Asterism framework with Docker containers to provide an integrated, complete, easy-to-use, portable approach to run data-intensive workflows on distributed platforms. Three containers integrate the DIaaS model: a Pegasus node, and an MPI and an Apache Storm clusters. Container images are described as Dockerfiles (available online at http://github.com/dispel4py/pegasus_dispel4py), linked to Docker Hub for providing continuous integration (automated image builds), and image storing and sharing. In this model, all required software (workflow systems and execution engines) for running scientific applications are packed into the containers, which significantly reduces the effort (and possible human errors) required by scientists or VRE administrators to build such systems. The most common use of DIaaS will be to act as a backend of VREs or Scientific Gateways to run data-intensive applications, deploying cloud resources upon request. We have demonstrated the feasibility of DIaaS using the data-intensive seismic ambient noise cross-correlation application (Figure 1). The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The application is submitted via Pegasus (Container1), and Phase1 and Phase2 are executed in the MPI (Container2) and Storm (Container3) clusters respectively. Although both phases could be executed within the same environment, this setup demonstrates the flexibility of DIaaS to run applications across e-Infrastructures. In summary, DIaaS delivers specialized software to execute data-intensive applications in a scalable, efficient, and robust manner reducing the engineering time and computational cost.
16. Perimeter acquisition radar building room #102, electrical equipment room; ...
16. Perimeter acquisition radar building room #102, electrical equipment room; the prime power distribution system. Excellent example of endulum-types shock isolation. The grey cabinet and barrel assemble is part of the polychlorinated biphenyl (PCB) retrofill project - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND
In campus location finder using mobile application services
NASA Astrophysics Data System (ADS)
Fai, Low Weng; Audah, Lukman
2017-09-01
Navigation services become very common in this era, the application include Google Map, Waze and etc. Although navigation application contains the main routing service in open area but not all of the buildings are recorded in the database. In this project, an application is made for the indoor and outdoor navigation in Universiti Tun Hussein Onn Malaysia (UTHM). It is used to help outsider and new incoming students by navigating them from their current location to destination using mobile application name "U Finder". Thunkable website has been used to build the application for outdoor and indoor navigation. Outdoor navigation is linked to the Google Map and indoor navigation is using the QR code for positioning and routing picture for navigation. The outdoor navigation can route user to the main faculties in UTHM and indoor navigation is only done for the G1 building in UTHM.
On the influence of latency estimation on dynamic group communication using overlays
NASA Astrophysics Data System (ADS)
Vik, Knut-Helge; Griwodz, Carsten; Halvorsen, Pål
2009-01-01
Distributed interactive applications tend to have stringent latency requirements and some may have high bandwidth demands. Many of them have also very dynamic user groups for which all-to-all communication is needed. In online multiplayer games, for example, such groups are determined through region-of-interest management in the application. We have investigated a variety of group management approaches for overlay networks in earlier work and shown that several useful tree heuristics exist. However, these heuristics require full knowledge of all overlay link latencies. Since this is not scalable, we investigate the effects that latency estimation techqniues have ton the quality of overlay tree constructions. We do this by evaluating one example of our group management approaches in Planetlab and examing how latency estimation techqniues influence their quality. Specifically, we investigate how two well-known latency estimation techniques, Vivaldi and Netvigator, affect the quality of tree building.
Intra-building telecommunications cabling standards for Sandia National Laboratories, New Mexico
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, R.L.
1993-08-01
This document establishes a working standard for all telecommunications cable installations at Sandia National Laboratories, New Mexico. It is based on recent national commercial cabling standards. The topics addressed are Secure and Open/Restricted Access telecommunications environments and both twisted-pair and optical-fiber components of communications media. Some of the state-of-the-art technologies that will be supported by the intrabuilding cable infrastructure are Circuit and Packet Switched Networks (PBX/5ESS Voice and Low-Speed Data), Local Area Networks (Ethernet, Token Ring, Fiber and Copper Distributed Data Interface), and Wide Area Networks (Asynchronous Transfer Mode). These technologies can be delivered to every desk and can transportmore » data at rates sufficient to support all existing applications (such as Voice, Text and graphics, Still Images, Full-motion Video), as well as applications to be defined in the future.« less
Optimizing CMS build infrastructure via Apache Mesos
NASA Astrophysics Data System (ADS)
Abdurachmanov, David; Degano, Alessandro; Elmer, Peter; Eulisse, Giulio; Mendez, David; Muzaffar, Shahzad
2015-12-01
The Offline Software of the CMS Experiment at the Large Hadron Collider (LHC) at CERN consists of 6M lines of in-house code, developed over a decade by nearly 1000 physicists, as well as a comparable amount of general use open-source code. A critical ingredient to the success of the construction and early operation of the WLCG was the convergence, around the year 2000, on the use of a homogeneous environment of commodity x86-64 processors and Linux. Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run Hadoop, Jenkins, Spark, Aurora, and other applications on a dynamically shared pool of nodes. We present how we migrated our continuous integration system to schedule jobs on a relatively small Apache Mesos enabled cluster and how this resulted in better resource usage, higher peak performance and lower latency thanks to the dynamic scheduling capabilities of Mesos.
10 CFR 455.131 - State ranking of grant applications.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) For technical assistance programs, buildings shall be ranked in descending priority based upon the... all buildings covered by eligible applications for: (1) Technical assistance programs for units of local government and public care institutions and (2) Technical assistance programs for schools and...
Engineering artificial machines from designable DNA materials for biomedical applications.
Qi, Hao; Huang, Guoyou; Han, Yulong; Zhang, Xiaohui; Li, Yuhui; Pingguan-Murphy, Belinda; Lu, Tian Jian; Xu, Feng; Wang, Lin
2015-06-01
Deoxyribonucleic acid (DNA) emerges as building bricks for the fabrication of nanostructure with complete artificial architecture and geometry. The amazing ability of DNA in building two- and three-dimensional structures raises the possibility of developing smart nanomachines with versatile controllability for various applications. Here, we overviewed the recent progresses in engineering DNA machines for specific bioengineering and biomedical applications.
Engineering Artificial Machines from Designable DNA Materials for Biomedical Applications
Huang, Guoyou; Han, Yulong; Zhang, Xiaohui; Li, Yuhui; Pingguan-Murphy, Belinda; Lu, Tian Jian; Xu, Feng
2015-01-01
Deoxyribonucleic acid (DNA) emerges as building bricks for the fabrication of nanostructure with complete artificial architecture and geometry. The amazing ability of DNA in building two- and three-dimensional structures raises the possibility of developing smart nanomachines with versatile controllability for various applications. Here, we overviewed the recent progresses in engineering DNA machines for specific bioengineering and biomedical applications. PMID:25547514
ERIC Educational Resources Information Center
Wu, Wei
2010-01-01
Building information modeling (BIM) and green building are currently two major trends in the architecture, engineering and construction (AEC) industry. This research recognizes the market demand for better solutions to achieve green building certification such as LEED in the United States. It proposes a new strategy based on the integration of BIM…
24 CFR 242.78 - Zoning, deed, and building restrictions.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Zoning, deed, and building... AUTHORITIES MORTGAGE INSURANCE FOR HOSPITALS Miscellaneous Requirements § 242.78 Zoning, deed, and building... to the project site, and shall comply with all applicable building and other governmental codes...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, R.T.; Coe, B.A.; Dick, J.D.
1981-01-01
Four state-owned building complexes ahve been evaluated within the city of Durango: The State Fish Hatchery, Fort Lewis College, new State Highway Department Building near the Bodo Industrial Park, and the National Guard Building. Three of the state facilities in Durango are evaluated for geothermal systems on thea ssumption of taking geothermal water from a trunk-line originating at the area northof Durango: State Fish Hatchery, Fort Lewis College and new State Highway Department Building. The National Guard Building is evaluated on the basis of a water-to-air heat pump, with warm water derived from a hypothetical shallow aquifer immediately below themore » building site. Two geothermal options were separately evaluated for Fort Lewis College: a central heat exchanger system for delivery of 145/sup 0/F heating water to the campus buildings and a central heat pump system for boosting the heating water to 200/sup 0/F prior to delivery to the buildings; both systems require the installation of a distribution piping network for the entire campus area. Retrofit engineering for the State Fish Hatchery provides for the installation of a small scale central distribution piping system to the several buildings, a central heat excanger coupled to the geothermal trunk line, and the use of various fan coil and unit heaters for space heating. An option is provided for discharge-mixing the geothermal water into the fish ponds and runs in order to raise the hatchery water temperature a couple degrees for increasing fish production and yield. The heating system for the new State Highway Department Building is redesigned to replace the natural-gas-fired forced-air furnaces with a heat exchanger, hot water fan coils and unit heaters.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Chao ..; Singh, Vijay P.; Mishra, Ashok K.
2013-02-06
This paper presents an improved brivariate mixed distribution, which is capable of modeling the dependence of daily rainfall from two distinct sources (e.g., rainfall from two stations, two consecutive days, or two instruments such as satellite and rain gauge). The distribution couples an existing framework for building a bivariate mixed distribution, the theory of copulae and a hybrid marginal distribution. Contributions of the improved distribution are twofold. One is the appropriate selection of the bivariate dependence structure from a wider admissible choice (10 candidate copula families). The other is the introduction of a marginal distribution capable of better representing lowmore » to moderate values as well as extremes of daily rainfall. Among several applications of the improved distribution, particularly presented here is its utility for single-site daily rainfall simulation. Rather than simulating rainfall occurrences and amounts separately, the developed generator unifies the two processes by generalizing daily rainfall as a Markov process with autocorrelation described by the improved bivariate mixed distribution. The generator is first tested on a sample station in Texas. Results reveal that the simulated and observed sequences are in good agreement with respect to essential characteristics. Then, extensive simulation experiments are carried out to compare the developed generator with three other alternative models: the conventional two-state Markov chain generator, the transition probability matrix model and the semi-parametric Markov chain model with kernel density estimation for rainfall amounts. Analyses establish that overall the developed generator is capable of reproducing characteristics of historical extreme rainfall events and is apt at extrapolating rare values beyond the upper range of available observed data. Moreover, it automatically captures the persistence of rainfall amounts on consecutive wet days in a relatively natural and easy way. Another interesting observation is that the recognized ‘overdispersion’ problem in daily rainfall simulation ascribes more to the loss of rainfall extremes than the under-representation of first-order persistence. The developed generator appears to be a sound option for daily rainfall simulation, especially in particular hydrologic planning situations when rare rainfall events are of great importance.« less
Article and method of forming an article
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lacy, Benjamin Paul; Kottilingam, Srikanth Chandrudu; Dutta, Sandip
Provided are an article and a method of forming an article. The method includes providing a metallic powder, heating the metallic powder to a temperature sufficient to joint at least a portion of the metallic powder to form an initial layer, sequentially forming additional layers in a build direction by providing a distributed layer of the metallic powder over the initial layer and heating the distributed layer of the metallic powder, repeating the steps of sequentially forming the additional layers in the build direction to form a portion of the article having a hollow space formed in the build direction,more » and forming an overhang feature extending into the hollow space. The article includes an article formed by the method described herein.« less
Regional Earthquake Shaking and Loss Estimation
NASA Astrophysics Data System (ADS)
Sesetyan, K.; Demircioglu, M. B.; Zulfikar, C.; Durukal, E.; Erdik, M.
2009-04-01
This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses in the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Both Level 0 (similar to PAGER system of USGS) and Level 1 analyses of the ELER routine are based on obtaining intensity distributions analytically and estimating total number of casualties and their geographic distribution either using regionally adjusted intensity-casualty or magnitude-casualty correlations (Level 0) of using regional building inventory data bases (Level 1). Level 0 analysis is similar to the PAGER system being developed by USGS. For given basis source parameters the intensity distributions can be computed using: a)Regional intensity attenuation relationships, b)Intensity correlations with attenuation relationship based PGV, PGA, and Spectral Amplitudes and, c)Intensity correlations with synthetic Fourier Amplitude Spectrum. In Level 1 analysis EMS98 based building vulnerability relationships are used for regional estimates of building damage and the casualty distributions. Results obtained from pilot applications of the Level 0 and Level 1 analysis modes of the ELER software to the 1999 M 7.4 Kocaeli, 1995 M 6.1 Dinar, and 2007 M 5.4 Bingol earthquakes in terms of ground shaking and losses are presented and comparisons with the observed losses are made. The regional earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation and related Monte-Carlo type simulations.
Data management in an object-oriented distributed aircraft conceptual design environment
NASA Astrophysics Data System (ADS)
Lu, Zhijie
In the competitive global market place, aerospace companies are forced to deliver the right products to the right market, with the right cost, and at the right time. However, the rapid development of technologies and new business opportunities, such as mergers, acquisitions, supply chain management, etc., have dramatically increased the complexity of designing an aircraft. Therefore, the pressure to reduce design cycle time and cost is enormous. One way to solve such a dilemma is to develop and apply advanced engineering environments (AEEs), which are distributed collaborative virtual design environments linking researchers, technologists, designers, etc., together by incorporating application tools and advanced computational, communications, and networking facilities. Aircraft conceptual design, as the first design stage, provides major opportunity to compress design cycle time and is the cheapest place for making design changes. However, traditional aircraft conceptual design programs, which are monolithic programs, cannot provide satisfactory functionality to meet new design requirements due to the lack of domain flexibility and analysis scalability. Therefore, we are in need of the next generation aircraft conceptual design environment (NextADE). To build the NextADE, the framework and the data management problem are two major problems that need to be addressed at the forefront. Solving these two problems, particularly the data management problem, is the focus of this research. In this dissertation, in light of AEEs, a distributed object-oriented framework is firstly formulated and tested for the NextADE. In order to improve interoperability and simplify the integration of heterogeneous application tools, data management is one of the major problems that need to be tackled. To solve this problem, taking into account the characteristics of aircraft conceptual design data, a robust, extensible object-oriented data model is then proposed according to the distributed object-oriented framework. By overcoming the shortcomings of the traditional approach of modeling aircraft conceptual design data, this data model makes it possible to capture specific detailed information of aircraft conceptual design without sacrificing generality, which is one of the most desired features of a data model for aircraft conceptual design. Based upon this data model, a prototype of the data management system, which is one of the fundamental building blocks of the NextADE, is implemented utilizing the state of the art information technologies. Using a general-purpose integration software package to demonstrate the efficacy of the proposed framework and the data management system, the NextADE is initially implemented by integrating the prototype of the data management system with other building blocks of the design environment, such as disciplinary analyses programs and mission analyses programs. As experiments, two case studies are conducted in the integrated design environments. One is based upon a simplified conceptual design of a notional conventional aircraft; the other is a simplified conceptual design of an unconventional aircraft. As a result of the experiments, the proposed framework and the data management approach are shown to be feasible solutions to the research problems.
Markes, Alexander R.; Okundaye, Amenawon O.; Qu, Zhilin; Mende, Ulrike; Choi, Bum-Rak
2018-01-01
Multicellular spheroids generated through cellular self-assembly provide cytoarchitectural complexities of native tissue including three-dimensionality, extensive cell-cell contacts, and appropriate cell-extracellular matrix interactions. They are increasingly suggested as building blocks for larger engineered tissues to achieve shapes, organization, heterogeneity, and other biomimetic complexities. Application of these tissue culture platforms is of particular importance in cardiac research as the myocardium is comprised of distinct but intermingled cell types. Here, we generated scaffold-free 3D cardiac microtissue spheroids comprised of cardiac myocytes (CMs) and/or cardiac fibroblasts (CFs) and used them as building blocks to form larger microtissues with different spatial distributions of CMs and CFs. Characterization of fusing homotypic and heterotypic spheroid pairs revealed an important influence of CFs on fusion kinetics, but most strikingly showed rapid fusion kinetics between heterotypic pairs consisting of one CF and one CM spheroid, indicating that CMs and CFs self-sort in vitro into the intermixed morphology found in the healthy myocardium. We then examined electrophysiological integration of fused homotypic and heterotypic microtissues by mapping action potential propagation. Heterocellular elongated microtissues which recapitulate the disproportionate CF spatial distribution seen in the infarcted myocardium showed that action potentials propagate through CF volumes albeit with significant delay. Complementary computational modeling revealed an important role of CF sodium currents and the spatial distribution of the CM-CF boundary in action potential conduction through CF volumes. Taken together, this study provides useful insights for the development of complex, heterocellular engineered 3D tissue constructs and their engraftment via tissue fusion and has implications for arrhythmogenesis in cardiac disease and repair. PMID:29715271
Autonomous perception and decision making in cyber-physical systems
NASA Astrophysics Data System (ADS)
Sarkar, Soumik
2011-07-01
The cyber-physical system (CPS) is a relatively new interdisciplinary technology area that includes the general class of embedded and hybrid systems. CPSs require integration of computation and physical processes that involves the aspects of physical quantities such as time, energy and space during information processing and control. The physical space is the source of information and the cyber space makes use of the generated information to make decisions. This dissertation proposes an overall architecture of autonomous perception-based decision & control of complex cyber-physical systems. Perception involves the recently developed framework of Symbolic Dynamic Filtering for abstraction of physical world in the cyber space. For example, under this framework, sensor observations from a physical entity are discretized temporally and spatially to generate blocks of symbols, also called words that form a language. A grammar of a language is the set of rules that determine the relationships among words to build sentences. Subsequently, a physical system is conjectured to be a linguistic source that is capable of generating a specific language. The proposed technology is validated on various (experimental and simulated) case studies that include health monitoring of aircraft gas turbine engines, detection and estimation of fatigue damage in polycrystalline alloys, and parameter identification. Control of complex cyber-physical systems involve distributed sensing, computation, control as well as complexity analysis. A novel statistical mechanics-inspired complexity analysis approach is proposed in this dissertation. In such a scenario of networked physical systems, the distribution of physical entities determines the underlying network topology and the interaction among the entities forms the abstract cyber space. It is envisioned that the general contributions, made in this dissertation, will be useful for potential application areas such as smart power grids and buildings, distributed energy systems, advanced health care procedures and future ground and air transportation systems.
A New Distributed Optimization for Community Microgrids Scheduling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Starke, Michael R; Tomsovic, Kevin
This paper proposes a distributed optimization model for community microgrids considering the building thermal dynamics and customer comfort preference. The microgrid central controller (MCC) minimizes the total cost of operating the community microgrid, including fuel cost, purchasing cost, battery degradation cost and voluntary load shedding cost based on the customers' consumption, while the building energy management systems (BEMS) minimize their electricity bills as well as the cost associated with customer discomfort due to room temperature deviation from the set point. The BEMSs and the MCC exchange information on energy consumption and prices. When the optimization converges, the distributed generation scheduling,more » energy storage charging/discharging and customers' consumption as well as the energy prices are determined. In particular, we integrate the detailed thermal dynamic characteristics of buildings into the proposed model. The heating, ventilation and air-conditioning (HVAC) systems can be scheduled intelligently to reduce the electricity cost while maintaining the indoor temperature in the comfort range set by customers. Numerical simulation results show the effectiveness of proposed model.« less
Skyshine study for next generation of fusion devices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gohar, Y.; Yang, S.
1987-02-01
A shielding analysis for next generation of fusion devices (ETR/INTOR) was performed to study the dose equivalent outside the reactor building during operation including the contribution from neutrons and photons scattered back by collisions with air nuclei (skyshine component). Two different three-dimensional geometrical models for a tokamak fusion reactor based on INTOR design parameters were developed for this study. In the first geometrical model, the reactor geometry and the spatial distribution of the deuterium-tritium neutron source were simplified for a parametric survey. The second geometrical model employed an explicit representation of the toroidal geometry of the reactor chamber and themore » spatial distribution of the neutron source. The MCNP general Monte Carlo code for neutron and photon transport was used to perform all the calculations. The energy distribution of the neutron source was used explicitly in the calculations with ENDF/B-V data. The dose equivalent results were analyzed as a function of the concrete roof thickness of the reactor building and the location outside the reactor building.« less
Building 736, second floor, view to southwest showing original window ...
Building 736, second floor, view to southwest showing original window on the left, and electrical controls and entry ladder on the right - Mare Island Naval Shipyard, Electrical Distribution Centers, Railroad Avenue near Eighteenth Street, Vallejo, Solano County, CA
Advanced Electric Distribution, Switching, and Conversion Technology for Power Control
NASA Technical Reports Server (NTRS)
Soltis, James V.
1998-01-01
The Electrical Power Control Unit currently under development by Sundstrand Aerospace for use on the Fluids Combustion Facility of the International Space Station is the precursor of modular power distribution and conversion concepts for future spacecraft and aircraft applications. This unit combines modular current-limiting flexible remote power controllers and paralleled power converters into one package. Each unit includes three 1-kW, current-limiting power converter modules designed for a variable-ratio load sharing capability. The flexible remote power controllers can be used in parallel to match load requirements and can be programmed for an initial ON or OFF state on powerup. The unit contains an integral cold plate. The modularity and hybridization of the Electrical Power Control Unit sets the course for future spacecraft electrical power systems, both large and small. In such systems, the basic hybridized converter and flexible remote power controller building blocks could be configured to match power distribution and conversion capabilities to load requirements. In addition, the flexible remote power controllers could be configured in assemblies to feed multiple individual loads and could be used in parallel to meet the specific current requirements of each of those loads. Ultimately, the Electrical Power Control Unit design concept could evolve to a common switch module hybrid, or family of hybrids, for both converter and switchgear applications. By assembling hybrids of a common current rating and voltage class in parallel, researchers could readily adapt these units for multiple applications. The Electrical Power Control Unit concept has the potential to be scaled to larger and smaller ratings for both small and large spacecraft and for aircraft where high-power density, remote power controllers or power converters are required and a common replacement part is desired for multiples of a base current rating.
GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data
NASA Astrophysics Data System (ADS)
Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.
2016-12-01
Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.
DOT National Transportation Integrated Search
1988-09-01
THIS GUIDE FOR DEVELOPERS, BUILDING OWNERS AND BUILDING MANAGERS IS ONE IN A SERIES OF SAMPLES OF TDM PLANS THAT ILLUSTRATE THE DESIGN AND PROPOSED APPLICATION OF TDM STRATEGIES. THIS SAMPLE PLAN WAS PREPARED FOR A FICTITIOUS BUILDING MANAGER NEAR DO...
DOT National Transportation Integrated Search
2010-06-10
Two primary contracting methods are used by most state transportation agencies to : design and build infrastructure: design-bid-build and design-build. The objective : of this study is to conduct a literature review to identify how ConnDOTs use of...
Agbenin, John O
2002-12-02
Growing concern about heavy metal contamination of agricultural lands under long-term application of inorganic fertilizers and organic wastes makes periodic risk assessment of heavy metal accumulation in arable lands imperative. As a part of a much larger study to systematically document the status of heavy metals in savanna soils this study investigated the distribution and dynamics of Cr and Ni in a savanna soil after 50 years of continuous cultivation and application of inorganic fertilizers and organic manures. The cultivated fields were fertilized with inorganic fertilizers (NPK), farmyard manure (FYM), FYM+NPK for 50 years and a control plot under continuous cultivation for 50 years but did not receive either FYM or NPK. Two uncultivated or natural sites were sampled as reference conditions for assessing the dynamics of Cr and Ni induced by cultivation and management practices. The distribution of Cr and Ni in the soil profiles exhibited eluvial-illuvial patterns. Sand and clay fractions explained between 62 and 90% of the variance in Cr and Ni concentration and distribution in the soil profiles. Mean Cr concentrations ranged from 17 to 59 mg kg(-1), while Ni varied from <1 mg kg(-1) in the topsoil to 16 mg kg(-1) in the subsoil. Mass balance calculations showed a loss of 10% Cr and 17% Ni in the FYM field, and approximately 4% Cr and 11% Ni in the NPK field compared to the natural site after 50 years of cultivation. The control and FYM + NPK field had, however, a positive balance of Cr and Ni. In general, it was concluded that existing soil management practices in this region are unlikely to lead to Cr and Ni build-up probably because of low rates of application of inorganic fertilizers, farmyard manure and other organic wastes to the soils.
Modeling the Delivery Physiology of Distributed Learning Systems.
ERIC Educational Resources Information Center
Paquette, Gilbert; Rosca, Ioan
2003-01-01
Discusses instructional delivery models and their physiology in distributed learning systems. Highlights include building delivery models; types of delivery models, including distributed classroom, self-training on the Web, online training, communities of practice, and performance support systems; and actors (users) involved, including experts,…
46 CFR 386.15 - Distribution of handbills.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 8 2010-10-01 2010-10-01 false Distribution of handbills. 386.15 Section 386.15 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION MISCELLANEOUS REGULATIONS GOVERNING PUBLIC BUILDINGS AND GROUNDS AT THE UNITED STATES MERCHANT MARINE ACADEMY § 386.15 Distribution of handbills. The...
NASA Astrophysics Data System (ADS)
Dasari, Venkat R.; Sadlier, Ronald J.; Geerhart, Billy E.; Snow, Nikolai A.; Williams, Brian P.; Humble, Travis S.
2017-05-01
Well-defined and stable quantum networks are essential to realize functional quantum communication applications. Quantum networks are complex and must use both quantum and classical channels to support quantum applications like QKD, teleportation, and superdense coding. In particular, the no-cloning theorem prevents the reliable copying of quantum signals such that the quantum and classical channels must be highly coordinated using robust and extensible methods. In this paper, we describe new network abstractions and interfaces for building programmable quantum networks. Our approach leverages new OpenFlow data structures and table type patterns to build programmable quantum networks and to support quantum applications.
Build, Buy, Open Source, or Web 2.0?: Making an Informed Decision for Your Library
ERIC Educational Resources Information Center
Fagan, Jody Condit; Keach, Jennifer A.
2010-01-01
When improving a web presence, today's libraries have a choice: using a free Web 2.0 application, opting for open source, buying a product, or building a web application. This article discusses how to make an informed decision for one's library. The authors stress that deciding whether to use a free Web 2.0 application, to choose open source, to…
Multi-Storey Air-Supported Building Construction
ERIC Educational Resources Information Center
Pohl, J. G.; Cowan, H. J.
1972-01-01
Multistory buildings, supported by internal air pressure and surrounded by a thin, flexible or rigid membrane acting both as structural container and external cladding, are feasible and highly economical for a number of building applications. (Author)
Review of the Application of Green Building and Energy Saving Technology
NASA Astrophysics Data System (ADS)
Tong, Zhineng
2017-12-01
The use of energy-saving technologies in green buildings should run through the entire process of building design, construction and use, enabling green energy-saving technologies to maximize their effectiveness in construction. Realize the sustainable development of green building, reduce energy consumption, reduce people’s interference with the natural environment, suitable for people living in “green” building.
NASA Astrophysics Data System (ADS)
Allison, M.; Gundersen, L. C.; Richard, S. M.; Dickinson, T. L.
2008-12-01
A coalition of the state geological surveys (AASG), the U.S. Geological Survey (USGS), and partners will receive NSF funding over 3 years under the INTEROP solicitation to start building the Geoscience Information Network (www.geoinformatics.info/gin) a distributed, interoperable data network. The GIN project will develop standardized services to link existing and in-progress components using a few standards and protocols, and work with data providers to implement these services. The key components of this network are 1) catalog system(s) for data discovery; 2) service definitions for interfaces for searching catalogs and accessing resources; 3) shared interchange formats to encode information for transmission (e.g. various XML markup languages); 4) data providers that publish information using standardized services defined by the network; and 5) client applications adapted to use information resources provided by the network. The GIN will integrate and use catalog resources that currently exist or are in development. We are working with the USGS National Geologic Map Database's existing map catalog, with the USGS National Geological and Geophysical Data Preservation Program, which is developing a metadata catalog (National Digital Catalog) for geoscience information resource discovery, and with the GEON catalog. Existing interchange formats will be used, such as GeoSciML, ChemML, and Open Geospatial Consortium sensor, observation and measurement MLs. Client application development will be fostered by collaboration with industry and academic partners. The GIN project will focus on the remaining aspects of the system -- service definitions and assistance to data providers to implement the services and bring content online - and on system integration of the modules. Initial formal collaborators include the OneGeology-Europe consortium of 27 nations that is building a comparable network under the EU INSPIRE initiative, GEON, Earthchem, and GIS software company ESRI. OneGeology-Europe and GIN have agreed to integrate their networks, effectively adopting global standards among geological surveys that are available across the entire field. ESRI is creating a Geology Data Model for ArcGIS software to be compatible with GIN, and other companies are expressing interest in adapting their services, applications, and clients to take advantage of the large data resources planned to become available through GIN.
Conceptual Model Scenarios for the Vapor Intrusion Pathway
This report provides simplified simulation examples to illustrate graphically how subsurface conditions and building-specific characteristics determine the distribution chemical distribution and indoor air concentration relative to a source concentration.
Computational Fluid Dynamics (CFD) techniques are increasingly being applied to air quality modeling of short-range dispersion, especially the flow and dispersion around buildings and other geometrically complex structures. The proper application and accuracy of such CFD techniqu...
Assessment of Distributed Generation Potential in JapaneseBuildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Nan; Marnay, Chris; Firestone, Ryan
2005-05-25
To meet growing energy demands, energy efficiency, renewable energy, and on-site generation coupled with effective utilization of exhaust heat will all be required. Additional benefit can be achieved by integrating these distributed technologies into distributed energy resource (DER) systems (or microgrids). This research investigates a method of choosing economically optimal DER, expanding on prior studies at the Berkeley Lab using the DER design optimization program, the Distributed Energy Resources Customer Adoption Model (DER-CAM). DER-CAM finds the optimal combination of installed equipment from available DER technologies, given prevailing utility tariffs, site electrical and thermal loads, and a menu of available equipment.more » It provides a global optimization, albeit idealized, that shows how the site energy loads can be served at minimum cost by selection and operation of on-site generation, heat recovery, and cooling. Five prototype Japanese commercial buildings are examined and DER-CAM applied to select the economically optimal DER system for each. The five building types are office, hospital, hotel, retail, and sports facility. Based on the optimization results, energy and emission reductions are evaluated. Furthermore, a Japan-U.S. comparison study of policy, technology, and utility tariffs relevant to DER installation is presented. Significant decreases in fuel consumption, carbon emissions, and energy costs were seen in the DER-CAM results. Savings were most noticeable in the sports facility (a very favourable CHP site), followed by the hospital, hotel, and office building.« less