Sample records for model building methods

  1. A Comparison of Two Balance Calibration Model Building Methods

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard; Ulbrich, Norbert

    2007-01-01

    Simulated strain-gage balance calibration data is used to compare the accuracy of two balance calibration model building methods for different noise environments and calibration experiment designs. The first building method obtains a math model for the analysis of balance calibration data after applying a candidate math model search algorithm to the calibration data set. The second building method uses stepwise regression analysis in order to construct a model for the analysis. Four balance calibration data sets were simulated in order to compare the accuracy of the two math model building methods. The simulated data sets were prepared using the traditional One Factor At a Time (OFAT) technique and the Modern Design of Experiments (MDOE) approach. Random and systematic errors were introduced in the simulated calibration data sets in order to study their influence on the math model building methods. Residuals of the fitted calibration responses and other statistical metrics were compared in order to evaluate the calibration models developed with different combinations of noise environment, experiment design, and model building method. Overall, predicted math models and residuals of both math model building methods show very good agreement. Significant differences in model quality were attributable to noise environment, experiment design, and their interaction. Generally, the addition of systematic error significantly degraded the quality of calibration models developed from OFAT data by either method, but MDOE experiment designs were more robust with respect to the introduction of a systematic component of the unexplained variance.

  2. Hybrid Modeling Based on Scsg-Br and Orthophoto

    NASA Astrophysics Data System (ADS)

    Zhou, G.; Huang, Y.; Yue, T.; Li, X.; Huang, W.; He, C.; Wu, Z.

    2018-05-01

    With the development of digital city, digital applications are more and more widespread, while the urban buildings are more complex. Therefore, establishing an effective data model is the key to express urban building models accurately. In addition, the combination of 3D building model and remote sensing data become a trend to build digital city there are a large amount of data resulting in data redundancy. In order to solve the limitation of single modelling of constructive solid geometry (CSG), this paper presents a mixed modelling method based on SCSG-BR for urban buildings representation. On one hand, the improved CSG method, which is called as "Spatial CSG (SCSG)" representation method, is used to represent the exterior shape of urban buildings. On the other hand, the boundary representation (BR) method represents the topological relationship between geometric elements of urban building, in which the textures is considered as the attribute data of the wall and the roof of urban building. What's more, the method combined file database and relational database is used to manage the data of three-dimensional building model, which can decrease the complex processes in texture mapping. During the data processing, the least-squares algorithm with constraints is used to orthogonalize the building polygons and adjust the polygons topology to ensure the accuracy of the modelling data. Finally, this paper matches the urban building model with the corresponding orthophoto. This paper selects data of Denver, Colorado, USA to establish urban building realistic model. The results show that the SCSG-BR method can represent the topological relations of building more precisely. The organization and management of urban building model data reduce the redundancy of data and improve modelling speed. The combination of orthophoto and urban building model further strengthens the application in view analysis and spatial query, which enhance the scope of digital city applications.

  3. Impacts of building geometry modeling methods on the simulation results of urban building energy models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yixing; Hong, Tianzhen

    We present that urban-scale building energy modeling (UBEM)—using building modeling to understand how a group of buildings will perform together—is attracting increasing attention in the energy modeling field. Unlike modeling a single building, which will use detailed information, UBEM generally uses existing building stock data consisting of high-level building information. This study evaluated the impacts of three zoning methods and the use of floor multipliers on the simulated energy use of 940 office and retail buildings in three climate zones using City Building Energy Saver. The first zoning method, OneZone, creates one thermal zone per floor using the target building'smore » footprint. The second zoning method, AutoZone, splits the building's footprint into perimeter and core zones. A novel, pixel-based automatic zoning algorithm is developed for the AutoZone method. The third zoning method, Prototype, uses the U.S. Department of Energy's reference building prototype shapes. Results show that simulated source energy use of buildings with the floor multiplier are marginally higher by up to 2.6% than those modeling each floor explicitly, which take two to three times longer to run. Compared with the AutoZone method, the OneZone method results in decreased thermal loads and less equipment capacities: 15.2% smaller fan capacity, 11.1% smaller cooling capacity, 11.0% smaller heating capacity, 16.9% less heating loads, and 7.5% less cooling loads. Source energy use differences range from -7.6% to 5.1%. When comparing the Prototype method with the AutoZone method, source energy use differences range from -12.1% to 19.0%, and larger ranges of differences are found for the thermal loads and equipment capacities. This study demonstrated that zoning methods have a significant impact on the simulated energy use of UBEM. Finally, one recommendation resulting from this study is to use the AutoZone method with floor multiplier to obtain accurate results while balancing the simulation run time for UBEM.« less

  4. Impacts of building geometry modeling methods on the simulation results of urban building energy models

    DOE PAGES

    Chen, Yixing; Hong, Tianzhen

    2018-02-20

    We present that urban-scale building energy modeling (UBEM)—using building modeling to understand how a group of buildings will perform together—is attracting increasing attention in the energy modeling field. Unlike modeling a single building, which will use detailed information, UBEM generally uses existing building stock data consisting of high-level building information. This study evaluated the impacts of three zoning methods and the use of floor multipliers on the simulated energy use of 940 office and retail buildings in three climate zones using City Building Energy Saver. The first zoning method, OneZone, creates one thermal zone per floor using the target building'smore » footprint. The second zoning method, AutoZone, splits the building's footprint into perimeter and core zones. A novel, pixel-based automatic zoning algorithm is developed for the AutoZone method. The third zoning method, Prototype, uses the U.S. Department of Energy's reference building prototype shapes. Results show that simulated source energy use of buildings with the floor multiplier are marginally higher by up to 2.6% than those modeling each floor explicitly, which take two to three times longer to run. Compared with the AutoZone method, the OneZone method results in decreased thermal loads and less equipment capacities: 15.2% smaller fan capacity, 11.1% smaller cooling capacity, 11.0% smaller heating capacity, 16.9% less heating loads, and 7.5% less cooling loads. Source energy use differences range from -7.6% to 5.1%. When comparing the Prototype method with the AutoZone method, source energy use differences range from -12.1% to 19.0%, and larger ranges of differences are found for the thermal loads and equipment capacities. This study demonstrated that zoning methods have a significant impact on the simulated energy use of UBEM. Finally, one recommendation resulting from this study is to use the AutoZone method with floor multiplier to obtain accurate results while balancing the simulation run time for UBEM.« less

  5. Review of Methods for Buildings Energy Performance Modelling

    NASA Astrophysics Data System (ADS)

    Krstić, Hrvoje; Teni, Mihaela

    2017-10-01

    Research presented in this paper gives a brief review of methods used for buildings energy performance modelling. This paper gives also a comprehensive review of the advantages and disadvantages of available methods as well as the input parameters used for modelling buildings energy performance. European Directive EPBD obliges the implementation of energy certification procedure which gives an insight on buildings energy performance via exiting energy certificate databases. Some of the methods for buildings energy performance modelling mentioned in this paper are developed by employing data sets of buildings which have already undergone an energy certification procedure. Such database is used in this paper where the majority of buildings in the database have already gone under some form of partial retrofitting - replacement of windows or installation of thermal insulation but still have poor energy performance. The case study presented in this paper utilizes energy certificates database obtained from residential units in Croatia (over 400 buildings) in order to determine the dependence between buildings energy performance and variables from database by using statistical dependencies tests. Building energy performance in database is presented with building energy efficiency rate (from A+ to G) which is based on specific annual energy needs for heating for referential climatic data [kWh/(m2a)]. Independent variables in database are surfaces and volume of the conditioned part of the building, building shape factor, energy used for heating, CO2 emission, building age and year of reconstruction. Research results presented in this paper give an insight in possibilities of methods used for buildings energy performance modelling. Further on it gives an analysis of dependencies between buildings energy performance as a dependent variable and independent variables from the database. Presented results could be used for development of new building energy performance predictive model.

  6. A Hierarchical Building Segmentation in Digital Surface Models for 3D Reconstruction.

    PubMed

    Yan, Yiming; Gao, Fengjiao; Deng, Shupei; Su, Nan

    2017-01-24

    In this study, a hierarchical method for segmenting buildings in a digital surface model (DSM), which is used in a novel framework for 3D reconstruction, is proposed. Most 3D reconstructions of buildings are model-based. However, the limitations of these methods are overreliance on completeness of the offline-constructed models of buildings, and the completeness is not easily guaranteed since in modern cities buildings can be of a variety of types. Therefore, a model-free framework using high precision DSM and texture-images buildings was introduced. There are two key problems with this framework. The first one is how to accurately extract the buildings from the DSM. Most segmentation methods are limited by either the terrain factors or the difficult choice of parameter-settings. A level-set method are employed to roughly find the building regions in the DSM, and then a recently proposed 'occlusions of random textures model' are used to enhance the local segmentation of the buildings. The second problem is how to generate the facades of buildings. Synergizing with the corresponding texture-images, we propose a roof-contour guided interpolation of building facades. The 3D reconstruction results achieved by airborne-like images and satellites are compared. Experiments show that the segmentation method has good performance, and 3D reconstruction is easily performed by our framework, and better visualization results can be obtained by airborne-like images, which can be further replaced by UAV images.

  7. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    NASA Astrophysics Data System (ADS)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  8. Implicit Regularization for Reconstructing 3D Building Rooftop Models Using Airborne LiDAR Data

    PubMed Central

    Jung, Jaewook; Jwa, Yoonseok; Sohn, Gunho

    2017-01-01

    With rapid urbanization, highly accurate and semantically rich virtualization of building assets in 3D become more critical for supporting various applications, including urban planning, emergency response and location-based services. Many research efforts have been conducted to automatically reconstruct building models at city-scale from remotely sensed data. However, developing a fully-automated photogrammetric computer vision system enabling the massive generation of highly accurate building models still remains a challenging task. One the most challenging task for 3D building model reconstruction is to regularize the noises introduced in the boundary of building object retrieved from a raw data with lack of knowledge on its true shape. This paper proposes a data-driven modeling approach to reconstruct 3D rooftop models at city-scale from airborne laser scanning (ALS) data. The focus of the proposed method is to implicitly derive the shape regularity of 3D building rooftops from given noisy information of building boundary in a progressive manner. This study covers a full chain of 3D building modeling from low level processing to realistic 3D building rooftop modeling. In the element clustering step, building-labeled point clouds are clustered into homogeneous groups by applying height similarity and plane similarity. Based on segmented clusters, linear modeling cues including outer boundaries, intersection lines, and step lines are extracted. Topology elements among the modeling cues are recovered by the Binary Space Partitioning (BSP) technique. The regularity of the building rooftop model is achieved by an implicit regularization process in the framework of Minimum Description Length (MDL) combined with Hypothesize and Test (HAT). The parameters governing the MDL optimization are automatically estimated based on Min-Max optimization and Entropy-based weighting method. The performance of the proposed method is tested over the International Society for Photogrammetry and Remote Sensing (ISPRS) benchmark datasets. The results show that the proposed method can robustly produce accurate regularized 3D building rooftop models. PMID:28335486

  9. Implicit Regularization for Reconstructing 3D Building Rooftop Models Using Airborne LiDAR Data.

    PubMed

    Jung, Jaewook; Jwa, Yoonseok; Sohn, Gunho

    2017-03-19

    With rapid urbanization, highly accurate and semantically rich virtualization of building assets in 3D become more critical for supporting various applications, including urban planning, emergency response and location-based services. Many research efforts have been conducted to automatically reconstruct building models at city-scale from remotely sensed data. However, developing a fully-automated photogrammetric computer vision system enabling the massive generation of highly accurate building models still remains a challenging task. One the most challenging task for 3D building model reconstruction is to regularize the noises introduced in the boundary of building object retrieved from a raw data with lack of knowledge on its true shape. This paper proposes a data-driven modeling approach to reconstruct 3D rooftop models at city-scale from airborne laser scanning (ALS) data. The focus of the proposed method is to implicitly derive the shape regularity of 3D building rooftops from given noisy information of building boundary in a progressive manner. This study covers a full chain of 3D building modeling from low level processing to realistic 3D building rooftop modeling. In the element clustering step, building-labeled point clouds are clustered into homogeneous groups by applying height similarity and plane similarity. Based on segmented clusters, linear modeling cues including outer boundaries, intersection lines, and step lines are extracted. Topology elements among the modeling cues are recovered by the Binary Space Partitioning (BSP) technique. The regularity of the building rooftop model is achieved by an implicit regularization process in the framework of Minimum Description Length (MDL) combined with Hypothesize and Test (HAT). The parameters governing the MDL optimization are automatically estimated based on Min-Max optimization and Entropy-based weighting method. The performance of the proposed method is tested over the International Society for Photogrammetry and Remote Sensing (ISPRS) benchmark datasets. The results show that the proposed method can robustly produce accurate regularized 3D building rooftop models.

  10. Semi-Automatic Building Models and FAÇADE Texture Mapping from Mobile Phone Images

    NASA Astrophysics Data System (ADS)

    Jeong, J.; Kim, T.

    2016-06-01

    Research on 3D urban modelling has been actively carried out for a long time. Recently the need of 3D urban modelling research is increased rapidly due to improved geo-web services and popularized smart devices. Nowadays 3D urban models provided by, for example, Google Earth use aerial photos for 3D urban modelling but there are some limitations: immediate update for the change of building models is difficult, many buildings are without 3D model and texture, and large resources for maintaining and updating are inevitable. To resolve the limitations mentioned above, we propose a method for semi-automatic building modelling and façade texture mapping from mobile phone images and analyze the result of modelling with actual measurements. Our method consists of camera geometry estimation step, image matching step, and façade mapping step. Models generated from this method were compared with actual measurement value of real buildings. Ratios of edge length of models and measurements were compared. Result showed 5.8% average error of length ratio. Through this method, we could generate a simple building model with fine façade textures without expensive dedicated tools and dataset.

  11. A Hierarchical Building Segmentation in Digital Surface Models for 3D Reconstruction

    PubMed Central

    Yan, Yiming; Gao, Fengjiao; Deng, Shupei; Su, Nan

    2017-01-01

    In this study, a hierarchical method for segmenting buildings in a digital surface model (DSM), which is used in a novel framework for 3D reconstruction, is proposed. Most 3D reconstructions of buildings are model-based. However, the limitations of these methods are overreliance on completeness of the offline-constructed models of buildings, and the completeness is not easily guaranteed since in modern cities buildings can be of a variety of types. Therefore, a model-free framework using high precision DSM and texture-images buildings was introduced. There are two key problems with this framework. The first one is how to accurately extract the buildings from the DSM. Most segmentation methods are limited by either the terrain factors or the difficult choice of parameter-settings. A level-set method are employed to roughly find the building regions in the DSM, and then a recently proposed ‘occlusions of random textures model’ are used to enhance the local segmentation of the buildings. The second problem is how to generate the facades of buildings. Synergizing with the corresponding texture-images, we propose a roof-contour guided interpolation of building facades. The 3D reconstruction results achieved by airborne-like images and satellites are compared. Experiments show that the segmentation method has good performance, and 3D reconstruction is easily performed by our framework, and better visualization results can be obtained by airborne-like images, which can be further replaced by UAV images. PMID:28125018

  12. Automatic Building Damage Detection Method Using High-Resolution Remote Sensing Images and 3d GIS Model

    NASA Astrophysics Data System (ADS)

    Tu, Jihui; Sui, Haigang; Feng, Wenqing; Song, Zhina

    2016-06-01

    In this paper, a novel approach of building damaged detection is proposed using high resolution remote sensing images and 3D GIS-Model data. Traditional building damage detection method considers to detect damaged building due to earthquake, but little attention has been paid to analyze various building damaged types(e.g., trivial damaged, severely damaged and totally collapsed.) Therefore, we want to detect the different building damaged type using 2D and 3D feature of scenes because the real world we live in is a 3D space. The proposed method generalizes that the image geometric correction method firstly corrects the post-disasters remote sensing image using the 3D GIS model or RPC parameters, then detects the different building damaged types using the change of the height and area between the pre- and post-disasters and the texture feature of post-disasters. The results, evaluated on a selected study site of the Beichuan earthquake ruins, Sichuan, show that this method is feasible and effective in building damage detection. It has also shown that the proposed method is easily applicable and well suited for rapid damage assessment after natural disasters.

  13. An integrated environmental and health performance quantification model for pre-occupancy phase of buildings in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xiaodong, E-mail: eastdawn@tsinghua.edu.cn; Su, Shu, E-mail: sushuqh@163.com; Zhang, Zhihui, E-mail: zhzhg@tsinghua.edu.cn

    To comprehensively pre-evaluate the damages to both the environment and human health due to construction activities in China, this paper presents an integrated building environmental and health performance (EHP) assessment model based on the Building Environmental Performance Analysis System (BEPAS) and the Building Health Impact Analysis System (BHIAS) models and offers a new inventory data estimation method. The new model follows the life cycle assessment (LCA) framework and the inventory analysis step involves bill of quantity (BOQ) data collection, consumption data formation, and environmental profile transformation. The consumption data are derived from engineering drawings and quotas to conduct the assessmentmore » before construction for pre-evaluation. The new model classifies building impacts into three safeguard areas: ecosystems, natural resources and human health. Thus, this model considers environmental impacts as well as damage to human wellbeing. The monetization approach, distance-to-target method and panel method are considered as optional weighting approaches. Finally, nine residential buildings of different structural types are taken as case studies to test the operability of the integrated model through application. The results indicate that the new model can effectively pre-evaluate building EHP and the structure type significantly affects the performance of residential buildings.« less

  14. Modal mass estimation from ambient vibrations measurement: A method for civil buildings

    NASA Astrophysics Data System (ADS)

    Acunzo, G.; Fiorini, N.; Mori, F.; Spina, D.

    2018-01-01

    A new method for estimating the modal mass ratios of buildings from unscaled mode shapes identified from ambient vibrations is presented. The method is based on the Multi Rigid Polygons (MRP) model in which each floor of the building is ideally divided in several non-deformable polygons that move independent of each other. The whole mass of the building is concentrated in the centroid of the polygons and the experimental mode shapes are expressed in term of rigid translations and of rotations. In this way, the mass matrix of the building can be easily computed on the basis of simple information about the geometry and the materials of the structure. The modal mass ratios can be then obtained through the classical equation of structural dynamics. Ambient vibrations measurement must be performed according to this MRP models, using at least two biaxial accelerometers per polygon. After a brief illustration of the theoretical background of the method, numerical validations are presented analysing the method sensitivity for possible different source of errors. Quality indexes are defined for evaluating the approximation of the modal mass ratios obtained from a certain MRP model. The capability of the proposed model to be applied to real buildings is illustrated through two experimental applications. In the first one, a geometrically irregular reinforced concrete building is considered, using a calibrated Finite Element Model for validating the results of the method. The second application refers to a historical monumental masonry building, with a more complex geometry and with less information available. In both cases, MRP models with a different number of rigid polygons per floor are compared.

  15. High-Resolution Remote Sensing Image Building Extraction Based on Markov Model

    NASA Astrophysics Data System (ADS)

    Zhao, W.; Yan, L.; Chang, Y.; Gong, L.

    2018-04-01

    With the increase of resolution, remote sensing images have the characteristics of increased information load, increased noise, more complex feature geometry and texture information, which makes the extraction of building information more difficult. To solve this problem, this paper designs a high resolution remote sensing image building extraction method based on Markov model. This method introduces Contourlet domain map clustering and Markov model, captures and enhances the contour and texture information of high-resolution remote sensing image features in multiple directions, and further designs the spectral feature index that can characterize "pseudo-buildings" in the building area. Through the multi-scale segmentation and extraction of image features, the fine extraction from the building area to the building is realized. Experiments show that this method can restrain the noise of high-resolution remote sensing images, reduce the interference of non-target ground texture information, and remove the shadow, vegetation and other pseudo-building information, compared with the traditional pixel-level image information extraction, better performance in building extraction precision, accuracy and completeness.

  16. Vision-based building energy diagnostics and retrofit analysis using 3D thermography and building information modeling

    NASA Astrophysics Data System (ADS)

    Ham, Youngjib

    The emerging energy crisis in the building sector and the legislative measures on improving energy efficiency are steering the construction industry towards adopting new energy efficient design concepts and construction methods that decrease the overall energy loads. However, the problems of energy efficiency are not only limited to the design and construction of new buildings. Today, a significant amount of input energy in existing buildings is still being wasted during the operational phase. One primary source of the energy waste is attributed to unnecessary heat flows through building envelopes during hot and cold seasons. This inefficiency increases the operational frequency of heating and cooling systems to keep the desired thermal comfort of building occupants, and ultimately results in excessive energy use. Improving thermal performance of building envelopes can reduce the energy consumption required for space conditioning and in turn provide building occupants with an optimal thermal comfort at a lower energy cost. In this sense, energy diagnostics and retrofit analysis for existing building envelopes are key enablers for improving energy efficiency. Since proper retrofit decisions of existing buildings directly translate into energy cost saving in the future, building practitioners are increasingly interested in methods for reliable identification of potential performance problems so that they can take timely corrective actions. However, sensing what and where energy problems are emerging or are likely to emerge and then analyzing how the problems influence the energy consumption are not trivial tasks. The overarching goal of this dissertation focuses on understanding the gaps in knowledge in methods for building energy diagnostics and retrofit analysis, and filling these gaps by devising a new method for multi-modal visual sensing and analytics using thermography and Building Information Modeling (BIM). First, to address the challenges in scaling and localization issues of 2D thermal image-based inspection, a new computer vision-based method is presented for automated 3D spatio-thermal modeling of building environments from images and localizing the thermal images into the 3D reconstructed scenes, which helps better characterize the as-is condition of existing buildings in 3D. By using these models, auditors can conduct virtual walk-through in buildings and explore the as-is condition of building geometry and the associated thermal conditions in 3D. Second, to address the challenges in qualitative and subjective interpretation of visual data, a new model-based method is presented to convert the 3D thermal profiles of building environments into their associated energy performance metrics. More specifically, the Energy Performance Augmented Reality (EPAR) models are formed which integrate the actual 3D spatio-thermal models ('as-is') with energy performance benchmarks ('as-designed') in 3D. In the EPAR models, the presence and location of potential energy problems in building environments are inferred based on performance deviations. The as-is thermal resistances of the building assemblies are also calculated at the level of mesh vertex in 3D. Then, based on the historical weather data reflecting energy load for space conditioning, the amount of heat transfer that can be saved by improving the as-is thermal resistances of the defective areas to the recommended level is calculated, and the equivalent energy cost for this saving is estimated. The outcome provides building practitioners with unique information that can facilitate energy efficient retrofit decision-makings. This is a major departure from offhand calculations that are based on historical cost data of industry best practices. Finally, to improve the reliability of BIM-based energy performance modeling and analysis for existing buildings, a new model-based automated method is presented to map actual thermal resistance measurements at the level of 3D vertexes to the associated BIM elements and update their corresponding thermal properties in the gbXML schema. By reflecting the as-is building condition in the BIM-based energy modeling process, this method bridges over the gap between the architectural information in the as-designed BIM and the as-is building condition for accurate energy performance analysis. The performance of each method was validated on ten case studies from interiors and exteriors of existing residential and instructional buildings in IL and VA. The extensive experimental results show the promise of the proposed methods in addressing the fundamental challenges of (1) visual sensing : scaling 2D visual assessments to real-world building environments and localizing energy problems; (2) analytics: subjective and qualitative assessments; and (3) BIM-based building energy analysis : a lack of procedures for reflecting the as-is building condition in the energy modeling process. Beyond the technical contributions, the domain expert surveys conducted in this dissertation show that the proposed methods have potential to improve the quality of thermographic inspection processes and complement the current building energy analysis tools.

  17. Stochastic and Geometric Reasoning for Indoor Building Models with Electric Installations - Bridging the Gap Between GIS and Bim

    NASA Astrophysics Data System (ADS)

    Dehbi, Y.; Haunert, J.-H.; Plümer, L.

    2017-10-01

    3D city and building models according to CityGML encode the geometry, represent the structure and model semantically relevant building parts such as doors, windows and balconies. Building information models support the building design, construction and the facility management. In contrast to CityGML, they include also objects which cannot be observed from the outside. The three dimensional indoor models characterize a missing link between both worlds. Their derivation, however, is expensive. The semantic automatic interpretation of 3D point clouds of indoor environments is a methodically demanding task. The data acquisition is costly and difficult. The laser scanners and image-based methods require the access to every room. Based on an approach which does not require an additional geometry acquisition of building indoors, we propose an attempt for filling the gaps between 3D building models and building information models. Based on sparse observations such as the building footprint and room areas, 3D indoor models are generated using combinatorial and stochastic reasoning. The derived models are expanded by a-priori not observable structures such as electric installation. Gaussian mixtures, linear and bi-linear constraints are used to represent the background knowledge and structural regularities. The derivation of hypothesised models is performed by stochastic reasoning using graphical models, Gauss-Markov models and MAP-estimators.

  18. Indoor 3D Route Modeling Based On Estate Spatial Data

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Wen, Y.; Jiang, J.; Huang, W.

    2014-04-01

    Indoor three-dimensional route model is essential for space intelligence navigation and emergency evacuation. This paper is motivated by the need of constructing indoor route model automatically and as far as possible. By comparing existing building data sources, this paper firstly explained the reason why the estate spatial management data is chosen as the data source. Then, an applicable method of construction three-dimensional route model in a building is introduced by establishing the mapping relationship between geographic entities and their topological expression. This data model is a weighted graph consist of "node" and "path" to express the spatial relationship and topological structure of a building components. The whole process of modelling internal space of a building is addressed by two key steps: (1) each single floor route model is constructed, including path extraction of corridor using Delaunay triangulation algorithm with constrained edge, fusion of room nodes into the path; (2) the single floor route model is connected with stairs and elevators and the multi-floor route model is eventually generated. In order to validate the method in this paper, a shopping mall called "Longjiang New City Plaza" in Nanjing is chosen as a case of study. And the whole building space is constructed according to the modelling method above. By integrating of existing path finding algorithm, the usability of this modelling method is verified, which shows the indoor three-dimensional route modelling method based on estate spatial data in this paper can support indoor route planning and evacuation route design very well.

  19. Application of BIM Technology in Building Water Supply and Drainage Design

    NASA Astrophysics Data System (ADS)

    Wei, Tianyun; Chen, Guiqing; Wang, Junde

    2017-12-01

    Through the application of BIM technology, the idea of building water supply and drainage designers can be related to the model, the various influencing factors to affect water supply and drainage design can be considered more comprehensively. BIM(Building information model) technology assist in improving the design process of building water supply and drainage, promoting the building water supply and drainage planning, enriching the building water supply and drainage design method, improving the water supply and drainage system design level and building quality. Combined with fuzzy comprehensive evaluation method to analyze the advantages of BIM technology in building water supply and drainage design. Therefore, application prospects of BIM technology are very worthy of promotion.

  20. Building Energy Modeling and Control Methods for Optimization and Renewables Integration

    NASA Astrophysics Data System (ADS)

    Burger, Eric M.

    This dissertation presents techniques for the numerical modeling and control of building systems, with an emphasis on thermostatically controlled loads. The primary objective of this work is to address technical challenges related to the management of energy use in commercial and residential buildings. This work is motivated by the need to enhance the performance of building systems and by the potential for aggregated loads to perform load following and regulation ancillary services, thereby enabling the further adoption of intermittent renewable energy generation technologies. To increase the generalizability of the techniques, an emphasis is placed on recursive and adaptive methods which minimize the need for customization to specific buildings and applications. The techniques presented in this dissertation can be divided into two general categories: modeling and control. Modeling techniques encompass the processing of data streams from sensors and the training of numerical models. These models enable us to predict the energy use of a building and of sub-systems, such as a heating, ventilation, and air conditioning (HVAC) unit. Specifically, we first present an ensemble learning method for the short-term forecasting of total electricity demand in buildings. As the deployment of intermittent renewable energy resources continues to rise, the generation of accurate building-level electricity demand forecasts will be valuable to both grid operators and building energy management systems. Second, we present a recursive parameter estimation technique for identifying a thermostatically controlled load (TCL) model that is non-linear in the parameters. For TCLs to perform demand response services in real-time markets, online methods for parameter estimation are needed. Third, we develop a piecewise linear thermal model of a residential building and train the model using data collected from a custom-built thermostat. This model is capable of approximating unmodeled dynamics within a building by learning from sensor data. Control techniques encompass the application of optimal control theory, model predictive control, and convex distributed optimization to TCLs. First, we present the alternative control trajectory (ACT) representation, a novel method for the approximate optimization of non-convex discrete systems. This approach enables the optimal control of a population of non-convex agents using distributed convex optimization techniques. Second, we present a distributed convex optimization algorithm for the control of a TCL population. Experimental results demonstrate the application of this algorithm to the problem of renewable energy generation following. This dissertation contributes to the development of intelligent energy management systems for buildings by presenting a suite of novel and adaptable modeling and control techniques. Applications focus on optimizing the performance of building operations and on facilitating the integration of renewable energy resources.

  1. Computational and mathematical methods in brain atlasing.

    PubMed

    Nowinski, Wieslaw L

    2017-12-01

    Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.

  2. LIDAR Point Cloud Data Extraction and Establishment of 3D Modeling of Buildings

    NASA Astrophysics Data System (ADS)

    Zhang, Yujuan; Li, Xiuhai; Wang, Qiang; Liu, Jiang; Liang, Xin; Li, Dan; Ni, Chundi; Liu, Yan

    2018-01-01

    This paper takes the method of Shepard’s to deal with the original LIDAR point clouds data, and generate regular grid data DSM, filters the ground point cloud and non ground point cloud through double least square method, and obtains the rules of DSM. By using region growing method for the segmentation of DSM rules, the removal of non building point cloud, obtaining the building point cloud information. Uses the Canny operator to extract the image segmentation is needed after the edges of the building, uses Hough transform line detection to extract the edges of buildings rules of operation based on the smooth and uniform. At last, uses E3De3 software to establish the 3D model of buildings.

  3. Method development of damage detection in asymmetric buildings

    NASA Astrophysics Data System (ADS)

    Wang, Yi; Thambiratnam, David P.; Chan, Tommy H. T.; Nguyen, Andy

    2018-01-01

    Aesthetics and functionality requirements have caused most buildings to be asymmetric in recent times. Such buildings exhibit complex vibration characteristics under dynamic loads as there is coupling between the lateral and torsional components of vibration, and are referred to as torsionally coupled buildings. These buildings require three dimensional modelling and analysis. In spite of much recent research and some successful applications of vibration based damage detection methods to civil structures in recent years, the applications to asymmetric buildings has been a challenging task for structural engineers. There has been relatively little research on detecting and locating damage specific to torsionally coupled asymmetric buildings. This paper aims to compare the difference in vibration behaviour between symmetric and asymmetric buildings and then use the vibration characteristics for predicting damage in them. The need for developing a special method to detect damage in asymmetric buildings thus becomes evident. Towards this end, this paper modifies the traditional modal strain energy based damage index by decomposing the mode shapes into their lateral and vertical components and to form component specific damage indices. The improved approach is then developed by combining the modified strain energy based damage indices with the modal flexibility method which was modified to suit three dimensional structures to form a new damage indicator. The procedure is illustrated through numerical studies conducted on three dimensional five-story symmetric and asymmetric frame structures with the same layout, after validating the modelling techniques through experimental testing of a laboratory scale asymmetric building model. Vibration parameters obtained from finite element analysis of the intact and damaged building models are then applied into the proposed algorithms for detecting and locating the single and multiple damages in these buildings. The results obtained from a number of different damage scenarios confirm the feasibility of the proposed vibration based damage detection method for three dimensional asymmetric buildings.

  4. A Hybrid 3D Indoor Space Model

    NASA Astrophysics Data System (ADS)

    Jamali, Ali; Rahman, Alias Abdul; Boguslawski, Pawel

    2016-10-01

    GIS integrates spatial information and spatial analysis. An important example of such integration is for emergency response which requires route planning inside and outside of a building. Route planning requires detailed information related to indoor and outdoor environment. Indoor navigation network models including Geometric Network Model (GNM), Navigable Space Model, sub-division model and regular-grid model lack indoor data sources and abstraction methods. In this paper, a hybrid indoor space model is proposed. In the proposed method, 3D modeling of indoor navigation network is based on surveying control points and it is less dependent on the 3D geometrical building model. This research proposes a method of indoor space modeling for the buildings which do not have proper 2D/3D geometrical models or they lack semantic or topological information. The proposed hybrid model consists of topological, geometrical and semantical space.

  5. Technological aspects of lift-slab method in high-rise-building construction.

    NASA Astrophysics Data System (ADS)

    Gaidukov, Pavel V.; Pugach, Evgeny M.

    2018-03-01

    The utilization efficiency of slab lifting technology for high-rise-building construction is regarded in the present article. The main problem of the article is organizing technology abilities indication, which proves the method application possibility. There is the comparing of lifting technologies and sequential concrete-frame extension, as follows: the first one: the parameters are defined, and the second one: the organizational model is executed. This model defines borders of the usage methods, as well. There is the mathematic model creating, which describes boundary conditions of the present technologies usage. This model allows to predict construction efficiency for different stored-number buildings.

  6. lidar change detection using building models

    NASA Astrophysics Data System (ADS)

    Kim, Angela M.; Runyon, Scott C.; Jalobeanu, Andre; Esterline, Chelsea H.; Kruse, Fred A.

    2014-06-01

    Terrestrial LiDAR scans of building models collected with a FARO Focus3D and a RIEGL VZ-400 were used to investigate point-to-point and model-to-model LiDAR change detection. LiDAR data were scaled, decimated, and georegistered to mimic real world airborne collects. Two physical building models were used to explore various aspects of the change detection process. The first model was a 1:250-scale representation of the Naval Postgraduate School campus in Monterey, CA, constructed from Lego blocks and scanned in a laboratory setting using both the FARO and RIEGL. The second model at 1:8-scale consisted of large cardboard boxes placed outdoors and scanned from rooftops of adjacent buildings using the RIEGL. A point-to-point change detection scheme was applied directly to the point-cloud datasets. In the model-to-model change detection scheme, changes were detected by comparing Digital Surface Models (DSMs). The use of physical models allowed analysis of effects of changes in scanner and scanning geometry, and performance of the change detection methods on different types of changes, including building collapse or subsistence, construction, and shifts in location. Results indicate that at low false-alarm rates, the point-to-point method slightly outperforms the model-to-model method. The point-to-point method is less sensitive to misregistration errors in the data. Best results are obtained when the baseline and change datasets are collected using the same LiDAR system and collection geometry.

  7. The Application of Typology Method in Historical Building Information Modelling (hbim) Taking the Information Surveying and Mapping of Jiayuguan Fortress Town as AN Example

    NASA Astrophysics Data System (ADS)

    Li, D. Y.; Li, K.; Wu, C.

    2017-08-01

    With the promotion of fine degree of the heritage building surveying and mapping, building information modelling technology(BIM) begins to be used in surveying and mapping, renovation, recording and research of heritage building, called historical building information modelling(HBIM). The hierarchical frameworks of parametric component library of BIM, belonging to the same type with the same parameters, has the same internal logic with archaeological typology which is more and more popular in the age identification of ancient buildings. Compared with the common materials, 2D drawings and photos, typology with HBIM has two advantages — (1) comprehensive building information both in collection and representation and (2) uniform and reasonable classification criteria This paper will take the information surveying and mapping of Jiayuguan Fortress Town as an example to introduce the field work method of information surveying and mapping based on HBIM technology and the construction of Revit family library.And then in order to prove the feasibility and advantage of HBIM technology used in typology method, this paper will identify the age of Guanghua gate tower, Rouyuan gate tower, Wenchang pavilion and the theater building of Jiayuguan Fortress Town with HBIM technology and typology method.

  8. Coordination between Understanding Historic Buildings and BIM Modelling: A 3D-Output Oriented and typological Data Capture Method

    NASA Astrophysics Data System (ADS)

    Li, K.; Li, S. J.; Liu, Y.; Wang, W.; Wu, C.

    2015-08-01

    At the present, in trend of shifting the old 2D-output oriented survey to a new 3D-output oriented survey based on BIM technology, the corresponding working methods and workflow for data capture, process, representation, etc. have to be changed.Based on case study of two buildings in the Summer Palace of Beijing, and Jiayuguan Pass at the west end of the Great Wall (both World Heritage sites), this paper puts forward a "structure-and-type method" by means of typological method used in archaeology, Revit family system, and the tectonic logic of building to realize a good coordination between understanding of historic buildings and BIM modelling.

  9. First Steps to Automated Interior Reconstruction from Semantically Enriched Point Clouds and Imagery

    NASA Astrophysics Data System (ADS)

    Obrock, L. S.; Gülch, E.

    2018-05-01

    The automated generation of a BIM-Model from sensor data is a huge challenge for the modeling of existing buildings. Currently the measurements and analyses are time consuming, allow little automation and require expensive equipment. We do lack an automated acquisition of semantical information of objects in a building. We are presenting first results of our approach based on imagery and derived products aiming at a more automated modeling of interior for a BIM building model. We examine the building parts and objects visible in the collected images using Deep Learning Methods based on Convolutional Neural Networks. For localization and classification of building parts we apply the FCN8s-Model for pixel-wise Semantic Segmentation. We, so far, reach a Pixel Accuracy of 77.2 % and a mean Intersection over Union of 44.2 %. We finally use the network for further reasoning on the images of the interior room. We combine the segmented images with the original images and use photogrammetric methods to produce a three-dimensional point cloud. We code the extracted object types as colours of the 3D-points. We thus are able to uniquely classify the points in three-dimensional space. We preliminary investigate a simple extraction method for colour and material of building parts. It is shown, that the combined images are very well suited to further extract more semantic information for the BIM-Model. With the presented methods we see a sound basis for further automation of acquisition and modeling of semantic and geometric information of interior rooms for a BIM-Model.

  10. Analysis of the impact of simulation model simplifications on the quality of low-energy buildings simulation results

    NASA Astrophysics Data System (ADS)

    Klimczak, Marcin; Bojarski, Jacek; Ziembicki, Piotr; Kęskiewicz, Piotr

    2017-11-01

    The requirements concerning energy performance of buildings and their internal installations, particularly HVAC systems, have been growing continuously in Poland and all over the world. The existing, traditional calculation methods following from the static heat exchange model are frequently not sufficient for a reasonable heating design of a building. Both in Poland and elsewhere in the world, methods and software are employed which allow a detailed simulation of the heating and moisture conditions in a building, and also an analysis of the performance of HVAC systems within a building. However, these systems are usually difficult in use and complex. In addition, the development of a simulation model that is sufficiently adequate to the real building requires considerable time involvement of a designer, is time-consuming and laborious. A simplification of the simulation model of a building renders it possible to reduce the costs of computer simulations. The paper analyses in detail the effect of introducing a number of different variants of the simulation model developed in Design Builder on the quality of final results obtained. The objective of this analysis is to find simplifications which allow obtaining simulation results which have an acceptable level of deviations from the detailed model, thus facilitating a quick energy performance analysis of a given building.

  11. Change detection on LOD 2 building models with very high resolution spaceborne stereo imagery

    NASA Astrophysics Data System (ADS)

    Qin, Rongjun

    2014-10-01

    Due to the fast development of the urban environment, the need for efficient maintenance and updating of 3D building models is ever increasing. Change detection is an essential step to spot the changed area for data (map/3D models) updating and urban monitoring. Traditional methods based on 2D images are no longer suitable for change detection in building scale, owing to the increased spectral variability of the building roofs and larger perspective distortion of the very high resolution (VHR) imagery. Change detection in 3D is increasingly being investigated using airborne laser scanning data or matched Digital Surface Models (DSM), but rare study has been conducted regarding to change detection on 3D city models with VHR images, which is more informative but meanwhile more complicated. This is due to the fact that the 3D models are abstracted geometric representation of the urban reality, while the VHR images record everything. In this paper, a novel method is proposed to detect changes directly on LOD (Level of Detail) 2 building models with VHR spaceborne stereo images from a different date, with particular focus on addressing the special characteristics of the 3D models. In the first step, the 3D building models are projected onto a raster grid, encoded with building object, terrain object, and planar faces. The DSM is extracted from the stereo imagery by hierarchical semi-global matching (SGM). In the second step, a multi-channel change indicator is extracted between the 3D models and stereo images, considering the inherent geometric consistency (IGC), height difference, and texture similarity for each planar face. Each channel of the indicator is then clustered with the Self-organizing Map (SOM), with "change", "non-change" and "uncertain change" status labeled through a voting strategy. The "uncertain changes" are then determined with a Markov Random Field (MRF) analysis considering the geometric relationship between faces. In the third step, buildings are extracted combining the multispectral images and the DSM by morphological operators, and the new buildings are determined by excluding the verified unchanged buildings from the second step. Both the synthetic experiment with Worldview-2 stereo imagery and the real experiment with IKONOS stereo imagery are carried out to demonstrate the effectiveness of the proposed method. It is shown that the proposed method can be applied as an effective way to monitoring the building changes, as well as updating 3D models from one epoch to the other.

  12. Modeling and Measurement Constraints in Fault Diagnostics for HVAC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Najafi, Massieh; Auslander, David M.; Bartlett, Peter L.

    2010-05-30

    Many studies have shown that energy savings of five to fifteen percent are achievable in commercial buildings by detecting and correcting building faults, and optimizing building control systems. However, in spite of good progress in developing tools for determining HVAC diagnostics, methods to detect faults in HVAC systems are still generally undeveloped. Most approaches use numerical filtering or parameter estimation methods to compare data from energy meters and building sensors to predictions from mathematical or statistical models. They are effective when models are relatively accurate and data contain few errors. In this paper, we address the case where models aremore » imperfect and data are variable, uncertain, and can contain error. We apply a Bayesian updating approach that is systematic in managing and accounting for most forms of model and data errors. The proposed method uses both knowledge of first principle modeling and empirical results to analyze the system performance within the boundaries defined by practical constraints. We demonstrate the approach by detecting faults in commercial building air handling units. We find that the limitations that exist in air handling unit diagnostics due to practical constraints can generally be effectively addressed through the proposed approach.« less

  13. Development and Testing of Building Energy Model Using Non-Linear Auto Regression Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Arida, Maya Ahmad

    In 1972 sustainable development concept existed and during The years it became one of the most important solution to save natural resources and energy, but now with rising energy costs and increasing awareness of the effect of global warming, the development of building energy saving methods and models become apparently more necessary for sustainable future. According to U.S. Energy Information Administration EIA (EIA), today buildings in the U.S. consume 72 percent of electricity produced, and use 55 percent of U.S. natural gas. Buildings account for about 40 percent of the energy consumed in the United States, more than industry and transportation. Of this energy, heating and cooling systems use about 55 percent. If energy-use trends continue, buildings will become the largest consumer of global energy by 2025. This thesis proposes procedures and analysis techniques for building energy system and optimization methods using time series auto regression artificial neural networks. The model predicts whole building energy consumptions as a function of four input variables, dry bulb and wet bulb outdoor air temperatures, hour of day and type of day. The proposed model and the optimization process are tested using data collected from an existing building located in Greensboro, NC. The testing results show that the model can capture very well the system performance, and The optimization method was also developed to automate the process of finding the best model structure that can produce the best accurate prediction against the actual data. The results show that the developed model can provide results sufficiently accurate for its use in various energy efficiency and saving estimation applications.

  14. Building Facade Modeling Under Line Feature Constraint Based on Close-Range Images

    NASA Astrophysics Data System (ADS)

    Liang, Y.; Sheng, Y. H.

    2018-04-01

    To solve existing problems in modeling facade of building merely with point feature based on close-range images , a new method for modeling building facade under line feature constraint is proposed in this paper. Firstly, Camera parameters and sparse spatial point clouds data were restored using the SFM , and 3D dense point clouds were generated with MVS; Secondly, the line features were detected based on the gradient direction , those detected line features were fit considering directions and lengths , then line features were matched under multiple types of constraints and extracted from multi-image sequence. At last, final facade mesh of a building was triangulated with point cloud and line features. The experiment shows that this method can effectively reconstruct the geometric facade of buildings using the advantages of combining point and line features of the close - range image sequence, especially in restoring the contour information of the facade of buildings.

  15. Physical and JIT Model Based Hybrid Modeling Approach for Building Thermal Load Prediction

    NASA Astrophysics Data System (ADS)

    Iino, Yutaka; Murai, Masahiko; Murayama, Dai; Motoyama, Ichiro

    Energy conservation in building fields is one of the key issues in environmental point of view as well as that of industrial, transportation and residential fields. The half of the total energy consumption in a building is occupied by HVAC (Heating, Ventilating and Air Conditioning) systems. In order to realize energy conservation of HVAC system, a thermal load prediction model for building is required. This paper propose a hybrid modeling approach with physical and Just-in-Time (JIT) model for building thermal load prediction. The proposed method has features and benefits such as, (1) it is applicable to the case in which past operation data for load prediction model learning is poor, (2) it has a self checking function, which always supervises if the data driven load prediction and the physical based one are consistent or not, so it can find if something is wrong in load prediction procedure, (3) it has ability to adjust load prediction in real-time against sudden change of model parameters and environmental conditions. The proposed method is evaluated with real operation data of an existing building, and the improvement of load prediction performance is illustrated.

  16. Building Regression Models: The Importance of Graphics.

    ERIC Educational Resources Information Center

    Dunn, Richard

    1989-01-01

    Points out reasons for using graphical methods to teach simple and multiple regression analysis. Argues that a graphically oriented approach has considerable pedagogic advantages in the exposition of simple and multiple regression. Shows that graphical methods may play a central role in the process of building regression models. (Author/LS)

  17. Modelling of Rail Vehicles and Track for Calculation of Ground-Vibration Transmission Into Buildings

    NASA Astrophysics Data System (ADS)

    Hunt, H. E. M.

    1996-05-01

    A methodology for the calculation of vibration transmission from railways into buildings is presented. The method permits existing models of railway vehicles and track to be incorporated and it has application to any model of vibration transmission through the ground. Special attention is paid to the relative phasing between adjacent axle-force inputs to the rail, so that vibration transmission may be calculated as a random process. The vehicle-track model is used in conjunction with a building model of infinite length. The tracking and building are infinite and parallel to each other and forces applied are statistically stationary in space so that vibration levels at any two points along the building are the same. The methodology is two-dimensional for the purpose of application of random process theory, but fully three-dimensional for calculation of vibration transmission from the track and through the ground into the foundations of the building. The computational efficiency of the method will interest engineers faced with the task of reducing vibration levels in buildings. It is possible to assess the relative merits of using rail pads, under-sleeper pads, ballast mats, floating-slab track or base isolation for particular applications.

  18. Combining the 3D model generated from point clouds and thermography to identify the defects presented on the facades of a building

    NASA Astrophysics Data System (ADS)

    Huang, Yishuo; Chiang, Chih-Hung; Hsu, Keng-Tsang

    2018-03-01

    Defects presented on the facades of a building do have profound impacts on extending the life cycle of the building. How to identify the defects is a crucial issue; destructive and non-destructive methods are usually employed to identify the defects presented on a building. Destructive methods always cause the permanent damages for the examined objects; on the other hand, non-destructive testing (NDT) methods have been widely applied to detect those defects presented on exterior layers of a building. However, NDT methods cannot provide efficient and reliable information for identifying the defects because of the huge examination areas. Infrared thermography is often applied to quantitative energy performance measurements for building envelopes. Defects on the exterior layer of buildings may be caused by several factors: ventilation losses, conduction losses, thermal bridging, defective services, moisture condensation, moisture ingress, and structure defects. Analyzing the collected thermal images can be quite difficult when the spatial variations of surface temperature are small. In this paper the authors employ image segmentation to cluster those pixels with similar surface temperatures such that the processed thermal images can be composed of limited groups. The surface temperature distribution in each segmented group is homogenous. In doing so, the regional boundaries of the segmented regions can be identified and extracted. A terrestrial laser scanner (TLS) is widely used to collect the point clouds of a building, and those point clouds are applied to reconstruct the 3D model of the building. A mapping model is constructed such that the segmented thermal images can be projected onto the 2D image of the specified 3D building. In this paper, the administrative building in Chaoyang University campus is used as an example. The experimental results not only provide the defect information but also offer their corresponding spatial locations in the 3D model.

  19. 3D Reconstruction of Irregular Buildings and Buddha Statues

    NASA Astrophysics Data System (ADS)

    Zhang, K.; Li, M.-j.

    2014-04-01

    Three-dimensional laser scanning could acquire object's surface data quickly and accurately. However, the post-processing of point cloud is not perfect and could be improved. Based on the study of 3D laser scanning technology, this paper describes the details of solutions to modelling irregular ancient buildings and Buddha statues in Jinshan Temple, which aiming at data acquisition, modelling and texture mapping, etc. In order to modelling irregular ancient buildings effectively, the structure of each building is extracted manually by point cloud and the textures are mapped by the software of 3ds Max. The methods clearly combine 3D laser scanning technology with traditional modelling methods, and greatly improves the efficiency and accuracy of the ancient buildings restored. On the other hand, the main idea of modelling statues is regarded as modelling objects in reverse engineering. The digital model of statues obtained is not just vivid, but also accurate in the field of surveying and mapping. On this basis, a 3D scene of Jinshan Temple is reconstructed, which proves the validity of the solutions.

  20. A New Method of Building Scale-Model Houses

    Treesearch

    Richard N. Malcolm

    1978-01-01

    Scale-model houses are used to display new architectural and construction designs.Some scale-model houses will not withstand the abuse of shipping and handling.This report describes how to build a solid-core model house which is rigid, lightweight, and sturdy.

  1. Learning Oriented Region-based Convolutional Neural Networks for Building Detection in Satellite Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Chen, C.; Gong, W.; Hu, Y.; Chen, Y.; Ding, Y.

    2017-05-01

    The automated building detection in aerial images is a fundamental problem encountered in aerial and satellite images analysis. Recently, thanks to the advances in feature descriptions, Region-based CNN model (R-CNN) for object detection is receiving an increasing attention. Despite the excellent performance in object detection, it is problematic to directly leverage the features of R-CNN model for building detection in single aerial image. As we know, the single aerial image is in vertical view and the buildings possess significant directional feature. However, in R-CNN model, direction of the building is ignored and the detection results are represented by horizontal rectangles. For this reason, the detection results with horizontal rectangle cannot describe the building precisely. To address this problem, in this paper, we proposed a novel model with a key feature related to orientation, namely, Oriented R-CNN (OR-CNN). Our contributions are mainly in the following two aspects: 1) Introducing a new oriented layer network for detecting the rotation angle of building on the basis of the successful VGG-net R-CNN model; 2) the oriented rectangle is proposed to leverage the powerful R-CNN for remote-sensing building detection. In experiments, we establish a complete and bran-new data set for training our oriented R-CNN model and comprehensively evaluate the proposed method on a publicly available building detection data set. We demonstrate State-of-the-art results compared with the previous baseline methods.

  2. Estimating Fallout Building Attributes from Architectural Features and Global Earthquake Model (GEM) Building Descriptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillon, Michael B.; Kane, Staci R.

    A nuclear explosion has the potential to injure or kill tens to hundreds of thousands (or more) of people through exposure to fallout (external gamma) radiation. Existing buildings can protect their occupants (reducing fallout radiation exposures) by placing material and distance between fallout particles and individuals indoors. Prior efforts have determined an initial set of building attributes suitable to reasonably assess a given building’s protection against fallout radiation. The current work provides methods to determine the quantitative values for these attributes from (a) common architectural features and data and (b) buildings described using the Global Earthquake Model (GEM) taxonomy. Thesemore » methods will be used to improve estimates of fallout protection for operational US Department of Defense (DoD) and US Department of Energy (DOE) consequence assessment models.« less

  3. The impact of solar radiation on the heating and cooling of buildings

    NASA Astrophysics Data System (ADS)

    Witmer, Lucas

    This work focuses on the impact of solar energy on the heating and cooling of buildings. The sun can be the primary driver for building cooling loads as well as a significant source of heat in the winter. Methods are presented for the calculation of solar energy incident on tilted surfaces and the irradiance data source options. A key deficiency in current building energy modeling softwares is reviewed with a demonstration of the impact of calculating for shade on opaque surfaces. Several tools include methods for calculating shade incident on windows, while none do so automatically for opaque surfaces. The resulting calculations for fully irradiated wall surfaces underestimate building energy consumption in the winter and overestimate in the summer by significant margins. A method has been developed for processing and filtering solar irradiance data based on local shading. This method is used to compare situations where a model predictive control system can make poor decisions for building comfort control. An MPC system informed by poor quality solar data will negatively impact comfort in perimeter building zones during the cooling season. The direct component of irradiance is necessary for the calculation of irradiance on a tilted surface. Using graphical analysis and conditional probability distributions, this work demonstrates a proof of concept for estimating direct normal irradiance from a multi-pyranometer array by leveraging inter-surface relationships without directly inverting a sky model.

  4. BESTEST-EX | Buildings | NREL

    Science.gov Websites

    method for testing home energy audit software and associated calibration methods. BESTEST-EX is one of Energy Analysis Model Calibration Methods. When completed, the ANSI/RESNET SMOT will specify test procedures for evaluating calibration methods used in conjunction with predicting building energy use and

  5. Modelling Complex Fenestration Systems using physical and virtual models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thanachareonkit, Anothai; Scartezzini, Jean-Louis

    2010-04-15

    Physical or virtual models are commonly used to visualize the conceptual ideas of architects, lighting designers and researchers; they are also employed to assess the daylighting performance of buildings, particularly in cases where Complex Fenestration Systems (CFS) are considered. Recent studies have however revealed a general tendency of physical models to over-estimate this performance, compared to those of real buildings; these discrepancies can be attributed to several reasons. In order to identify the main error sources, a series of comparisons in-between a real building (a single office room within a test module) and the corresponding physical and virtual models wasmore » undertaken. The physical model was placed in outdoor conditions, which were strictly identical to those of the real building, as well as underneath a scanning sky simulator. The virtual model simulations were carried out by way of the Radiance program using the GenSky function; an alternative evaluation method, named Partial Daylight Factor method (PDF method), was also employed with the physical model together with sky luminance distributions acquired by a digital sky scanner during the monitoring of the real building. The overall daylighting performance of physical and virtual models were assessed and compared. The causes of discrepancies between the daylighting performance of the real building and the models were analysed. The main identified sources of errors are the reproduction of building details, the CFS modelling and the mocking-up of the geometrical and photometrical properties. To study the impact of these errors on daylighting performance assessment, computer simulation models created using the Radiance program were also used to carry out a sensitivity analysis of modelling errors. The study of the models showed that large discrepancies can occur in daylighting performance assessment. In case of improper mocking-up of the glazing for instance, relative divergences of 25-40% can be found in different room locations, suggesting that more light is entering than actually monitored in the real building. All these discrepancies can however be reduced by making an effort to carefully mock up the geometry and photometry of the real building. A synthesis is presented in this article which can be used as guidelines for daylighting designers to avoid or estimate errors during CFS daylighting performance assessment. (author)« less

  6. Voluminator 2.0 - Speeding up the Approximation of the Volume of Defective 3d Building Models

    NASA Astrophysics Data System (ADS)

    Sindram, M.; Machl, T.; Steuer, H.; Pültz, M.; Kolbe, T. H.

    2016-06-01

    Semantic 3D city models are increasingly used as a data source in planning and analyzing processes of cities. They represent a virtual copy of the reality and are a common information base and source of information for examining urban questions. A significant advantage of virtual city models is that important indicators such as the volume of buildings, topological relationships between objects and other geometric as well as thematic information can be derived. Knowledge about the exact building volume is an essential base for estimating the building energy demand. In order to determine the volume of buildings with conventional algorithms and tools, the buildings may not contain any topological and geometrical errors. The reality, however, shows that city models very often contain errors such as missing surfaces, duplicated faces and misclosures. To overcome these errors (Steuer et al., 2015) have presented a robust method for approximating the volume of building models. For this purpose, a bounding box of the building is divided into a regular grid of voxels and it is determined which voxels are inside the building. The regular arrangement of the voxels leads to a high number of topological tests and prevents the application of this method using very high resolutions. In this paper we present an extension of the algorithm using an octree approach limiting the subdivision of space to regions around surfaces of the building models and to regions where, in the case of defective models, the topological tests are inconclusive. We show that the computation time can be significantly reduced, while preserving the robustness against geometrical and topological errors.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Torcellini, Paul A.; Bonnema, Eric; Goldwasser, David

    Building energy consumption can only be measured at the site or at the point of utility interconnection with a building. Often, to evaluate the total energy impact, this site-based energy consumption is translated into source energy, that is, the energy at the point of fuel extraction. Consistent with this approach, the U.S. Department of Energy's (DOE) definition of zero energy buildings uses source energy as the metric to account for energy losses from the extraction, transformation, and delivery of energy. Other organizations, as well, use source energy to characterize the energy impacts. Four methods of making the conversion from sitemore » energy to source energy were investigated in the context of the DOE definition of zero energy buildings. These methods were evaluated based on three guiding principles--improve energy efficiency, reduce and stabilize power demand, and use power from nonrenewable energy sources as efficiently as possible. This study examines relative trends between strategies as they are implemented on very low-energy buildings to achieve zero energy. A typical office building was modeled and variations to this model performed. The photovoltaic output that was required to create a zero energy building was calculated. Trends were examined with these variations to study the impacts of the calculation method on the building's ability to achieve zero energy status. The paper will highlight the different methods and give conclusions on the advantages and disadvantages of the methods studied.« less

  8. Village Building Identification Based on Ensemble Convolutional Neural Networks

    PubMed Central

    Guo, Zhiling; Chen, Qi; Xu, Yongwei; Shibasaki, Ryosuke; Shao, Xiaowei

    2017-01-01

    In this study, we present the Ensemble Convolutional Neural Network (ECNN), an elaborate CNN frame formulated based on ensembling state-of-the-art CNN models, to identify village buildings from open high-resolution remote sensing (HRRS) images. First, to optimize and mine the capability of CNN for village mapping and to ensure compatibility with our classification targets, a few state-of-the-art models were carefully optimized and enhanced based on a series of rigorous analyses and evaluations. Second, rather than directly implementing building identification by using these models, we exploited most of their advantages by ensembling their feature extractor parts into a stronger model called ECNN based on the multiscale feature learning method. Finally, the generated ECNN was applied to a pixel-level classification frame to implement object identification. The proposed method can serve as a viable tool for village building identification with high accuracy and efficiency. The experimental results obtained from the test area in Savannakhet province, Laos, prove that the proposed ECNN model significantly outperforms existing methods, improving overall accuracy from 96.64% to 99.26%, and kappa from 0.57 to 0.86. PMID:29084154

  9. Modeling Poroelastic Wave Propagation in a Real 2-D Complex Geological Structure Obtained via Self-Organizing Maps

    NASA Astrophysics Data System (ADS)

    Itzá Balam, Reymundo; Iturrarán-Viveros, Ursula; Parra, Jorge O.

    2018-03-01

    Two main stages of seismic modeling are geological model building and numerical computation of seismic response for the model. The quality of the computed seismic response is partly related to the type of model that is built. Therefore, the model building approaches become as important as seismic forward numerical methods. For this purpose, three petrophysical facies (sands, shales and limestones) are extracted from reflection seismic data and some seismic attributes via the clustering method called Self-Organizing Maps (SOM), which, in this context, serves as a geological model building tool. This model with all its properties is the input to the Optimal Implicit Staggered Finite Difference (OISFD) algorithm to create synthetic seismograms for poroelastic, poroacoustic and elastic media. The results show a good agreement between observed and 2-D synthetic seismograms. This demonstrates that the SOM classification method enables us to extract facies from seismic data and allows us to integrate the lithology at the borehole scale with the 2-D seismic data.

  10. Research on conflict detection algorithm in 3D visualization environment of urban rail transit line

    NASA Astrophysics Data System (ADS)

    Wang, Li; Xiong, Jing; You, Kuokuo

    2017-03-01

    In this paper, a method of collision detection is introduced, and the theory of three-dimensional modeling of underground buildings and urban rail lines is realized by rapidly extracting the buildings that are in conflict with the track area in the 3D visualization environment. According to the characteristics of the buildings, CSG and B-rep are used to model the buildings based on CSG and B-rep. On the basis of studying the modeling characteristics, this paper proposes to use the AABB level bounding volume method to detect the first conflict and improve the detection efficiency, and then use the triangular rapid intersection detection algorithm to detect the conflict, and finally determine whether the building collides with the track area. Through the algorithm of this paper, we can quickly extract buildings colliding with the influence area of the track line, so as to help the line design, choose the best route and calculate the cost of land acquisition in the three-dimensional visualization environment.

  11. Dynamic building risk assessment theoretic model for rainstorm-flood utilization ABM and ABS

    NASA Astrophysics Data System (ADS)

    Lai, Wenze; Li, Wenbo; Wang, Hailei; Huang, Yingliang; Wu, Xuelian; Sun, Bingyun

    2015-12-01

    Flood is one of natural disasters with the worst loss in the world. It needs to assess flood disaster risk so that we can reduce the loss of flood disaster. Disaster management practical work needs the dynamic risk results of building. Rainstorm flood disaster system is a typical complex system. From the view of complex system theory, flood disaster risk is the interaction result of hazard effect objects, rainstorm flood hazard factors, and hazard environments. Agent-based modeling (ABM) is an important tool for complex system modeling. Rainstorm-flood building risk dynamic assessment method (RFBRDAM) was proposed using ABM in this paper. The interior structures and procedures of different agents in proposed meth had been designed. On the Netlogo platform, the proposed method was implemented to assess the building risk changes of the rainstorm flood disaster in the Huaihe River Basin using Agent-based simulation (ABS). The results indicated that the proposed method can dynamically assess building risk of the whole process for the rainstorm flood disaster. The results of this paper can provide one new approach for flood disaster building risk dynamic assessment and flood disaster management.

  12. Slicing Method for curved façade and window extraction from point clouds

    NASA Astrophysics Data System (ADS)

    Iman Zolanvari, S. M.; Laefer, Debra F.

    2016-09-01

    Laser scanning technology is a fast and reliable method to survey structures. However, the automatic conversion of such data into solid models for computation remains a major challenge, especially where non-rectilinear features are present. Since, openings and the overall dimensions of the buildings are the most critical elements in computational models for structural analysis, this article introduces the Slicing Method as a new, computationally-efficient method for extracting overall façade and window boundary points for reconstructing a façade into a geometry compatible for computational modelling. After finding a principal plane, the technique slices a façade into limited portions, with each slice representing a unique, imaginary section passing through a building. This is done along a façade's principal axes to segregate window and door openings from structural portions of the load-bearing masonry walls. The method detects each opening area's boundaries, as well as the overall boundary of the façade, in part, by using a one-dimensional projection to accelerate processing. Slices were optimised as 14.3 slices per vertical metre of building and 25 slices per horizontal metre of building, irrespective of building configuration or complexity. The proposed procedure was validated by its application to three highly decorative, historic brick buildings. Accuracy in excess of 93% was achieved with no manual intervention on highly complex buildings and nearly 100% on simple ones. Furthermore, computational times were less than 3 sec for data sets up to 2.6 million points, while similar existing approaches required more than 16 hr for such datasets.

  13. A comprehensive study on urban true orthorectification

    USGS Publications Warehouse

    Zhou, G.; Chen, W.; Kelmelis, J.A.; Zhang, Dongxiao

    2005-01-01

    To provide some advanced technical bases (algorithms and procedures) and experience needed for national large-scale digital orthophoto generation and revision of the Standards for National Large-Scale City Digital Orthophoto in the National Digital Orthophoto Program (NDOP), this paper presents a comprehensive study on theories, algorithms, and methods of large-scale urban orthoimage generation. The procedures of orthorectification for digital terrain model (DTM)-based and digital building model (DBM)-based orthoimage generation and their mergence for true orthoimage generation are discussed in detail. A method of compensating for building occlusions using photogrammetric geometry is developed. The data structure needed to model urban buildings for accurately generating urban orthoimages is presented. Shadow detection and removal, the optimization of seamline for automatic mosaic, and the radiometric balance of neighbor images are discussed. Street visibility analysis, including the relationship between flight height, building height, street width, and relative location of the street to the imaging center, is analyzed for complete true orthoimage generation. The experimental results demonstrated that our method can effectively and correctly orthorectify the displacements caused by terrain and buildings in urban large-scale aerial images. ?? 2005 IEEE.

  14. Identification Approach to Alleviate Effects of Unmeasured Heat Gains for MIMO Building Thermal Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Jie; Kim, Donghun; Braun, James E.

    It is important to have practical methods for constructing a good mathematical model for a building's thermal system for energy audits, retrofit analysis and advanced building controls, e.g. model predictive control. Identification approaches based on semi-physical model structures are popular in building science for those purposes. However conventional gray box identification approaches applied to thermal networks would fail when significant unmeasured heat gains present in estimation data. Although this situation is very common and practical, there has been little research to tackle this issue in building science. This paper presents an overall identification approach to alleviate influences of unmeasured disturbances,more » and hence to obtain improved gray-box building models. The approach was applied to an existing open space building and the performance is demonstrated.« less

  15. A seismic optimization procedure for reinforced concrete framed buildings based on eigenfrequency optimization

    NASA Astrophysics Data System (ADS)

    Arroyo, Orlando; Gutiérrez, Sergio

    2017-07-01

    Several seismic optimization methods have been proposed to improve the performance of reinforced concrete framed (RCF) buildings; however, they have not been widely adopted among practising engineers because they require complex nonlinear models and are computationally expensive. This article presents a procedure to improve the seismic performance of RCF buildings based on eigenfrequency optimization, which is effective, simple to implement and efficient. The method is used to optimize a 10-storey regular building, and its effectiveness is demonstrated by nonlinear time history analyses, which show important reductions in storey drifts and lateral displacements compared to a non-optimized building. A second example for an irregular six-storey building demonstrates that the method provides benefits to a wide range of RCF structures and supports the applicability of the proposed method.

  16. A model for the sustainable selection of building envelope assemblies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huedo, Patricia, E-mail: huedo@uji.es; Mulet, Elena, E-mail: emulet@uji.es; López-Mesa, Belinda, E-mail: belinda@unizar.es

    2016-02-15

    The aim of this article is to define an evaluation model for the environmental impacts of building envelopes to support planners in the early phases of materials selection. The model is intended to estimate environmental impacts for different combinations of building envelope assemblies based on scientifically recognised sustainability indicators. These indicators will increase the amount of information that existing catalogues show to support planners in the selection of building assemblies. To define the model, first the environmental indicators were selected based on the specific aims of the intended sustainability assessment. Then, a simplified LCA methodology was developed to estimate themore » impacts applicable to three types of dwellings considering different envelope assemblies, building orientations and climate zones. This methodology takes into account the manufacturing, installation, maintenance and use phases of the building. Finally, the model was validated and a matrix in Excel was created as implementation of the model. - Highlights: • Method to assess the envelope impacts based on a simplified LCA • To be used at an earlier phase than the existing methods in a simple way. • It assigns a score by means of known sustainability indicators. • It estimates data about the embodied and operating environmental impacts. • It compares the investment costs with the costs of the consumed energy.« less

  17. Four Methods for Completing the Conceptual Development Phase of Applied Theory Building Research in HRD

    ERIC Educational Resources Information Center

    Storberg-Walker, Julia; Chermack, Thomas J.

    2007-01-01

    The purpose of this article is to describe four methods for completing the conceptual development phase of theory building research for single or multiparadigm research. The four methods selected for this review are (1) Weick's method of "theorizing as disciplined imagination" (1989); (2) Whetten's method of "modeling as theorizing" (2002); (3)…

  18. Building damage assessment from PolSAR data using texture parameters of statistical model

    NASA Astrophysics Data System (ADS)

    Li, Linlin; Liu, Xiuguo; Chen, Qihao; Yang, Shuai

    2018-04-01

    Accurate building damage assessment is essential in providing decision support for disaster relief and reconstruction. Polarimetric synthetic aperture radar (PolSAR) has become one of the most effective means of building damage assessment, due to its all-day/all-weather ability and richer backscatter information of targets. However, intact buildings that are not parallel to the SAR flight pass (termed oriented buildings) and collapsed buildings share similar scattering mechanisms, both of which are dominated by volume scattering. This characteristic always leads to misjudgments between assessments of collapsed buildings and oriented buildings from PolSAR data. Because the collapsed buildings and the intact buildings (whether oriented or parallel buildings) have different textures, a novel building damage assessment method is proposed in this study to address this problem by introducing texture parameters of statistical models. First, the logarithms of the estimated texture parameters of different statistical models are taken as a new texture feature to describe the collapse of the buildings. Second, the collapsed buildings and intact buildings are distinguished using an appropriate threshold. Then, the building blocks are classified into three levels based on the building block collapse rate. Moreover, this paper also discusses the capability for performing damage assessment using texture parameters from different statistical models or using different estimators. The RADARSAT-2 and ALOS-1 PolSAR images are used to present and analyze the performance of the proposed method. The results show that using the texture parameters avoids the problem of confusing collapsed and oriented buildings and improves the assessment accuracy. The results assessed by using the K/G0 distribution texture parameters estimated based on the second moment obtain the highest extraction accuracies. For the RADARSAT-2 and ALOS-1 data, the overall accuracy (OA) for these three types of buildings is 73.39% and 68.45%, respectively.

  19. Use of noncrystallographic symmetry for automated model building at medium to low resolution.

    PubMed

    Wiegels, Tim; Lamzin, Victor S

    2012-04-01

    A novel method is presented for the automatic detection of noncrystallographic symmetry (NCS) in macromolecular crystal structure determination which does not require the derivation of molecular masks or the segmentation of density. It was found that throughout structure determination the NCS-related parts may be differently pronounced in the electron density. This often results in the modelling of molecular fragments of variable length and accuracy, especially during automated model-building procedures. These fragments were used to identify NCS relations in order to aid automated model building and refinement. In a number of test cases higher completeness and greater accuracy of the obtained structures were achieved, specifically at a crystallographic resolution of 2.3 Å or poorer. In the best case, the method allowed the building of up to 15% more residues automatically and a tripling of the average length of the built fragments.

  20. Knowledge-based model building of proteins: concepts and examples.

    PubMed Central

    Bajorath, J.; Stenkamp, R.; Aruffo, A.

    1993-01-01

    We describe how to build protein models from structural templates. Methods to identify structural similarities between proteins in cases of significant, moderate to low, or virtually absent sequence similarity are discussed. The detection and evaluation of structural relationships is emphasized as a central aspect of protein modeling, distinct from the more technical aspects of model building. Computational techniques to generate and complement comparative protein models are also reviewed. Two examples, P-selectin and gp39, are presented to illustrate the derivation of protein model structures and their use in experimental studies. PMID:7505680

  1. Russian Apartment Building Thermal Response Models for Retrofit Selection and Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, Peter R.; Dirks, James A.; Reilly, Raymond W.

    2000-08-21

    The Enterprise Housing Divestiture Project (EHDP) aims to identify cost-effective energy efficiency and conservation measures for Russian apartment buildings and to implement these measures in the entire stock of buildings undergoing divestiture in six cities. Short-term measurements of infiltration and exterior wall heat-loss coefficient were made in the cities of Cheropovets, Orenburg, Petrozavodsk, Ryazan, and Vladimir. Long-term monitoring equipment was installed in six or more buildings in the aforementioned and in the city of Volxhov. The results of these measurements will be used to calibrate models used to select optimal retrofit packages and to verify energy savings. The retrofit categoriesmore » representing the largest technical potential in these buildings are envelope, heat recovery, and heating/hot water system improvements. This paper describes efforts to establish a useful thermal model calibration process. The model structures and analytical methods for obtaining building parameters from time series weather, energy use, and thermal response data are developed. Our experience applying these methods to two, nominally identical 5-story apartment buildings in the city of Ryazan is presented. Building envelope UA?s inferred from measured whole-building thermal response data are compared with UA?s based on window and wall U-values (the latter obtained by ASTM in-situ measurements of 20 wall sections in various Ryazan panel buildings) as well. The UA's obtained by these completely independent measurements differ by less than 10%.« less

  2. Building occupancy simulation and data assimilation using a graph-based agent-oriented model

    NASA Astrophysics Data System (ADS)

    Rai, Sanish; Hu, Xiaolin

    2018-07-01

    Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.

  3. New approach to analyzing soil-building systems

    USGS Publications Warehouse

    Safak, E.

    1998-01-01

    A new method of analyzing seismic response of soil-building systems is introduced. The method is based on the discrete-time formulation of wave propagation in layered media for vertically propagating plane shear waves. Buildings are modeled as an extension of the layered soil media by assuming that each story in the building is another layer. The seismic response is expressed in terms of wave travel times between the layers, and the wave reflection and transmission coefficients at layer interfaces. The calculation of the response is reduced to a pair of simple finite-difference equations for each layer, which are solved recursively starting from the bedrock. Compared with commonly used vibration formulation, the wave propagation formulation provides several advantages, including the ability to incorporate soil layers, simplicity of the calculations, improved accuracy in modeling the mass and damping, and better tools for system identification and damage detection.A new method of analyzing seismic response of soil-building systems is introduced. The method is based on the discrete-time formulation of wave propagation in layered media for vertically propagating plane shear waves. Buildings are modeled as an extension of the layered soil media by assuming that each story in the building is another layer. The seismic response is expressed in terms of wave travel times between the layers, and the wave reflection and transmission coefficients at layer interfaces. The calculation of the response is reduced to a pair of simple finite-difference equations for each layer, which are solved recursively starting from the bedrock. Compared with commonly used vibration formulation, the wave propagation formulation provides several advantages, including the ability to incorporate soil layers, simplicity of the calculations, improved accuracy in modeling the mass and damping, and better tools for system identification and damage detection.

  4. Field measurement of moisture-buffering model inputs for residential buildings

    DOE PAGES

    Woods, Jason; Winkler, Jon

    2016-02-05

    Moisture adsorption and desorption in building materials impact indoor humidity. This effect should be included in building-energy simulations, particularly when humidity is being investigated or controlled. Several models can calculate this moisture-buffering effect, but accurate ones require model inputs that are not always known to the user of the building-energy simulation. This research developed an empirical method to extract whole-house model inputs for the effective moisture penetration depth (EMPD) model. The experimental approach was to subject the materials in the house to a square-wave relative-humidity profile, measure all of the moisture-transfer terms (e.g., infiltration, air-conditioner condensate), and calculate the onlymore » unmeasured term—the moisture sorption into the materials. We validated this method with laboratory measurements, which we used to measure the EMPD model inputs of two houses. After deriving these inputs, we measured the humidity of the same houses during tests with realistic latent and sensible loads and demonstrated the accuracy of this approach. Furthermore, these results show that the EMPD model, when given reasonable inputs, is an accurate moisture-buffering model.« less

  5. Automatic computation for optimum height planning of apartment buildings to improve solar access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seong, Yoon-Bok; Kim, Yong-Yee; Seok, Ho-Tae

    2011-01-15

    The objective of this study is to suggest a mathematical model and an optimal algorithm for determining the height of apartment buildings to satisfy the solar rights of survey buildings or survey housing units. The objective is also to develop an automatic computation model for the optimum height of apartment buildings and then to clarify the performance and expected effects. To accomplish the objective of this study, the following procedures were followed: (1) The necessity of the height planning of obstruction buildings to satisfy the solar rights of survey buildings or survey housing units is demonstrated by analyzing through amore » literature review the recent trend of disputes related to solar rights and to examining the social requirements in terms of solar rights. In addition, the necessity of the automatic computation system for height planning of apartment buildings is demonstrated and a suitable analysis method for this system is chosen by investigating the characteristics of analysis methods for solar rights assessment. (2) A case study on the process of height planning of apartment buildings will be briefly described and the problems occurring in this process will then be examined carefully. (3) To develop an automatic computation model for height planning of apartment buildings, geometrical elements forming apartment buildings are defined by analyzing the geometrical characteristics of apartment buildings. In addition, design factors and regulations required in height planning of apartment buildings are investigated. Based on this knowledge, the methodology and mathematical algorithm to adjust the height of apartment buildings by automatic computation are suggested and probable problems and the ways to resolve these problems are discussed. Finally, the methodology and algorithm for the optimization are suggested. (4) Based on the suggested methodology and mathematical algorithm, the automatic computation model for optimum height of apartment buildings is developed and the developed system is verified through the application of some cases. The effects of the suggested model are then demonstrated quantitatively and qualitatively. (author)« less

  6. Recognition of building group patterns in topographic maps based on graph partitioning and random forest

    NASA Astrophysics Data System (ADS)

    He, Xianjin; Zhang, Xinchang; Xin, Qinchuan

    2018-02-01

    Recognition of building group patterns (i.e., the arrangement and form exhibited by a collection of buildings at a given mapping scale) is important to the understanding and modeling of geographic space and is hence essential to a wide range of downstream applications such as map generalization. Most of the existing methods develop rigid rules based on the topographic relationships between building pairs to identify building group patterns and thus their applications are often limited. This study proposes a method to identify a variety of building group patterns that allow for map generalization. The method first identifies building group patterns from potential building clusters based on a machine-learning algorithm and further partitions the building clusters with no recognized patterns based on the graph partitioning method. The proposed method is applied to the datasets of three cities that are representative of the complex urban environment in Southern China. Assessment of the results based on the reference data suggests that the proposed method is able to recognize both regular (e.g., the collinear, curvilinear, and rectangular patterns) and irregular (e.g., the L-shaped, H-shaped, and high-density patterns) building group patterns well, given that the correctness values are consistently nearly 90% and the completeness values are all above 91% for three study areas. The proposed method shows promises in automated recognition of building group patterns that allows for map generalization.

  7. The methodology of choice Cam-Clay model parameters for loess subsoil

    NASA Astrophysics Data System (ADS)

    Nepelski, Krzysztof; Błazik-Borowa, Ewa

    2018-01-01

    The paper deals with the calibration method of FEM subsoil model described by the constitutive Cam-Clay model. The four-storey residential building and solid substrate are modelled. Identification of the substrate is made using research drilling, CPT static tests, DMT Marchetti dilatometer, and laboratory tests. Latter are performed on the intact soil specimens which are taken from the wide planning trench at the depth of foundation. The real building settlements was measured as the vertical displacement of benchmarks. These measurements were carried out periodically during the erection of the building and its operation. Initially, the Cam Clay model parameters were determined on the basis of the laboratory tests, and later, they were corrected by taking into consideration numerical analyses results (whole building and its parts) and real building settlements.

  8. Localized Segment Based Processing for Automatic Building Extraction from LiDAR Data

    NASA Astrophysics Data System (ADS)

    Parida, G.; Rajan, K. S.

    2017-05-01

    The current methods of object segmentation and extraction and classification of aerial LiDAR data is manual and tedious task. This work proposes a technique for object segmentation out of LiDAR data. A bottom-up geometric rule based approach was used initially to devise a way to segment buildings out of the LiDAR datasets. For curved wall surfaces, comparison of localized surface normals was done to segment buildings. The algorithm has been applied to both synthetic datasets as well as real world dataset of Vaihingen, Germany. Preliminary results show successful segmentation of the buildings objects from a given scene in case of synthetic datasets and promissory results in case of real world data. The advantages of the proposed work is non-dependence on any other form of data required except LiDAR. It is an unsupervised method of building segmentation, thus requires no model training as seen in supervised techniques. It focuses on extracting the walls of the buildings to construct the footprint, rather than focussing on roof. The focus on extracting the wall to reconstruct the buildings from a LiDAR scene is crux of the method proposed. The current segmentation approach can be used to get 2D footprints of the buildings, with further scope to generate 3D models. Thus, the proposed method can be used as a tool to get footprints of buildings in urban landscapes, helping in urban planning and the smart cities endeavour.

  9. Research Capacity Building: A Historically Black College/University-Based Case Study of a Peer-to-Peer Mentor Research Team Model

    ERIC Educational Resources Information Center

    Moore, Corey L.; Manyibe, Edward O.; Aref, Fariborz; Washington, Andre L.

    2017-01-01

    Purpose: To evaluate a peer-to-peer mentor research team model (PPMRTM) in building investigators' research skills (i.e., research methods and grant writing) at a historically Black college/university (HBCU) in the United States. Method: Three different theories (i.e., planned change, critical mass, and self-efficacy), contemporary study findings,…

  10. What do we gain from simplicity versus complexity in species distribution models?

    USGS Publications Warehouse

    Merow, Cory; Smith, Matthew J.; Edwards, Thomas C.; Guisan, Antoine; McMahon, Sean M.; Normand, Signe; Thuiller, Wilfried; Wuest, Rafael O.; Zimmermann, Niklaus E.; Elith, Jane

    2014-01-01

    Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence–environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence–environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building ‘under fit’ models, having insufficient flexibility to describe observed occurrence–environment relationships, we risk misunderstanding the factors shaping species distributions. By building ‘over fit’ models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species ranges.

  11. Global optimization framework for solar building design

    NASA Astrophysics Data System (ADS)

    Silva, N.; Alves, N.; Pascoal-Faria, P.

    2017-07-01

    The generative modeling paradigm is a shift from static models to flexible models. It describes a modeling process using functions, methods and operators. The result is an algorithmic description of the construction process. Each evaluation of such an algorithm creates a model instance, which depends on its input parameters (width, height, volume, roof angle, orientation, location). These values are normally chosen according to aesthetic aspects and style. In this study, the model's parameters are automatically generated according to an objective function. A generative model can be optimized according to its parameters, in this way, the best solution for a constrained problem is determined. Besides the establishment of an overall framework design, this work consists on the identification of different building shapes and their main parameters, the creation of an algorithmic description for these main shapes and the formulation of the objective function, respecting a building's energy consumption (solar energy, heating and insulation). Additionally, the conception of an optimization pipeline, combining an energy calculation tool with a geometric scripting engine is presented. The methods developed leads to an automated and optimized 3D shape generation for the projected building (based on the desired conditions and according to specific constrains). The approach proposed will help in the construction of real buildings that account for less energy consumption and for a more sustainable world.

  12. Data-driven forecasting algorithms for building energy consumption

    NASA Astrophysics Data System (ADS)

    Noh, Hae Young; Rajagopal, Ram

    2013-04-01

    This paper introduces two forecasting methods for building energy consumption data that are recorded from smart meters in high resolution. For utility companies, it is important to reliably forecast the aggregate consumption profile to determine energy supply for the next day and prevent any crisis. The proposed methods involve forecasting individual load on the basis of their measurement history and weather data without using complicated models of building system. The first method is most efficient for a very short-term prediction, such as the prediction period of one hour, and uses a simple adaptive time-series model. For a longer-term prediction, a nonparametric Gaussian process has been applied to forecast the load profiles and their uncertainty bounds to predict a day-ahead. These methods are computationally simple and adaptive and thus suitable for analyzing a large set of data whose pattern changes over the time. These forecasting methods are applied to several sets of building energy consumption data for lighting and heating-ventilation-air-conditioning (HVAC) systems collected from a campus building at Stanford University. The measurements are collected every minute, and corresponding weather data are provided hourly. The results show that the proposed algorithms can predict those energy consumption data with high accuracy.

  13. Prediction of microstructure, residual stress, and deformation in laser powder bed fusion process

    NASA Astrophysics Data System (ADS)

    Yang, Y. P.; Jamshidinia, M.; Boulware, P.; Kelly, S. M.

    2018-05-01

    Laser powder bed fusion (L-PBF) process has been investigated significantly to build production parts with a complex shape. Modeling tools, which can be used in a part level, are essential to allow engineers to fine tune the shape design and process parameters for additive manufacturing. This study focuses on developing modeling methods to predict microstructure, hardness, residual stress, and deformation in large L-PBF built parts. A transient sequentially coupled thermal and metallurgical analysis method was developed to predict microstructure and hardness on L-PBF built high-strength, low-alloy steel parts. A moving heat-source model was used in this analysis to accurately predict the temperature history. A kinetics based model which was developed to predict microstructure in the heat-affected zone of a welded joint was extended to predict the microstructure and hardness in an L-PBF build by inputting the predicted temperature history. The tempering effect resulting from the following built layers on the current-layer microstructural phases were modeled, which is the key to predict the final hardness correctly. It was also found that the top layers of a build part have higher hardness because of the lack of the tempering effect. A sequentially coupled thermal and mechanical analysis method was developed to predict residual stress and deformation for an L-PBF build part. It was found that a line-heating model is not suitable for analyzing a large L-PBF built part. The layer heating method is a potential method for analyzing a large L-PBF built part. The experiment was conducted to validate the model predictions.

  14. Prediction of microstructure, residual stress, and deformation in laser powder bed fusion process

    NASA Astrophysics Data System (ADS)

    Yang, Y. P.; Jamshidinia, M.; Boulware, P.; Kelly, S. M.

    2017-12-01

    Laser powder bed fusion (L-PBF) process has been investigated significantly to build production parts with a complex shape. Modeling tools, which can be used in a part level, are essential to allow engineers to fine tune the shape design and process parameters for additive manufacturing. This study focuses on developing modeling methods to predict microstructure, hardness, residual stress, and deformation in large L-PBF built parts. A transient sequentially coupled thermal and metallurgical analysis method was developed to predict microstructure and hardness on L-PBF built high-strength, low-alloy steel parts. A moving heat-source model was used in this analysis to accurately predict the temperature history. A kinetics based model which was developed to predict microstructure in the heat-affected zone of a welded joint was extended to predict the microstructure and hardness in an L-PBF build by inputting the predicted temperature history. The tempering effect resulting from the following built layers on the current-layer microstructural phases were modeled, which is the key to predict the final hardness correctly. It was also found that the top layers of a build part have higher hardness because of the lack of the tempering effect. A sequentially coupled thermal and mechanical analysis method was developed to predict residual stress and deformation for an L-PBF build part. It was found that a line-heating model is not suitable for analyzing a large L-PBF built part. The layer heating method is a potential method for analyzing a large L-PBF built part. The experiment was conducted to validate the model predictions.

  15. Image-Based Airborne LiDAR Point Cloud Encoding for 3d Building Model Retrieval

    NASA Astrophysics Data System (ADS)

    Chen, Yi-Chen; Lin, Chao-Hung

    2016-06-01

    With the development of Web 2.0 and cyber city modeling, an increasing number of 3D models have been available on web-based model-sharing platforms with many applications such as navigation, urban planning, and virtual reality. Based on the concept of data reuse, a 3D model retrieval system is proposed to retrieve building models similar to a user-specified query. The basic idea behind this system is to reuse these existing 3D building models instead of reconstruction from point clouds. To efficiently retrieve models, the models in databases are compactly encoded by using a shape descriptor generally. However, most of the geometric descriptors in related works are applied to polygonal models. In this study, the input query of the model retrieval system is a point cloud acquired by Light Detection and Ranging (LiDAR) systems because of the efficient scene scanning and spatial information collection. Using Point clouds with sparse, noisy, and incomplete sampling as input queries is more difficult than that by using 3D models. Because that the building roof is more informative than other parts in the airborne LiDAR point cloud, an image-based approach is proposed to encode both point clouds from input queries and 3D models in databases. The main goal of data encoding is that the models in the database and input point clouds can be consistently encoded. Firstly, top-view depth images of buildings are generated to represent the geometry surface of a building roof. Secondly, geometric features are extracted from depth images based on height, edge and plane of building. Finally, descriptors can be extracted by spatial histograms and used in 3D model retrieval system. For data retrieval, the models are retrieved by matching the encoding coefficients of point clouds and building models. In experiments, a database including about 900,000 3D models collected from the Internet is used for evaluation of data retrieval. The results of the proposed method show a clear superiority over related methods.

  16. An economic model for passive solar designs in commercial environments

    NASA Astrophysics Data System (ADS)

    Powell, J. W.

    1980-06-01

    The model incorporates a life cycle costing approach that focuses on the costs of purchase, installation, maintenance, repairs, replacement, and energy. It includes a detailed analysis of tax laws affecting the use of solar energy in commercial buildings. Possible methods of treating difficult to measure benefits and costs, such as effects of the passive solar design on resale value of the building and on lighting costs, rental income from the building, and the use of commercial space, are presented. The model is illustrated in two case examples of prototypical solar design for low rise commercial buildings in an urban setting.

  17. Early Design Energy Analysis Using Building Information Modeling Technology

    DTIC Science & Technology

    2011-11-01

    building, (a) floor plan and (b) 3D image. ....................................... 50 Figure 28. Comparison of different energy estimates...when they make the biggest impact on building life-cycle costs. Traditionally, most building energy analyses have been conducted late in design, by...complete energy analysis. This method enables project teams to make energy conscious decisions early in design when they impact building life-cycle

  18. Analysis of 3d Building Models Accuracy Based on the Airborne Laser Scanning Point Clouds

    NASA Astrophysics Data System (ADS)

    Ostrowski, W.; Pilarska, M.; Charyton, J.; Bakuła, K.

    2018-05-01

    Creating 3D building models in large scale is becoming more popular and finds many applications. Nowadays, a wide term "3D building models" can be applied to several types of products: well-known CityGML solid models (available on few Levels of Detail), which are mainly generated from Airborne Laser Scanning (ALS) data, as well as 3D mesh models that can be created from both nadir and oblique aerial images. City authorities and national mapping agencies are interested in obtaining the 3D building models. Apart from the completeness of the models, the accuracy aspect is also important. Final accuracy of a building model depends on various factors (accuracy of the source data, complexity of the roof shapes, etc.). In this paper the methodology of inspection of dataset containing 3D models is presented. The proposed approach check all building in dataset with comparison to ALS point clouds testing both: accuracy and level of details. Using analysis of statistical parameters for normal heights for reference point cloud and tested planes and segmentation of point cloud provides the tool that can indicate which building and which roof plane in do not fulfill requirement of model accuracy and detail correctness. Proposed method was tested on two datasets: solid and mesh model.

  19. Multicriteria decision model for retrofitting existing buildings

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, B.

    2003-04-01

    In this paper a model to decide which buildings from an urban area should be retrofitted is presented. The model has been cast into existing ones by choosing the decision rule, criterion weighting and decision support system types most suitable for the spatial problem of reducing earthquake risk in urban areas, considering existing spatial multiatributive and multiobjective decision methods and especially collaborative issues. Due to the participative character of the group decision problem "retrofitting existing buildings" the decision making model is based on interactivity. Buildings have been modeled following the criteria of spatial decision support systems. This includes identifying the corresponding spatial elements of buildings according to the information needs of actors from different sphaeres like architects, construction engineers and economists. The decision model aims to facilitate collaboration between this actors. The way of setting priorities interactivelly will be shown, by detailing the two phases: judgemental and computational, in this case site analysis, collection and evaluation of the unmodified data and converting survey data to information with computational methods using additional expert support. Buildings have been divided into spatial elements which are characteristic for the survey, present typical damages in case of an earthquake and are decisive for a better seismic behaviour in case of retrofitting. The paper describes the architectural and engineering characteristics as well as the structural damage for constuctions of different building ages on the example of building types in Bucharest, Romania in compressible and interdependent charts, based on field observation, reports from the 1977 earthquake and detailed studies made by the author together with a local engineer for the EERI Web Housing Encyclopedia. On this base criteria for setting priorities flow into the expert information contained in the system.

  20. Efficient and Robust Optimization for Building Energy Simulation

    PubMed Central

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-01-01

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell’s Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell’s method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell’s Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell’s Hybrid method presently used in HVACSIM+. PMID:27325907

  1. Efficient and Robust Optimization for Building Energy Simulation.

    PubMed

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-06-15

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell's Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell's method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell's Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell's Hybrid method presently used in HVACSIM+.

  2. Translating building information modeling to building energy modeling using model view definition.

    PubMed

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  3. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    PubMed Central

    Kim, Jong Bum; Clayton, Mark J.; Haberl, Jeff S.

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process. PMID:25309954

  4. a Method for the Seamlines Network Automatic Selection Based on Building Vector

    NASA Astrophysics Data System (ADS)

    Li, P.; Dong, Y.; Hu, Y.; Li, X.; Tan, P.

    2018-04-01

    In order to improve the efficiency of large scale orthophoto production of city, this paper presents a method for automatic selection of seamlines network in large scale orthophoto based on the buildings' vector. Firstly, a simple model of the building is built by combining building's vector, height and DEM, and the imaging area of the building on single DOM is obtained. Then, the initial Voronoi network of the measurement area is automatically generated based on the positions of the bottom of all images. Finally, the final seamlines network is obtained by optimizing all nodes and seamlines in the network automatically based on the imaging areas of the buildings. The experimental results show that the proposed method can not only get around the building seamlines network quickly, but also remain the Voronoi network' characteristics of projection distortion minimum theory, which can solve the problem of automatic selection of orthophoto seamlines network in image mosaicking effectively.

  5. Comprehensive Evaluation of Fast-Response, Reynolds-Averaged Navier–Stokes, and Large-Eddy Simulation Methods Against High-Spatial-Resolution Wind-Tunnel Data in Step-Down Street Canyons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayati, Arash Nemati; Stoll, Rob; Kim, J. J.

    Three computational fluid dynamics (CFD) methods with different levels of flow-physics modelling are comprehensively evaluated against high-spatial-resolution wind-tunnel velocity data from step-down street canyons (i.e., a short building downwind of a tall building). The first method is a semi-empirical fast-response approach using the Quick Urban Industrial Complex (QUIC-URB) model. The second method solves the Reynolds-averaged Navier–Stokes (RANS) equations, and the third one utilizes a fully-coupled fluid-structure interaction large-eddy simulation (LES) model with a grid-turbulence inflow generator. Unlike typical point-by-point evaluation comparisons, here the entire two-dimensional wind-tunnel dataset is used to evaluate the dynamics of dominant flow topological features in themore » street canyon. Each CFD method is scrutinized for several geometric configurations by varying the downwind-to-upwind building-height ratio (H d/H u) and street canyon-width to building-width aspect ratio (S / W) for inflow winds perpendicular to the upwind building front face. Disparities between the numerical results and experimental data are quantified in terms of their ability to capture flow topological features for different geometric configurations. Ultimately, all three methods qualitatively predict the primary flow topological features, including a saddle point and a primary vortex. But, the secondary flow topological features, namely an in-canyon separation point and secondary vortices, are only well represented by the LES method despite its failure for taller downwind building cases. Misrepresentation of flow-regime transitions, exaggeration of the coherence of recirculation zones and wake fields, and overestimation of downwards vertical velocity into the canyon are the main defects in QUIC-URB, RANS and LES results, respectively. All three methods underestimate the updrafts and, surprisingly, QUIC-URB outperforms RANS for the streamwise velocity component, while RANS is superior to QUIC-URB for the vertical velocity component in the street canyon.« less

  6. Comprehensive Evaluation of Fast-Response, Reynolds-Averaged Navier–Stokes, and Large-Eddy Simulation Methods Against High-Spatial-Resolution Wind-Tunnel Data in Step-Down Street Canyons

    DOE PAGES

    Hayati, Arash Nemati; Stoll, Rob; Kim, J. J.; ...

    2017-05-18

    Three computational fluid dynamics (CFD) methods with different levels of flow-physics modelling are comprehensively evaluated against high-spatial-resolution wind-tunnel velocity data from step-down street canyons (i.e., a short building downwind of a tall building). The first method is a semi-empirical fast-response approach using the Quick Urban Industrial Complex (QUIC-URB) model. The second method solves the Reynolds-averaged Navier–Stokes (RANS) equations, and the third one utilizes a fully-coupled fluid-structure interaction large-eddy simulation (LES) model with a grid-turbulence inflow generator. Unlike typical point-by-point evaluation comparisons, here the entire two-dimensional wind-tunnel dataset is used to evaluate the dynamics of dominant flow topological features in themore » street canyon. Each CFD method is scrutinized for several geometric configurations by varying the downwind-to-upwind building-height ratio (H d/H u) and street canyon-width to building-width aspect ratio (S / W) for inflow winds perpendicular to the upwind building front face. Disparities between the numerical results and experimental data are quantified in terms of their ability to capture flow topological features for different geometric configurations. Ultimately, all three methods qualitatively predict the primary flow topological features, including a saddle point and a primary vortex. But, the secondary flow topological features, namely an in-canyon separation point and secondary vortices, are only well represented by the LES method despite its failure for taller downwind building cases. Misrepresentation of flow-regime transitions, exaggeration of the coherence of recirculation zones and wake fields, and overestimation of downwards vertical velocity into the canyon are the main defects in QUIC-URB, RANS and LES results, respectively. All three methods underestimate the updrafts and, surprisingly, QUIC-URB outperforms RANS for the streamwise velocity component, while RANS is superior to QUIC-URB for the vertical velocity component in the street canyon.« less

  7. Comprehensive Evaluation of Fast-Response, Reynolds-Averaged Navier-Stokes, and Large-Eddy Simulation Methods Against High-Spatial-Resolution Wind-Tunnel Data in Step-Down Street Canyons

    NASA Astrophysics Data System (ADS)

    Hayati, Arash Nemati; Stoll, Rob; Kim, J. J.; Harman, Todd; Nelson, Matthew A.; Brown, Michael J.; Pardyjak, Eric R.

    2017-08-01

    Three computational fluid dynamics (CFD) methods with different levels of flow-physics modelling are comprehensively evaluated against high-spatial-resolution wind-tunnel velocity data from step-down street canyons (i.e., a short building downwind of a tall building). The first method is a semi-empirical fast-response approach using the Quick Urban Industrial Complex (QUIC-URB) model. The second method solves the Reynolds-averaged Navier-Stokes (RANS) equations, and the third one utilizes a fully-coupled fluid-structure interaction large-eddy simulation (LES) model with a grid-turbulence inflow generator. Unlike typical point-by-point evaluation comparisons, here the entire two-dimensional wind-tunnel dataset is used to evaluate the dynamics of dominant flow topological features in the street canyon. Each CFD method is scrutinized for several geometric configurations by varying the downwind-to-upwind building-height ratio (H_d/H_u) and street canyon-width to building-width aspect ratio ( S / W) for inflow winds perpendicular to the upwind building front face. Disparities between the numerical results and experimental data are quantified in terms of their ability to capture flow topological features for different geometric configurations. Overall, all three methods qualitatively predict the primary flow topological features, including a saddle point and a primary vortex. However, the secondary flow topological features, namely an in-canyon separation point and secondary vortices, are only well represented by the LES method despite its failure for taller downwind building cases. Misrepresentation of flow-regime transitions, exaggeration of the coherence of recirculation zones and wake fields, and overestimation of downwards vertical velocity into the canyon are the main defects in QUIC-URB, RANS and LES results, respectively. All three methods underestimate the updrafts and, surprisingly, QUIC-URB outperforms RANS for the streamwise velocity component, while RANS is superior to QUIC-URB for the vertical velocity component in the street canyon.

  8. A View on Future Building System Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described bymore » coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).« less

  9. Accuracy of automated measurement and verification (M&V) techniques for energy savings in commercial buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Touzani, Samir; Custodio, Claudine

    Trustworthy savings calculations are critical to convincing investors in energy efficiency projects of the benefit and cost-effectiveness of such investments and their ability to replace or defer supply-side capital investments. However, today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of efficiency projects. They also require time-consuming manual data acquisition and often do not deliver results until years after the program period has ended. The rising availability of “smart” meters, combined with new analytical approaches to quantifying savings, has opened the door to conducting M&V more quickly and at lower cost,more » with comparable or improved accuracy. These meter- and software-based approaches, increasingly referred to as “M&V 2.0”, are the subject of surging industry interest, particularly in the context of utility energy efficiency programs. Program administrators, evaluators, and regulators are asking how M&V 2.0 compares with more traditional methods, how proprietary software can be transparently performance tested, how these techniques can be integrated into the next generation of whole-building focused efficiency programs. This paper expands recent analyses of public-domain whole-building M&V methods, focusing on more novel M&V2.0 modeling approaches that are used in commercial technologies, as well as approaches that are documented in the literature, and/or developed by the academic building research community. We present a testing procedure and metrics to assess the performance of whole-building M&V methods. We then illustrate the test procedure by evaluating the accuracy of ten baseline energy use models, against measured data from a large dataset of 537 buildings. The results of this study show that the already available advanced interval data baseline models hold great promise for scaling the adoption of building measured savings calculations using Advanced Metering Infrastructure (AMI) data. Median coefficient of variation of the root mean squared error (CV(RMSE)) was less than 25% for every model tested when twelve months of training data were used. With even six months of training data, median CV(RMSE) for daily energy total was under 25% for all models tested. Finally, these findings can be used to build confidence in model robustness, and the readiness of these approaches for industry uptake and adoption« less

  10. Accuracy of automated measurement and verification (M&V) techniques for energy savings in commercial buildings

    DOE PAGES

    Granderson, Jessica; Touzani, Samir; Custodio, Claudine; ...

    2016-04-16

    Trustworthy savings calculations are critical to convincing investors in energy efficiency projects of the benefit and cost-effectiveness of such investments and their ability to replace or defer supply-side capital investments. However, today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of efficiency projects. They also require time-consuming manual data acquisition and often do not deliver results until years after the program period has ended. The rising availability of “smart” meters, combined with new analytical approaches to quantifying savings, has opened the door to conducting M&V more quickly and at lower cost,more » with comparable or improved accuracy. These meter- and software-based approaches, increasingly referred to as “M&V 2.0”, are the subject of surging industry interest, particularly in the context of utility energy efficiency programs. Program administrators, evaluators, and regulators are asking how M&V 2.0 compares with more traditional methods, how proprietary software can be transparently performance tested, how these techniques can be integrated into the next generation of whole-building focused efficiency programs. This paper expands recent analyses of public-domain whole-building M&V methods, focusing on more novel M&V2.0 modeling approaches that are used in commercial technologies, as well as approaches that are documented in the literature, and/or developed by the academic building research community. We present a testing procedure and metrics to assess the performance of whole-building M&V methods. We then illustrate the test procedure by evaluating the accuracy of ten baseline energy use models, against measured data from a large dataset of 537 buildings. The results of this study show that the already available advanced interval data baseline models hold great promise for scaling the adoption of building measured savings calculations using Advanced Metering Infrastructure (AMI) data. Median coefficient of variation of the root mean squared error (CV(RMSE)) was less than 25% for every model tested when twelve months of training data were used. With even six months of training data, median CV(RMSE) for daily energy total was under 25% for all models tested. Finally, these findings can be used to build confidence in model robustness, and the readiness of these approaches for industry uptake and adoption« less

  11. Post Occupancy Evaluation of Educational Buildings and Equipment.

    ERIC Educational Resources Information Center

    Watson, Chris

    1997-01-01

    Details the post occupancy evaluation (POE) process for public buildings. POEs are used to improve design and optimize educational building and equipment use. The evaluation participants, the method used, the results and recommendations, model schools, and classroom alterations using POE are described. (9 references.) (RE)

  12. Three-Dimensional Reconstruction and Solar Energy Potential Estimation of Buildings

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Li, M.; Cheng, L.; Xu, H.; Li, S.; Liu, X.

    2017-12-01

    In the context of the construction of low-carbon cities, green cities and eco-cities, the ability of the airborne and mobile LiDAR should be explored in urban renewable energy research. As the main landscape in urban environment, buildings have large regular envelopes could receive a huge amount of solar radiation. In this study, a relatively complete calculation scheme about building roof and façade solar utilization potential is proposed, using building three-dimensional geometric feature information. For measuring the city-level building solar irradiance, the precise three-dimensional building roof and façade models should be first reconstructed from the airborne and mobile LiDAR, respectively. In order to obtaining the precise geometric structure of building facades from the mobile LiDAR data, a new method for structure detection and the three-dimensional reconstruction of building façades from mobile LiDAR data is proposed. The method consists of three steps: the preprocessing of façade points, the detection of façade structure, the restoration and reconstruction of building façade. As a result, the reconstruction method can effectively deal with missing areas caused by occlusion, viewpoint limitation, and uneven point density, as well as realizing the highly complete 3D reconstruction of a building façade. Furthermore, the window areas can be excluded for more accurate estimation of solar utilization potential. After then, the solar energy utilization potential of global building roofs and facades is estimate by using the solar irradiance model, which combine the analysis of the building shade and sky diffuse, based on the analysis of the geometrical structure of buildings.

  13. Data-driven modeling, control and tools for cyber-physical energy systems

    NASA Astrophysics Data System (ADS)

    Behl, Madhur

    Energy systems are experiencing a gradual but substantial change in moving away from being non-interactive and manually-controlled systems to utilizing tight integration of both cyber (computation, communications, and control) and physical representations guided by first principles based models, at all scales and levels. Furthermore, peak power reduction programs like demand response (DR) are becoming increasingly important as the volatility on the grid continues to increase due to regulation, integration of renewables and extreme weather conditions. In order to shield themselves from the risk of price volatility, end-user electricity consumers must monitor electricity prices and be flexible in the ways they choose to use electricity. This requires the use of control-oriented predictive models of an energy system's dynamics and energy consumption. Such models are needed for understanding and improving the overall energy efficiency and operating costs. However, learning dynamical models using grey/white box approaches is very cost and time prohibitive since it often requires significant financial investments in retrofitting the system with several sensors and hiring domain experts for building the model. We present the use of data-driven methods for making model capture easy and efficient for cyber-physical energy systems. We develop Model-IQ, a methodology for analysis of uncertainty propagation for building inverse modeling and controls. Given a grey-box model structure and real input data from a temporary set of sensors, Model-IQ evaluates the effect of the uncertainty propagation from sensor data to model accuracy and to closed-loop control performance. We also developed a statistical method to quantify the bias in the sensor measurement and to determine near optimal sensor placement and density for accurate data collection for model training and control. Using a real building test-bed, we show how performing an uncertainty analysis can reveal trends about inverse model accuracy and control performance, which can be used to make informed decisions about sensor requirements and data accuracy. We also present DR-Advisor, a data-driven demand response recommender system for the building's facilities manager which provides suitable control actions to meet the desired load curtailment while maintaining operations and maximizing the economic reward. We develop a model based control with regression trees algorithm (mbCRT), which allows us to perform closed-loop control for DR strategy synthesis for large commercial buildings. Our data-driven control synthesis algorithm outperforms rule-based demand response methods for a large DoE commercial reference building and leads to a significant amount of load curtailment (of 380kW) and over $45,000 in savings which is 37.9% of the summer energy bill for the building. The performance of DR-Advisor is also evaluated for 8 buildings on Penn's campus; where it achieves 92.8% to 98.9% prediction accuracy. We also compare DR-Advisor with other data driven methods and rank 2nd on ASHRAE's benchmarking data-set for energy prediction.

  14. Importance of Laser Scanning Resolution in the Process of Recreating the Architectural Details of Historical Buildings

    NASA Astrophysics Data System (ADS)

    Pawłowicz, Joanna A.

    2017-10-01

    The TLS method (Terrestrial Laser Scanning) may replace the traditional building survey methods, e.g. those requiring the use measuring tapes or range finders. This technology allows for collecting digital data in the form of a point cloud, which can be used to create a 3D model of a building. In addition, it allows for collecting data with an incredible precision, which translates into the possibility to reproduce all architectural features of a building. This data is applied in reverse engineering to create a 3D model of an object existing in a physical space. This study presents the results of a research carried out using a point cloud to recreate the architectural features of a historical building with the application of reverse engineering. The research was conducted on a two-storey residential building with a basement and an attic. Out of the building’s façade sticks a veranda featuring a complicated, wooden structure. The measurements were taken at the medium and the highest resolution using a ScanStation C10 laser scanner by Leica. The data obtained was processed using specialist software, which allowed for the application of reverse engineering, especially for reproducing the sculpted details of the veranda. Following digitization, all redundant data was removed from the point cloud and the cloud was subjected to modelling. For testing purposes, a selected part of the veranda was modelled by means of two methods: surface matching and Triangulated Irregular Network. Both modelling methods were applied in the case of data collected at medium and the highest resolution. Creating a model based on data obtained at medium resolution, both by means of the surface matching and the TIN method, does not allow for a precise recreation of architectural details. The study presents certain sculpted elements recreated based on the highest resolution data with superimposed TIN juxtaposed against a digital image. The resulting model is very precise. Creating good models requires highly accurate field data. It is important to properly choose the distance between the measuring station and the measured object in order to ensure that the angles of incidence (horizontal and vertical) of the laser beam are as straight as possible. The model created based on medium resolution offers very poor quality of details, i.e. only the bigger, basic elements of each detail are clearly visible, while the smaller ones are blurred. This is why in order to obtain data sufficient to reproduce architectural details laser scanning should be performed at the highest resolution. In addition, modelling by means of the surface matching method should be avoided - a better idea is to use the TIN method. In addition to providing a realistically-looking visualization, the method has one more important advantage - it is 4 times faster than the surface matching method.

  15. Research and implementation on 3D modeling of geological body

    NASA Astrophysics Data System (ADS)

    Niu, Lijuan; Li, Ligong; Zhu, Renyi; Huang, Man

    2017-10-01

    This study based on GIS thinking explores the combination of the mixed spatial data model and GIS model to build three-dimensional(3d) model of geological bodies in the Arc Engine platform, describes the interface and method used in the construction of 3d geological body in Arc Engine component platform in detail, and puts forward an indirect method which constructs a set of geological grid layers through Rigging interpolation by the borehole data and then converts it into the geological layers of TIN, which improves the defect in building the geological layers of TIN directly and makes it better to complete the simulation of the real geological layer. This study makes a useful attempt to build 3d model of the geological body based on the GIS, and provides a certain reference value for simulating geological bodies in 3d and constructing the digital system of underground space.

  16. Surface characteristics modeling and performance evaluation of urban building materials using LiDAR data.

    PubMed

    Li, Xiaolu; Liang, Yu

    2015-05-20

    Analysis of light detection and ranging (LiDAR) intensity data to extract surface features is of great interest in remote sensing research. One potential application of LiDAR intensity data is target classification. A new bidirectional reflectance distribution function (BRDF) model is derived for target characterization of rough and smooth surfaces. Based on the geometry of our coaxial full-waveform LiDAR system, the integration method is improved through coordinate transformation to establish the relationship between the BRDF model and intensity data of LiDAR. A series of experiments using typical urban building materials are implemented to validate the proposed BRDF model and integration method. The fitting results show that three parameters extracted from the proposed BRDF model can distinguish the urban building materials from perspectives of roughness, specular reflectance, and diffuse reflectance. A comprehensive analysis of these parameters will help characterize surface features in a physically rigorous manner.

  17. Probabilistic safety assessment of the design of a tall buildings under the extreme load

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Králik, Juraj, E-mail: juraj.kralik@stuba.sk

    2016-06-08

    The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.

  18. Probabilistic safety assessment of the design of a tall buildings under the extreme load

    NASA Astrophysics Data System (ADS)

    Králik, Juraj

    2016-06-01

    The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.

  19. Mobile Laser Scanning for Indoor Modelling

    NASA Astrophysics Data System (ADS)

    Thomson, C.; Apostolopoulos, G.; Backes, D.; Boehm, J.

    2013-10-01

    The process of capturing and modelling buildings has gained increased focus in recent years with the rise of Building Information Modelling (BIM). At the heart of BIM is a process change for the construction and facilities management industries whereby a BIM aids more collaborative working through better information exchange, and as a part of the process Geomatic/Land Surveyors are not immune from the changes. Terrestrial laser scanning has been proscribed as the preferred method for rapidly capturing buildings for BIM geometry. This is a process change from a traditional measured building survey just with a total station and is aided by the increasing acceptance of point cloud data being integrated with parametric building models in BIM tools such as Autodesk Revit or Bentley Architecture. Pilot projects carried out previously by the authors to investigate the geometry capture and modelling of BIM confirmed the view of others that the process of data capture with static laser scan setups is slow and very involved requiring at least two people for efficiency. Indoor Mobile Mapping Systems (IMMS) present a possible solution to these issues especially in time saved. Therefore this paper investigates their application as a capture device for BIM geometry creation over traditional static methods through a fit-for-purpose test.

  20. IEA EBC Annex 66: Definition and simulation of occupant behavior in buildings

    DOE PAGES

    Yan, Da; Hong, Tianzhen; Dong, Bing; ...

    2017-09-28

    Here, more than 30% of the total primary energy in the world is consumed in buildings. It is crucial to reduce building energy consumption in order to preserve energy resources and mitigate global climate change. Building performance simulations have been widely used for the estimation and optimization of building performance, providing reference values for the assessment of building energy consumption and the effects of energy-saving technologies. Among the various factors influencing building energy consumption, occupant behavior has drawn increasing attention. Occupant behavior includes occupant presence, movement, and interaction with building energy devices and systems. However, there are gaps in occupantmore » behavior modeling as different energy modelers have employed varied data and tools to simulate occupant behavior, therefore producing different and incomparable results. Aiming to address these gaps, the International Energy Agency (IEA) Energy in Buildings and Community (EBC) Programme Annex 66 has established a scientific methodological framework for occupant behavior research, including data collection, behavior model representation, modeling and evaluation approaches, and the integration of behavior modeling tools with building performance simulation programs. Annex 66 also includes case studies and application guidelines to assist in building design, operation, and policymaking, using interdisciplinary approaches to reduce energy use in buildings and improve occupant comfort and productivity. This paper highlights the key research issues, methods, and outcomes pertaining to Annex 66, and offers perspectives on future research needs to integrate occupant behavior with the building life cycle.« less

  1. IEA EBC Annex 66: Definition and simulation of occupant behavior in buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Da; Hong, Tianzhen; Dong, Bing

    Here, more than 30% of the total primary energy in the world is consumed in buildings. It is crucial to reduce building energy consumption in order to preserve energy resources and mitigate global climate change. Building performance simulations have been widely used for the estimation and optimization of building performance, providing reference values for the assessment of building energy consumption and the effects of energy-saving technologies. Among the various factors influencing building energy consumption, occupant behavior has drawn increasing attention. Occupant behavior includes occupant presence, movement, and interaction with building energy devices and systems. However, there are gaps in occupantmore » behavior modeling as different energy modelers have employed varied data and tools to simulate occupant behavior, therefore producing different and incomparable results. Aiming to address these gaps, the International Energy Agency (IEA) Energy in Buildings and Community (EBC) Programme Annex 66 has established a scientific methodological framework for occupant behavior research, including data collection, behavior model representation, modeling and evaluation approaches, and the integration of behavior modeling tools with building performance simulation programs. Annex 66 also includes case studies and application guidelines to assist in building design, operation, and policymaking, using interdisciplinary approaches to reduce energy use in buildings and improve occupant comfort and productivity. This paper highlights the key research issues, methods, and outcomes pertaining to Annex 66, and offers perspectives on future research needs to integrate occupant behavior with the building life cycle.« less

  2. A 3D model retrieval approach based on Bayesian networks lightfield descriptor

    NASA Astrophysics Data System (ADS)

    Xiao, Qinhan; Li, Yanjun

    2009-12-01

    A new 3D model retrieval methodology is proposed by exploiting a novel Bayesian networks lightfield descriptor (BNLD). There are two key novelties in our approach: (1) a BN-based method for building lightfield descriptor; and (2) a 3D model retrieval scheme based on the proposed BNLD. To overcome the disadvantages of the existing 3D model retrieval methods, we explore BN for building a new lightfield descriptor. Firstly, 3D model is put into lightfield, about 300 binary-views can be obtained along a sphere, then Fourier descriptors and Zernike moments descriptors can be calculated out from binaryviews. Then shape feature sequence would be learned into a BN model based on BN learning algorithm; Secondly, we propose a new 3D model retrieval method by calculating Kullback-Leibler Divergence (KLD) between BNLDs. Beneficial from the statistical learning, our BNLD is noise robustness as compared to the existing methods. The comparison between our method and the lightfield descriptor-based approach is conducted to demonstrate the effectiveness of our proposed methodology.

  3. Building the 3D Geological Model of Wall Rock of Salt Caverns Based on Integration Method of Multi-source data

    NASA Astrophysics Data System (ADS)

    Yongzhi, WANG; hui, WANG; Lixia, LIAO; Dongsen, LI

    2017-02-01

    In order to analyse the geological characteristics of salt rock and stability of salt caverns, rough three-dimensional (3D) models of salt rock stratum and the 3D models of salt caverns on study areas are built by 3D GIS spatial modeling technique. During implementing, multi-source data, such as basic geographic data, DEM, geological plane map, geological section map, engineering geological data, and sonar data are used. In this study, the 3D spatial analyzing and calculation methods, such as 3D GIS intersection detection method in three-dimensional space, Boolean operations between three-dimensional space entities, three-dimensional space grid discretization, are used to build 3D models on wall rock of salt caverns. Our methods can provide effective calculation models for numerical simulation and analysis of the creep characteristics of wall rock in salt caverns.

  4. A compressed sensing method with analytical results for lidar feature classification

    NASA Astrophysics Data System (ADS)

    Allen, Josef D.; Yuan, Jiangbo; Liu, Xiuwen; Rahmes, Mark

    2011-04-01

    We present an innovative way to autonomously classify LiDAR points into bare earth, building, vegetation, and other categories. One desirable product of LiDAR data is the automatic classification of the points in the scene. Our algorithm automatically classifies scene points using Compressed Sensing Methods via Orthogonal Matching Pursuit algorithms utilizing a generalized K-Means clustering algorithm to extract buildings and foliage from a Digital Surface Models (DSM). This technology reduces manual editing while being cost effective for large scale automated global scene modeling. Quantitative analyses are provided using Receiver Operating Characteristics (ROC) curves to show Probability of Detection and False Alarm of buildings vs. vegetation classification. Histograms are shown with sample size metrics. Our inpainting algorithms then fill the voids where buildings and vegetation were removed, utilizing Computational Fluid Dynamics (CFD) techniques and Partial Differential Equations (PDE) to create an accurate Digital Terrain Model (DTM) [6]. Inpainting preserves building height contour consistency and edge sharpness of identified inpainted regions. Qualitative results illustrate other benefits such as Terrain Inpainting's unique ability to minimize or eliminate undesirable terrain data artifacts.

  5. Optimizing Energy Consumption in Building Designs Using Building Information Model (BIM)

    NASA Astrophysics Data System (ADS)

    Egwunatum, Samuel; Joseph-Akwara, Esther; Akaigwe, Richard

    2016-09-01

    Given the ability of a Building Information Model (BIM) to serve as a multi-disciplinary data repository, this paper seeks to explore and exploit the sustainability value of Building Information Modelling/models in delivering buildings that require less energy for their operation, emit less CO2 and at the same time provide a comfortable living environment for their occupants. This objective was achieved by a critical and extensive review of the literature covering: (1) building energy consumption, (2) building energy performance and analysis, and (3) building information modeling and energy assessment. The literature cited in this paper showed that linking an energy analysis tool with a BIM model helped project design teams to predict and create optimized energy consumption. To validate this finding, an in-depth analysis was carried out on a completed BIM integrated construction project using the Arboleda Project in the Dominican Republic. The findings showed that the BIM-based energy analysis helped the design team achieve the world's first 103% positive energy building. From the research findings, the paper concludes that linking an energy analysis tool with a BIM model helps to expedite the energy analysis process, provide more detailed and accurate results as well as deliver energy-efficient buildings. The study further recommends that the adoption of a level 2 BIM and the integration of BIM in energy optimization analyse should be made compulsory for all projects irrespective of the method of procurement (government-funded or otherwise) or its size.

  6. Single-image-based Modelling Architecture from a Historical Photograph

    NASA Astrophysics Data System (ADS)

    Dzwierzynska, Jolanta

    2017-10-01

    Historical photographs are proved to be very useful to provide a dimensional and geometrical analysis of buildings as well as to generate 3D reconstruction of the whole structure. The paper addresses the problem of single historical photograph analysis and modelling of an architectural object from it. Especially, it focuses on reconstruction of the original look of New-Town synagogue from the single historic photograph, when camera calibration is completely unknown. Due to the fact that the photograph faithfully followed the geometric rules of perspective, it was possible to develop and apply the method to obtain a correct 3D reconstruction of the building. The modelling process consisted of a series of familiar steps: feature extraction, determination of base elements of perspective, dimensional analyses and 3D reconstruction. Simple formulas were proposed in order to estimate location of characteristic points of the building in 3D Cartesian system of axes on the base of their location in 2D Cartesian system of axes. The reconstruction process proceeded well, although slight corrections were necessary. It was possible to reconstruct the shape of the building in general, and two of its facades in detail. The reconstruction of the other two facades requires some additional information or the additional picture. The success of the presented reconstruction method depends on the geometrical content of the photograph as well as quality of the picture, which ensures the legibility of building edges. The presented method of reconstruction is a combination of the descriptive method of reconstruction and computer aid; therefore, it seems to be universal. It can prove useful for single-image-based modelling architecture.

  7. Applying Critical Race Theory to Group Model Building Methods to Address Community Violence.

    PubMed

    Frerichs, Leah; Lich, Kristen Hassmiller; Funchess, Melanie; Burrell, Marcus; Cerulli, Catherine; Bedell, Precious; White, Ann Marie

    2016-01-01

    Group model building (GMB) is an approach to building qualitative and quantitative models with stakeholders to learn about the interrelationships among multilevel factors causing complex public health problems over time. Scant literature exists on adapting this method to address public health issues that involve racial dynamics. This study's objectives are to (1) introduce GMB methods, (2) present a framework for adapting GMB to enhance cultural responsiveness, and (3) describe outcomes of adapting GMB to incorporate differences in racial socialization during a community project seeking to understand key determinants of community violence transmission. An academic-community partnership planned a 1-day session with diverse stakeholders to explore the issue of violence using GMB. We documented key questions inspired by critical race theory (CRT) and adaptations to established GMB "scripts" (i.e., published facilitation instructions). The theory's emphasis on experiential knowledge led to a narrative-based facilitation guide from which participants created causal loop diagrams. These early diagrams depict how violence is transmitted and how communities respond, based on participants' lived experiences and mental models of causation that grew to include factors associated with race. Participants found these methods useful for advancing difficult discussion. The resulting diagrams can be tested and expanded in future research, and will form the foundation for collaborative identification of solutions to build community resilience. GMB is a promising strategy that community partnerships should consider when addressing complex health issues; our experience adapting methods based on CRT is promising in its acceptability and early system insights.

  8. Methods of mathematical modeling using polynomials of algebra of sets

    NASA Astrophysics Data System (ADS)

    Kazanskiy, Alexandr; Kochetkov, Ivan

    2018-03-01

    The article deals with the construction of discrete mathematical models for solving applied problems arising from the operation of building structures. Security issues in modern high-rise buildings are extremely serious and relevant, and there is no doubt that interest in them will only increase. The territory of the building is divided into zones for which it is necessary to observe. Zones can overlap and have different priorities. Such situations can be described using formulas algebra of sets. Formulas can be programmed, which makes it possible to work with them using computer models.

  9. Comparison of integrated clustering methods for accurate and stable prediction of building energy consumption data

    DOE PAGES

    Hsu, David

    2015-09-27

    Clustering methods are often used to model energy consumption for two reasons. First, clustering is often used to process data and to improve the predictive accuracy of subsequent energy models. Second, stable clusters that are reproducible with respect to non-essential changes can be used to group, target, and interpret observed subjects. However, it is well known that clustering methods are highly sensitive to the choice of algorithms and variables. This can lead to misleading assessments of predictive accuracy and mis-interpretation of clusters in policymaking. This paper therefore introduces two methods to the modeling of energy consumption in buildings: clusterwise regression,more » also known as latent class regression, which integrates clustering and regression simultaneously; and cluster validation methods to measure stability. Using a large dataset of multifamily buildings in New York City, clusterwise regression is compared to common two-stage algorithms that use K-means and model-based clustering with linear regression. Predictive accuracy is evaluated using 20-fold cross validation, and the stability of the perturbed clusters is measured using the Jaccard coefficient. These results show that there seems to be an inherent tradeoff between prediction accuracy and cluster stability. This paper concludes by discussing which clustering methods may be appropriate for different analytical purposes.« less

  10. Experiments in concept modeling for radiographic image reports.

    PubMed Central

    Bell, D S; Pattison-Gordon, E; Greenes, R A

    1994-01-01

    OBJECTIVE: Development of methods for building concept models to support structured data entry and image retrieval in chest radiography. DESIGN: An organizing model for chest-radiographic reporting was built by analyzing manually a set of natural-language chest-radiograph reports. During model building, clinician-informaticians judged alternative conceptual structures according to four criteria: content of clinically relevant detail, provision for semantic constraints, provision for canonical forms, and simplicity. The organizing model was applied in representing three sample reports in their entirety. To explore the potential for automatic model discovery, the representation of one sample report was compared with the noun phrases derived from the same report by the CLARIT natural-language processing system. RESULTS: The organizing model for chest-radiographic reporting consists of 62 concept types and 17 relations, arranged in an inheritance network. The broadest types in the model include finding, anatomic locus, procedure, attribute, and status. Diagnoses are modeled as a subtype of finding. Representing three sample reports in their entirety added 79 narrower concept types. Some CLARIT noun phrases suggested valid associations among subtypes of finding, status, and anatomic locus. CONCLUSIONS: A manual modeling process utilizing explicitly stated criteria for making modeling decisions produced an organizing model that showed consistency in early testing. A combination of top-down and bottom-up modeling was required. Natural-language processing may inform model building, but algorithms that would replace manual modeling were not discovered. Further progress in modeling will require methods for objective model evaluation and tools for formalizing the model-building process. PMID:7719807

  11. Fine reservoir structure modeling based upon 3D visualized stratigraphic correlation between horizontal wells: methodology and its application

    NASA Astrophysics Data System (ADS)

    Chenghua, Ou; Chaochun, Li; Siyuan, Huang; Sheng, James J.; Yuan, Xu

    2017-12-01

    As the platform-based horizontal well production mode has been widely applied in petroleum industry, building a reliable fine reservoir structure model by using horizontal well stratigraphic correlation has become very important. Horizontal wells usually extend between the upper and bottom boundaries of the target formation, with limited penetration points. Using these limited penetration points to conduct well deviation correction means the formation depth information obtained is not accurate, which makes it hard to build a fine structure model. In order to solve this problem, a method of fine reservoir structure modeling, based on 3D visualized stratigraphic correlation among horizontal wells, is proposed. This method can increase the accuracy when estimating the depth of the penetration points, and can also effectively predict the top and bottom interfaces in the horizontal penetrating section. Moreover, this method will greatly increase not only the number of points of depth data available, but also the accuracy of these data, which achieves the goal of building a reliable fine reservoir structure model by using the stratigraphic correlation among horizontal wells. Using this method, four 3D fine structure layer models have been successfully built of a specimen shale gas field with platform-based horizontal well production mode. The shale gas field is located to the east of Sichuan Basin, China; the successful application of the method has proven its feasibility and reliability.

  12. Software-engineering challenges of building and deploying reusable problem solvers.

    PubMed

    O'Connor, Martin J; Nyulas, Csongor; Tu, Samson; Buckeridge, David L; Okhmatovskaia, Anna; Musen, Mark A

    2009-11-01

    Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task-method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach.

  13. Software-engineering challenges of building and deploying reusable problem solvers

    PubMed Central

    O’CONNOR, MARTIN J.; NYULAS, CSONGOR; TU, SAMSON; BUCKERIDGE, DAVID L.; OKHMATOVSKAIA, ANNA; MUSEN, MARK A.

    2012-01-01

    Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task–method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach. PMID:23565031

  14. Development of hazard-compatible building fragility and vulnerability models

    USGS Publications Warehouse

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  15. Developing an OD-Intervention Metric System with the Use of Applied Theory-Building Methodology: A Work/Life-Intervention Example

    ERIC Educational Resources Information Center

    Morris, Michael Lane; Storberg-Walker, Julia; McMillan, Heather S.

    2009-01-01

    This article presents a new model, generated through applied theory-building research methods, that helps human resource development (HRD) practitioners evaluate the return on investment (ROI) of organization development (OD) interventions. This model, called organization development human-capital accounting system (ODHCAS), identifies…

  16. Grammar-based Automatic 3D Model Reconstruction from Terrestrial Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Yu, Q.; Helmholz, P.; Belton, D.; West, G.

    2014-04-01

    The automatic reconstruction of 3D buildings has been an important research topic during the last years. In this paper, a novel method is proposed to automatically reconstruct the 3D building models from segmented data based on pre-defined formal grammar and rules. Such segmented data can be extracted e.g. from terrestrial or mobile laser scanning devices. Two steps are considered in detail. The first step is to transform the segmented data into 3D shapes, for instance using the DXF (Drawing Exchange Format) format which is a CAD data file format used for data interchange between AutoCAD and other program. Second, we develop a formal grammar to describe the building model structure and integrate the pre-defined grammars into the reconstruction process. Depending on the different segmented data, the selected grammar and rules are applied to drive the reconstruction process in an automatic manner. Compared with other existing approaches, our proposed method allows the model reconstruction directly from 3D shapes and takes the whole building into account.

  17. NLOS Correction/Exclusion for GNSS Measurement Using RAIM and City Building Models.

    PubMed

    Hsu, Li-Ta; Gu, Yanlei; Kamijo, Shunsuke

    2015-07-17

    Currently, global navigation satellite system (GNSS) receivers can provide accurate and reliable positioning service in open-field areas. However, their performance in the downtown areas of cities is still affected by the multipath and none-line-of-sight (NLOS) receptions. This paper proposes a new positioning method using 3D building models and the receiver autonomous integrity monitoring (RAIM) satellite selection method to achieve satisfactory positioning performance in urban area. The 3D building model uses a ray-tracing technique to simulate the line-of-sight (LOS) and NLOS signal travel distance, which is well-known as pseudorange, between the satellite and receiver. The proposed RAIM fault detection and exclusion (FDE) is able to compare the similarity between the raw pseudorange measurement and the simulated pseudorange. The measurement of the satellite will be excluded if the simulated and raw pseudoranges are inconsistent. Because of the assumption of the single reflection in the ray-tracing technique, an inconsistent case indicates it is a double or multiple reflected NLOS signal. According to the experimental results, the RAIM satellite selection technique can reduce by about 8.4% and 36.2% the positioning solutions with large errors (solutions estimated on the wrong side of the road) for the 3D building model method in the middle and deep urban canyon environment, respectively.

  18. Automated Decomposition of Model-based Learning Problems

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  19. Evaluation of Model Recognition for Grammar-Based Automatic 3d Building Model Reconstruction

    NASA Astrophysics Data System (ADS)

    Yu, Qian; Helmholz, Petra; Belton, David

    2016-06-01

    In recent years, 3D city models are in high demand by many public and private organisations, and the steadily growing capacity in both quality and quantity are increasing demand. The quality evaluation of these 3D models is a relevant issue both from the scientific and practical points of view. In this paper, we present a method for the quality evaluation of 3D building models which are reconstructed automatically from terrestrial laser scanning (TLS) data based on an attributed building grammar. The entire evaluation process has been performed in all the three dimensions in terms of completeness and correctness of the reconstruction. Six quality measures are introduced to apply on four datasets of reconstructed building models in order to describe the quality of the automatic reconstruction, and also are assessed on their validity from the evaluation point of view.

  20. The Advantages of Parametric Modeling for the Reconstruction of Historic Buildings. The Example of the in War Destroyed Church of ST. Catherine (katharinenkirche) in Nuremberg

    NASA Astrophysics Data System (ADS)

    Ludwig, M.; Herbst, G.; Rieke-Zapp, D.; Rosenbauer, R.; Rutishauser, S.; Zellweger, A.

    2013-02-01

    Consecrated in 1297 as the monastery church of the four years earlier founded St. Catherine's monastery, the Gothic Church of St. Catherine was largely destroyed in a devastating bombing raid on January 2nd 1945. To counteract the process of disintegration, the departments of geo-information and lower monument protection authority of the City of Nuremburg decided to getting done a three dimensional building model of the Church of St. Catherine's. A heterogeneous set of data was used for preparation of a parametric architectural model. In effect the modeling of historic buildings can profit from the so called BIM method (Building Information Modeling), as the necessary structuring of the basic data renders it into very sustainable information. The resulting model is perfectly suited to deliver a vivid impression of the interior and exterior of this former mendicant orders' church to present observers.

  1. Building energy modeling for green architecture and intelligent dashboard applications

    NASA Astrophysics Data System (ADS)

    DeBlois, Justin

    Buildings are responsible for 40% of the carbon emissions in the United States. Energy efficiency in this sector is key to reducing overall greenhouse gas emissions. This work studied the passive technique called the roof solar chimney for reducing the cooling load in homes architecturally. Three models of the chimney were created: a zonal building energy model, computational fluid dynamics model, and numerical analytic model. The study estimated the error introduced to the building energy model (BEM) through key assumptions, and then used a sensitivity analysis to examine the impact on the model outputs. The conclusion was that the error in the building energy model is small enough to use it for building simulation reliably. Further studies simulated the roof solar chimney in a whole building, integrated into one side of the roof. Comparisons were made between high and low efficiency constructions, and three ventilation strategies. The results showed that in four US climates, the roof solar chimney results in significant cooling load energy savings of up to 90%. After developing this new method for the small scale representation of a passive architecture technique in BEM, the study expanded the scope to address a fundamental issue in modeling - the implementation of the uncertainty from and improvement of occupant behavior. This is believed to be one of the weakest links in both accurate modeling and proper, energy efficient building operation. A calibrated model of the Mascaro Center for Sustainable Innovation's LEED Gold, 3,400 m2 building was created. Then algorithms were developed for integration to the building's dashboard application that show the occupant the energy savings for a variety of behaviors in real time. An approach using neural networks to act on real-time building automation system data was found to be the most accurate and efficient way to predict the current energy savings for each scenario. A stochastic study examined the impact of the representation of unpredictable occupancy patterns on model results. Combined, these studies inform modelers and researchers on frameworks for simulating holistically designed architecture and improving the interaction between models and building occupants, in residential and commercial settings. v

  2. Market-oriented Programming Using Small-world Networks for Controlling Building Environments

    NASA Astrophysics Data System (ADS)

    Shigei, Noritaka; Miyajima, Hiromi; Osako, Tsukasa

    The market model, which is one of the economic activity models, is modeled as an agent system, and applying the model to the resource allocation problem has been studied. For air conditioning control of building, which is one of the resource allocation problems, an effective method based on the agent system using auction has been proposed for traditional PID controller. On the other hand, it has been considered that this method is performed by decentralized control. However, its decentralization is not perfect, and its performace is not enough. In this paper, firstly, we propose a perfectly decentralized agent model and show its performance. Secondly, in order to improve the model, we propose the agent model based on small-world model. The effectiveness of the proposed model is shown by simulation.

  3. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. The implementation of assessment model based on character building to improve students’ discipline and achievement

    NASA Astrophysics Data System (ADS)

    Rusijono; Khotimah, K.

    2018-01-01

    The purpose of this research was to investigate the effect of implementing the assessment model based on character building to improve discipline and student’s achievement. Assessment model based on character building includes three components, which are the behaviour of students, the efforts, and student’s achievement. This assessment model based on the character building is implemented in science philosophy and educational assessment courses, in Graduate Program of Educational Technology Department, Educational Faculty, Universitas Negeri Surabaya. This research used control group pre-test and post-test design. Data collection method used in this research were observation and test. The observation was used to collect the data about the disciplines of the student in the instructional process, while the test was used to collect the data about student’s achievement. Moreover, the study applied t-test to the analysis of data. The result of this research showed that assessment model based on character building improved discipline and student’s achievement.

  5. A new method of building footprints detection using airborne laser scanning data and multispectral image

    NASA Astrophysics Data System (ADS)

    Luo, Yiping; Jiang, Ting; Gao, Shengli; Wang, Xin

    2010-10-01

    It presents a new approach for detecting building footprints in a combination of registered aerial image with multispectral bands and airborne laser scanning data synchronously obtained by Leica-Geosystems ALS40 and Applanix DACS-301 on the same platform. A two-step method for building detection was presented consisting of selecting 'building' candidate points and then classifying candidate points. A digital surface model(DSM) derived from last pulse laser scanning data was first filtered and the laser points were classified into classes 'ground' and 'building or tree' based on mathematic morphological filter. Then, 'ground' points were resample into digital elevation model(DEM), and a Normalized DSM(nDSM) was generated from DEM and DSM. The candidate points were selected from 'building or tree' points by height value and area threshold in nDSM. The candidate points were further classified into building points and tree points by using the support vector machines(SVM) classification method. Two classification tests were carried out using features only from laser scanning data and associated features from two input data sources. The features included height, height finite difference, RGB bands value, and so on. The RGB value of points was acquired by matching laser scanning data and image using collinear equation. The features of training points were presented as input data for SVM classification method, and cross validation was used to select best classification parameters. The determinant function could be constructed by the classification parameters and the class of candidate points was determined by determinant function. The result showed that associated features from two input data sources were superior to features only from laser scanning data. The accuracy of more than 90% was achieved for buildings in first kind of features.

  6. Improving Semantic Updating Method on 3d City Models Using Hybrid Semantic-Geometric 3d Segmentation Technique

    NASA Astrophysics Data System (ADS)

    Sharkawi, K.-H.; Abdul-Rahman, A.

    2013-09-01

    Cities and urban areas entities such as building structures are becoming more complex as the modern human civilizations continue to evolve. The ability to plan and manage every territory especially the urban areas is very important to every government in the world. Planning and managing cities and urban areas based on printed maps and 2D data are getting insufficient and inefficient to cope with the complexity of the new developments in big cities. The emergence of 3D city models have boosted the efficiency in analysing and managing urban areas as the 3D data are proven to represent the real world object more accurately. It has since been adopted as the new trend in buildings and urban management and planning applications. Nowadays, many countries around the world have been generating virtual 3D representation of their major cities. The growing interest in improving the usability of 3D city models has resulted in the development of various tools for analysis based on the 3D city models. Today, 3D city models are generated for various purposes such as for tourism, location-based services, disaster management and urban planning. Meanwhile, modelling 3D objects are getting easier with the emergence of the user-friendly tools for 3D modelling available in the market. Generating 3D buildings with high accuracy also has become easier with the availability of airborne Lidar and terrestrial laser scanning equipments. The availability and accessibility to this technology makes it more sensible to analyse buildings in urban areas using 3D data as it accurately represent the real world objects. The Open Geospatial Consortium (OGC) has accepted CityGML specifications as one of the international standards for representing and exchanging spatial data, making it easier to visualize, store and manage 3D city models data efficiently. CityGML able to represents the semantics, geometry, topology and appearance of 3D city models in five well-defined Level-of-Details (LoD), namely LoD0 to LoD4. The accuracy and structural complexity of the 3D objects increases with the LoD level where LoD0 is the simplest LoD (2.5D; Digital Terrain Model (DTM) + building or roof print) while LoD4 is the most complex LoD (architectural details with interior structures). Semantic information is one of the main components in CityGML and 3D City Models, and provides important information for any analyses. However, more often than not, the semantic information is not available for the 3D city model due to the unstandardized modelling process. One of the examples is where a building is normally generated as one object (without specific feature layers such as Roof, Ground floor, Level 1, Level 2, Block A, Block B, etc). This research attempts to develop a method to improve the semantic data updating process by segmenting the 3D building into simpler parts which will make it easier for the users to select and update the semantic information. The methodology is implemented for 3D buildings in LoD2 where the buildings are generated without architectural details but with distinct roof structures. This paper also introduces hybrid semantic-geometric 3D segmentation method that deals with hierarchical segmentation of a 3D building based on its semantic value and surface characteristics, fitted by one of the predefined primitives. For future work, the segmentation method will be implemented as part of the change detection module that can detect any changes on the 3D buildings, store and retrieve semantic information of the changed structure, automatically updates the 3D models and visualize the results in a userfriendly graphical user interface (GUI).

  7. Parametric Modelling (bim) for the Documentation of Vernacular Construction Methods: a Bim Model for the Commissariat Building, Ottawa, Canada

    NASA Astrophysics Data System (ADS)

    Fai, S.; Filippi, M.; Paliaga, S.

    2013-07-01

    Whether a house of worship or a simple farmhouse, the fabrication of a building reveals both the unspoken cultural aspirations of the builder and the inevitable exigencies of the construction process. In other-words, why buildings are made is intimately and inevitably associated with how buildings are made. Nowhere is this more evident than in vernacular architecture. At the Carleton Immersive Media Studio (CIMS) we are concerned that the de-population of Canada's rural areas, paucity of specialized tradespersons, and increasing complexity of building codes threaten the sustainability of this invaluable cultural resource. For current and future generations, the quantitative and qualitative values of traditional methods of construction are essential for an inclusive cultural memory. More practically, and equally pressing, an operational knowledge of these technologies is essential for the conservation of our built heritage. To address these concerns, CIMS has launched a number of research initiatives over the past five years that explore novel protocols for the documentation and dissemination of knowledge related to traditional methods of construction. Our current project, Cultural Diversity and Material Imagination in Canadian Architecture (CDMICA), made possible through funding from Canada's Social Sciences and Humanities Research Council (SSHRC), explores the potential of building information modelling (BIM) within the context of a web-based environment. In this paper, we discuss our work-to-date on the development of a web-based library of BIM details that is referenced to ''typical'' assemblies culled from 19C and early 20C construction manuals. The parametric potential of these ''typical'' details is further refined by evidence from the documentation of ''specific'' details studied during comprehensive surveys of extant heritage buildings. Here, we consider a BIM of the roof truss assembly of one of the oldest buildings in Canada's national capital - the Commissariat Building and current home to the Bytown Museum - as a case study within the CDMICA project.

  8. Relative significance of heat transfer processes to quantify tradeoffs between complexity and accuracy of energy simulations with a building energy use patterns classification

    NASA Astrophysics Data System (ADS)

    Heidarinejad, Mohammad

    This dissertation develops rapid and accurate building energy simulations based on a building classification that identifies and focuses modeling efforts on most significant heat transfer processes. The building classification identifies energy use patterns and their contributing parameters for a portfolio of buildings. The dissertation hypothesis is "Building classification can provide minimal required inputs for rapid and accurate energy simulations for a large number of buildings". The critical literature review indicated there is lack of studies to (1) Consider synoptic point of view rather than the case study approach, (2) Analyze influence of different granularities of energy use, (3) Identify key variables based on the heat transfer processes, and (4) Automate the procedure to quantify model complexity with accuracy. Therefore, three dissertation objectives are designed to test out the dissertation hypothesis: (1) Develop different classes of buildings based on their energy use patterns, (2) Develop different building energy simulation approaches for the identified classes of buildings to quantify tradeoffs between model accuracy and complexity, (3) Demonstrate building simulation approaches for case studies. Penn State's and Harvard's campus buildings as well as high performance LEED NC office buildings are test beds for this study to develop different classes of buildings. The campus buildings include detailed chilled water, electricity, and steam data, enabling to classify buildings into externally-load, internally-load, or mixed-load dominated. The energy use of the internally-load buildings is primarily a function of the internal loads and their schedules. Externally-load dominated buildings tend to have an energy use pattern that is a function of building construction materials and outdoor weather conditions. However, most of the commercial medium-sized office buildings have a mixed-load pattern, meaning the HVAC system and operation schedule dictate the indoor condition regardless of the contribution of internal and external loads. To deploy the methodology to another portfolio of buildings, simulated LEED NC office buildings are selected. The advantage of this approach is to isolate energy performance due to inherent building characteristics and location, rather than operational and maintenance factors that can contribute to significant variation in building energy use. A framework for detailed building energy databases with annual energy end-uses is developed to select variables and omit outliers. The results show that the high performance office buildings are internally-load dominated with existence of three different clusters of low-intensity, medium-intensity, and high-intensity energy use pattern for the reviewed office buildings. Low-intensity cluster buildings benefit from small building area, while the medium- and high-intensity clusters have a similar range of floor areas and different energy use intensities. Half of the energy use in the low-intensity buildings is associated with the internal loads, such as lighting and plug loads, indicating that there are opportunities to save energy by using lighting or plug load management systems. A comparison between the frameworks developed for the campus buildings and LEED NC office buildings indicates these two frameworks are complementary to each other. Availability of the information has yielded to two different procedures, suggesting future studies for a portfolio of buildings such as city benchmarking and disclosure ordinance should collect and disclose minimal required inputs suggested by this study with the minimum level of monthly energy consumption granularity. This dissertation developed automated methods using the OpenStudio API (Application Programing Interface) to create energy models based on the building class. ASHRAE Guideline 14 defines well-accepted criteria to measure accuracy of energy simulations; however, there is no well-accepted methodology to quantify the model complexity without the influence of the energy modeler judgment about the model complexity. This study developed a novel method using two weighting factors, including weighting factors based on (1) computational time and (2) easiness of on-site data collection, to measure complexity of the energy models. Therefore, this dissertation enables measurement of both model complexity and accuracy as well as assessment of the inherent tradeoffs between energy simulation model complexity and accuracy. The results of this methodology suggest for most of the internal load contributors such as operation schedules the on-site data collection adds more complexity to the model compared to the computational time. Overall, this study provided specific data on tradeoffs between accuracy and model complexity that points to critical inputs for different building classes, rather than an increase in the volume and detail of model inputs as the current research and consulting practice indicates. (Abstract shortened by UMI.).

  9. Two-Graph Building Interior Representation for Emergency Response Applications

    NASA Astrophysics Data System (ADS)

    Boguslawski, P.; Mahdjoubi, L.; Zverovich, V.; Fadli, F.

    2016-06-01

    Nowadays, in a rapidly developing urban environment with bigger and higher public buildings, disasters causing emergency situations and casualties are unavoidable. Preparedness and quick response are crucial issues saving human lives. Available information about an emergency scene, such as a building structure, helps for decision making and organizing rescue operations. Models supporting decision-making should be available in real, or near-real, time. Thus, good quality models that allow implementation of automated methods are highly desirable. This paper presents details of the recently developed method for automated generation of variable density navigable networks in a 3D indoor environment, including a full 3D topological model, which may be used not only for standard navigation but also for finding safe routes and simulating hazard and phenomena associated with disasters such as fire spread and heat transfer.

  10. Analysis Methods for Post Occupancy Evaluation of Energy-Use in High Performance Buildings Using Short-Term Monitoring

    NASA Astrophysics Data System (ADS)

    Singh, Vipul

    2011-12-01

    The green building movement has been an effective catalyst in reducing energy demands of buildings and a large number of 'green' certified buildings have been in operation for several years. Whether these buildings are actually performing as intended, and if not, identifying specific causes for this discrepancy falls into the general realm of post-occupancy evaluation (POE). POE involves evaluating building performance in terms of energy-use, indoor environmental quality, acoustics and water-use; the first aspect i.e. energy-use is addressed in this thesis. Normally, a full year or more of energy-use and weather data is required to determine the actual post-occupancy energy-use of buildings. In many cases, either measured building performance data is not available or the time and cost implications may not make it feasible to invest in monitoring the building for a whole year. Knowledge about the minimum amount of measured data needed to accurately capture the behavior of the building over the entire year can be immensely beneficial. This research identifies simple modeling techniques to determine best time of the year to begin in-situ monitoring of building energy-use, and the least amount of data required for generating acceptable long-term predictions. Four analysis procedures are studied. The short-term monitoring for long-term prediction (SMLP) approach and dry-bulb temperature analysis (DBTA) approach allow determining the best time and duration of the year for in-situ monitoring to be performed based only on the ambient temperature data of the location. Multivariate change-point (MCP) modeling uses simulated/monitored data to determine best monitoring period of the year. This is also used to validate the SMLP and DBTA approaches. The hybrid inverse modeling method-1 predicts energy-use by combining a short dataset of monitored internal loads with a year of utility-bills, and hybrid inverse method-2 predicts long term building performance using utility-bills only. The results obtained show that often less than three to four months of monitored data is adequate for estimating the annual building energy use, provided that the monitoring is initiated at the right time, and the seasonal as well as daily variations are adequately captured by the short dataset. The predictive accuracy of the short data-sets is found to be strongly influenced by the closeness of the dataset's mean temperature to the annual average temperature. The analysis methods studied would be very useful for energy professionals involved in POE.

  11. Jeddah Historical Building Information Modelling "JHBIM" - Object Library

    NASA Astrophysics Data System (ADS)

    Baik, A.; Alitany, A.; Boehm, J.; Robson, S.

    2014-05-01

    The theory of using Building Information Modelling "BIM" has been used in several Heritage places in the worldwide, in the case of conserving, documenting, managing, and creating full engineering drawings and information. However, one of the most serious issues that facing many experts in order to use the Historical Building Information Modelling "HBIM", is creating the complicated architectural elements of these Historical buildings. In fact, many of these outstanding architectural elements have been designed and created in the site to fit the exact location. Similarly, this issue has been faced the experts in Old Jeddah in order to use the BIM method for Old Jeddah historical Building. Moreover, The Saudi Arabian City has a long history as it contains large number of historic houses and buildings that were built since the 16th century. Furthermore, the BIM model of the historical building in Old Jeddah always take a lot of time, due to the unique of Hijazi architectural elements and no such elements library, which have been took a lot of time to be modelled. This paper will focus on building the Hijazi architectural elements library based on laser scanner and image survey data. This solution will reduce the time to complete the HBIM model and offering in depth and rich digital architectural elements library to be used in any heritage projects in Al-Balad district, Jeddah City.

  12. Developing Performance Cost Index Targets for ASHRAE Standard 90.1 Appendix G – Performance Rating Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Michael I.; Hart, Philip R.

    2016-02-16

    Appendix G, the Performance Rating Method in ASHRAE Standard 90.1 has been updated to make two significant changes for the 2016 edition, to be published in October of 2016. First, it allows Appendix G to be used as a third path for compliance with the standard in addition to rating beyond code building performance. This prevents modelers from having to develop separate building models for code compliance and beyond code programs. Using this new version of Appendix G to show compliance with the 2016 edition of the standard, the proposed building design needs to have a performance cost index (PCI)more » less than targets shown in a new table based on building type and climate zone. The second change is that the baseline design is now fixed at a stable level of performance set approximately equal to the 2004 code. Rather than changing the stringency of the baseline with each subsequent edition of the standard, compliance with new editions will simply require a reduced PCI (a PCI of zero is a net-zero building). Using this approach, buildings of any era can be rated using the same method. The intent is that any building energy code or beyond code program can use this methodology and merely set the appropriate PCI target for their needs. This report discusses the process used to set performance criteria for compliance with ASHRAE Standard 90.1-2016 and suggests a method for demonstrating compliance with other codes and beyond code programs.« less

  13. Developing Performance Cost Index Targets for ASHRAE Standard 90.1 Appendix G – Performance Rating Method - Rev.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Michael I.; Hart, Philip R.

    2016-03-01

    Appendix G, the Performance Rating Method in ASHRAE Standard 90.1 has been updated to make two significant changes for the 2016 edition, to be published in October of 2016. First, it allows Appendix G to be used as a third path for compliance with the standard in addition to rating beyond code building performance. This prevents modelers from having to develop separate building models for code compliance and beyond code programs. Using this new version of Appendix G to show compliance with the 2016 edition of the standard, the proposed building design needs to have a performance cost index (PCI)more » less than targets shown in a new table based on building type and climate zone. The second change is that the baseline design is now fixed at a stable level of performance set approximately equal to the 2004 code. Rather than changing the stringency of the baseline with each subsequent edition of the standard, compliance with new editions will simply require a reduced PCI (a PCI of zero is a net-zero building). Using this approach, buildings of any era can be rated using the same method. The intent is that any building energy code or beyond code program can use this methodology and merely set the appropriate PCI target for their needs. This report discusses the process used to set performance criteria for compliance with ASHRAE Standard 90.1-2016 and suggests a method for demonstrating compliance with other codes and beyond code programs.« less

  14. Matching Aerial Images to 3D Building Models Using Context-Based Geometric Hashing

    PubMed Central

    Jung, Jaewook; Sohn, Gunho; Bang, Kiin; Wichmann, Andreas; Armenakis, Costas; Kada, Martin

    2016-01-01

    A city is a dynamic entity, which environment is continuously changing over time. Accordingly, its virtual city models also need to be regularly updated to support accurate model-based decisions for various applications, including urban planning, emergency response and autonomous navigation. A concept of continuous city modeling is to progressively reconstruct city models by accommodating their changes recognized in spatio-temporal domain, while preserving unchanged structures. A first critical step for continuous city modeling is to coherently register remotely sensed data taken at different epochs with existing building models. This paper presents a new model-to-image registration method using a context-based geometric hashing (CGH) method to align a single image with existing 3D building models. This model-to-image registration process consists of three steps: (1) feature extraction; (2) similarity measure; and matching, and (3) estimating exterior orientation parameters (EOPs) of a single image. For feature extraction, we propose two types of matching cues: edged corner features representing the saliency of building corner points with associated edges, and contextual relations among the edged corner features within an individual roof. A set of matched corners are found with given proximity measure through geometric hashing, and optimal matches are then finally determined by maximizing the matching cost encoding contextual similarity between matching candidates. Final matched corners are used for adjusting EOPs of the single airborne image by the least square method based on collinearity equations. The result shows that acceptable accuracy of EOPs of a single image can be achievable using the proposed registration approach as an alternative to a labor-intensive manual registration process. PMID:27338410

  15. Assigning Robust Default Values in Building Performance Simulation Software for Improved Decision-Making in the Initial Stages of Building Design.

    PubMed

    Hiyama, Kyosuke

    2015-01-01

    Applying data mining techniques on a database of BIM models could provide valuable insights in key design patterns implicitly present in these BIM models. The architectural designer would then be able to use previous data from existing building projects as default values in building performance simulation software for the early phases of building design. The author has proposed the method to minimize the magnitude of the variation in these default values in subsequent design stages. This approach maintains the accuracy of the simulation results in the initial stages of building design. In this study, a more convincing argument is presented to demonstrate the significance of the new method. The variation in the ideal default values for different building design conditions is assessed first. Next, the influence of each condition on these variations is investigated. The space depth is found to have a large impact on the ideal default value of the window to wall ratio. In addition, the presence or absence of lighting control and natural ventilation has a significant influence on the ideal default value. These effects can be used to identify the types of building conditions that should be considered to determine the ideal default values.

  16. Assigning Robust Default Values in Building Performance Simulation Software for Improved Decision-Making in the Initial Stages of Building Design

    PubMed Central

    2015-01-01

    Applying data mining techniques on a database of BIM models could provide valuable insights in key design patterns implicitly present in these BIM models. The architectural designer would then be able to use previous data from existing building projects as default values in building performance simulation software for the early phases of building design. The author has proposed the method to minimize the magnitude of the variation in these default values in subsequent design stages. This approach maintains the accuracy of the simulation results in the initial stages of building design. In this study, a more convincing argument is presented to demonstrate the significance of the new method. The variation in the ideal default values for different building design conditions is assessed first. Next, the influence of each condition on these variations is investigated. The space depth is found to have a large impact on the ideal default value of the window to wall ratio. In addition, the presence or absence of lighting control and natural ventilation has a significant influence on the ideal default value. These effects can be used to identify the types of building conditions that should be considered to determine the ideal default values. PMID:26090512

  17. Algae façade as green building method: application of algae as a method to meet the green building regulation

    NASA Astrophysics Data System (ADS)

    Poerbo, Heru W.; Martokusumo, Widjaja; Donny Koerniawan, M.; Aulia Ardiani, Nissa; Krisanti, Susan

    2017-12-01

    The Local Government of Bandung city has stipulated a Green Building regulation through the Peraturan Walikota Number 1023/2016. Signed by the mayor in October 2016, Bandung became the first city in Indonesia that put green building as mandatory requirement in the building permit (IMB) process. Green Building regulation is intended to have more efficient consumption of energy and water, improved indoor air quality, management of liquid and solid waste etc. This objective is attained through various design method in building envelope, ventilation and air conditioning system, lighting, indoor transportation system, and electrical system. To minimize energy consumption of buildings that have large openings, sun shading device is often utilized together with low-E glass panes. For buildings in hot humid tropical climate, this method reduces indoor air temperature and thus requires less energy for air conditioning. Indoor air quality is often done by monitoring the carbon dioxide levels. Application of algae as part of building system façade has recently been introduced as replacement of large glass surface in the building façade. Algae are not yet included in the green building regulation because it is relatively new. The research will investigate, with the help of the modelling process and extensive literature, how effective is the implementation of algae in building façade to reduce energy consumption and improve its indoor air quality. This paper is written based on the design of ITB Innovation Park as an ongoing architectural design-based research how the algae-integrated building façade affects the energy consumption.

  18. Using Whole-House Field Tests to Empirically Derive Moisture Buffering Model Inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, J.; Winkler, J.; Christensen, D.

    2014-08-01

    Building energy simulations can be used to predict a building's interior conditions, along with the energy use associated with keeping these conditions comfortable. These models simulate the loads on the building (e.g., internal gains, envelope heat transfer), determine the operation of the space conditioning equipment, and then calculate the building's temperature and humidity throughout the year. The indoor temperature and humidity are affected not only by the loads and the space conditioning equipment, but also by the capacitance of the building materials, which buffer changes in temperature and humidity. This research developed an empirical method to extract whole-house model inputsmore » for use with a more accurate moisture capacitance model (the effective moisture penetration depth model). The experimental approach was to subject the materials in the house to a square-wave relative humidity profile, measure all of the moisture transfer terms (e.g., infiltration, air conditioner condensate) and calculate the only unmeasured term: the moisture absorption into the materials. After validating the method with laboratory measurements, we performed the tests in a field house. A least-squares fit of an analytical solution to the measured moisture absorption curves was used to determine the three independent model parameters representing the moisture buffering potential of this house and its furnishings. Follow on tests with realistic latent and sensible loads showed good agreement with the derived parameters, especially compared to the commonly-used effective capacitance approach. These results show that the EMPD model, once the inputs are known, is an accurate moisture buffering model.« less

  19. Inverse modeling methods for indoor airborne pollutant tracking: literature review and fundamentals.

    PubMed

    Liu, X; Zhai, Z

    2007-12-01

    Reduction in indoor environment quality calls for effective control and improvement measures. Accurate and prompt identification of contaminant sources ensures that they can be quickly removed and contaminated spaces isolated and cleaned. This paper discusses the use of inverse modeling to identify potential indoor pollutant sources with limited pollutant sensor data. The study reviews various inverse modeling methods for advection-dispersion problems and summarizes the methods into three major categories: forward, backward, and probability inverse modeling methods. The adjoint probability inverse modeling method is indicated as an appropriate model for indoor air pollutant tracking because it can quickly find source location, strength and release time without prior information. The paper introduces the principles of the adjoint probability method and establishes the corresponding adjoint equations for both multi-zone airflow models and computational fluid dynamics (CFD) models. The study proposes a two-stage inverse modeling approach integrating both multi-zone and CFD models, which can provide a rapid estimate of indoor pollution status and history for a whole building. Preliminary case study results indicate that the adjoint probability method is feasible for indoor pollutant inverse modeling. The proposed method can help identify contaminant source characteristics (location and release time) with limited sensor outputs. This will ensure an effective and prompt execution of building management strategies and thus achieve a healthy and safe indoor environment. The method can also help design optimal sensor networks.

  20. Multi-type sensor placement and response reconstruction for building structures: Experimental investigations

    NASA Astrophysics Data System (ADS)

    Hu, Rong-Pan; Xu, You-Lin; Zhan, Sheng

    2018-01-01

    Estimation of lateral displacement and acceleration responses is essential to assess safety and serviceability of high-rise buildings under dynamic loadings including earthquake excitations. However, the measurement information from the limited number of sensors installed in a building structure is often insufficient for the complete structural performance assessment. An integrated multi-type sensor placement and response reconstruction method has thus been proposed by the authors to tackle this problem. To validate the feasibility and effectiveness of the proposed method, an experimental investigation using a cantilever beam with multi-type sensors is performed and reported in this paper. The experimental setup is first introduced. The finite element modelling and model updating of the cantilever beam are then performed. The optimal sensor placement for the best response reconstruction is determined by the proposed method based on the updated FE model of the beam. After the sensors are installed on the physical cantilever beam, a number of experiments are carried out. The responses at key locations are reconstructed and compared with the measured ones. The reconstructed responses achieve a good match with the measured ones, manifesting the feasibility and effectiveness of the proposed method. Besides, the proposed method is also examined for the cases of different excitations and unknown excitation, and the results prove the proposed method to be robust and effective. The superiority of the optimized sensor placement scheme is finally demonstrated through comparison with two other different sensor placement schemes: the accelerometer-only scheme and non-optimal sensor placement scheme. The proposed method can be applied to high-rise buildings for seismic performance assessment.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Jason; Winkler, Jon

    Moisture adsorption and desorption in building materials impact indoor humidity. This effect should be included in building-energy simulations, particularly when humidity is being investigated or controlled. Several models can calculate this moisture-buffering effect, but accurate ones require model inputs that are not always known to the user of the building-energy simulation. This research developed an empirical method to extract whole-house model inputs for the effective moisture penetration depth (EMPD) model. The experimental approach was to subject the materials in the house to a square-wave relative-humidity profile, measure all of the moisture-transfer terms (e.g., infiltration, air-conditioner condensate), and calculate the onlymore » unmeasured term—the moisture sorption into the materials. We validated this method with laboratory measurements, which we used to measure the EMPD model inputs of two houses. After deriving these inputs, we measured the humidity of the same houses during tests with realistic latent and sensible loads and demonstrated the accuracy of this approach. Furthermore, these results show that the EMPD model, when given reasonable inputs, is an accurate moisture-buffering model.« less

  2. Data-Driven Benchmarking of Building Energy Efficiency Utilizing Statistical Frontier Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kavousian, A; Rajagopal, R

    2014-01-01

    Frontier methods quantify the energy efficiency of buildings by forming an efficient frontier (best-practice technology) and by comparing all buildings against that frontier. Because energy consumption fluctuates over time, the efficiency scores are stochastic random variables. Existing applications of frontier methods in energy efficiency either treat efficiency scores as deterministic values or estimate their uncertainty by resampling from one set of measurements. Availability of smart meter data (repeated measurements of energy consumption of buildings) enables using actual data to estimate the uncertainty in efficiency scores. Additionally, existing applications assume a linear form for an efficient frontier; i.e.,they assume that themore » best-practice technology scales up and down proportionally with building characteristics. However, previous research shows that buildings are nonlinear systems. This paper proposes a statistical method called stochastic energy efficiency frontier (SEEF) to estimate a bias-corrected efficiency score and its confidence intervals from measured data. The paper proposes an algorithm to specify the functional form of the frontier, identify the probability distribution of the efficiency score of each building using measured data, and rank buildings based on their energy efficiency. To illustrate the power of SEEF, this paper presents the results from applying SEEF on a smart meter data set of 307 residential buildings in the United States. SEEF efficiency scores are used to rank individual buildings based on energy efficiency, to compare subpopulations of buildings, and to identify irregular behavior of buildings across different time-of-use periods. SEEF is an improvement to the energy-intensity method (comparing kWh/sq.ft.): whereas SEEF identifies efficient buildings across the entire spectrum of building sizes, the energy-intensity method showed bias toward smaller buildings. The results of this research are expected to assist researchers and practitioners compare and rank (i.e.,benchmark) buildings more robustly and over a wider range of building types and sizes. Eventually, doing so is expected to result in improved resource allocation in energy-efficiency programs.« less

  3. Formal Methods for Automated Diagnosis of Autosub 6000

    NASA Technical Reports Server (NTRS)

    Ernits, Juhan; Dearden, Richard; Pebody, Miles

    2009-01-01

    This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.

  4. The application of the statistical classifying models for signal evaluation of the gas sensors analyzing mold contamination of the building materials

    NASA Astrophysics Data System (ADS)

    Majerek, Dariusz; Guz, Łukasz; Suchorab, Zbigniew; Łagód, Grzegorz; Sobczuk, Henryk

    2017-07-01

    Mold that develops on moistened building barriers is a major cause of the Sick Building Syndrome (SBS). Fungal contamination is normally evaluated using standard biological methods which are time-consuming and require a lot of manual labor. Fungi emit Volatile Organic Compounds (VOC) that can be detected in the indoor air using several techniques of detection e.g. chromatography. VOCs can be also detected using gas sensors arrays. All array sensors generate particular voltage signals that ought to be analyzed using properly selected statistical methods of interpretation. This work is focused on the attempt to apply statistical classifying models in evaluation of signals from gas sensors matrix to analyze the air sampled from the headspace of various types of the building materials at different level of contamination but also clean reference materials.

  5. A cloud-based framework for large-scale traditional Chinese medical record retrieval.

    PubMed

    Liu, Lijun; Liu, Li; Fu, Xiaodong; Huang, Qingsong; Zhang, Xianwen; Zhang, Yin

    2018-01-01

    Electronic medical records are increasingly common in medical practice. The secondary use of medical records has become increasingly important. It relies on the ability to retrieve the complete information about desired patient populations. How to effectively and accurately retrieve relevant medical records from large- scale medical big data is becoming a big challenge. Therefore, we propose an efficient and robust framework based on cloud for large-scale Traditional Chinese Medical Records (TCMRs) retrieval. We propose a parallel index building method and build a distributed search cluster, the former is used to improve the performance of index building, and the latter is used to provide high concurrent online TCMRs retrieval. Then, a real-time multi-indexing model is proposed to ensure the latest relevant TCMRs are indexed and retrieved in real-time, and a semantics-based query expansion method and a multi- factor ranking model are proposed to improve retrieval quality. Third, we implement a template-based visualization method for displaying medical reports. The proposed parallel indexing method and distributed search cluster can improve the performance of index building and provide high concurrent online TCMRs retrieval. The multi-indexing model can ensure the latest relevant TCMRs are indexed and retrieved in real-time. The semantics expansion method and the multi-factor ranking model can enhance retrieval quality. The template-based visualization method can enhance the availability and universality, where the medical reports are displayed via friendly web interface. In conclusion, compared with the current medical record retrieval systems, our system provides some advantages that are useful in improving the secondary use of large-scale traditional Chinese medical records in cloud environment. The proposed system is more easily integrated with existing clinical systems and be used in various scenarios. Copyright © 2017. Published by Elsevier Inc.

  6. Comparison of the Calculations Results of Heat Exchange Between a Single-Family Building and the Ground Obtained with the Quasi-Stationary and 3-D Transient Models. Part 2: Intermittent and Reduced Heating Mode

    NASA Astrophysics Data System (ADS)

    Staszczuk, Anna

    2017-03-01

    The paper provides comparative results of calculations of heat exchange between ground and typical residential buildings using simplified (quasi-stationary) and more accurate (transient, three-dimensional) methods. Such characteristics as building's geometry, basement hollow and construction of ground touching assemblies were considered including intermittent and reduced heating mode. The calculations with simplified methods were conducted in accordance with currently valid norm: PN-EN ISO 13370:2008. Thermal performance of buildings. Heat transfer via the ground. Calculation methods. Comparative estimates concerning transient, 3-D, heat flow were performed with computer software WUFI®plus. The differences of heat exchange obtained using more exact and simplified methods have been specified as a result of the analysis.

  7. Simulating Building Fires for Movies

    NASA Technical Reports Server (NTRS)

    Rodriguez, Ricardo C.; Johnson, Randall P.

    1987-01-01

    Fire scenes for cinematography staged at relatively low cost in method that combines several existing techniques. Nearly realistic scenes, suitable for firefighter training, produced with little specialized equipment. Sequences of scenes set up quickly and easily, without compromising safety because model not burned. Images of fire, steam, and smoke superimposed on image of building to simulate burning of building.

  8. Method for automated building of spindle thermal model with use of CAE system

    NASA Astrophysics Data System (ADS)

    Kamenev, S. V.

    2018-03-01

    The spindle is one of the most important units of the metal-cutting machine tool. Its performance is critical to minimize the machining error, especially the thermal error. Various methods are applied to improve the thermal behaviour of spindle units. One of the most important methods is mathematical modelling based on the finite element analysis. The most common approach for its realization is the use of CAE systems. This approach, however, is not capable to address the number of important effects that need to be taken into consideration for proper simulation. In the present article, the authors propose the solution to overcome these disadvantages using automated thermal model building for the spindle unit utilizing the CAE system ANSYS.

  9. Predicting the activity of drugs for a group of imidazopyridine anticoccidial compounds.

    PubMed

    Si, Hongzong; Lian, Ning; Yuan, Shuping; Fu, Aiping; Duan, Yun-Bo; Zhang, Kejun; Yao, Xiaojun

    2009-10-01

    Gene expression programming (GEP) is a novel machine learning technique. The GEP is used to build nonlinear quantitative structure-activity relationship model for the prediction of the IC(50) for the imidazopyridine anticoccidial compounds. This model is based on descriptors which are calculated from the molecular structure. Four descriptors are selected from the descriptors' pool by heuristic method (HM) to build multivariable linear model. The GEP method produced a nonlinear quantitative model with a correlation coefficient and a mean error of 0.96 and 0.24 for the training set, 0.91 and 0.52 for the test set, respectively. It is shown that the GEP predicted results are in good agreement with experimental ones.

  10. Equation-based languages – A new paradigm for building energy modeling, simulation and optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael; Bonvini, Marco; Nouidui, Thierry S.

    Most of the state-of-the-art building simulation programs implement models in imperative programming languages. This complicates modeling and excludes the use of certain efficient methods for simulation and optimization. In contrast, equation-based modeling languages declare relations among variables, thereby allowing the use of computer algebra to enable much simpler schematic modeling and to generate efficient code for simulation and optimization. We contrast the two approaches in this paper. We explain how such manipulations support new use cases. In the first of two examples, we couple models of the electrical grid, multiple buildings, HVAC systems and controllers to test a controller thatmore » adjusts building room temperatures and PV inverter reactive power to maintain power quality. In the second example, we contrast the computing time for solving an optimal control problem for a room-level model predictive controller with and without symbolic manipulations. As a result, exploiting the equation-based language led to 2, 200 times faster solution« less

  11. Equation-based languages – A new paradigm for building energy modeling, simulation and optimization

    DOE PAGES

    Wetter, Michael; Bonvini, Marco; Nouidui, Thierry S.

    2016-04-01

    Most of the state-of-the-art building simulation programs implement models in imperative programming languages. This complicates modeling and excludes the use of certain efficient methods for simulation and optimization. In contrast, equation-based modeling languages declare relations among variables, thereby allowing the use of computer algebra to enable much simpler schematic modeling and to generate efficient code for simulation and optimization. We contrast the two approaches in this paper. We explain how such manipulations support new use cases. In the first of two examples, we couple models of the electrical grid, multiple buildings, HVAC systems and controllers to test a controller thatmore » adjusts building room temperatures and PV inverter reactive power to maintain power quality. In the second example, we contrast the computing time for solving an optimal control problem for a room-level model predictive controller with and without symbolic manipulations. As a result, exploiting the equation-based language led to 2, 200 times faster solution« less

  12. Using the Scientific Method to Engage Mathematical Modeling: An Investigation of pi

    ERIC Educational Resources Information Center

    Archer, Lester A. C.; Ng, Karen E.

    2016-01-01

    The purpose of this paper is to explain how to use the scientific method as the framework to introduce mathematical model. Two interdisciplinary activities, targeted for students in grade 6 or grade 7, are explained to show the application of the scientific method while building a mathematical model to investigate the relationship between the…

  13. Scan-To Output Validation: Towards a Standardized Geometric Quality Assessment of Building Information Models Based on Point Clouds

    NASA Astrophysics Data System (ADS)

    Bonduel, M.; Bassier, M.; Vergauwen, M.; Pauwels, P.; Klein, R.

    2017-11-01

    The use of Building Information Modeling (BIM) for existing buildings based on point clouds is increasing. Standardized geometric quality assessment of the BIMs is needed to make them more reliable and thus reusable for future users. First, available literature on the subject is studied. Next, an initial proposal for a standardized geometric quality assessment is presented. Finally, this method is tested and evaluated with a case study. The number of specifications on BIM relating to existing buildings is limited. The Levels of Accuracy (LOA) specification of the USIBD provides definitions and suggestions regarding geometric model accuracy, but lacks a standardized assessment method. A deviation analysis is found to be dependent on (1) the used mathematical model, (2) the density of the point clouds and (3) the order of comparison. Results of the analysis can be graphical and numerical. An analysis on macro (building) and micro (BIM object) scale is necessary. On macro scale, the complete model is compared to the original point cloud and vice versa to get an overview of the general model quality. The graphical results show occluded zones and non-modeled objects respectively. Colored point clouds are derived from this analysis and integrated in the BIM. On micro scale, the relevant surface parts are extracted per BIM object and compared to the complete point cloud. Occluded zones are extracted based on a maximum deviation. What remains is classified according to the LOA specification. The numerical results are integrated in the BIM with the use of object parameters.

  14. NLOS Correction/Exclusion for GNSS Measurement Using RAIM and City Building Models

    PubMed Central

    Hsu, Li-Ta; Gu, Yanlei; Kamijo, Shunsuke

    2015-01-01

    Currently, global navigation satellite system (GNSS) receivers can provide accurate and reliable positioning service in open-field areas. However, their performance in the downtown areas of cities is still affected by the multipath and none-line-of-sight (NLOS) receptions. This paper proposes a new positioning method using 3D building models and the receiver autonomous integrity monitoring (RAIM) satellite selection method to achieve satisfactory positioning performance in urban area. The 3D building model uses a ray-tracing technique to simulate the line-of-sight (LOS) and NLOS signal travel distance, which is well-known as pseudorange, between the satellite and receiver. The proposed RAIM fault detection and exclusion (FDE) is able to compare the similarity between the raw pseudorange measurement and the simulated pseudorange. The measurement of the satellite will be excluded if the simulated and raw pseudoranges are inconsistent. Because of the assumption of the single reflection in the ray-tracing technique, an inconsistent case indicates it is a double or multiple reflected NLOS signal. According to the experimental results, the RAIM satellite selection technique can reduce by about 8.4% and 36.2% the positioning solutions with large errors (solutions estimated on the wrong side of the road) for the 3D building model method in the middle and deep urban canyon environment, respectively. PMID:26193278

  15. Net-zero Building Cluster Simulations and On-line Energy Forecasting for Adaptive and Real-Time Control and Decisions

    NASA Astrophysics Data System (ADS)

    Li, Xiwang

    Buildings consume about 41.1% of primary energy and 74% of the electricity in the U.S. Moreover, it is estimated by the National Energy Technology Laboratory that more than 1/4 of the 713 GW of U.S. electricity demand in 2010 could be dispatchable if only buildings could respond to that dispatch through advanced building energy control and operation strategies and smart grid infrastructure. In this study, it is envisioned that neighboring buildings will have the tendency to form a cluster, an open cyber-physical system to exploit the economic opportunities provided by a smart grid, distributed power generation, and storage devices. Through optimized demand management, these building clusters will then reduce overall primary energy consumption and peak time electricity consumption, and be more resilient to power disruptions. Therefore, this project seeks to develop a Net-zero building cluster simulation testbed and high fidelity energy forecasting models for adaptive and real-time control and decision making strategy development that can be used in a Net-zero building cluster. The following research activities are summarized in this thesis: 1) Development of a building cluster emulator for building cluster control and operation strategy assessment. 2) Development of a novel building energy forecasting methodology using active system identification and data fusion techniques. In this methodology, a systematic approach for building energy system characteristic evaluation, system excitation and model adaptation is included. The developed methodology is compared with other literature-reported building energy forecasting methods; 3) Development of the high fidelity on-line building cluster energy forecasting models, which includes energy forecasting models for buildings, PV panels, batteries and ice tank thermal storage systems 4) Small scale real building validation study to verify the performance of the developed building energy forecasting methodology. The outcomes of this thesis can be used for building cluster energy forecasting model development and model based control and operation optimization. The thesis concludes with a summary of the key outcomes of this research, as well as a list of recommendations for future work.

  16. Predicting indoor pollutant concentrations, and applications to air quality management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lorenzetti, David M.

    Because most people spend more than 90% of their time indoors, predicting exposure to airborne pollutants requires models that incorporate the effect of buildings. Buildings affect the exposure of their occupants in a number of ways, both by design (for example, filters in ventilation systems remove particles) and incidentally (for example, sorption on walls can reduce peak concentrations, but prolong exposure to semivolatile organic compounds). Furthermore, building materials and occupant activities can generate pollutants. Indoor air quality depends not only on outdoor air quality, but also on the design, maintenance, and use of the building. For example, ''sick building'' symptomsmore » such as respiratory problems and headaches have been related to the presence of air-conditioning systems, to carpeting, to low ventilation rates, and to high occupant density (1). The physical processes of interest apply even in simple structures such as homes. Indoor air quality models simulate the processes, such as ventilation and filtration, that control pollutant concentrations in a building. Section 2 describes the modeling approach, and the important transport processes in buildings. Because advection usually dominates among the transport processes, Sections 3 and 4 describe methods for predicting airflows. The concluding section summarizes the application of these models.« less

  17. Mathematical modeling of the heat transfer for determining the depth of thawing basin buildings with long service life

    NASA Astrophysics Data System (ADS)

    Sirditov, Ivan; Stepanov, Sergei

    2017-11-01

    In this paper, a numerical study of the problem of determining a thawing basin in the permafrost soil for buildings with a long service life is carried out using two methods, with the formulas of set of rules 25.13330.2012 "Soil bases and foundations on permafrost soils" and using a mathematical model.

  18. Electricity Markets, Smart Grids and Smart Buildings

    NASA Astrophysics Data System (ADS)

    Falcey, Jonathan M.

    A smart grid is an electricity network that accommodates two-way power flows, and utilizes two-way communications and increased measurement, in order to provide more information to customers and aid in the development of a more efficient electricity market. The current electrical network is outdated and has many shortcomings relating to power flows, inefficient electricity markets, generation/supply balance, a lack of information for the consumer and insufficient consumer interaction with electricity markets. Many of these challenges can be addressed with a smart grid, but there remain significant barriers to the implementation of a smart grid. This paper proposes a novel method for the development of a smart grid utilizing a bottom up approach (starting with smart buildings/campuses) with the goal of providing the framework and infrastructure necessary for a smart grid instead of the more traditional approach (installing many smart meters and hoping a smart grid emerges). This novel approach involves combining deterministic and statistical methods in order to accurately estimate building electricity use down to the device level. It provides model users with a cheaper alternative to energy audits and extensive sensor networks (the current methods of quantifying electrical use at this level) which increases their ability to modify energy consumption and respond to price signals The results of this method are promising, but they are still preliminary. As a result, there is still room for improvement. On days when there were no missing or inaccurate data, this approach has R2 of about 0.84, sometimes as high as 0.94 when compared to measured results. However, there were many days where missing data brought overall accuracy down significantly. In addition, the development and implementation of the calibration process is still underway and some functional additions must be made in order to maximize accuracy. The calibration process must be completed before a reliable accuracy can be determined. While this work shows that a combination of a deterministic and statistical methods can accurately forecast building energy usage, the ability to produce accurate results is heavily dependent upon software availability, accurate data and the proper calibration of the model. Creating the software required for a smart building model is time consuming and expensive. Bad or missing data have significant negative impacts on the accuracy of the results and can be caused by a hodgepodge of equipment and communication protocols. Proper calibration of the model is essential to ensure that the device level estimations are sufficiently accurate. Any building model which is to be successful at creating a smart building must be able to overcome these challenges.

  19. A risk evaluation model and its application in online retailing trustfulness

    NASA Astrophysics Data System (ADS)

    Ye, Ruyi; Xu, Yingcheng

    2017-08-01

    Building a general model for risks evaluation in advance could improve the convenience, normality and comparability of the results of repeating risks evaluation in the case that the repeating risks evaluating are in the same area and for a similar purpose. One of the most convenient and common risks evaluation models is an index system including of several index, according weights and crediting method. One method to build a risk evaluation index system that guarantees the proportional relationship between the resulting credit and the expected risk loss is proposed and an application example is provided in online retailing in this article.

  20. Approximation Model Building for Reliability & Maintainability Characteristics of Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Morris, W. Douglas; White, Nancy H.; Lepsch, Roger A.; Brown, Richard W.

    2000-01-01

    This paper describes the development of parametric models for estimating operational reliability and maintainability (R&M) characteristics for reusable vehicle concepts, based on vehicle size and technology support level. A R&M analysis tool (RMAT) and response surface methods are utilized to build parametric approximation models for rapidly estimating operational R&M characteristics such as mission completion reliability. These models that approximate RMAT, can then be utilized for fast analysis of operational requirements, for lifecycle cost estimating and for multidisciplinary sign optimization.

  1. A Petascale Non-Hydrostatic Atmospheric Dynamical Core in the HOMME Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tufo, Henry

    The High-Order Method Modeling Environment (HOMME) is a framework for building scalable, conserva- tive atmospheric models for climate simulation and general atmospheric-modeling applications. Its spatial discretizations are based on Spectral-Element (SE) and Discontinuous Galerkin (DG) methods. These are local methods employing high-order accurate spectral basis-functions that have been shown to perform well on massively parallel supercomputers at any resolution and scale particularly well at high resolutions. HOMME provides the framework upon which the CAM-SE community atmosphere model dynamical-core is constructed. In its current incarnation, CAM-SE employs the hydrostatic primitive-equations (PE) of motion, which limits its resolution to simulations coarser thanmore » 0.1 per grid cell. The primary objective of this project is to remove this resolution limitation by providing HOMME with the capabilities needed to build nonhydrostatic models that solve the compressible Euler/Navier-Stokes equations.« less

  2. Automated building of organometallic complexes from 3D fragments.

    PubMed

    Foscato, Marco; Venkatraman, Vishwesh; Occhipinti, Giovanni; Alsberg, Bjørn K; Jensen, Vidar R

    2014-07-28

    A method for the automated construction of three-dimensional (3D) molecular models of organometallic species in design studies is described. Molecular structure fragments derived from crystallographic structures and accurate molecular-level calculations are used as 3D building blocks in the construction of multiple molecular models of analogous compounds. The method allows for precise control of stereochemistry and geometrical features that may otherwise be very challenging, or even impossible, to achieve with commonly available generators of 3D chemical structures. The new method was tested in the construction of three sets of active or metastable organometallic species of catalytic reactions in the homogeneous phase. The performance of the method was compared with those of commonly available methods for automated generation of 3D models, demonstrating higher accuracy of the prepared 3D models in general, and, in particular, a much wider range with respect to the kind of chemical structures that can be built automatically, with capabilities far beyond standard organic and main-group chemistry.

  3. Testing Different Model Building Procedures Using Multiple Regression.

    ERIC Educational Resources Information Center

    Thayer, Jerome D.

    The stepwise regression method of selecting predictors for computer assisted multiple regression analysis was compared with forward, backward, and best subsets regression, using 16 data sets. The results indicated the stepwise method was preferred because of its practical nature, when the models chosen by different selection methods were similar…

  4. Style grammars for interactive visualization of architecture.

    PubMed

    Aliaga, Daniel G; Rosen, Paul A; Bekins, Daniel R

    2007-01-01

    Interactive visualization of architecture provides a way to quickly visualize existing or novel buildings and structures. Such applications require both fast rendering and an effortless input regimen for creating and changing architecture using high-level editing operations that automatically fill in the necessary details. Procedural modeling and synthesis is a powerful paradigm that yields high data amplification and can be coupled with fast-rendering techniques to quickly generate plausible details of a scene without much or any user interaction. Previously, forward generating procedural methods have been proposed where a procedure is explicitly created to generate particular content. In this paper, we present our work in inverse procedural modeling of buildings and describe how to use an extracted repertoire of building grammars to facilitate the visualization and quick modification of architectural structures and buildings. We demonstrate an interactive application where the user draws simple building blocks and, using our system, can automatically complete the building "in the style of" other buildings using view-dependent texture mapping or nonphotorealistic rendering techniques. Our system supports an arbitrary number of building grammars created from user subdivided building models and captured photographs. Using only edit, copy, and paste metaphors, the entire building styles can be altered and transferred from one building to another in a few operations, enhancing the ability to modify an existing architectural structure or to visualize a novel building in the style of the others.

  5. Stochastic Modeling of Overtime Occupancy and Its Application in Building Energy Simulation and Calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Kaiyu; Yan, Da; Hong, Tianzhen

    2014-02-28

    Overtime is a common phenomenon around the world. Overtime drives both internal heat gains from occupants, lighting and plug-loads, and HVAC operation during overtime periods. Overtime leads to longer occupancy hours and extended operation of building services systems beyond normal working hours, thus overtime impacts total building energy use. Current literature lacks methods to model overtime occupancy because overtime is stochastic in nature and varies by individual occupants and by time. To address this gap in the literature, this study aims to develop a new stochastic model based on the statistical analysis of measured overtime occupancy data from an officemore » building. A binomial distribution is used to represent the total number of occupants working overtime, while an exponential distribution is used to represent the duration of overtime periods. The overtime model is used to generate overtime occupancy schedules as an input to the energy model of a second office building. The measured and simulated cooling energy use during the overtime period is compared in order to validate the overtime model. A hybrid approach to energy model calibration is proposed and tested, which combines ASHRAE Guideline 14 for the calibration of the energy model during normal working hours, and a proposed KS test for the calibration of the energy model during overtime. The developed stochastic overtime model and the hybrid calibration approach can be used in building energy simulations to improve the accuracy of results, and better understand the characteristics of overtime in office buildings.« less

  6. Applications of system dynamics modelling to support health policy.

    PubMed

    Atkinson, Jo-An M; Wells, Robert; Page, Andrew; Dominello, Amanda; Haines, Mary; Wilson, Andrew

    2015-07-09

    The value of systems science modelling methods in the health sector is increasingly being recognised. Of particular promise is the potential of these methods to improve operational aspects of healthcare capacity and delivery, analyse policy options for health system reform and guide investments to address complex public health problems. Because it lends itself to a participatory approach, system dynamics modelling has been a particularly appealing method that aims to align stakeholder understanding of the underlying causes of a problem and achieve consensus for action. The aim of this review is to determine the effectiveness of system dynamics modelling for health policy, and explore the range and nature of its application. A systematic search was conducted to identify articles published up to April 2015 from the PubMed, Web of Knowledge, Embase, ScienceDirect and Google Scholar databases. The grey literature was also searched. Papers eligible for inclusion were those that described applications of system dynamics modelling to support health policy at any level of government. Six papers were identified, comprising eight case studies of the application of system dynamics modelling to support health policy. No analytic studies were found that examined the effectiveness of this type of modelling. Only three examples engaged multidisciplinary stakeholders in collective model building. Stakeholder participation in model building reportedly facilitated development of a common 'mental map' of the health problem, resulting in consensus about optimal policy strategy and garnering support for collaborative action. The paucity of relevant papers indicates that, although the volume of descriptive literature advocating the value of system dynamics modelling is considerable, its practical application to inform health policy making is yet to be routinely applied and rigorously evaluated. Advances in software are allowing the participatory model building approach to be extended to more sophisticated multimethod modelling that provides policy makers with more powerful tools to support the design of targeted, effective and equitable policy responses for complex health problems. Building capacity and investing in communication to promote these modelling methods, as well as documenting and evaluating their applications, will be vital to supporting uptake by policy makers.

  7. Towards a Semantically-Enabled Control Strategy for Building Simulations: Integration of Semantic Technologies and Model Predictive Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delgoshaei, Parastoo; Austin, Mark A.; Pertzborn, Amanda J.

    State-of-the-art building simulation control methods incorporate physical constraints into their mathematical models, but omit implicit constraints associated with policies of operation and dependency relationships among rules representing those constraints. To overcome these shortcomings, there is a recent trend in enabling the control strategies with inference-based rule checking capabilities. One solution is to exploit semantic web technologies in building simulation control. Such approaches provide the tools for semantic modeling of domains, and the ability to deduce new information based on the models through use of Description Logic (DL). In a step toward enabling this capability, this paper presents a cross-disciplinary data-drivenmore » control strategy for building energy management simulation that integrates semantic modeling and formal rule checking mechanisms into a Model Predictive Control (MPC) formulation. The results show that MPC provides superior levels of performance when initial conditions and inputs are derived from inference-based rules.« less

  8. BIM and IoT: A Synopsis from GIS Perspective

    NASA Astrophysics Data System (ADS)

    Isikdag, U.

    2015-10-01

    Internet-of-Things (IoT) focuses on enabling communication between all devices, things that are existent in real life or that are virtual. Building Information Models (BIMs) and Building Information Modelling is a hype that has been the buzzword of the construction industry for last 15 years. BIMs emerged as a result of a push by the software companies, to tackle the problems of inefficient information exchange between different software and to enable true interoperability. In BIM approach most up-to-date an accurate models of a building are stored in shared central databases during the design and the construction of a project and at post-construction stages. GIS based city monitoring / city management applications require the fusion of information acquired from multiple resources, BIMs, City Models and Sensors. This paper focuses on providing a method for facilitating the GIS based fusion of information residing in digital building "Models" and information acquired from the city objects i.e. "Things". Once this information fusion is accomplished, many fields ranging from Emergency Response, Urban Surveillance, Urban Monitoring to Smart Buildings will have potential benefits.

  9. Critical review of the building downwash algorithms in AERMOD.

    PubMed

    Petersen, Ron L; Guerra, Sergio A; Bova, Anthony S

    2017-08-01

    The only documentation on the building downwash algorithm in AERMOD (American Meteorological Society/U.S. Environmental Protection Agency Regulatory Model), referred to as PRIME (Plume Rise Model Enhancements), is found in the 2000 A&WMA journal article by Schulman, Strimaitis and Scire. Recent field and wind tunnel studies have shown that AERMOD can overpredict concentrations by factors of 2 to 8 for certain building configurations. While a wind tunnel equivalent building dimension study (EBD) can be conducted to approximately correct the overprediction bias, past field and wind tunnel studies indicate that there are notable flaws in the PRIME building downwash theory. A detailed review of the theory supported by CFD (Computational Fluid Dynamics) and wind tunnel simulations of flow over simple rectangular buildings revealed the following serious theoretical flaws: enhanced turbulence in the building wake starting at the wrong longitudinal location; constant enhanced turbulence extending up to the wake height; constant initial enhanced turbulence in the building wake (does not vary with roughness or stability); discontinuities in the streamline calculations; and no method to account for streamlined or porous structures. This paper documents theoretical and other problems in PRIME along with CFD simulations and wind tunnel observations that support these findings. Although AERMOD/PRIME may provide accurate and unbiased estimates (within a factor of 2) for some building configurations, a major review and update is needed so that accurate estimates can be obtained for other building configurations where significant overpredictions or underpredictions are common due to downwash effects. This will ensure that regulatory evaluations subject to dispersion modeling requirements can be based on an accurate model. Thus, it is imperative that the downwash theory in PRIME is corrected to improve model performance and ensure that the model better represents reality.

  10. Statistical considerations on prognostic models for glioma

    PubMed Central

    Molinaro, Annette M.; Wrensch, Margaret R.; Jenkins, Robert B.; Eckel-Passow, Jeanette E.

    2016-01-01

    Given the lack of beneficial treatments in glioma, there is a need for prognostic models for therapeutic decision making and life planning. Recently several studies defining subtypes of glioma have been published. Here, we review the statistical considerations of how to build and validate prognostic models, explain the models presented in the current glioma literature, and discuss advantages and disadvantages of each model. The 3 statistical considerations to establishing clinically useful prognostic models are: study design, model building, and validation. Careful study design helps to ensure that the model is unbiased and generalizable to the population of interest. During model building, a discovery cohort of patients can be used to choose variables, construct models, and estimate prediction performance via internal validation. Via external validation, an independent dataset can assess how well the model performs. It is imperative that published models properly detail the study design and methods for both model building and validation. This provides readers the information necessary to assess the bias in a study, compare other published models, and determine the model's clinical usefulness. As editors, reviewers, and readers of the relevant literature, we should be cognizant of the needed statistical considerations and insist on their use. PMID:26657835

  11. Hybrid Automatic Building Interpretation System

    NASA Astrophysics Data System (ADS)

    Pakzad, K.; Klink, A.; Müterthies, A.; Gröger, G.; Stroh, V.; Plümer, L.

    2011-09-01

    HABIS (Hybrid Automatic Building Interpretation System) is a system for an automatic reconstruction of building roofs used in virtual 3D building models. Unlike most of the commercially available systems, HABIS is able to work to a high degree automatically. The hybrid method uses different sources intending to exploit the advantages of the particular sources. 3D point clouds usually provide good height and surface data, whereas spatial high resolution aerial images provide important information for edges and detail information for roof objects like dormers or chimneys. The cadastral data provide important basis information about the building ground plans. The approach used in HABIS works with a multi-stage-process, which starts with a coarse roof classification based on 3D point clouds. After that it continues with an image based verification of these predicted roofs. In a further step a final classification and adjustment of the roofs is done. In addition some roof objects like dormers and chimneys are also extracted based on aerial images and added to the models. In this paper the used methods are described and some results are presented.

  12. Geometric Model of Induction Heating Process of Iron-Based Sintered Materials

    NASA Astrophysics Data System (ADS)

    Semagina, Yu V.; Egorova, M. A.

    2018-03-01

    The article studies the issue of building multivariable dependences based on the experimental data. A constructive method for solving the issue is presented in the form of equations of (n-1) – surface compartments of the extended Euclidean space E+n. The dimension of space is taken to be equal to the sum of the number of parameters and factors of the model of the system being studied. The basis for building multivariable dependencies is the generalized approach to n-space used for the surface compartments of 3D space. The surface is designed on the basis of the kinematic method, moving one geometric object along a certain trajectory. The proposed approach simplifies the process aimed at building the multifactorial empirical dependencies which describe the process being investigated.

  13. Comparison of different approaches of modelling in a masonry building

    NASA Astrophysics Data System (ADS)

    Saba, M.; Meloni, D.

    2017-12-01

    The present work has the objective to model a simple masonry building, through two different modelling methods in order to assess their validity in terms of evaluation of static stresses. Have been chosen two of the most commercial software used to address this kind of problem, which are of S.T.A. Data S.r.l. and Sismicad12 of Concrete S.r.l. While the 3Muri software adopts the Frame by Macro Elements Method (FME), which should be more schematic and more efficient, Sismicad12 software uses the Finite Element Method (FEM), which guarantees accurate results, with greater computational burden. Remarkably differences of the static stresses, for such a simple structure between the two approaches have been found, and an interesting comparison and analysis of the reasons is proposed.

  14. A Method to Test Model Calibration Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, Ron; Polly, Ben; Neymark, Joel

    This paper describes a method for testing model calibration techniques. Calibration is commonly used in conjunction with energy retrofit audit models. An audit is conducted to gather information about the building needed to assemble an input file for a building energy modeling tool. A calibration technique is used to reconcile model predictions with utility data, and then the 'calibrated model' is used to predict energy savings from a variety of retrofit measures and combinations thereof. Current standards and guidelines such as BPI-2400 and ASHRAE-14 set criteria for 'goodness of fit' and assume that if the criteria are met, then themore » calibration technique is acceptable. While it is logical to use the actual performance data of the building to tune the model, it is not certain that a good fit will result in a model that better predicts post-retrofit energy savings. Therefore, the basic idea here is that the simulation program (intended for use with the calibration technique) is used to generate surrogate utility bill data and retrofit energy savings data against which the calibration technique can be tested. This provides three figures of merit for testing a calibration technique, 1) accuracy of the post-retrofit energy savings prediction, 2) closure on the 'true' input parameter values, and 3) goodness of fit to the utility bill data. The paper will also discuss the pros and cons of using this synthetic surrogate data approach versus trying to use real data sets of actual buildings.« less

  15. A Method to Test Model Calibration Techniques: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, Ron; Polly, Ben; Neymark, Joel

    This paper describes a method for testing model calibration techniques. Calibration is commonly used in conjunction with energy retrofit audit models. An audit is conducted to gather information about the building needed to assemble an input file for a building energy modeling tool. A calibration technique is used to reconcile model predictions with utility data, and then the 'calibrated model' is used to predict energy savings from a variety of retrofit measures and combinations thereof. Current standards and guidelines such as BPI-2400 and ASHRAE-14 set criteria for 'goodness of fit' and assume that if the criteria are met, then themore » calibration technique is acceptable. While it is logical to use the actual performance data of the building to tune the model, it is not certain that a good fit will result in a model that better predicts post-retrofit energy savings. Therefore, the basic idea here is that the simulation program (intended for use with the calibration technique) is used to generate surrogate utility bill data and retrofit energy savings data against which the calibration technique can be tested. This provides three figures of merit for testing a calibration technique, 1) accuracy of the post-retrofit energy savings prediction, 2) closure on the 'true' input parameter values, and 3) goodness of fit to the utility bill data. The paper will also discuss the pros and cons of using this synthetic surrogate data approach versus trying to use real data sets of actual buildings.« less

  16. Simulation of earthquake caused building damages for the development of fast reconnaissance techniques

    NASA Astrophysics Data System (ADS)

    Schweier, C.; Markus, M.; Steinle, E.

    2004-04-01

    Catastrophic events like strong earthquakes can cause big losses in life and economic values. An increase in the efficiency of reconnaissance techniques could help to reduce the losses in life as many victims die after and not during the event. A basic prerequisite to improve the rescue teams' work is an improved planning of the measures. This can only be done on the basis of reliable and detailed information about the actual situation in the affected regions. Therefore, a bundle of projects at Karlsruhe university aim at the development of a tool for fast information retrieval after strong earthquakes. The focus is on urban areas as the most losses occur there. In this paper the approach for a damage analysis of buildings will be presented. It consists of an automatic methodology to model buildings in three dimensions, a comparison of pre- and post-event models to detect changes and a subsequent classification of the changes into damage types. The process is based on information extraction from airborne laserscanning data, i.e. digital surface models (DSM) acquired through scanning of an area with pulsed laser light. To date, there are no laserscanning derived DSMs available to the authors that were taken of areas that suffered damages from earthquakes. Therefore, it was necessary to simulate such data for the development of the damage detection methodology. In this paper two different methodologies used for simulating the data will be presented. The first method is to create CAD models of undamaged buildings based on their construction plans and alter them artificially in such a way as if they had suffered serious damage. Then, a laserscanning data set is simulated based on these models which can be compared with real laserscanning data acquired of the buildings (in intact state). The other approach is to use measurements of actual damaged buildings and simulate their intact state. It is possible to model the geometrical structure of these damaged buildings based on digital photography taken after the event by evaluating the images with photogrammetrical methods. The intact state of the buildings is simulated based on on-site investigations, and finally laserscanning data are simulated for both states.

  17. Transport of Bacillus thuringiensis var. kurstaki from an outdoor release into buildings: pathways of infiltration and a rapid method to identify contaminated buildings.

    PubMed

    Van Cuyk, Sheila; Deshpande, Alina; Hollander, Attelia; Franco, David O; Teclemariam, Nerayo P; Layshock, Julie A; Ticknor, Lawrence O; Brown, Michael J; Omberg, Kristin M

    2012-06-01

    Understanding the fate and transport of biological agents into buildings will be critical to recovery and restoration efforts after a biological attack in an urban area. As part of the Interagency Biological Restoration Demonstration (IBRD), experiments were conducted in Fairfax County, VA, to study whether a biological agent can be expected to infiltrate into buildings following a wide-area release. Bacillus thuringiensis var. kurstaki is a common organic pesticide that has been sprayed in Fairfax County for a number of years to control the gypsy moth. Because the bacterium shares many physical and biological properties with Bacillus anthracis, the results from these studies can be extrapolated to a bioterrorist release. In 2009, samples were collected from inside buildings located immediately adjacent to a spray block. A combined probabilistic and targeted sampling strategy and modeling were conducted to provide insight into likely methods of infiltration. Both the simulations and the experimental results indicate sampling entryways and heating, ventilation, and air conditioning (HVAC) filters are reasonable methods for "ruling in" a building as contaminated. Following a biological attack, this method is likely to provide significant savings in time and labor compared to more rigorous, statistically based characterization. However, this method should never be used to "rule out," or clear, a building.

  18. Enhanced air dispersion modelling at a typical Chinese nuclear power plant site: Coupling RIMPUFF with two advanced diagnostic wind models.

    PubMed

    Liu, Yun; Li, Hong; Sun, Sida; Fang, Sheng

    2017-09-01

    An enhanced air dispersion modelling scheme is proposed to cope with the building layout and complex terrain of a typical Chinese nuclear power plant (NPP) site. In this modelling, the California Meteorological Model (CALMET) and the Stationary Wind Fit and Turbulence (SWIFT) are coupled with the Risø Mesoscale PUFF model (RIMPUFF) for refined wind field calculation. The near-field diffusion coefficient correction scheme of the Atmospheric Relative Concentrations in the Building Wakes Computer Code (ARCON96) is adopted to characterize dispersion in building arrays. The proposed method is evaluated by a wind tunnel experiment that replicates the typical Chinese NPP site. For both wind speed/direction and air concentration, the enhanced modelling predictions agree well with the observations. The fraction of the predictions within a factor of 2 and 5 of observations exceeds 55% and 82% respectively in the building area and the complex terrain area. This demonstrates the feasibility of the new enhanced modelling for typical Chinese NPP sites. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Learning Petri net models of non-linear gene interactions.

    PubMed

    Mayo, Michael

    2005-10-01

    Understanding how an individual's genetic make-up influences their risk of disease is a problem of paramount importance. Although machine-learning techniques are able to uncover the relationships between genotype and disease, the problem of automatically building the best biochemical model or "explanation" of the relationship has received less attention. In this paper, I describe a method based on random hill climbing that automatically builds Petri net models of non-linear (or multi-factorial) disease-causing gene-gene interactions. Petri nets are a suitable formalism for this problem, because they are used to model concurrent, dynamic processes analogous to biochemical reaction networks. I show that this method is routinely able to identify perfect Petri net models for three disease-causing gene-gene interactions recently reported in the literature.

  20. D Model Generation from Uav: Historical Mosque (masjid LAMA Nilai)

    NASA Astrophysics Data System (ADS)

    Nasir, N. H. Mohd; Tahar, K. N.

    2017-08-01

    Preserving cultural heritage and historic sites is an important issue. These sites are subjected to erosion and vandalism, and, as long-lived artifacts, they have gone through many phases of construction, damage and repair. It is important to keep an accurate record of these sites using the 3-D model building technology as they currently are, so that preservationists can track changes, foresee structural problems, and allow a wider audience to "virtually" see and tour these sites. Due to the complexity of these sites, building 3-D models is time consuming and difficult, usually involving much manual effort. This study discusses new methods that can reduce the time to build a model using the Unmanned Aerial Vehicle method. This study aims to develop a 3D model of a historical mosque using UAV photogrammetry. In order to achieve this, the data acquisition set of Masjid Lama Nilai, Negeri Sembilan was captured by using an Unmanned Aerial Vehicle. In addition, accuracy assessment between the actual and measured values is made. Besides that, a comparison between the rendering 3D model and texturing 3D model is also carried out through this study.

  1. Method for detecting moment connection fracture using high-frequency transients in recorded accelerations

    USGS Publications Warehouse

    Rodgers, J.E.; Elebi, M.

    2011-01-01

    The 1994 Northridge earthquake caused brittle fractures in steel moment frame building connections, despite causing little visible building damage in most cases. Future strong earthquakes are likely to cause similar damage to the many un-retrofitted pre-Northridge buildings in the western US and elsewhere. Without obvious permanent building deformation, costly intrusive inspections are currently the only way to determine if major fracture damage that compromises building safety has occurred. Building instrumentation has the potential to provide engineers and owners with timely information on fracture occurrence. Structural dynamics theory predicts and scale model experiments have demonstrated that sudden, large changes in structure properties caused by moment connection fractures will cause transient dynamic response. A method is proposed for detecting the building-wide level of connection fracture damage, based on observing high-frequency, fracture-induced transient dynamic responses in strong motion accelerograms. High-frequency transients are short (<1 s), sudden-onset waveforms with frequency content above 25 Hz that are visually apparent in recorded accelerations. Strong motion data and damage information from intrusive inspections collected from 24 sparsely instrumented buildings following the 1994 Northridge earthquake are used to evaluate the proposed method. The method's overall success rate for this data set is 67%, but this rate varies significantly with damage level. The method performs reasonably well in detecting significant fracture damage and in identifying cases with no damage, but fails in cases with few fractures. Combining the method with other damage indicators and removing records with excessive noise improves the ability to detect the level of damage. ?? 2010 Elsevier B.V. All rights reserved.

  2. Semi-Automatic Modelling of Building FAÇADES with Shape Grammars Using Historic Building Information Modelling

    NASA Astrophysics Data System (ADS)

    Dore, C.; Murphy, M.

    2013-02-01

    This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.

  3. Models of resource planning during formation of calendar construction plans for erection of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Pocebneva, Irina; Belousov, Vadim; Fateeva, Irina

    2018-03-01

    This article provides a methodical description of resource-time analysis for a wide range of requirements imposed for resource consumption processes in scheduling tasks during the construction of high-rise buildings and facilities. The core of the proposed approach and is the resource models being determined. The generalized network models are the elements of those models, the amount of which can be too large to carry out the analysis of each element. Therefore, the problem is to approximate the original resource model by simpler time models, when their amount is not very large.

  4. Accuracy Assessment of a Complex Building 3d Model Reconstructed from Images Acquired with a Low-Cost Uas

    NASA Astrophysics Data System (ADS)

    Oniga, E.; Chirilă, C.; Stătescu, F.

    2017-02-01

    Nowadays, Unmanned Aerial Systems (UASs) are a wide used technique for acquisition in order to create buildings 3D models, providing the acquisition of a high number of images at very high resolution or video sequences, in a very short time. Since low-cost UASs are preferred, the accuracy of a building 3D model created using this platforms must be evaluated. To achieve results, the dean's office building from the Faculty of "Hydrotechnical Engineering, Geodesy and Environmental Engineering" of Iasi, Romania, has been chosen, which is a complex shape building with the roof formed of two hyperbolic paraboloids. Seven points were placed on the ground around the building, three of them being used as GCPs, while the remaining four as Check points (CPs) for accuracy assessment. Additionally, the coordinates of 10 natural CPs representing the building characteristic points were measured with a Leica TCR 405 total station. The building 3D model was created as a point cloud which was automatically generated based on digital images acquired with the low-cost UASs, using the image matching algorithm and different software like 3DF Zephyr, Visual SfM, PhotoModeler Scanner and Drone2Map for ArcGIS. Except for the PhotoModeler Scanner software, the interior and exterior orientation parameters were determined simultaneously by solving a self-calibrating bundle adjustment. Based on the UAS point clouds, automatically generated by using the above mentioned software and GNSS data respectively, the parameters of the east side hyperbolic paraboloid were calculated using the least squares method and a statistical blunder detection. Then, in order to assess the accuracy of the building 3D model, several comparisons were made for the facades and the roof with reference data, considered with minimum errors: TLS mesh for the facades and GNSS mesh for the roof. Finally, the front facade of the building was created in 3D based on its characteristic points using the PhotoModeler Scanner software, resulting a CAD (Computer Aided Design) model. The results showed the high potential of using low-cost UASs for building 3D model creation and if the building 3D model is created based on its characteristic points the accuracy is significantly improved.

  5. A numerical simulation strategy on occupant evacuation behaviors and casualty prediction in a building during earthquakes

    NASA Astrophysics Data System (ADS)

    Li, Shuang; Yu, Xiaohui; Zhang, Yanjuan; Zhai, Changhai

    2018-01-01

    Casualty prediction in a building during earthquakes benefits to implement the economic loss estimation in the performance-based earthquake engineering methodology. Although after-earthquake observations reveal that the evacuation has effects on the quantity of occupant casualties during earthquakes, few current studies consider occupant movements in the building in casualty prediction procedures. To bridge this knowledge gap, a numerical simulation method using refined cellular automata model is presented, which can describe various occupant dynamic behaviors and building dimensions. The simulation on the occupant evacuation is verified by a recorded evacuation process from a school classroom in real-life 2013 Ya'an earthquake in China. The occupant casualties in the building under earthquakes are evaluated by coupling the building collapse process simulation by finite element method, the occupant evacuation simulation, and the casualty occurrence criteria with time and space synchronization. A case study of casualty prediction in a building during an earthquake is provided to demonstrate the effect of occupant movements on casualty prediction.

  6. Advanced building energy management system demonstration for Department of Defense buildings.

    PubMed

    O'Neill, Zheng; Bailey, Trevor; Dong, Bing; Shashanka, Madhusudana; Luo, Dong

    2013-08-01

    This paper presents an advanced building energy management system (aBEMS) that employs advanced methods of whole-building performance monitoring combined with statistical methods of learning and data analysis to enable identification of both gradual and discrete performance erosion and faults. This system assimilated data collected from multiple sources, including blueprints, reduced-order models (ROM) and measurements, and employed advanced statistical learning algorithms to identify patterns of anomalies. The results were presented graphically in a manner understandable to facilities managers. A demonstration of aBEMS was conducted in buildings at Naval Station Great Lakes. The facility building management systems were extended to incorporate the energy diagnostics and analysis algorithms, producing systematic identification of more efficient operation strategies. At Naval Station Great Lakes, greater than 20% savings were demonstrated for building energy consumption by improving facility manager decision support to diagnose energy faults and prioritize alternative, energy-efficient operation strategies. The paper concludes with recommendations for widespread aBEMS success. © 2013 New York Academy of Sciences.

  7. Multi-Hierarchical Gray Correlation Analysis Applied in the Selection of Green Building Design Scheme

    NASA Astrophysics Data System (ADS)

    Wang, Li; Li, Chuanghong

    2018-02-01

    As a sustainable form of ecological structure, green building is widespread concerned and advocated in society increasingly nowadays. In the survey and design phase of preliminary project construction, carrying out the evaluation and selection of green building design scheme, which is in accordance with the scientific and reasonable evaluation index system, can improve the ecological benefits of green building projects largely and effectively. Based on the new Green Building Evaluation Standard which came into effect on January 1, 2015, the evaluation index system of green building design scheme is constructed taking into account the evaluation contents related to the green building design scheme. We organized experts who are experienced in construction scheme optimization to mark and determine the weight of each evaluation index through the AHP method. The correlation degree was calculated between each evaluation scheme and ideal scheme by using multilevel gray relational analysis model and then the optimal scheme was determined. The feasibility and practicability of the evaluation method are verified by introducing examples.

  8. Gradient boosting machine for modeling the energy consumption of commercial buildings

    DOE PAGES

    Touzani, Samir; Granderson, Jessica; Fernandes, Samuel

    2017-11-26

    Accurate savings estimations are important to promote energy efficiency projects and demonstrate their cost-effectiveness. The increasing presence of advanced metering infrastructure (AMI) in commercial buildings has resulted in a rising availability of high frequency interval data. These data can be used for a variety of energy efficiency applications such as demand response, fault detection and diagnosis, and heating, ventilation, and air conditioning (HVAC) optimization. This large amount of data has also opened the door to the use of advanced statistical learning models, which hold promise for providing accurate building baseline energy consumption predictions, and thus accurate saving estimations. The gradientmore » boosting machine is a powerful machine learning algorithm that is gaining considerable traction in a wide range of data driven applications, such as ecology, computer vision, and biology. In the present work an energy consumption baseline modeling method based on a gradient boosting machine was proposed. To assess the performance of this method, a recently published testing procedure was used on a large dataset of 410 commercial buildings. The model training periods were varied and several prediction accuracy metrics were used to evaluate the model's performance. The results show that using the gradient boosting machine model improved the R-squared prediction accuracy and the CV(RMSE) in more than 80 percent of the cases, when compared to an industry best practice model that is based on piecewise linear regression, and to a random forest algorithm.« less

  9. Gradient boosting machine for modeling the energy consumption of commercial buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Touzani, Samir; Granderson, Jessica; Fernandes, Samuel

    Accurate savings estimations are important to promote energy efficiency projects and demonstrate their cost-effectiveness. The increasing presence of advanced metering infrastructure (AMI) in commercial buildings has resulted in a rising availability of high frequency interval data. These data can be used for a variety of energy efficiency applications such as demand response, fault detection and diagnosis, and heating, ventilation, and air conditioning (HVAC) optimization. This large amount of data has also opened the door to the use of advanced statistical learning models, which hold promise for providing accurate building baseline energy consumption predictions, and thus accurate saving estimations. The gradientmore » boosting machine is a powerful machine learning algorithm that is gaining considerable traction in a wide range of data driven applications, such as ecology, computer vision, and biology. In the present work an energy consumption baseline modeling method based on a gradient boosting machine was proposed. To assess the performance of this method, a recently published testing procedure was used on a large dataset of 410 commercial buildings. The model training periods were varied and several prediction accuracy metrics were used to evaluate the model's performance. The results show that using the gradient boosting machine model improved the R-squared prediction accuracy and the CV(RMSE) in more than 80 percent of the cases, when compared to an industry best practice model that is based on piecewise linear regression, and to a random forest algorithm.« less

  10. A Framework for Text Mining in Scientometric Study: A Case Study in Biomedicine Publications

    NASA Astrophysics Data System (ADS)

    Silalahi, V. M. M.; Hardiyati, R.; Nadhiroh, I. M.; Handayani, T.; Rahmaida, R.; Amelia, M.

    2018-04-01

    The data of Indonesians research publications in the domain of biomedicine has been collected to be text mined for the purpose of a scientometric study. The goal is to build a predictive model that provides a classification of research publications on the potency for downstreaming. The model is based on the drug development processes adapted from the literatures. An effort is described to build the conceptual model and the development of a corpus on the research publications in the domain of Indonesian biomedicine. Then an investigation is conducted relating to the problems associated with building a corpus and validating the model. Based on our experience, a framework is proposed to manage the scientometric study based on text mining. Our method shows the effectiveness of conducting a scientometric study based on text mining in order to get a valid classification model. This valid model is mainly supported by the iterative and close interactions with the domain experts starting from identifying the issues, building a conceptual model, to the labelling, validation and results interpretation.

  11. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.

    PubMed

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil

    2014-08-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.

  12. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods

    PubMed Central

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called “Patient Recursive Survival Peeling” is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called “combined” cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication. PMID:26997922

  13. Comparative study on diagonal equivalent methods of masonry infill panel

    NASA Astrophysics Data System (ADS)

    Amalia, Aniendhita Rizki; Iranata, Data

    2017-06-01

    Infrastructure construction in earthquake prone area needs good design process, including modeling a structure in a correct way to reduce damages caused by an earthquake. Earthquakes cause many damages e.g. collapsed buildings that are dangerous. An incorrect modeling in design process certainly affects the structure's ability in responding to load, i.e. an earthquake load, and it needs to be paid attention to in order to reduce damages and fatalities. A correct modeling considers every aspect that affects the strength of a building, including stiffness of resisting lateral loads caused by an earthquake. Most of structural analyses still use open frame method that does not consider the effect of stiffness of masonry panel to the stiffness and strength of the whole structure. Effect of masonry panel is usually not included in design process, but the presence of this panel greatly affects behavior of the building in responding to an earthquake. In worst case scenario, it can even cause the building to collapse as what has been reported after great earthquakes worldwide. Modeling a structure with masonry panel as consideration can be performed by designing the panel as compression brace or shell element. In designing masonry panel as a compression brace, there are fourteen methods popular to be used by structure designers formulated by Saneinejad-Hobbs, Holmes, Stafford-Smith, Mainstones, Mainstones-Weeks, Bazan-Meli, Liauw Kwan, Paulay and Priestley, FEMA 356, Durani Luo, Hendry, Al-Chaar, Papia and Chen-Iranata. Every method has its own equation and parameters to use, therefore the model of every method was compared to results of experimental test to see which one gives closer values. Moreover, those methods also need to be compared to the open frame to see if they can result values within limits. Experimental test that was used in comparing all methods was taken from Mehrabi's research (Fig. 1), which was a prototype of a frame in a structure with 0.5 scale and the ratio of height to width of 1 to 1.5. Load used in the experiment was based on Uniform Building Code (UBC) 1991. Every method compared was calculated first to get equivalent diagonal strut width. The second step was modelling method using structure analysis software as a frame with a diagonal in a linear mode. The linear mode was chosen based on structure analysis commonly used by structure designers. The frame was loaded and for every model, its load and deformation values were identified. The values of load - deformation of every method were compared to those of experimental test specimen by Mehrabi and open frame. From comparative study performed, Holmes' and Bazan-Meli's equations gave results the closest to the experimental test specimen by Mehrabi. Other equations that gave close values within the limit (by comparing it to the open frame) are Saneinejad-Hobbs, Stafford-Smith, Bazan-Meli, Liauw Kwan, Paulay and Priestley, FEMA 356, Durani Luo, Hendry, Papia and Chen-Iranata.

  14. Damage estimation of subterranean building constructions due to groundwater inundation - the GIS-based model approach GRUWAD

    NASA Astrophysics Data System (ADS)

    Schinke, R.; Neubert, M.; Hennersdorf, J.; Stodolny, U.; Sommer, T.; Naumann, T.

    2012-09-01

    The analysis and management of flood risk commonly focuses on surface water floods, because these types are often associated with high economic losses due to damage to buildings and settlements. The rising groundwater as a secondary effect of these floods induces additional damage, particularly in the basements of buildings. Mostly, these losses remain underestimated, because they are difficult to assess, especially for the entire building stock of flood-prone urban areas. For this purpose an appropriate methodology has been developed and lead to a groundwater damage simulation model named GRUWAD. The overall methodology combines various engineering and geoinformatic methods to calculate major damage processes by high groundwater levels. It considers a classification of buildings by building types, synthetic depth-damage functions for groundwater inundation as well as the results of a groundwater-flow model. The modular structure of this procedure can be adapted in the level of detail. Hence, the model allows damage calculations from the local to the regional scale. Among others it can be used to prepare risk maps, for ex-ante analysis of future risks, and to simulate the effects of mitigation measures. Therefore, the model is a multifarious tool for determining urban resilience with respect to high groundwater levels.

  15. Modular Exposure Disaggregation Methodologies for Catastrophe Modelling using GIS and Remotely-Sensed Data

    NASA Astrophysics Data System (ADS)

    Foulser-Piggott, R.; Saito, K.; Spence, R.

    2012-04-01

    Loss estimates produced by catastrophe models are dependent on the quality of the input data, including both the hazard and exposure data. Currently, some of the exposure data input into a catastrophe model is aggregated over an area and therefore an estimate of the risk in this area may have a low level of accuracy. In order to obtain a more detailed and accurate loss estimate, it is necessary to have higher resolution exposure data. However, high resolution exposure data is not commonly available worldwide and therefore methods to infer building distribution and characteristics at higher resolution from existing information must be developed. This study is focussed on the development of disaggregation methodologies for exposure data which, if implemented in current catastrophe models, would lead to improved loss estimates. The new methodologies developed for disaggregating exposure data make use of GIS, remote sensing and statistical techniques. The main focus of this study is on earthquake risk, however the methods developed are modular so that they may be applied to different hazards. A number of different methods are proposed in order to be applicable to different regions of the world which have different amounts of data available. The new methods give estimates of both the number of buildings in a study area and a distribution of building typologies, as well as a measure of the vulnerability of the building stock to hazard. For each method, a way to assess and quantify the uncertainties in the methods and results is proposed, with particular focus on developing an index to enable input data quality to be compared. The applicability of the methods is demonstrated through testing for two study areas, one in Japan and the second in Turkey, selected because of the occurrence of recent and damaging earthquake events. The testing procedure is to use the proposed methods to estimate the number of buildings damaged at different levels following a scenario earthquake event. This enables the results of the models to be compared with real data and the relative performance of the different methodologies to be evaluated. A sensitivity analysis is also conducted for two main reasons. Firstly, to determine the key input variables in the methodology that have the most significant impact on the resulting loss estimate. Secondly, to enable the uncertainty in the different approaches to be quantified and therefore provide a range of uncertainty in the loss estimates.

  16. On a computational model of building thermal dynamic response

    NASA Astrophysics Data System (ADS)

    Jarošová, Petra; Vala, Jiří

    2016-07-01

    Development and exploitation of advanced materials, structures and technologies in civil engineering, both for buildings with carefully controlled interior temperature and for common residential houses, together with new European and national directives and technical standards, stimulate the development of rather complex and robust, but sufficiently simple and inexpensive computational tools, supporting their design and optimization of energy consumption. This paper demonstrates the possibility of consideration of such seemingly contradictory requirements, using the simplified non-stationary thermal model of a building, motivated by the analogy with the analysis of electric circuits; certain semi-analytical forms of solutions come from the method of lines.

  17. Building surgical capacity in low-resource countries: a qualitative analysis of task shifting from surgeon volunteers' perspectives.

    PubMed

    Aliu, Oluseyi; Corlew, Scott D; Heisler, Michele E; Pannucci, Christopher J; Chung, Kevin C

    2014-01-01

    Surgical volunteer organizations (SVOs) focus considerable resources on addressing the backlog of cases in low-resource countries. This model of service may perpetuate dependency. Efforts should focus on models that establish independence in providing surgical care. Independence could be achieved through surgical capacity building. However, there has been scant discussion in literature on SVO involvement in surgical capacity building. Using qualitative methods, we evaluated the perspectives of surgeons with extensive volunteer experience in low-resource countries. We collected data through in-depth interviews that centered on SVOs using task shifting as a tool for surgical capacity building. Some of the key themes from our analysis include the ethical ramifications of task shifting, the challenges of addressing technical and clinical education in capacity building for low-resource settings, and the allocation of limited volunteer resources toward surgical capacity building. These themes will be the foundation of subsequent studies that will focus on other stakeholders in surgical capacity building including host communities and SVO administrators.

  18. Toward a virtual building laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klems, J.H.; Finlayson, E.U.; Olsen, T.H.

    1999-03-01

    In order to achieve in a timely manner the large energy and dollar savings technically possible through improvements in building energy efficiency, it will be necessary to solve the problem of design failure risk. The most economical method of doing this would be to learn to calculate building performance with sufficient detail, accuracy and reliability to avoid design failure. Existing building simulation models (BSM) are a large step in this direction, but are still not capable of this level of modeling. Developments in computational fluid dynamics (CFD) techniques now allow one to construct a road map from present BSM's tomore » a complete building physical model. The most useful first step is a building interior model (BIM) that would allow prediction of local conditions affecting occupant health and comfort. To provide reliable prediction a BIM must incorporate the correct physical boundary conditions on a building interior. Doing so raises a number of specific technical problems and research questions. The solution of these within a context useful for building research and design is not likely to result from other research on CFD, which is directed toward the solution of different types of problems. A six-step plan for incorporating the correct boundary conditions within the context of the model problem of a large atrium has been outlined. A promising strategy for constructing a BIM is the overset grid technique for representing a building space in a CFD calculation. This technique promises to adapt well to building design and allows a step-by-step approach. A state-of-the-art CFD computer code using this technique has been adapted to the problem and can form the departure point for this research.« less

  19. An Interactive GIS Procedure for Building and Basement Corrections in Urban Microgravity Surveys

    NASA Astrophysics Data System (ADS)

    Chasseriau, P.; Olivier, R.

    2007-12-01

    Construction of a new underground railway in Lausanne, a highly-urbanized city in Switzerland, was an opportunity to test the feasibility and reliability of microgravity surveys in urban environments. The goal of our microgravity survey was to determine the depth-to-bedrock along the project corridor. Available drilling information allowed us verify the density model obtained. The geophysical results also provided spatially exhaustive subsurface information that could not be obtained with drilling methods alone. Gravimetry is one of the rare geophysical methods that can be used in noisy urban environments. An inevitable constraint of this method is terrain correction. It is not easy to obtain a simple and accurate digital elevation model (DEM) of an urban environment considering that buildings and basements are not included. However, these structures significantly influence gravity measurements. We calculate, with software that we have developed, the influence of buildings and basements in order to correct our gravity data. Our procedure permits the integration of gravity measurements, cadastral information (building typology and geometry) and basement geometry in an Access database that allows interactive determination of the Bouguer anomaly. A geographic information system (GIS) is used to extract building geometries based on cadastral information and to correct the influence of each building using a simplified architectural style. Basement voids are then introduced in the final DEM using building outlines given by cadastral maps. The depths and altitudes of the basements are measured by visiting them, and then linking the results to a regional topographic map. All of these corrections can be calculated before the gravity acquisition has begun in order to optimize the design of the survey. The surveys are executed late at night so as to minimize the effects of traffic noise. 160 gravity measurements were carried out before and after digging of the underground tunnel. The difference between gravimetric values of both surveys permitted validation of our modelling code.

  20. Analysis on the restriction factors of the green building scale promotion based on DEMATEL

    NASA Astrophysics Data System (ADS)

    Wenxia, Hong; Zhenyao, Jiang; Zhao, Yang

    2017-03-01

    In order to promote the large-scale development of the green building in our country, DEMATEL method was used to classify influence factors of green building development into three parts, including green building market, green technology and macro economy. Through the DEMATEL model, the interaction mechanism of each part was analyzed. The mutual influence degree of each barrier factor that affects the green building promotion was quantitatively analysed and key factors for the development of green building in China were also finally determined. In addition, some implementation strategies of promoting green building scale development in our country were put forward. This research will show important reference value and practical value for making policies of the green building promotion.

  1. Improved tsunami impact assessments: validation, comparison and the integration of hydrodynamic modeling

    NASA Astrophysics Data System (ADS)

    Tarbotton, C.; Walters, R. A.; Goff, J. R.; Dominey-Howes, D.; Turner, I. L.

    2012-12-01

    As communities become increasingly aware of the risks posed by tsunamis, it is important to develop methods for predicting the damage they can cause to the built environment. This will provide the information needed to make informed decisions regarding land-use, building codes, and evacuation. At present, a number of tsunami-building vulnerability assessment models are available, however, the relative infrequency and destructive nature of tsunamis has long made it difficult to obtain the data necessary to adequately validate and compare them. Further complicating matters is that the inundation of a tsunami in the built environment is very difficult model, as is the response of a building to the hydraulic forces that a tsunami generates. Variations in building design and condition will significantly affect a building's susceptibility to damage. Likewise, factors affecting the flow conditions at a building (i.e. surrounding structures and topography), will greatly affect its exposure. This presents significant challenges for practitioners, as they are often left in the dark on how to use hazard modeling and vulnerability assessment techniques together to conduct the community-scale impact studies required for tsunami planning. This paper presents the results of an in-depth case study of Yuriage, Miyagi Prefecture - a coastal city in Japan that was badly damaged by the 2011 Tohoku tsunami. The aim of the study was twofold: 1) To test and compare existing tsunami vulnerability assessment models and 2) To more effectively utilize hydrodynamic models in the context of tsunami impact studies. Following the 2011 Tohoku event, an unprecedented quantity of field data, imagery and video emerged. Yuriage in particular, features a comprehensive set of street level Google Street View imagery, available both before and after the event. This has enabled the collection of a large dataset describing the characteristics of the buildings existing before the event as well the subsequent damage that they sustained during. These data together with the detailed results from hydrodynamic models have been used to provide the building, damage and hazard data necessary to rigorously test and compare existing vulnerability assessments techniques. The result is a much-improved understanding of the capabilities of existing vulnerability assessment techniques, as well as important improvements to their assessment framework This provides much needed guidance to practitioners on how to conduct tsunami impact assessments in the future. Furthermore, the study introduces some new methods of integrating hydrodynamic models into vulnerability assessment models, offering guidance on how to more effectively model tsunami inundation in the built environment.

  2. Metric Survey and Bim Technologies to Record Decay Conditions

    NASA Astrophysics Data System (ADS)

    Lo Turco, M.; Mattone, M.; Rinaudo, F.

    2017-05-01

    The paper proposes a method able to describe, classify and organize information assets concerned with Architectural Heritage, through the use of integrated survey procedures, mainly based on Terrestrial Laser Scanner (TLS). The point clouds are then imported into the Building Information Modeling (BIM) software to start with the modeling phase. With regard to this issue, in the last period Building Information Modeling is emerging as the most reliable method to manage architectural design and building processes. Literature supplies both theoretical approaches and several practical applications. However, very little researches are devoted to BIM applied to historical architecture, even if some initial results indicate the actual HBIM (Historic/Heritage BIM) as a possible instrument for the design of an intervention aimed at the conservation of the Cultural Heritage. The focus of the research is the creation of parametric objects representing the preservation status of materials and building components: 3D modeling of decays in the BIM platform ensures to enrich the related database with graphic, geometric and alphanumeric data that can be effectively used to design and manage future interventions. The added value consists in its capability to associate new parameters that describe both the state of conservation of the materials and the detailed description of interventions needed to restore the building. The analyzed case study belongs to Ferrovie dello Stato (the main Italian Railways company) and it is part of the maintenance area, which was originally constituted by a roundhouse containing 51 sheltered railroad tracks and two big sheds.

  3. A case study on the historical peninsula of Istanbul based on three-dimensional modeling by using photogrammetry and terrestrial laser scanning.

    PubMed

    Ergun, Bahadir; Sahin, Cumhur; Baz, Ibrahim; Ustuntas, Taner

    2010-06-01

    Terrestrial laser scanning is a popular methodology that is used frequently in the process of documenting historical buildings and cultural heritage. The historical peninsula region sprawls over an area of approximately 1,500 ha and is one of the main aggregate areas of the historical buildings in Istanbul. In this study, terrestrial laser scanning and close range photogrammetry techniques are integrated into each other to create a 3D city model of this part of Istanbul, including some of the buildings that represent the most brilliant areas of Byzantine and Ottoman Empires. Several terrestrial laser scanners with their different specifications were used to solve various geometric scanning problems for distinct areas of the subject city. Photogrammetric method was used for the documentation of the façades of these historical buildings for architectural purposes. This study differentiates itself from the similar ones by its application process that focuses on the geometry, the building texture, and density of the study area. Nowadays, the largest-scale studies among 3D modeling studies, in terms of the methodology of measurement, are urban modeling studies. Because of this large scale, the application of 3D urban modeling studies is executed in a gradual way. In this study, a modeling method based on the façades of the streets was used. In addition, the complimentary elements for the process of modeling were combined in several ways. A street model was presented as a sample, as being the subject of the applied study. In our application of 3D modeling, the modeling based on close range photogrammetry and the data of combined calibration with the data of terrestrial laser scanner were used in a compatible way. The final work was formed with the pedestal data for 3D visualization.

  4. Neuronize: a tool for building realistic neuronal cell morphologies

    PubMed Central

    Brito, Juan P.; Mata, Susana; Bayona, Sofia; Pastor, Luis; DeFelipe, Javier; Benavides-Piccione, Ruth

    2013-01-01

    This study presents a tool, Neuronize, for building realistic three-dimensional models of neuronal cells from the morphological information extracted through computer-aided tracing applications. Neuronize consists of a set of methods designed to build 3D neural meshes that approximate the cell membrane at different resolution levels, allowing a balance to be reached between the complexity and the quality of the final model. The main contribution of the present study is the proposal of a novel approach to build a realistic and accurate 3D shape of the soma from the incomplete information stored in the digitally traced neuron, which usually consists of a 2D cell body contour. This technique is based on the deformation of an initial shape driven by the position and thickness of the first order dendrites. The addition of a set of spines along the dendrites completes the model, building a final 3D neuronal cell suitable for its visualization in a wide range of 3D environments. PMID:23761740

  5. Wind tunnel measurements of three-dimensional wakes of buildings. [for aircraft safety applications

    NASA Technical Reports Server (NTRS)

    Logan, E., Jr.; Lin, S. H.

    1982-01-01

    Measurements relevant to the effect of buildings on the low level atmospheric boundary layer are presented. A wind tunnel experiment was undertaken to determine the nature of the flow downstream from a gap between two transversely aligned, equal sized models of rectangular cross section. These building models were immersed in an equilibrium turbulent boundary layer which was developed on a smooth floor in a zero longitudinal pressure gradient. Measurements with an inclined (45 degree) hot-wire were made at key positions downstream of models arranged with a large, small, and no gap between them. Hot-wire theory is presented which enables computation of the three mean velocity components, U, V and W, as well as Reynolds stresses. These measurements permit understanding of the character of the wake downstream of laterally spaced buildings. Surface streamline patterns obtained by the oil film method were used to delineate the separation region to the rear of the buildings for a variety of spacings.

  6. Neuronize: a tool for building realistic neuronal cell morphologies.

    PubMed

    Brito, Juan P; Mata, Susana; Bayona, Sofia; Pastor, Luis; Defelipe, Javier; Benavides-Piccione, Ruth

    2013-01-01

    This study presents a tool, Neuronize, for building realistic three-dimensional models of neuronal cells from the morphological information extracted through computer-aided tracing applications. Neuronize consists of a set of methods designed to build 3D neural meshes that approximate the cell membrane at different resolution levels, allowing a balance to be reached between the complexity and the quality of the final model. The main contribution of the present study is the proposal of a novel approach to build a realistic and accurate 3D shape of the soma from the incomplete information stored in the digitally traced neuron, which usually consists of a 2D cell body contour. This technique is based on the deformation of an initial shape driven by the position and thickness of the first order dendrites. The addition of a set of spines along the dendrites completes the model, building a final 3D neuronal cell suitable for its visualization in a wide range of 3D environments.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Joyce Jihyun; Yin, Rongxin; Kiliccote, Sila

    Open Automated Demand Response (OpenADR), an XML-based information exchange model, is used to facilitate continuous price-responsive operation and demand response participation for large commercial buildings in New York who are subject to the default day-ahead hourly pricing. We summarize the existing demand response programs in New York and discuss OpenADR communication, prioritization of demand response signals, and control methods. Building energy simulation models are developed and field tests are conducted to evaluate continuous energy management and demand response capabilities of two commercial buildings in New York City. Preliminary results reveal that providing machine-readable prices to commercial buildings can facilitate bothmore » demand response participation and continuous energy cost savings. Hence, efforts should be made to develop more sophisticated algorithms for building control systems to minimize customer's utility bill based on price and reliability information from the electricity grid.« less

  8. IEA EBC annex 53: Total energy use in buildings—Analysis and evaluation methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoshino, Hiroshi; Hong, Tianzhen; Nord, Natasa

    One of the most significant barriers to achieving deep building energy efficiency is a lack of knowledge about the factors determining energy use. In fact, there is often a significant discrepancy between designed and real energy use in buildings, which is poorly understood but are believed to have more to do with the role of human behavior than building design. Building energy use is mainly influenced by six factors: climate, building envelope, building services and energy systems, building operation and maintenance, occupants’ activities and behavior, and indoor environmental quality. In the past, much research focused on the first three factors.more » However, the next three human-related factors can have an influence as significant as the first three. Annex 53 employed an interdisciplinary approach, integrating building science, architectural engineering, computer modeling and simulation, and social and behavioral science to develop and apply methods to analyze and evaluate the real energy use in buildings considering the six influencing factors. Finally, outcomes from Annex 53 improved understanding and strengthen knowledge regarding the robust prediction of total energy use in buildings, enabling reliable quantitative assessment of energy-savings measures, policies, and techniques.« less

  9. IEA EBC annex 53: Total energy use in buildings—Analysis and evaluation methods

    DOE PAGES

    Yoshino, Hiroshi; Hong, Tianzhen; Nord, Natasa

    2017-07-18

    One of the most significant barriers to achieving deep building energy efficiency is a lack of knowledge about the factors determining energy use. In fact, there is often a significant discrepancy between designed and real energy use in buildings, which is poorly understood but are believed to have more to do with the role of human behavior than building design. Building energy use is mainly influenced by six factors: climate, building envelope, building services and energy systems, building operation and maintenance, occupants’ activities and behavior, and indoor environmental quality. In the past, much research focused on the first three factors.more » However, the next three human-related factors can have an influence as significant as the first three. Annex 53 employed an interdisciplinary approach, integrating building science, architectural engineering, computer modeling and simulation, and social and behavioral science to develop and apply methods to analyze and evaluate the real energy use in buildings considering the six influencing factors. Finally, outcomes from Annex 53 improved understanding and strengthen knowledge regarding the robust prediction of total energy use in buildings, enabling reliable quantitative assessment of energy-savings measures, policies, and techniques.« less

  10. Building a new predictor for multiple linear regression technique-based corrective maintenance turnaround time.

    PubMed

    Cruz, Antonio M; Barr, Cameron; Puñales-Pozo, Elsa

    2008-01-01

    This research's main goals were to build a predictor for a turnaround time (TAT) indicator for estimating its values and use a numerical clustering technique for finding possible causes of undesirable TAT values. The following stages were used: domain understanding, data characterisation and sample reduction and insight characterisation. Building the TAT indicator multiple linear regression predictor and clustering techniques were used for improving corrective maintenance task efficiency in a clinical engineering department (CED). The indicator being studied was turnaround time (TAT). Multiple linear regression was used for building a predictive TAT value model. The variables contributing to such model were clinical engineering department response time (CE(rt), 0.415 positive coefficient), stock service response time (Stock(rt), 0.734 positive coefficient), priority level (0.21 positive coefficient) and service time (0.06 positive coefficient). The regression process showed heavy reliance on Stock(rt), CE(rt) and priority, in that order. Clustering techniques revealed the main causes of high TAT values. This examination has provided a means for analysing current technical service quality and effectiveness. In doing so, it has demonstrated a process for identifying areas and methods of improvement and a model against which to analyse these methods' effectiveness.

  11. Self-calibrating models for dynamic monitoring and diagnosis

    NASA Technical Reports Server (NTRS)

    Kuipers, Benjamin

    1996-01-01

    A method for automatically building qualitative and semi-quantitative models of dynamic systems, and using them for monitoring and fault diagnosis, is developed and demonstrated. The qualitative approach and semi-quantitative method are applied to monitoring observation streams, and to design of non-linear control systems.

  12. Multi-Level Building Reconstruction for Automatic Enhancement of High Resolution Dsms

    NASA Astrophysics Data System (ADS)

    Arefi, H.; Reinartz, P.

    2012-07-01

    In this article a multi-level approach is proposed for reconstruction-based improvement of high resolution Digital Surface Models (DSMs). The concept of Levels of Detail (LOD) defined by CityGML standard has been considered as basis for abstraction levels of building roof structures. Here, the LOD1 and LOD2 which are related to prismatic and parametric roof shapes are reconstructed. Besides proposing a new approach for automatic LOD1 and LOD2 generation from high resolution DSMs, the algorithm contains two generalization levels namely horizontal and vertical. Both generalization levels are applied to prismatic model of buildings. The horizontal generalization allows controlling the approximation level of building footprints which is similar to cartographic generalization concept of the urban maps. In vertical generalization, the prismatic model is formed using an individual building height and continuous to included all flat structures locating in different height levels. The concept of LOD1 generation is based on approximation of the building footprints into rectangular or non-rectangular polygons. For a rectangular building containing one main orientation a method based on Minimum Bounding Rectangle (MBR) in employed. In contrast, a Combined Minimum Bounding Rectangle (CMBR) approach is proposed for regularization of non-rectilinear polygons, i.e. buildings without perpendicular edge directions. Both MBRand CMBR-based approaches are iteratively employed on building segments to reduce the original building footprints to a minimum number of nodes with maximum similarity to original shapes. A model driven approach based on the analysis of the 3D points of DSMs in a 2D projection plane is proposed for LOD2 generation. Accordingly, a building block is divided into smaller parts according to the direction and number of existing ridge lines. The 3D model is derived for each building part and finally, a complete parametric model is formed by merging all the 3D models of the individual parts and adjusting the nodes after the merging step. In order to provide an enhanced DSM, a surface model is provided for each building by interpolation of the internal points of the generated models. All interpolated models are situated on a Digital Terrain Model (DTM) of corresponding area to shape the enhanced DSM. Proposed DSM enhancement approach has been tested on a dataset from Munich central area. The original DSM is created using robust stereo matching of Worldview-2 stereo images. A quantitative assessment of the new DSM by comparing the heights of the ridges and eaves shows a standard deviation of better than 50cm.

  13. Implementation of Kriging Methods in Mobile GIS to Estimate Damage to Buildings in Crisis Scenarios

    NASA Astrophysics Data System (ADS)

    Laun, S.; Rösch, N.; Breunig, M.; Doori, M. Al

    2016-06-01

    In the paper an example for the application of kriging methods to estimate damage to buildings in crisis scenarios is introduced. Furthermore, the Java implementations for Ordinary and Universal Kriging on mobile GIS are presented. As variogram models an exponential, a Gaussian and a spherical variogram are tested in detail. Different test constellations are introduced with various information densities. As test data set, public data from the analysis of the 2010 Haiti earthquake by satellite images are pre-processed and visualized in a Geographic Information System. As buildings, topography and other external influences cannot be seen as being constant for the whole area under investigation, semi variograms are calculated by consulting neighboured classified buildings using the so called moving window method. The evaluation of the methods shows that the underlying variogram model is the determining factor for the quality of the interpolation rather than the choice of the kriging method or increasing the information density of a random sample. The implementation is completely realized with the programming language Java. Thereafter, the implemented software component is integrated into GeoTech Mobile, a mobile GIS Android application based on the processing of standardized spatial data representations defined by the Open Geospatial Consortium (OGC). As a result the implemented methods can be used on mobile devices, i.e. they may be transferred to other application fields. That is why we finally point out further research with new applications in the Dubai region.

  14. A hybrid approach to survival model building using integration of clinical and molecular information in censored data.

    PubMed

    Choi, Ickwon; Kattan, Michael W; Wells, Brian J; Yu, Changhong

    2012-01-01

    In medical society, the prognostic models, which use clinicopathologic features and predict prognosis after a certain treatment, have been externally validated and used in practice. In recent years, most research has focused on high dimensional genomic data and small sample sizes. Since clinically similar but molecularly heterogeneous tumors may produce different clinical outcomes, the combination of clinical and genomic information, which may be complementary, is crucial to improve the quality of prognostic predictions. However, there is a lack of an integrating scheme for clinic-genomic models due to the P ≥ N problem, in particular, for a parsimonious model. We propose a methodology to build a reduced yet accurate integrative model using a hybrid approach based on the Cox regression model, which uses several dimension reduction techniques, L₂ penalized maximum likelihood estimation (PMLE), and resampling methods to tackle the problem. The predictive accuracy of the modeling approach is assessed by several metrics via an independent and thorough scheme to compare competing methods. In breast cancer data studies on a metastasis and death event, we show that the proposed methodology can improve prediction accuracy and build a final model with a hybrid signature that is parsimonious when integrating both types of variables.

  15. Automated structure solution, density modification and model building.

    PubMed

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  16. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less

  17. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    and Ben Polly, Joseph Robertson; Polly, Ben; Collis, Jon

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less

  18. Construction of a model demonstrating neural pathways and reflex arcs.

    PubMed

    Chan, V; Pisegna, J M; Rosian, R L; DiCarlo, S E

    1996-12-01

    Employment opportunities in the future will require higher skills and an understanding of mathematics and science. As a result of the growing number of careers that require solid science and mathematics training, the methods of science education are undergoing major reform. To adequately equip students for technologically advanced positions, new teaching methods must be developed that prepare tomorrow's workforce for the challenges of the 21st century. One such method is the use of models. By actively building and manipulating concrete models that represent scientific concepts, students are involved in the most basic level of Piaget's learning scheme: the sensorimotor stage. Models are useful in reaching all students at the foundational levels of learning, and further learning experiences are rapidly moved through higher learning levels. This success ensures greater comprehension and understanding compared with the traditional methods of rote memorization. We developed an exercise for the construction of an inexpensive, easy-to-build model demonstrating neural pathways and reflex arcs. Our exercise also includes many supplemental teaching tools. The exercise is designed to fulfill the need of sound physiological teaching materials for high school students.

  19. Modeling carbon dioxide emissions reductions for three commercial reference buildings in Salt Lake City

    NASA Astrophysics Data System (ADS)

    Lucich, Stephen M.

    In the United States, the buildings sector is responsible for approximately 40% of the national carbon dioxide (CO2) emissions. CO2 is created during the generation of heat and electricity, and has been linked to climate change, acid rain, a variety of health threats, surface water depletion, and the destruction of natural habitats. Building energy modeling is a powerful educational tool that building owners, architects, engineers, city planners, and policy makers can use to make informed decisions. The aim of this thesis is to simulate the reduction in CO2 emissions that may be achieved for three commercial buildings located in Salt Lake City, UT. The following two questions were used to guide this process: 1. How much can a building's annual CO2 emissions be reduced through a specific energy efficiency upgrade or policy? 2. How much can a building's annual CO2 emissions be reduced through the addition of a photovoltaic (PV) array? How large should the array be? Building energy simulations were performed with the Department of Energy's EnergyPlus software, commercial reference building models, and TMY3 weather data. The chosen models were a medium office building, a primary school, and a supermarket. Baseline energy consumption data were simulated for each model in order to identify changes that would have a meaningful impact. Modifications to the buildings construction and operation were considered before a PV array was incorporated. These modifications include (1) an improved building envelope, (2) reduced lighting intensity, and (3) modified HVAC temperature set points. The PV array sizing was optimized using a demand matching approach based on the method of least squares. The arrays tilt angle was optimized using the golden section search algorithm. Combined, energy efficiency upgrades and the PV array reduced building CO2 emissions by 58.6, 54.0, and 52.2% for the medium office, primary school, and supermarket, respectively. However, for these models, it was determined that the addition of a PV array is not feasible from a purely economic viewpoint. Several avenues for expansion of this research are presented in Chapter 5.

  20. 40 CFR Appendix C to Subpart E of... - Asbestos Model Accreditation Plan

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of friable ACBM. 6. “Public and commercial building” means the interior space of any building which..., warehouses and factories. Interior space includes exterior hallways connecting buildings, porticos, and mechanical systems used to condition interior space. 7. “Response action” means a method, including removal...

  1. 40 CFR Appendix C to Subpart E of... - Asbestos Model Accreditation Plan

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of friable ACBM. 6. “Public and commercial building” means the interior space of any building which..., warehouses and factories. Interior space includes exterior hallways connecting buildings, porticos, and mechanical systems used to condition interior space. 7. “Response action” means a method, including removal...

  2. 40 CFR Appendix C to Subpart E of... - Asbestos Model Accreditation Plan

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of friable ACBM. 6. “Public and commercial building” means the interior space of any building which..., warehouses and factories. Interior space includes exterior hallways connecting buildings, porticos, and mechanical systems used to condition interior space. 7. “Response action” means a method, including removal...

  3. 40 CFR Appendix C to Subpart E of... - Asbestos Model Accreditation Plan

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of friable ACBM. 6. “Public and commercial building” means the interior space of any building which..., warehouses and factories. Interior space includes exterior hallways connecting buildings, porticos, and mechanical systems used to condition interior space. 7. “Response action” means a method, including removal...

  4. 40 CFR Appendix C to Subpart E of... - Asbestos Model Accreditation Plan

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of friable ACBM. 6. “Public and commercial building” means the interior space of any building which..., warehouses and factories. Interior space includes exterior hallways connecting buildings, porticos, and mechanical systems used to condition interior space. 7. “Response action” means a method, including removal...

  5. DEVELOPMENT AND APPLICATIONS OF CFD SIMULATIONS IN SUPPORT OF AIR QUALITY STUDIES INVOLVING BUILDINGS

    EPA Science Inventory

    There is a need to properly develop the application of Computational Fluid Dynamics (CFD) methods in support of air quality studies involving pollution sources near buildings at industrial sites. CFD models are emerging as a promising technology for such assessments, in part due ...

  6. Education for Liberation: A Precursor to Youth Activism for Social Justice

    ERIC Educational Resources Information Center

    Atkinson, Kristen N.

    2012-01-01

    This paper presents a participatory research approach to the study of youth activism within a community development and movement-building program. It employs participatory ethnography theory and methods to explore an innovative model of social change for social justice. Building on community youth development and transformative social work…

  7. Integration of Point Clouds Dataset from Different Sensors

    NASA Astrophysics Data System (ADS)

    Abdullah, C. K. A. F. Che Ku; Baharuddin, N. Z. S.; Ariff, M. F. M.; Majid, Z.; Lau, C. L.; Yusoff, A. R.; Idris, K. M.; Aspuri, A.

    2017-02-01

    Laser Scanner technology become an option in the process of collecting data nowadays. It is composed of Airborne Laser Scanner (ALS) and Terrestrial Laser Scanner (TLS). ALS like Phoenix AL3-32 can provide accurate information from the viewpoint of rooftop while TLS as Leica C10 can provide complete data for building facade. However if both are integrated, it is able to produce more accurate data. The focus of this study is to integrate both types of data acquisition of ALS and TLS and determine the accuracy of the data obtained. The final results acquired will be used to generate models of three-dimensional (3D) buildings. The scope of this study is focusing on data acquisition of UTM Eco-home through laser scanning methods such as ALS which scanning on the roof and the TLS which scanning on building façade. Both device is used to ensure that no part of the building that are not scanned. In data integration process, both are registered by the selected points among the manmade features which are clearly visible in Cyclone 7.3 software. The accuracy of integrated data is determined based on the accuracy assessment which is carried out using man-made registration methods. The result of integration process can achieve below 0.04m. This integrated data then are used to generate a 3D model of UTM Eco-home building using SketchUp software. In conclusion, the combination of the data acquisition integration between ALS and TLS would produce the accurate integrated data and able to use for generate a 3D model of UTM eco-home. For visualization purposes, the 3D building model which generated is prepared in Level of Detail 3 (LOD3) which recommended by City Geographic Mark-Up Language (CityGML).

  8. Automatic building detection based on Purposive FastICA (PFICA) algorithm using monocular high resolution Google Earth images

    NASA Astrophysics Data System (ADS)

    Ghaffarian, Saman; Ghaffarian, Salar

    2014-11-01

    This paper proposes an improved FastICA model named as Purposive FastICA (PFICA) with initializing by a simple color space transformation and a novel masking approach to automatically detect buildings from high resolution Google Earth imagery. ICA and FastICA algorithms are defined as Blind Source Separation (BSS) techniques for unmixing source signals using the reference data sets. In order to overcome the limitations of the ICA and FastICA algorithms and make them purposeful, we developed a novel method involving three main steps: 1-Improving the FastICA algorithm using Moore-Penrose pseudo inverse matrix model, 2-Automated seeding of the PFICA algorithm based on LUV color space and proposed simple rules to split image into three regions; shadow + vegetation, baresoil + roads and buildings, respectively, 3-Masking out the final building detection results from PFICA outputs utilizing the K-means clustering algorithm with two number of clusters and conducting simple morphological operations to remove noises. Evaluation of the results illustrates that buildings detected from dense and suburban districts with divers characteristics and color combinations using our proposed method have 88.6% and 85.5% overall pixel-based and object-based precision performances, respectively.

  9. Reconstruction of 3d Objects of Assets and Facilities by Using Benchmark Points

    NASA Astrophysics Data System (ADS)

    Baig, S. U.; Rahman, A. A.

    2013-08-01

    Acquiring and modeling 3D geo-data of building assets and facility objects is one of the challenges. A number of methods and technologies are being utilized for this purpose. Total station, GPS, photogrammetric and terrestrial laser scanning are few of these technologies. In this paper, points commonly shared by potential facades of assets and facilities modeled from point clouds are identified. These points are useful for modeling process to reconstruct 3D models of assets and facilities stored to be used for management purposes. These models are segmented through different planes to produce accurate 2D plans. This novel method improves the efficiency and quality of construction of models of assets and facilities with the aim utilize in 3D management projects such as maintenance of buildings or group of items that need to be replaced, or renovated for new services.

  10. Assessment of Pansharpening Methods Applied to WorldView-2 Imagery Fusion.

    PubMed

    Li, Hui; Jing, Linhai; Tang, Yunwei

    2017-01-05

    Since WorldView-2 (WV-2) images are widely used in various fields, there is a high demand for the use of high-quality pansharpened WV-2 images for different application purposes. With respect to the novelty of the WV-2 multispectral (MS) and panchromatic (PAN) bands, the performances of eight state-of-art pan-sharpening methods for WV-2 imagery including six datasets from three WV-2 scenes were assessed in this study using both quality indices and information indices, along with visual inspection. The normalized difference vegetation index, normalized difference water index, and morphological building index, which are widely used in applications related to land cover classification, the extraction of vegetation areas, buildings, and water bodies, were employed in this work to evaluate the performance of different pansharpening methods in terms of information presentation ability. The experimental results show that the Haze- and Ratio-based, adaptive Gram-Schmidt, Generalized Laplacian pyramids (GLP) methods using enhanced spectral distortion minimal model and enhanced context-based decision model methods are good choices for producing fused WV-2 images used for image interpretation and the extraction of urban buildings. The two GLP-based methods are better choices than the other methods, if the fused images will be used for applications related to vegetation and water-bodies.

  11. Assessment of Pansharpening Methods Applied to WorldView-2 Imagery Fusion

    PubMed Central

    Li, Hui; Jing, Linhai; Tang, Yunwei

    2017-01-01

    Since WorldView-2 (WV-2) images are widely used in various fields, there is a high demand for the use of high-quality pansharpened WV-2 images for different application purposes. With respect to the novelty of the WV-2 multispectral (MS) and panchromatic (PAN) bands, the performances of eight state-of-art pan-sharpening methods for WV-2 imagery including six datasets from three WV-2 scenes were assessed in this study using both quality indices and information indices, along with visual inspection. The normalized difference vegetation index, normalized difference water index, and morphological building index, which are widely used in applications related to land cover classification, the extraction of vegetation areas, buildings, and water bodies, were employed in this work to evaluate the performance of different pansharpening methods in terms of information presentation ability. The experimental results show that the Haze- and Ratio-based, adaptive Gram-Schmidt, Generalized Laplacian pyramids (GLP) methods using enhanced spectral distortion minimal model and enhanced context-based decision model methods are good choices for producing fused WV-2 images used for image interpretation and the extraction of urban buildings. The two GLP-based methods are better choices than the other methods, if the fused images will be used for applications related to vegetation and water-bodies. PMID:28067770

  12. Study on the wind field and pollutant dispersion in street canyons using a stable numerical method.

    PubMed

    Xia, Ji-Yang; Leung, Dennis Y C

    2005-01-01

    A stable finite element method for the time dependent Navier-Stokes equations was used for studying the wind flow and pollutant dispersion within street canyons. A three-step fractional method was used to solve the velocity field and the pressure field separately from the governing equations. The Streamline Upwind Petrov-Galerkin (SUPG) method was used to get stable numerical results. Numerical oscillation was minimized and satisfactory results can be obtained for flows at high Reynolds numbers. Simulating the flow over a square cylinder within a wide range of Reynolds numbers validates the wind field model. The Strouhal numbers obtained from the numerical simulation had a good agreement with those obtained from experiment. The wind field model developed in the present study is applied to simulate more complex flow phenomena in street canyons with two different building configurations. The results indicated that the flow at rooftop of buildings might not be assumed parallel to the ground as some numerical modelers did. A counter-clockwise rotating vortex may be found in street canyons with an inflow from the left to right. In addition, increasing building height can increase velocity fluctuations in the street canyon under certain circumstances, which facilitate pollutant dispersion. At high Reynolds numbers, the flow regimes in street canyons do not change with inflow velocity.

  13. Mass and stiffness estimation using mobile devices for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Le, Viet; Yu, Tzuyang

    2015-04-01

    In the structural health monitoring (SHM) of civil infrastructure, dynamic methods using mass, damping, and stiffness for characterizing structural health have been a traditional and widely used approach. Changes in these system parameters over time indicate the progress of structural degradation or deterioration. In these methods, capability of predicting system parameters is essential to their success. In this paper, research work on the development of a dynamic SHM method based on perturbation analysis is reported. The concept is to use externally applied mass to perturb an unknown system and measure the natural frequency of the system. Derived theoretical expressions for mass and stiffness prediction are experimentally verified by a building model. Dynamic responses of the building model perturbed by various masses in free vibration were experimentally measured by a mobile device (cell phone) to extract the natural frequency of the building model. Single-degreeof- freedom (SDOF) modeling approach was adopted for the sake of using a cell phone. From the experimental result, it is shown that the percentage error of predicted mass increases when the mass ratio increases, while the percentage error of predicted stiffness decreases when the mass ratio increases. This work also demonstrated the potential use of mobile devices in the health monitoring of civil infrastructure.

  14. Regression Tree-Based Methodology for Customizing Building Energy Benchmarks to Individual Commercial Buildings

    NASA Astrophysics Data System (ADS)

    Kaskhedikar, Apoorva Prakash

    According to the U.S. Energy Information Administration, commercial buildings represent about 40% of the United State's energy consumption of which office buildings consume a major portion. Gauging the extent to which an individual building consumes energy in excess of its peers is the first step in initiating energy efficiency improvement. Energy Benchmarking offers initial building energy performance assessment without rigorous evaluation. Energy benchmarking tools based on the Commercial Buildings Energy Consumption Survey (CBECS) database are investigated in this thesis. This study proposes a new benchmarking methodology based on decision trees, where a relationship between the energy use intensities (EUI) and building parameters (continuous and categorical) is developed for different building types. This methodology was applied to medium office and school building types contained in the CBECS database. The Random Forest technique was used to find the most influential parameters that impact building energy use intensities. Subsequently, correlations which were significant were identified between EUIs and CBECS variables. Other than floor area, some of the important variables were number of workers, location, number of PCs and main cooling equipment. The coefficient of variation was used to evaluate the effectiveness of the new model. The customization technique proposed in this thesis was compared with another benchmarking model that is widely used by building owners and designers namely, the ENERGY STAR's Portfolio Manager. This tool relies on the standard Linear Regression methods which is only able to handle continuous variables. The model proposed uses data mining technique and was found to perform slightly better than the Portfolio Manager. The broader impacts of the new benchmarking methodology proposed is that it allows for identifying important categorical variables, and then incorporating them in a local, as against a global, model framework for EUI pertinent to the building type. The ability to identify and rank the important variables is of great importance in practical implementation of the benchmarking tools which rely on query-based building and HVAC variable filters specified by the user.

  15. Building Change Detection from Bi-Temporal Dense-Matching Point Clouds and Aerial Images.

    PubMed

    Pang, Shiyan; Hu, Xiangyun; Cai, Zhongliang; Gong, Jinqi; Zhang, Mi

    2018-03-24

    In this work, a novel building change detection method from bi-temporal dense-matching point clouds and aerial images is proposed to address two major problems, namely, the robust acquisition of the changed objects above ground and the automatic classification of changed objects into buildings or non-buildings. For the acquisition of changed objects above ground, the change detection problem is converted into a binary classification, in which the changed area above ground is regarded as the foreground and the other area as the background. For the gridded points of each period, the graph cuts algorithm is adopted to classify the points into foreground and background, followed by the region-growing algorithm to form candidate changed building objects. A novel structural feature that was extracted from aerial images is constructed to classify the candidate changed building objects into buildings and non-buildings. The changed building objects are further classified as "newly built", "taller", "demolished", and "lower" by combining the classification and the digital surface models of two periods. Finally, three typical areas from a large dataset are used to validate the proposed method. Numerous experiments demonstrate the effectiveness of the proposed algorithm.

  16. Quantitative risk assessment of landslides triggered by earthquakes and rainfall based on direct costs of urban buildings

    NASA Astrophysics Data System (ADS)

    Vega, Johnny Alexander; Hidalgo, Cesar Augusto

    2016-11-01

    This paper outlines a framework for risk assessment of landslides triggered by earthquakes and rainfall in urban buildings in the city of Medellín - Colombia, applying a model that uses a geographic information system (GIS). We applied a computer model that includes topographic, geological, geotechnical and hydrological features of the study area to assess landslide hazards using the Newmark's pseudo-static method, together with a probabilistic approach based on the first order and second moment method (FOSM). The physical vulnerability assessment of buildings was conducted using structural fragility indexes, as well as the definition of damage level of buildings via decision trees and using Medellin's cadastral inventory data. The probability of occurrence of a landslide was calculated assuming that an earthquake produces horizontal ground acceleration (Ah) and considering the uncertainty of the geotechnical parameters and the soil saturation conditions of the ground. The probability of occurrence was multiplied by the structural fragility index values and by the replacement value of structures. The model implemented aims to quantify the risk caused by this kind of disaster in an area of the city of Medellín based on different values of Ah and an analysis of the damage costs of this disaster to buildings under different scenarios and structural conditions. Currently, 62% of ;Valle de Aburra; where the study area is located is under very low condition of landslide hazard and 38% is under low condition. If all buildings in the study area fulfilled the requirements of the Colombian building code, the costs of a landslide would be reduced 63% compared with the current condition. An earthquake with a return period of 475 years was used in this analysis according to the seismic microzonation study in 2002.

  17. Experimental study of geotextile as plinth beam in a pile group-supported modeled building frame

    NASA Astrophysics Data System (ADS)

    Ravi Kumar Reddy, C.; Gunneswara Rao, T. D.

    2017-12-01

    This paper presents the experimental results of static vertical load tests on a model building frame with geotextile as plinth beam supported by pile groups embedded in cohesionless soil (sand). The experimental results have been compared with those obtained from the nonlinear FEA and conventional method of analysis. The results revealed that the conventional method of analysis gives a shear force of about 53%, bending moment at the top of the column about 17% and at the base of the column about 50-98% higher than that by the nonlinear FEA for the frame with geotextile as plinth beam.

  18. Pretest predictions for the response of a 1:8-scale steel LWR containment building model to static overpressurization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clauss, D.B.

    The analyses used to predict the behavior of a 1:8-scale model of a steel LWR containment building to static overpressurization are described and results are presented. Finite strain, large displacement, and nonlinear material properties were accounted for using finite element methods. Three-dimensional models were needed to analyze the penetrations, which included operable equipment hatches, personnel lock representations, and a constrained pipe. It was concluded that the scale model would fail due to leakage caused by large deformations of the equipment hatch sleeves. 13 refs., 34 figs., 1 tab.

  19. Hierarchical analytical and simulation modelling of human-machine systems with interference

    NASA Astrophysics Data System (ADS)

    Braginsky, M. Ya; Tarakanov, D. V.; Tsapko, S. G.; Tsapko, I. V.; Baglaeva, E. A.

    2017-01-01

    The article considers the principles of building the analytical and simulation model of the human operator and the industrial control system hardware and software. E-networks as the extension of Petri nets are used as the mathematical apparatus. This approach allows simulating complex parallel distributed processes in human-machine systems. The structural and hierarchical approach is used as the building method for the mathematical model of the human operator. The upper level of the human operator is represented by the logical dynamic model of decision making based on E-networks. The lower level reflects psychophysiological characteristics of the human-operator.

  20. Dem Reconstruction Using Light Field and Bidirectional Reflectance Function from Multi-View High Resolution Spatial Images

    NASA Astrophysics Data System (ADS)

    de Vieilleville, F.; Ristorcelli, T.; Delvit, J.-M.

    2016-06-01

    This paper presents a method for dense DSM reconstruction from high resolution, mono sensor, passive imagery, spatial panchromatic image sequence. The interest of our approach is four-fold. Firstly, we extend the core of light field approaches using an explicit BRDF model from the Image Synthesis community which is more realistic than the Lambertian model. The chosen model is the Cook-Torrance BRDF which enables us to model rough surfaces with specular effects using specific material parameters. Secondly, we extend light field approaches for non-pinhole sensors and non-rectilinear motion by using a proper geometric transformation on the image sequence. Thirdly, we produce a 3D volume cost embodying all the tested possible heights and filter it using simple methods such as Volume Cost Filtering or variational optimal methods. We have tested our method on a Pleiades image sequence on various locations with dense urban buildings and report encouraging results with respect to classic multi-label methods such as MIC-MAC, or more recent pipelines such as S2P. Last but not least, our method also produces maps of material parameters on the estimated points, allowing us to simplify building classification or road extraction.

  1. Automated Measurement and Verification and Innovative Occupancy Detection Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, Phillip; Bruce, Nordman; Piette, Mary Ann

    In support of DOE’s sensors and controls research, the goal of this project is to move toward integrated building to grid systems by building on previous work to develop and demonstrate a set of load characterization measurement and evaluation tools that are envisioned to be part of a suite of applications for transactive efficient buildings, built upon data-driven load characterization and prediction models. This will include the ability to include occupancy data in the models, plus data collection and archival methods to include different types of occupancy data with existing networks and a taxonomy for naming these data within amore » Volttron agent platform.« less

  2. Creativity of Junior High School’s Students in Designing Earthquake Resistant Buildings

    NASA Astrophysics Data System (ADS)

    Fitriani, D. N.; Kaniawati, I.; Ramalis, T. R.

    2017-09-01

    This research was stimulated by the present the territory of Indonesia is largely an area prone to earthquakes and the issue that human resources and disaster response planning process is still less competent and not optimal. In addition, the construction of houses and public facilities has not been in accordance with earthquake-resistant building standards. This study aims to develop students’ creativity through earthquake resistant building model’s projects. The research method used is descriptive qualitative method. The sample is one of the 7th grades consisting of 32 students in one of the junior high schools, Indonesia. Data was collected using an observation sheets and student worksheet. Results showed that students’ creativity in designing earthquake resistant building models varies greatly and yields new solutions to solve problems.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Yan, Da; D'Oca, Simona

    Occupant behavior has significant impacts on building energy performance and occupant comfort. However, occupant behavior is not well understood and is often oversimplified in the building life cycle, due to its stochastic, diverse, complex, and interdisciplinary nature. The use of simplified methods or tools to quantify the impacts of occupant behavior in building performance simulations significantly contributes to performance gaps between simulated models and actual building energy consumption. Therefore, it is crucial to understand occupant behavior in a comprehensive way, integrating qualitative approaches and data- and model-driven quantitative approaches, and employing appropriate tools to guide the design and operation ofmore » low-energy residential and commercial buildings that integrate technological and human dimensions. This paper presents ten questions, highlighting some of the most important issues regarding concepts, applications, and methodologies in occupant behavior research. The proposed questions and answers aim to provide insights into occupant behavior for current and future researchers, designers, and policy makers, and most importantly, to inspire innovative research and applications to increase energy efficiency and reduce energy use in buildings.« less

  4. Energy Modeling for the Artisan Food Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goel, Supriya

    2013-05-01

    The Artisan Food Center is a 6912 sq.ft food processing plant located in Dayton, Washington. PNNL was contacted by Strecker Engineering to assist with the building’s energy analysis as a part of the project’s U.S. Green Building Council’s Leadership in Energy and Environmental Design (LEED) submittal requirements. The project is aiming for LEED Silver certification, one of the prerequisites to which is a whole building energy model to demonstrate compliance with American Society of Heating Refrigeration and Air Conditioning Engineers (ASHRAE) 90.1 2007 Appendix G, Performance Rating Method. The building incorporates a number of energy efficiency measures as part ofmore » its design and the energy analysis aimed at providing Strecker Engineering with the know-how of developing an energy model for the project as well as an estimate of energy savings of the proposed design over the baseline design, which could be used to document points in the LEED documentation. This report documents the ASHRAE 90.1 2007 baseline model design, the proposed model design, the modeling assumptions and procedures as well as the energy savings results in order to inform the Strecker Engineering team on a possible whole building energy model.« less

  5. A Research Synthesis of the Evaluation Capacity Building Literature

    ERIC Educational Resources Information Center

    Labin, Susan N.; Duffy, Jennifer L.; Meyers, Duncan C.; Wandersman, Abraham; Lesesne, Catherine A.

    2012-01-01

    The continuously growing demand for program results has produced an increased need for evaluation capacity building (ECB). The "Integrative ECB Model" was developed to integrate concepts from existing ECB theory literature and to structure a synthesis of the empirical ECB literature. The study used a broad-based research synthesis method with…

  6. Self-calibrating models for dynamic monitoring and diagnosis

    NASA Technical Reports Server (NTRS)

    Kuipers, Benjamin

    1994-01-01

    The present goal in qualitative reasoning is to develop methods for automatically building qualitative and semiquantitative models of dynamic systems and to use them for monitoring and fault diagnosis. The qualitative approach to modeling provides a guarantee of coverage while our semiquantitative methods support convergence toward a numerical model as observations are accumulated. We have developed and applied methods for automatic creation of qualitative models, developed two methods for obtaining tractable results on problems that were previously intractable for qualitative simulation, and developed more powerful methods for learning semiquantitative models from observations and deriving semiquantitative predictions from them. With these advances, qualitative reasoning comes significantly closer to realizing its aims as a practical engineering method.

  7. Estimating Building Age with 3d GIS

    NASA Astrophysics Data System (ADS)

    Biljecki, F.; Sindram, M.

    2017-10-01

    Building datasets (e.g. footprints in OpenStreetMap and 3D city models) are becoming increasingly available worldwide. However, the thematic (attribute) aspect is not always given attention, as many of such datasets are lacking in completeness of attributes. A prominent attribute of buildings is the year of construction, which is useful for some applications, but its availability may be scarce. This paper explores the potential of estimating the year of construction (or age) of buildings from other attributes using random forest regression. The developed method has a two-fold benefit: enriching datasets and quality control (verification of existing attributes). Experiments are carried out on a semantically rich LOD1 dataset of Rotterdam in the Netherlands using 9 attributes. The results are mixed: the accuracy in the estimation of building age depends on the available information used in the regression model. In the best scenario we have achieved predictions with an RMSE of 11 years, but in more realistic situations with limited knowledge about buildings the error is much larger (RMSE = 26 years). Hence the main conclusion of the paper is that inferring building age with 3D city models is possible to a certain extent because it reveals the approximate period of construction, but precise estimations remain a difficult task.

  8. Semiautomated model building for RNA crystallography using a directed rotameric approach.

    PubMed

    Keating, Kevin S; Pyle, Anna Marie

    2010-05-04

    Structured RNA molecules play essential roles in a variety of cellular processes; however, crystallographic studies of such RNA molecules present a large number of challenges. One notable complication arises from the low resolutions typical of RNA crystallography, which results in electron density maps that are imprecise and difficult to interpret. This problem is exacerbated by the lack of computational tools for RNA modeling, as many of the techniques commonly used in protein crystallography have no equivalents for RNA structure. This leads to difficulty and errors in the model building process, particularly in modeling of the RNA backbone, which is highly error prone due to the large number of variable torsion angles per nucleotide. To address this, we have developed a method for accurately building the RNA backbone into maps of intermediate or low resolution. This method is semiautomated, as it requires a crystallographer to first locate phosphates and bases in the electron density map. After this initial trace of the molecule, however, an accurate backbone structure can be built without further user intervention. To accomplish this, backbone conformers are first predicted using RNA pseudotorsions and the base-phosphate perpendicular distance. Detailed backbone coordinates are then calculated to conform both to the predicted conformer and to the previously located phosphates and bases. This technique is shown to produce accurate backbone structure even when starting from imprecise phosphate and base coordinates. A program implementing this methodology is currently available, and a plugin for the Coot model building program is under development.

  9. Local-aggregate modeling for big data via distributed optimization: Applications to neuroimaging.

    PubMed

    Hu, Yue; Allen, Genevera I

    2015-12-01

    Technological advances have led to a proliferation of structured big data that have matrix-valued covariates. We are specifically motivated to build predictive models for multi-subject neuroimaging data based on each subject's brain imaging scans. This is an ultra-high-dimensional problem that consists of a matrix of covariates (brain locations by time points) for each subject; few methods currently exist to fit supervised models directly to this tensor data. We propose a novel modeling and algorithmic strategy to apply generalized linear models (GLMs) to this massive tensor data in which one set of variables is associated with locations. Our method begins by fitting GLMs to each location separately, and then builds an ensemble by blending information across locations through regularization with what we term an aggregating penalty. Our so called, Local-Aggregate Model, can be fit in a completely distributed manner over the locations using an Alternating Direction Method of Multipliers (ADMM) strategy, and thus greatly reduces the computational burden. Furthermore, we propose to select the appropriate model through a novel sequence of faster algorithmic solutions that is similar to regularization paths. We will demonstrate both the computational and predictive modeling advantages of our methods via simulations and an EEG classification problem. © 2015, The International Biometric Society.

  10. A Comparison Study for DNA Motif Modeling on Protein Binding Microarray.

    PubMed

    Wong, Ka-Chun; Li, Yue; Peng, Chengbin; Wong, Hau-San

    2016-01-01

    Transcription factor binding sites (TFBSs) are relatively short (5-15 bp) and degenerate. Identifying them is a computationally challenging task. In particular, protein binding microarray (PBM) is a high-throughput platform that can measure the DNA binding preference of a protein in a comprehensive and unbiased manner; for instance, a typical PBM experiment can measure binding signal intensities of a protein to all possible DNA k-mers (k = 8∼10). Since proteins can often bind to DNA with different binding intensities, one of the major challenges is to build TFBS (also known as DNA motif) models which can fully capture the quantitative binding affinity data. To learn DNA motif models from the non-convex objective function landscape, several optimization methods are compared and applied to the PBM motif model building problem. In particular, representative methods from different optimization paradigms have been chosen for modeling performance comparison on hundreds of PBM datasets. The results suggest that the multimodal optimization methods are very effective for capturing the binding preference information from PBM data. In particular, we observe a general performance improvement if choosing di-nucleotide modeling over mono-nucleotide modeling. In addition, the models learned by the best-performing method are applied to two independent applications: PBM probe rotation testing and ChIP-Seq peak sequence prediction, demonstrating its biological applicability.

  11. Ten questions concerning occupant behavior in buildings: The big picture

    DOE PAGES

    Hong, Tianzhen; Yan, Da; D'Oca, Simona; ...

    2016-12-27

    Occupant behavior has significant impacts on building energy performance and occupant comfort. However, occupant behavior is not well understood and is often oversimplified in the building life cycle, due to its stochastic, diverse, complex, and interdisciplinary nature. The use of simplified methods or tools to quantify the impacts of occupant behavior in building performance simulations significantly contributes to performance gaps between simulated models and actual building energy consumption. Therefore, it is crucial to understand occupant behavior in a comprehensive way, integrating qualitative approaches and data- and model-driven quantitative approaches, and employing appropriate tools to guide the design and operation ofmore » low-energy residential and commercial buildings that integrate technological and human dimensions. This paper presents ten questions, highlighting some of the most important issues regarding concepts, applications, and methodologies in occupant behavior research. The proposed questions and answers aim to provide insights into occupant behavior for current and future researchers, designers, and policy makers, and most importantly, to inspire innovative research and applications to increase energy efficiency and reduce energy use in buildings.« less

  12. Assessment of energy and economic performance of office building models: a case study

    NASA Astrophysics Data System (ADS)

    Song, X. Y.; Ye, C. T.; Li, H. S.; Wang, X. L.; Ma, W. B.

    2016-08-01

    Energy consumption of building accounts for more than 37.3% of total energy consumption while the proportion of energy-saving buildings is just 5% in China. In this paper, in order to save potential energy, an office building in Southern China was selected as a test example for energy consumption characteristics. The base building model was developed by TRNSYS software and validated against the recorded data from the field work in six days out of August-September in 2013. Sensitivity analysis was conducted for energy performance of building envelope retrofitting; five envelope parameters were analyzed for assessing the thermal responses. Results indicated that the key sensitivity factors were obtained for the heat-transfer coefficient of exterior walls (U-wall), infiltration rate and shading coefficient (SC), of which the sum sensitivity factor was about 89.32%. In addition, the results were evaluated in terms of energy and economic analysis. The analysis of sensitivity validated against some important results of previous studies. On the other hand, the cost-effective method improved the efficiency of investment management in building energy.

  13. Building Development Monitoring in Multitemporal Remotely Sensed Image Pairs with Stochastic Birth-Death Dynamics.

    PubMed

    Benedek, C; Descombes, X; Zerubia, J

    2012-01-01

    In this paper, we introduce a new probabilistic method which integrates building extraction with change detection in remotely sensed image pairs. A global optimization process attempts to find the optimal configuration of buildings, considering the observed data, prior knowledge, and interactions between the neighboring building parts. We present methodological contributions in three key issues: 1) We implement a novel object-change modeling approach based on Multitemporal Marked Point Processes, which simultaneously exploits low-level change information between the time layers and object-level building description to recognize and separate changed and unaltered buildings. 2) To answer the challenges of data heterogeneity in aerial and satellite image repositories, we construct a flexible hierarchical framework which can create various building appearance models from different elementary feature-based modules. 3) To simultaneously ensure the convergence, optimality, and computation complexity constraints raised by the increased data quantity, we adopt the quick Multiple Birth and Death optimization technique for change detection purposes, and propose a novel nonuniform stochastic object birth process which generates relevant objects with higher probability based on low-level image features.

  14. First Prismatic Building Model Reconstruction from Tomosar Point Clouds

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Shahzad, M.; Zhu, X.

    2016-06-01

    This paper demonstrates for the first time the potential of explicitly modelling the individual roof surfaces to reconstruct 3-D prismatic building models using spaceborne tomographic synthetic aperture radar (TomoSAR) point clouds. The proposed approach is modular and works as follows: it first extracts the buildings via DSM generation and cutting-off the ground terrain. The DSM is smoothed using BM3D denoising method proposed in (Dabov et al., 2007) and a gradient map of the smoothed DSM is generated based on height jumps. Watershed segmentation is then adopted to oversegment the DSM into different regions. Subsequently, height and polygon complexity constrained merging is employed to refine (i.e., to reduce) the retrieved number of roof segments. Coarse outline of each roof segment is then reconstructed and later refined using quadtree based regularization plus zig-zag line simplification scheme. Finally, height is associated to each refined roof segment to obtain the 3-D prismatic model of the building. The proposed approach is illustrated and validated over a large building (convention center) in the city of Las Vegas using TomoSAR point clouds generated from a stack of 25 images using Tomo-GENESIS software developed at DLR.

  15. Development of new methodologies for evaluating the energy performance of new commercial buildings

    NASA Astrophysics Data System (ADS)

    Song, Suwon

    The concept of Measurement and Verification (M&V) of a new building continues to become more important because efficient design alone is often not sufficient to deliver an efficient building. Simulation models that are calibrated to measured data can be used to evaluate the energy performance of new buildings if they are compared to energy baselines such as similar buildings, energy codes, and design standards. Unfortunately, there is a lack of detailed M&V methods and analysis methods to measure energy savings from new buildings that would have hypothetical energy baselines. Therefore, this study developed and demonstrated several new methodologies for evaluating the energy performance of new commercial buildings using a case-study building in Austin, Texas. First, three new M&V methods were developed to enhance the previous generic M&V framework for new buildings, including: (1) The development of a method to synthesize weather-normalized cooling energy use from a correlation of Motor Control Center (MCC) electricity use when chilled water use is unavailable, (2) The development of an improved method to analyze measured solar transmittance against incidence angle for sample glazing using different solar sensor types, including Eppley PSP and Li-Cor sensors, and (3) The development of an improved method to analyze chiller efficiency and operation at part-load conditions. Second, three new calibration methods were developed and analyzed, including: (1) A new percentile analysis added to the previous signature method for use with a DOE-2 calibration, (2) A new analysis to account for undocumented exhaust air in DOE-2 calibration, and (3) An analysis of the impact of synthesized direct normal solar radiation using the Erbs correlation on DOE-2 simulation. Third, an analysis of the actual energy savings compared to three different energy baselines was performed, including: (1) Energy Use Index (EUI) comparisons with sub-metered data, (2) New comparisons against Standards 90.1-1989 and 90.1-2001, and (3) A new evaluation of the performance of selected Energy Conservation Design Measures (ECDMs). Finally, potential energy savings were also simulated from selected improvements, including: minimum supply air flow, undocumented exhaust air, and daylighting.

  16. True Concurrent Thermal Engineering Integrating CAD Model Building with Finite Element and Finite Difference Methods

    NASA Technical Reports Server (NTRS)

    Panczak, Tim; Ring, Steve; Welch, Mark

    1999-01-01

    Thermal engineering has long been left out of the concurrent engineering environment dominated by CAD (computer aided design) and FEM (finite element method) software. Current tools attempt to force the thermal design process into an environment primarily created to support structural analysis, which results in inappropriate thermal models. As a result, many thermal engineers either build models "by hand" or use geometric user interfaces that are separate from and have little useful connection, if any, to CAD and FEM systems. This paper describes the development of a new thermal design environment called the Thermal Desktop. This system, while fully integrated into a neutral, low cost CAD system, and which utilizes both FEM and FD methods, does not compromise the needs of the thermal engineer. Rather, the features needed for concurrent thermal analysis are specifically addressed by combining traditional parametric surface based radiation and FD based conduction modeling with CAD and FEM methods. The use of flexible and familiar temperature solvers such as SINDA/FLUINT (Systems Improved Numerical Differencing Analyzer/Fluid Integrator) is retained.

  17. Workflow Agents vs. Expert Systems: Problem Solving Methods in Work Systems Design

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten; Seah, Chin

    2009-01-01

    During the 1980s, a community of artificial intelligence researchers became interested in formalizing problem solving methods as part of an effort called "second generation expert systems" (2nd GES). How do the motivations and results of this research relate to building tools for the workplace today? We provide an historical review of how the theory of expertise has developed, a progress report on a tool for designing and implementing model-based automation (Brahms), and a concrete example how we apply 2nd GES concepts today in an agent-based system for space flight operations (OCAMS). Brahms incorporates an ontology for modeling work practices, what people are doing in the course of a day, characterized as "activities." OCAMS was developed using a simulation-to-implementation methodology, in which a prototype tool was embedded in a simulation of future work practices. OCAMS uses model-based methods to interactively plan its actions and keep track of the work to be done. The problem solving methods of practice are interactive, employing reasoning for and through action in the real world. Analogously, it is as if a medical expert system were charged not just with interpreting culture results, but actually interacting with a patient. Our perspective shifts from building a "problem solving" (expert) system to building an actor in the world. The reusable components in work system designs include entire "problem solvers" (e.g., a planning subsystem), interoperability frameworks, and workflow agents that use and revise models dynamically in a network of people and tools. Consequently, the research focus shifts so "problem solving methods" include ways of knowing that models do not fit the world, and ways of interacting with other agents and people to gain or verify information and (ultimately) adapt rules and procedures to resolve problematic situations.

  18. Moving target detection method based on improved Gaussian mixture model

    NASA Astrophysics Data System (ADS)

    Ma, J. Y.; Jie, F. R.; Hu, Y. J.

    2017-07-01

    Gaussian Mixture Model is often employed to build background model in background difference methods for moving target detection. This paper puts forward an adaptive moving target detection algorithm based on improved Gaussian Mixture Model. According to the graylevel convergence for each pixel, adaptively choose the number of Gaussian distribution to learn and update background model. Morphological reconstruction method is adopted to eliminate the shadow.. Experiment proved that the proposed method not only has good robustness and detection effect, but also has good adaptability. Even for the special cases when the grayscale changes greatly and so on, the proposed method can also make outstanding performance.

  19. Simulation-based coefficients for adjusting climate impact on energy consumption of commercial buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Na; Makhmalbaf, Atefe; Srivastava, Viraj

    This paper presents a new technique for and the results of normalizing building energy consumption to enable a fair comparison among various types of buildings located near different weather stations across the U.S. The method was developed for the U.S. Building Energy Asset Score, a whole-building energy efficiency rating system focusing on building envelope, mechanical systems, and lighting systems. The Asset Score is calculated based on simulated energy use under standard operating conditions. Existing weather normalization methods such as those based on heating and cooling degrees days are not robust enough to adjust all climatic factors such as humidity andmore » solar radiation. In this work, over 1000 sets of climate coefficients were developed to separately adjust building heating, cooling, and fan energy use at each weather station in the United States. This paper also presents a robust, standardized weather station mapping based on climate similarity rather than choosing the closest weather station. This proposed simulated-based climate adjustment was validated through testing on several hundreds of thousands of modeled buildings. Results indicated the developed climate coefficients can isolate and adjust for the impacts of local climate for asset rating.« less

  20. Rosetta Structure Prediction as a Tool for Solving Difficult Molecular Replacement Problems.

    PubMed

    DiMaio, Frank

    2017-01-01

    Molecular replacement (MR), a method for solving the crystallographic phase problem using phases derived from a model of the target structure, has proven extremely valuable, accounting for the vast majority of structures solved by X-ray crystallography. However, when the resolution of data is low, or the starting model is very dissimilar to the target protein, solving structures via molecular replacement may be very challenging. In recent years, protein structure prediction methodology has emerged as a powerful tool in model building and model refinement for difficult molecular replacement problems. This chapter describes some of the tools available in Rosetta for model building and model refinement specifically geared toward difficult molecular replacement cases.

  1. Polynomial Chaos decomposition applied to stochastic dosimetry: study of the influence of the magnetic field orientation on the pregnant woman exposure at 50 Hz.

    PubMed

    Liorni, I; Parazzini, M; Fiocchi, S; Guadagnin, V; Ravazzani, P

    2014-01-01

    Polynomial Chaos (PC) is a decomposition method used to build a meta-model, which approximates the unknown response of a model. In this paper the PC method is applied to the stochastic dosimetry to assess the variability of human exposure due to the change of the orientation of the B-field vector respect to the human body. In detail, the analysis of the pregnant woman exposure at 7 months of gestational age is carried out, to build-up a statistical meta-model of the induced electric field for each fetal tissue and in the fetal whole-body by means of the PC expansion as a function of the B-field orientation, considering a uniform exposure at 50 Hz.

  2. Virtual reality technique to assist measurement of degree of shaking of two minarets of an ancient building

    NASA Astrophysics Data System (ADS)

    Homainejad, Amir S.; Satari, Mehran

    2000-05-01

    VR is possible which brings users to the reality by computer and VE is a simulated world which takes users to any points and directions of the object. VR and VE can be very useful if accurate and precise data are sued, and allows users to work with realistic model. Photogrammetry is a technique which is able to collect and provide accurate and precise data for building 3D model in a computer. Data can be collected from various sensor and cameras, and methods of data collector are vary based on the method of image acquiring. Indeed VR includes real-time graphics, 3D model, and display and it has application in the entertainment industry, flight simulators, industrial design.

  3. Multimodal inspection in power engineering and building industries: new challenges and solutions

    NASA Astrophysics Data System (ADS)

    Kujawińska, Małgorzata; Malesa, Marcin; Malowany, Krzysztof

    2013-09-01

    Recently the demand and number of applications of full-field, optical measurement methods based on noncoherent light sources increased significantly. They include traditional image processing, thermovision, digital image correlation (DIC) and structured light methods. However, there are still numerous challenges connected with implementation of these methods to in-situ, long-term monitoring in industrial, civil engineering and cultural heritage applications, multimodal measurements of a variety of object features or simply adopting instruments to work in hard environmental conditions. In this paper we focus on 3D DIC method and present its enhancements concerning software modifications (new visualization methods and a method for automatic merging of data distributed in time) and hardware improvements. The modified 3D DIC system combined with infrared camera system is applied in many interesting cases: measurements of boiler drum during annealing and of pipelines in heat power stations and monitoring of different building steel struts at construction site and validation of numerical models of large building structures constructed of graded metal plate arches.

  4. Sequence2Vec: a novel embedding approach for modeling transcription factor binding affinity landscape.

    PubMed

    Dai, Hanjun; Umarov, Ramzan; Kuwahara, Hiroyuki; Li, Yu; Song, Le; Gao, Xin

    2017-11-15

    An accurate characterization of transcription factor (TF)-DNA affinity landscape is crucial to a quantitative understanding of the molecular mechanisms underpinning endogenous gene regulation. While recent advances in biotechnology have brought the opportunity for building binding affinity prediction methods, the accurate characterization of TF-DNA binding affinity landscape still remains a challenging problem. Here we propose a novel sequence embedding approach for modeling the transcription factor binding affinity landscape. Our method represents DNA binding sequences as a hidden Markov model which captures both position specific information and long-range dependency in the sequence. A cornerstone of our method is a novel message passing-like embedding algorithm, called Sequence2Vec, which maps these hidden Markov models into a common nonlinear feature space and uses these embedded features to build a predictive model. Our method is a novel combination of the strength of probabilistic graphical models, feature space embedding and deep learning. We conducted comprehensive experiments on over 90 large-scale TF-DNA datasets which were measured by different high-throughput experimental technologies. Sequence2Vec outperforms alternative machine learning methods as well as the state-of-the-art binding affinity prediction methods. Our program is freely available at https://github.com/ramzan1990/sequence2vec. xin.gao@kaust.edu.sa or lsong@cc.gatech.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  5. Federating Cyber and Physical Models for Event-Driven Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Pawlowski, Ronald A.; Sridhar, Siddharth

    The purpose of this paper is to describe a novel method to improve electric power system monitoring and control software application interoperability. This method employs the concept of federation, which is defined as the use of existing models that represent aspects of a system in specific domains (such as physical and cyber security domains) and building interface to link all of domain models.

  6. Model-based optimal design of active cool thermal energy storage for maximal life-cycle cost saving from demand management in commercial buildings

    DOE PAGES

    Cui, Borui; Gao, Dian-ce; Xiao, Fu; ...

    2016-12-23

    This article provides a method in comprehensive evaluation of cost-saving potential of active cool thermal energy storage (CTES) integrated with HVAC system for demand management in non-residential building. The active storage is beneficial by shifting peak demand for peak load management (PLM) as well as providing longer duration and larger capacity of demand response (DR). In this research, a model-based optimal design method using genetic algorithm is developed to optimize the capacity of active CTES aiming for maximizing the life-cycle cost saving concerning capital cost associated with storage capacity as well as incentives from both fast DR and PLM. Inmore » the method, the active CTES operates under a fast DR control strategy during DR events while under the storage-priority operation mode to shift peak demand during normal days. The optimal storage capacities, maximum annual net cost saving and corresponding power reduction set-points during DR event are obtained by using the proposed optimal design method. Lastly, this research provides guidance in comprehensive evaluation of cost-saving potential of CTES integrated with HVAC system for building demand management including both fast DR and PLM.« less

  7. Model-based optimal design of active cool thermal energy storage for maximal life-cycle cost saving from demand management in commercial buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Borui; Gao, Dian-ce; Xiao, Fu

    This article provides a method in comprehensive evaluation of cost-saving potential of active cool thermal energy storage (CTES) integrated with HVAC system for demand management in non-residential building. The active storage is beneficial by shifting peak demand for peak load management (PLM) as well as providing longer duration and larger capacity of demand response (DR). In this research, a model-based optimal design method using genetic algorithm is developed to optimize the capacity of active CTES aiming for maximizing the life-cycle cost saving concerning capital cost associated with storage capacity as well as incentives from both fast DR and PLM. Inmore » the method, the active CTES operates under a fast DR control strategy during DR events while under the storage-priority operation mode to shift peak demand during normal days. The optimal storage capacities, maximum annual net cost saving and corresponding power reduction set-points during DR event are obtained by using the proposed optimal design method. Lastly, this research provides guidance in comprehensive evaluation of cost-saving potential of CTES integrated with HVAC system for building demand management including both fast DR and PLM.« less

  8. Effective Energy Simulation and Optimal Design of Side-lit Buildings with Venetian Blinds

    NASA Astrophysics Data System (ADS)

    Cheng, Tian

    Venetian blinds are popularly used in buildings to control the amount of incoming daylight for improving visual comfort and reducing heat gains in air-conditioning systems. Studies have shown that the proper design and operation of window systems could result in significant energy savings in both lighting and cooling. However, there is no convenient computer tool that allows effective and efficient optimization of the envelope of side-lit buildings with blinds now. Three computer tools, Adeline, DOE2 and EnergyPlus widely used for the above-mentioned purpose have been experimentally examined in this study. Results indicate that the two former tools give unacceptable accuracy due to unrealistic assumptions adopted while the last one may generate large errors in certain conditions. Moreover, current computer tools have to conduct hourly energy simulations, which are not necessary for life-cycle energy analysis and optimal design, to provide annual cooling loads. This is not computationally efficient, particularly not suitable for optimal designing a building at initial stage because the impacts of many design variations and optional features have to be evaluated. A methodology is therefore developed for efficient and effective thermal and daylighting simulations and optimal design of buildings with blinds. Based on geometric optics and radiosity method, a mathematical model is developed to reasonably simulate the daylighting behaviors of venetian blinds. Indoor illuminance at any reference point can be directly and efficiently computed. They have been validated with both experiments and simulations with Radiance. Validation results show that indoor illuminances computed by the new models agree well with the measured data, and the accuracy provided by them is equivalent to that of Radiance. The computational efficiency of the new models is much higher than that of Radiance as well as EnergyPlus. Two new methods are developed for the thermal simulation of buildings. A fast Fourier transform (FFT) method is presented to avoid the root-searching process in the inverse Laplace transform of multilayered walls. Generalized explicit FFT formulae for calculating the discrete Fourier transform (DFT) are developed for the first time. They can largely facilitate the implementation of FFT. The new method also provides a basis for generating the symbolic response factors. Validation simulations show that it can generate the response factors as accurate as the analytical solutions. The second method is for direct estimation of annual or seasonal cooling loads without the need for tedious hourly energy simulations. It is validated by hourly simulation results with DOE2. Then symbolic long-term cooling load can be created by combining the two methods with thermal network analysis. The symbolic long-term cooling load can keep the design parameters of interest as symbols, which is particularly useful for the optimal design and sensitivity analysis. The methodology is applied to an office building in Hong Kong for the optimal design of building envelope. Design variables such as window-to-wall ratio, building orientation, and glazing optical and thermal properties are included in the study. Results show that the selected design values could significantly impact the energy performance of windows, and the optimal design of side-lit buildings could greatly enhance energy savings. The application example also demonstrates that the developed methodology significantly facilitates the optimal building design and sensitivity analysis, and leads to high computational efficiency.

  9. Applied Distributed Model Predictive Control for Energy Efficient Buildings and Ramp Metering

    NASA Astrophysics Data System (ADS)

    Koehler, Sarah Muraoka

    Industrial large-scale control problems present an interesting algorithmic design challenge. A number of controllers must cooperate in real-time on a network of embedded hardware with limited computing power in order to maximize system efficiency while respecting constraints and despite communication delays. Model predictive control (MPC) can automatically synthesize a centralized controller which optimizes an objective function subject to a system model, constraints, and predictions of disturbance. Unfortunately, the computations required by model predictive controllers for large-scale systems often limit its industrial implementation only to medium-scale slow processes. Distributed model predictive control (DMPC) enters the picture as a way to decentralize a large-scale model predictive control problem. The main idea of DMPC is to split the computations required by the MPC problem amongst distributed processors that can compute in parallel and communicate iteratively to find a solution. Some popularly proposed solutions are distributed optimization algorithms such as dual decomposition and the alternating direction method of multipliers (ADMM). However, these algorithms ignore two practical challenges: substantial communication delays present in control systems and also problem non-convexity. This thesis presents two novel and practically effective DMPC algorithms. The first DMPC algorithm is based on a primal-dual active-set method which achieves fast convergence, making it suitable for large-scale control applications which have a large communication delay across its communication network. In particular, this algorithm is suited for MPC problems with a quadratic cost, linear dynamics, forecasted demand, and box constraints. We measure the performance of this algorithm and show that it significantly outperforms both dual decomposition and ADMM in the presence of communication delay. The second DMPC algorithm is based on an inexact interior point method which is suited for nonlinear optimization problems. The parallel computation of the algorithm exploits iterative linear algebra methods for the main linear algebra computations in the algorithm. We show that the splitting of the algorithm is flexible and can thus be applied to various distributed platform configurations. The two proposed algorithms are applied to two main energy and transportation control problems. The first application is energy efficient building control. Buildings represent 40% of energy consumption in the United States. Thus, it is significant to improve the energy efficiency of buildings. The goal is to minimize energy consumption subject to the physics of the building (e.g. heat transfer laws), the constraints of the actuators as well as the desired operating constraints (thermal comfort of the occupants), and heat load on the system. In this thesis, we describe the control systems of forced air building systems in practice. We discuss the "Trim and Respond" algorithm which is a distributed control algorithm that is used in practice, and show that it performs similarly to a one-step explicit DMPC algorithm. Then, we apply the novel distributed primal-dual active-set method and provide extensive numerical results for the building MPC problem. The second main application is the control of ramp metering signals to optimize traffic flow through a freeway system. This application is particularly important since urban congestion has more than doubled in the past few decades. The ramp metering problem is to maximize freeway throughput subject to freeway dynamics (derived from mass conservation), actuation constraints, freeway capacity constraints, and predicted traffic demand. In this thesis, we develop a hybrid model predictive controller for ramp metering that is guaranteed to be persistently feasible and stable. This contrasts to previous work on MPC for ramp metering where such guarantees are absent. We apply a smoothing method to the hybrid model predictive controller and apply the inexact interior point method to this nonlinear non-convex ramp metering problem.

  10. Building Interoperable FHIR-Based Vocabulary Mapping Services: A Case Study of OHDSI Vocabularies and Mappings.

    PubMed

    Jiang, Guoqian; Kiefer, Richard; Prud'hommeaux, Eric; Solbrig, Harold R

    2017-01-01

    The OHDSI Common Data Model (CDM) is a deep information model, in which its vocabulary component plays a critical role in enabling consistent coding and query of clinical data. The objective of the study is to create methods and tools to expose the OHDSI vocabularies and mappings as the vocabulary mapping services using two HL7 FHIR core terminology resources ConceptMap and ValueSet. We discuss the benefits and challenges in building the FHIR-based terminology services.

  11. Construction of high-rise building with underground parking in Moscow

    NASA Astrophysics Data System (ADS)

    Ilyichev, Vyacheslav; Nikiforova, Nadezhda; Konnov, Artem

    2018-03-01

    Paper presents results of scientific support to construction of unique residential building 108 m high with one storey underground part under high-rise section and 3-storey underground parking connected by underground passage. On-site soils included anthropogenic soil, clayey soils soft-stiff, saturated sands of varied grain coarseness. Design of retaining structure and support system for high-rise part excavation was developed. It suggested installation of steel pipes and struts. Construction of adjacent 3-storey underground parking by "Moscow method" is described in the paper. This method involves implementation of retaining wall consisted of prefabricated panels, truss structures (used as struts) and reinforced concrete slabs. Also design and construction technology is provided for foundations consisted of bored piles 800 MM in diameter joined by slab with base widening diameter of 1500 MM. Experiment results of static and dynamic load testing (ELDY method) are considered. Geotechnical monitoring data of adjacent building and utility systems settlement caused by construction of presented high-rise building were compared to numerical modelling results, predicted and permissible values.

  12. Using Evaluation To Build Organizational Performance and Learning Capability: A Strategy and a Method.

    ERIC Educational Resources Information Center

    Brinkerhoff, Robert O.; Dressler, Dennis

    2002-01-01

    Discusses the causes of variability of training impact and problems with previous models for evaluation of training. Presents the Success Case Evaluation approach as a way to measure the impact of training and build learning capability to increase the business value of training by focusing on a small number of trainees. (Author/LRW)

  13. Business Models for Training and Performance Improvement Departments

    ERIC Educational Resources Information Center

    Carliner, Saul

    2004-01-01

    Although typically applied to entire enterprises, the concept of business models applies to training and performance improvement groups. Business models are "the method by which firm[s] build and use [their] resources to offer.. value." Business models affect the types of projects, services offered, skills required, business processes, and type of…

  14. Building Simple Hidden Markov Models. Classroom Notes

    ERIC Educational Resources Information Center

    Ching, Wai-Ki; Ng, Michael K.

    2004-01-01

    Hidden Markov models (HMMs) are widely used in bioinformatics, speech recognition and many other areas. This note presents HMMs via the framework of classical Markov chain models. A simple example is given to illustrate the model. An estimation method for the transition probabilities of the hidden states is also discussed.

  15. Novel Methods to Explore Building Energy Sensitivity to Climate and Heat Waves Using PNNL's BEND Model

    NASA Astrophysics Data System (ADS)

    Burleyson, C. D.; Voisin, N.; Taylor, T.; Xie, Y.; Kraucunas, I.

    2017-12-01

    The DOE's Pacific Northwest National Laboratory (PNNL) has been developing the Building ENergy Demand (BEND) model to simulate energy usage in residential and commercial buildings responding to changes in weather, climate, population, and building technologies. At its core, BEND is a mechanism to aggregate EnergyPlus simulations of a large number of individual buildings with a diversity of characteristics over large spatial scales. We have completed a series of experiments to explore methods to calibrate the BEND model, measure its ability to capture interannual variability in energy demand due to weather using simulations of two distinct weather years, and understand the sensitivity to the number and location of weather stations used to force the model. The use of weather from "representative cities" reduces computational costs, but often fails to capture spatial heterogeneity that may be important for simulations aimed at understanding how building stocks respond to a changing climate (Fig. 1). We quantify the potential reduction in temperature and load biases from using an increasing number of weather stations across the western U.S., ranging from 8 to roughly 150. Using 8 stations results in an average absolute summertime temperature bias of 4.0°C. The mean absolute bias drops to 1.5°C using all available stations. Temperature biases of this magnitude translate to absolute summertime mean simulated load biases as high as 13.8%. Additionally, using only 8 representative weather stations can lead to a 20-40% bias of peak building loads under heat wave or cold snap conditions, a significant error for capacity expansion planners who may rely on these types of simulations. This analysis suggests that using 4 stations per climate zone may be sufficient for most purposes. Our novel approach, which requires no new EnergyPlus simulations, could be useful to other researchers designing or calibrating aggregate building model simulations - particularly those looking at the impact of future climate scenarios. Fig. 1. An example of temperature bias that results from using 8 representative weather stations: (a) surface temperature from NLDAS on 5-July 2008 at 2000 UTC; (b) temperature from 8 representative stations at the same time mapped to all counties within a given IECC climate zone; (c) the difference between (a) and (b).

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hao, He; Sun, Yannan; Carroll, Thomas E.

    We propose a coordination algorithm for cooperative power allocation among a collection of commercial buildings within a campus. We introduced thermal and power models of a typical commercial building Heating, Ventilation, and Air Conditioning (HVAC) system, and utilize model predictive control to characterize their power flexibility. The power allocation problem is formulated as a cooperative game using the Nash Bargaining Solution (NBS) concept, in which buildings collectively maximize the product of their utilities subject to their local flexibility constraints and a total power limit set by the campus coordinator. To solve the optimal allocation problem, a distributed protocol is designedmore » using dual decomposition of the Nash bargaining problem. Numerical simulations are performed to demonstrate the efficacy of our proposed allocation method« less

  17. Thermal environment analysis and energy conservation research of rural residence in cold regions of China based on BIM platform

    NASA Astrophysics Data System (ADS)

    Dong, J. Y.; Cheng, W.; Ma, C. P.; Xin, L. S.; Tan, Y. T.

    2017-06-01

    In order to study the issue of rural residential energy consumption in cold regions of China, modeled an architecture prototype based on BIM platform according to the affecting factors of rural residential thermal environment, and imported the virtual model which contains building information into energy analysis tools and chose the appropriate building orientation. By analyzing the energy consumption of the residential buildings with different enclosure structure forms, we designed the optimal energy-saving residence form. There is a certain application value of this method for researching the energy consumption and energy-saving design for the rural residence in cold regions of China.

  18. City model enrichment

    NASA Astrophysics Data System (ADS)

    Smart, Philip D.; Quinn, Jonathan A.; Jones, Christopher B.

    The combination of mobile communication technology with location and orientation aware digital cameras has introduced increasing interest in the exploitation of 3D city models for applications such as augmented reality and automated image captioning. The effectiveness of such applications is, at present, severely limited by the often poor quality of semantic annotation of the 3D models. In this paper, we show how freely available sources of georeferenced Web 2.0 information can be used for automated enrichment of 3D city models. Point referenced names of prominent buildings and landmarks mined from Wikipedia articles and from the OpenStreetMaps digital map and Geonames gazetteer have been matched to the 2D ground plan geometry of a 3D city model. In order to address the ambiguities that arise in the associations between these sources and the city model, we present procedures to merge potentially related buildings and implement fuzzy matching between reference points and building polygons. An experimental evaluation demonstrates the effectiveness of the presented methods.

  19. Estimated damage from the Cascadia Subduction Zone tsunami: A model comparisons using fragility curves

    NASA Astrophysics Data System (ADS)

    Wiebe, D. M.; Cox, D. T.; Chen, Y.; Weber, B. A.; Chen, Y.

    2012-12-01

    Building damage from a hypothetical Cascadia Subduction Zone tsunami was estimated using two methods and applied at the community scale. The first method applies proposed guidelines for a new ASCE 7 standard to calculate the flow depth, flow velocity, and momentum flux from a known runup limit and estimate of the total tsunami energy at the shoreline. This procedure is based on a potential energy budget, uses the energy grade line, and accounts for frictional losses. The second method utilized numerical model results from previous studies to determine maximum flow depth, velocity, and momentum flux throughout the inundation zone. The towns of Seaside and Canon Beach, Oregon, were selected for analysis due to the availability of existing data from previously published works. Fragility curves, based on the hydrodynamic features of the tsunami flow (inundation depth, flow velocity, and momentum flux) and proposed design standards from ASCE 7 were used to estimate the probability of damage to structures located within the inundations zone. The analysis proceeded at the parcel level, using tax-lot data to identify construction type (wood, steel, and reinforced-concrete) and age, which was used as a performance measure when applying the fragility curves and design standards. The overall probability of damage to civil buildings was integrated for comparison between the two methods, and also analyzed spatially for damage patterns, which could be controlled by local bathymetric features. The two methods were compared to assess the sensitivity of the results to the uncertainty in the input hydrodynamic conditions and fragility curves, and the potential advantages of each method discussed. On-going work includes coupling the results of building damage and vulnerability to an economic input output model. This model assesses trade between business sectors located inside and outside the induction zone, and is used to measure the impact to the regional economy. Results highlight critical businesses sectors and infrastructure critical to the economic recovery effort, which could be retrofitted or relocated to survive the event. The results of this study improve community understanding of the tsunami hazard for civil buildings.

  20. CCBuilder: an interactive web-based tool for building, designing and assessing coiled-coil protein assemblies.

    PubMed

    Wood, Christopher W; Bruning, Marc; Ibarra, Amaurys Á; Bartlett, Gail J; Thomson, Andrew R; Sessions, Richard B; Brady, R Leo; Woolfson, Derek N

    2014-11-01

    The ability to accurately model protein structures at the atomistic level underpins efforts to understand protein folding, to engineer natural proteins predictably and to design proteins de novo. Homology-based methods are well established and produce impressive results. However, these are limited to structures presented by and resolved for natural proteins. Addressing this problem more widely and deriving truly ab initio models requires mathematical descriptions for protein folds; the means to decorate these with natural, engineered or de novo sequences; and methods to score the resulting models. We present CCBuilder, a web-based application that tackles the problem for a defined but large class of protein structure, the α-helical coiled coils. CCBuilder generates coiled-coil backbones, builds side chains onto these frameworks and provides a range of metrics to measure the quality of the models. Its straightforward graphical user interface provides broad functionality that allows users to build and assess models, in which helix geometry, coiled-coil architecture and topology and protein sequence can be varied rapidly. We demonstrate the utility of CCBuilder by assembling models for 653 coiled-coil structures from the PDB, which cover >96% of the known coiled-coil types, and by generating models for rarer and de novo coiled-coil structures. CCBuilder is freely available, without registration, at http://coiledcoils.chm.bris.ac.uk/app/cc_builder/. © The Author 2014. Published by Oxford University Press.

  1. Model of slums rejuvenation in Telaga Tujuh village: the case of Langsa city, Aceh, Indonesia

    NASA Astrophysics Data System (ADS)

    Irwansyah, Mirza; Caisarina, Irin; Solehati, Dini

    2018-05-01

    Telaga Tujuh village is the only island inhabited compared to other islands in Langsa City, Aceh. Most of the houses are on stilts with very limited infrastructure such as lack of road facilities, local drainage, drinking water, wastewater, and garbage disposals. In determining the model of the slum settlements arrangement of Telaga Tujuh Village, there are some things to know that the characteristics of slums themselves and the causes of slum settlement. The aim of this study is to determine model of slum settlement arrangement that is suitable to be applied in the location. The method used is qualitative with sampling technique and qualitative analysis. To obtain the primary data used observation method, questionnaires, and interview. Secondary data obtained from agencies related to slum settlement arrangement. Based on characteristic analysis found that 365 residential buildings are irregular with the percentage of 100%, 365 residential buildings do not have safe drinking water supply, 365 residential buildings do not have waste water management. From the analysis shows that the appropriate model to be applied to Telaga Tujuh village is the rejuvenation model with the land consolidation system through the re-arrangement divided by two, 60% for the existing residential development and 40% for commercial development.

  2. Estimating rooftop solar technical potential across the US using a combination of GIS-based methods, lidar data, and statistical modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gagnon, Pieter; Margolis, Robert; Melius, Jennifer

    We provide a detailed estimate of the technical potential of rooftop solar photovoltaic (PV) electricity generation throughout the contiguous United States. This national estimate is based on an analysis of select US cities that combines light detection and ranging (lidar) data with a validated analytical method for determining rooftop PV suitability employing geographic information systems. We use statistical models to extend this analysis to estimate the quantity and characteristics of roofs in areas not covered by lidar data. Finally, we model PV generation for all rooftops to yield technical potential estimates. At the national level, 8.13 billion m 2 ofmore » suitable roof area could host 1118 GW of PV capacity, generating 1432 TWh of electricity per year. This would equate to 38.6% of the electricity that was sold in the contiguous United States in 2013. This estimate is substantially higher than a previous estimate made by the National Renewable Energy Laboratory. The difference can be attributed to increases in PV module power density, improved estimation of building suitability, higher estimates of total number of buildings, and improvements in PV performance simulation tools that previously tended to underestimate productivity. Also notable, the nationwide percentage of buildings suitable for at least some PV deployment is high—82% for buildings smaller than 5000 ft 2 and over 99% for buildings larger than that. In most states, rooftop PV could enable small, mostly residential buildings to offset the majority of average household electricity consumption. Even in some states with a relatively poor solar resource, such as those in the Northeast, the residential sector has the potential to offset around 100% of its total electricity consumption with rooftop PV.« less

  3. Estimating rooftop solar technical potential across the US using a combination of GIS-based methods, lidar data, and statistical modeling

    DOE PAGES

    Gagnon, Pieter; Margolis, Robert; Melius, Jennifer; ...

    2018-01-05

    We provide a detailed estimate of the technical potential of rooftop solar photovoltaic (PV) electricity generation throughout the contiguous United States. This national estimate is based on an analysis of select US cities that combines light detection and ranging (lidar) data with a validated analytical method for determining rooftop PV suitability employing geographic information systems. We use statistical models to extend this analysis to estimate the quantity and characteristics of roofs in areas not covered by lidar data. Finally, we model PV generation for all rooftops to yield technical potential estimates. At the national level, 8.13 billion m 2 ofmore » suitable roof area could host 1118 GW of PV capacity, generating 1432 TWh of electricity per year. This would equate to 38.6% of the electricity that was sold in the contiguous United States in 2013. This estimate is substantially higher than a previous estimate made by the National Renewable Energy Laboratory. The difference can be attributed to increases in PV module power density, improved estimation of building suitability, higher estimates of total number of buildings, and improvements in PV performance simulation tools that previously tended to underestimate productivity. Also notable, the nationwide percentage of buildings suitable for at least some PV deployment is high—82% for buildings smaller than 5000 ft 2 and over 99% for buildings larger than that. In most states, rooftop PV could enable small, mostly residential buildings to offset the majority of average household electricity consumption. Even in some states with a relatively poor solar resource, such as those in the Northeast, the residential sector has the potential to offset around 100% of its total electricity consumption with rooftop PV.« less

  4. Estimating rooftop solar technical potential across the US using a combination of GIS-based methods, lidar data, and statistical modeling

    NASA Astrophysics Data System (ADS)

    Gagnon, Pieter; Margolis, Robert; Melius, Jennifer; Phillips, Caleb; Elmore, Ryan

    2018-02-01

    We provide a detailed estimate of the technical potential of rooftop solar photovoltaic (PV) electricity generation throughout the contiguous United States. This national estimate is based on an analysis of select US cities that combines light detection and ranging (lidar) data with a validated analytical method for determining rooftop PV suitability employing geographic information systems. We use statistical models to extend this analysis to estimate the quantity and characteristics of roofs in areas not covered by lidar data. Finally, we model PV generation for all rooftops to yield technical potential estimates. At the national level, 8.13 billion m2 of suitable roof area could host 1118 GW of PV capacity, generating 1432 TWh of electricity per year. This would equate to 38.6% of the electricity that was sold in the contiguous United States in 2013. This estimate is substantially higher than a previous estimate made by the National Renewable Energy Laboratory. The difference can be attributed to increases in PV module power density, improved estimation of building suitability, higher estimates of total number of buildings, and improvements in PV performance simulation tools that previously tended to underestimate productivity. Also notable, the nationwide percentage of buildings suitable for at least some PV deployment is high—82% for buildings smaller than 5000 ft2 and over 99% for buildings larger than that. In most states, rooftop PV could enable small, mostly residential buildings to offset the majority of average household electricity consumption. Even in some states with a relatively poor solar resource, such as those in the Northeast, the residential sector has the potential to offset around 100% of its total electricity consumption with rooftop PV.

  5. Prediction model of sinoatrial node field potential using high order partial least squares.

    PubMed

    Feng, Yu; Cao, Hui; Zhang, Yanbin

    2015-01-01

    High order partial least squares (HOPLS) is a novel data processing method. It is highly suitable for building prediction model which has tensor input and output. The objective of this study is to build a prediction model of the relationship between sinoatrial node field potential and high glucose using HOPLS. The three sub-signals of the sinoatrial node field potential made up the model's input. The concentration and the actuation duration of high glucose made up the model's output. The results showed that on the premise of predicting two dimensional variables, HOPLS had the same predictive ability and a lower dispersion degree compared with partial least squares (PLS).

  6. Beyond the scope of Free-Wilson analysis: building interpretable QSAR models with machine learning algorithms.

    PubMed

    Chen, Hongming; Carlsson, Lars; Eriksson, Mats; Varkonyi, Peter; Norinder, Ulf; Nilsson, Ingemar

    2013-06-24

    A novel methodology was developed to build Free-Wilson like local QSAR models by combining R-group signatures and the SVM algorithm. Unlike Free-Wilson analysis this method is able to make predictions for compounds with R-groups not present in a training set. Eleven public data sets were chosen as test cases for comparing the performance of our new method with several other traditional modeling strategies, including Free-Wilson analysis. Our results show that the R-group signature SVM models achieve better prediction accuracy compared with Free-Wilson analysis in general. Moreover, the predictions of R-group signature models are also comparable to the models using ECFP6 fingerprints and signatures for the whole compound. Most importantly, R-group contributions to the SVM model can be obtained by calculating the gradient for R-group signatures. For most of the studied data sets, a significant correlation with that of a corresponding Free-Wilson analysis is shown. These results suggest that the R-group contribution can be used to interpret bioactivity data and highlight that the R-group signature based SVM modeling method is as interpretable as Free-Wilson analysis. Hence the signature SVM model can be a useful modeling tool for any drug discovery project.

  7. Organism-level models: When mechanisms and statistics fail us

    NASA Astrophysics Data System (ADS)

    Phillips, M. H.; Meyer, J.; Smith, W. P.; Rockhill, J. K.

    2014-03-01

    Purpose: To describe the unique characteristics of models that represent the entire course of radiation therapy at the organism level and to highlight the uses to which such models can be put. Methods: At the level of an organism, traditional model-building runs into severe difficulties. We do not have sufficient knowledge to devise a complete biochemistry-based model. Statistical model-building fails due to the vast number of variables and the inability to control many of them in any meaningful way. Finally, building surrogate models, such as animal-based models, can result in excluding some of the most critical variables. Bayesian probabilistic models (Bayesian networks) provide a useful alternative that have the advantages of being mathematically rigorous, incorporating the knowledge that we do have, and being practical. Results: Bayesian networks representing radiation therapy pathways for prostate cancer and head & neck cancer were used to highlight the important aspects of such models and some techniques of model-building. A more specific model representing the treatment of occult lymph nodes in head & neck cancer were provided as an example of how such a model can inform clinical decisions. A model of the possible role of PET imaging in brain cancer was used to illustrate the means by which clinical trials can be modelled in order to come up with a trial design that will have meaningful outcomes. Conclusions: Probabilistic models are currently the most useful approach to representing the entire therapy outcome process.

  8. Spatial Information in Support of 3D Flood Damage Assessment of Buildings at Micro Level: A Review

    NASA Astrophysics Data System (ADS)

    Amirebrahimi, S.; Rajabifard, A.; Sabri, S.; Mendis, P.

    2016-10-01

    Floods, as the most common and costliest natural disaster around the globe, have adverse impacts on buildings which are considered as major contributors to the overall economic damage. With emphasis on risk management methods for reducing the risks to structures and people, estimating damage from potential flood events becomes an important task for identifying and implementing the optimal flood risk-reduction solutions. While traditional Flood Damage Assessment (FDA) methods focus on simple representation of buildings for large-scale damage assessment purposes, recent emphasis on buildings' flood resilience resulted in development of a sophisticated method that allows for a detailed and effective damage evaluation at the scale of building and its components. In pursuit of finding the suitable spatial information model to satisfy the needs of implementing such frameworks, this article explores the technical developments for an effective representation of buildings, floods and other required information within the built environment. The search begins with the Geospatial domain and investigates the state-of-the-art and relevant developments from data point of view in this area. It is further extended to other relevant disciplines in the Architecture, Engineering and Construction domain (AEC/FM) and finally, even some overlapping areas between these domains are considered and explored.

  9. Semantic Segmentation of Building Elements Using Point Cloud Hashing

    NASA Astrophysics Data System (ADS)

    Chizhova, M.; Gurianov, A.; Hess, M.; Luhmann, T.; Brunn, A.; Stilla, U.

    2018-05-01

    For the interpretation of point clouds, the semantic definition of extracted segments from point clouds or images is a common problem. Usually, the semantic of geometrical pre-segmented point cloud elements are determined using probabilistic networks and scene databases. The proposed semantic segmentation method is based on the psychological human interpretation of geometric objects, especially on fundamental rules of primary comprehension. Starting from these rules the buildings could be quite well and simply classified by a human operator (e.g. architect) into different building types and structural elements (dome, nave, transept etc.), including particular building parts which are visually detected. The key part of the procedure is a novel method based on hashing where point cloud projections are transformed into binary pixel representations. A segmentation approach released on the example of classical Orthodox churches is suitable for other buildings and objects characterized through a particular typology in its construction (e.g. industrial objects in standardized enviroments with strict component design allowing clear semantic modelling).

  10. Verification and Updating of the Database of Topographic Objects with Geometric Information About Buildings by Means of Airborne Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Mendela-Anzlik, Małgorzata; Borkowski, Andrzej

    2017-06-01

    Airborne laser scanning data (ALS) are used mainly for creation of precise digital elevation models. However, it appears that the informative potential stored in ALS data can be also used for updating spatial databases, including the Database of Topographic Objects (BDOT10k). Typically, geometric representations of buildings in the BDOT10k are equal to their entities in the Land and Property Register (EGiB). In this study ALS is considered as supporting data source. The thresholding method of original ALS data with the use of the alpha shape algorithm, proposed in this paper, allows for extraction of points that represent horizontal cross section of building walls, leading to creation of vector, geometric models of buildings that can be then used for updating the BDOT10k. This method gives also the possibility of an easy verification of up-to-dateness of both the BDOT10k and the district EGiB databases within geometric information about buildings. For verification of the proposed methodology there have been used the classified ALS data acquired with a density of 4 points/m2. The accuracy assessment of the identified building outlines has been carried out by their comparison to the corresponding EGiB objects. The RMSE values for 78 buildings are from a few to tens of centimeters and the average value is about 0,5 m. At the same time for several objects there have been revealed huge geometric discrepancies. Further analyses have shown that these discrepancies could be resulted from incorrect representations of buildings in the EGiB database.

  11. Building Quakes: Detection of Weld Fractures in Buildings using High-Frequency Seismic Techniques

    NASA Astrophysics Data System (ADS)

    Heckman, V.; Kohler, M. D.; Heaton, T. H.

    2009-12-01

    Catastrophic fracture of welded beam-column connections in buildings was observed in the Northridge and Kobe earthquakes. Despite the structural importance of such connections, it can be difficult to locate damage in structural members underneath superficial building features. We have developed a novel technique to locate fracturing welds in buildings in real time using high-frequency information from seismograms. Numerical and experimental methods were used to investigate an approach for detecting the brittle fracture of welds of beam-column connections in instrumented steel moment-frame buildings through the use of time-reversed Green’s functions and wave propagation reciprocity. The approach makes use of a prerecorded catalogue of Green’s functions for an instrumented building to detect high-frequency failure events in the building during a later earthquake by screening continuous data for the presence of one or more of the events. This was explored experimentally by comparing structural responses of a small-scale laboratory structure under a variety of loading conditions. Experimentation was conducted on a polyvinyl chloride frame model structure with data recorded at a sample rate of 2000 Hz using piezoelectric accelerometers and a 24-bit digitizer. Green’s functions were obtained by applying impulsive force loads at various locations along the structure with a rubber-tipped force transducer hammer. We performed a blind test using cross-correlation techniques to determine if it was possible to use the catalogue of Green’s functions to pinpoint the absolute times and locations of subsequent, induced failure events in the structure. A finite-element method was used to simulate the response of the model structure to various source mechanisms in order to determine the types of elastic waves that were produced as well as to obtain a general understanding of the structural response to localized loading and fracture.

  12. Environmental and Energy Aspects of Construction Industry and Green Buildings

    NASA Astrophysics Data System (ADS)

    Kauskale, L.; Geipele, I.; Zeltins, N.; Lecis, I.

    2017-04-01

    Green building is an important component of sustainable real estate market development, and one of the reasons is that the construction industry consumes a high amount of resources. Energy consumption of construction industry results in greenhouse gas emissions, so green buildings, energy systems, building technologies and other aspects play an important role in sustainable development of real estate market, construction and environmental development. The aim of the research is to analyse environmental aspects of sustainable real estate market development, focusing on importance of green buildings at the industry level and related energy aspects. Literature review, historical, statistical data analysis and logical access methods have been used in the research. The conducted research resulted in high environmental rationale and importance of environment-friendly buildings, and there are many green building benefits during the building life cycle. Future research direction is environmental information process and its models.

  13. A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data

    DOE PAGES

    Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...

    2016-01-01

    Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less

  14. Real-time identification of indoor pollutant source positions based on neural network locator of contaminant sources and optimized sensor networks.

    PubMed

    Vukovic, Vladimir; Tabares-Velasco, Paulo Cesar; Srebric, Jelena

    2010-09-01

    A growing interest in security and occupant exposure to contaminants revealed a need for fast and reliable identification of contaminant sources during incidental situations. To determine potential contaminant source positions in outdoor environments, current state-of-the-art modeling methods use computational fluid dynamic simulations on parallel processors. In indoor environments, current tools match accidental contaminant distributions with cases from precomputed databases of possible concentration distributions. These methods require intensive computations in pre- and postprocessing. On the other hand, neural networks emerged as a tool for rapid concentration forecasting of outdoor environmental contaminants such as nitrogen oxides or sulfur dioxide. All of these modeling methods depend on the type of sensors used for real-time measurements of contaminant concentrations. A review of the existing sensor technologies revealed that no perfect sensor exists, but intensity of work in this area provides promising results in the near future. The main goal of the presented research study was to extend neural network modeling from the outdoor to the indoor identification of source positions, making this technology applicable to building indoor environments. The developed neural network Locator of Contaminant Sources was also used to optimize number and allocation of contaminant concentration sensors for real-time prediction of indoor contaminant source positions. Such prediction should take place within seconds after receiving real-time contaminant concentration sensor data. For the purpose of neural network training, a multizone program provided distributions of contaminant concentrations for known source positions throughout a test building. Trained networks had an output indicating contaminant source positions based on measured concentrations in different building zones. A validation case based on a real building layout and experimental data demonstrated the ability of this method to identify contaminant source positions. Future research intentions are focused on integration with real sensor networks and model improvements for much more complicated contamination scenarios.

  15. Building energy analysis tool

    DOEpatents

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  16. The effect of building façade on natural lighting (Case study: Building of phinisi tower UNM)

    NASA Astrophysics Data System (ADS)

    Jamala, Nurul

    2017-04-01

    Utilization of natural lighting is one factor to lower the energy consumption of a building. Model building facade effect on natural light sources that can be absorbed into the building. UNM Phinisi Tower Building is a metaphor for the display of boats phinisi using Hiperbolic paraboloid facade which is futuristic sophistication of the application of science and technology, so that this object that is the focus of research on the effects on the building facade natural lighting. A quantitative research methods using Autodesk Echotech program to determine the value of the building into the natural lighting illuminance, either by using the facade and do not. The aim of research is to determine the percentage utilization of natural light into the building using a building facade. The study concluded the decline percentage in the value of the illuminance after the building using the building facade is 49% -74% and a mean value of 60.3%, so it can be concluded that the building facade effects on the natural lighting.

  17. Construction of ground-state preserving sparse lattice models for predictive materials simulations

    NASA Astrophysics Data System (ADS)

    Huang, Wenxuan; Urban, Alexander; Rong, Ziqin; Ding, Zhiwei; Luo, Chuan; Ceder, Gerbrand

    2017-08-01

    First-principles based cluster expansion models are the dominant approach in ab initio thermodynamics of crystalline mixtures enabling the prediction of phase diagrams and novel ground states. However, despite recent advances, the construction of accurate models still requires a careful and time-consuming manual parameter tuning process for ground-state preservation, since this property is not guaranteed by default. In this paper, we present a systematic and mathematically sound method to obtain cluster expansion models that are guaranteed to preserve the ground states of their reference data. The method builds on the recently introduced compressive sensing paradigm for cluster expansion and employs quadratic programming to impose constraints on the model parameters. The robustness of our methodology is illustrated for two lithium transition metal oxides with relevance for Li-ion battery cathodes, i.e., Li2xFe2(1-x)O2 and Li2xTi2(1-x)O2, for which the construction of cluster expansion models with compressive sensing alone has proven to be challenging. We demonstrate that our method not only guarantees ground-state preservation on the set of reference structures used for the model construction, but also show that out-of-sample ground-state preservation up to relatively large supercell size is achievable through a rapidly converging iterative refinement. This method provides a general tool for building robust, compressed and constrained physical models with predictive power.

  18. Building of Reusable Reverse Logistics Model and its Optimization Considering the Decision of Backorder or Next Arrival of Goods

    NASA Astrophysics Data System (ADS)

    Lee, Jeong-Eun; Gen, Mitsuo; Rhee, Kyong-Gu; Lee, Hee-Hyol

    This paper deals with the building of the reusable reverse logistics model considering the decision of the backorder or the next arrival of goods. The optimization method to minimize the transportation cost and to minimize the volume of the backorder or the next arrival of goods occurred by the Just in Time delivery of the final delivery stage between the manufacturer and the processing center is proposed. Through the optimization algorithms using the priority-based genetic algorithm and the hybrid genetic algorithm, the sub-optimal delivery routes are determined. Based on the case study of a distilling and sale company in Busan in Korea, the new model of the reusable reverse logistics of empty bottles is built and the effectiveness of the proposed method is verified.

  19. Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.

    This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less

  20. Environmental management and monitoring for education building development

    NASA Astrophysics Data System (ADS)

    Masri, R. M.

    2018-05-01

    The purpose of research were (1) a conceptual, functional model designed and implementation for environmental management and monitoring for education building development, (2) standard operational procedure made for management and monitoring for education building development, (3) assessed physic-chemical, biological, social-economic environmental components so that fulfilling sustainable development, (4) environmental management and monitoring program made for decreasing negative and increasing positive impact in education building development activities. Descriptive method is used for the research. Cibiru UPI Campus, Bandung, West Java, Indonesia was study location. The research was conducted on July 2016 to January 2017. Spatial and activities analysis were used to assess physic-chemical, biological, social-economic environmental components. Environmental management and monitoring for education building development could be decreasing water, air, soil pollution and environmental degradation in education building development activities.

  1. Fuzzy control for nonlinear structure with semi-active friction damper

    NASA Astrophysics Data System (ADS)

    Zhao, Da-Hai; Li, Hong-Nan

    2007-04-01

    The implementation of semi-active friction damper for vibration mitigation of seismic structure generally requires an efficient control strategy. In this paper, the fuzzy logic based on Takagi-Sugeno model is proposed for controlling a semi-active friction damper that is installed on a nonlinear building subjected to strong earthquakes. The continuous Bouc-Wen hysteretic model for the stiffness is used to describe nonlinear characteristic of the building. The optimal sliding force with friction damper is determined by nonlinear time history analysis under normal earthquakes. The Takagi-Sugeno fuzzy logic model is employed to adjust the clamping force acted on the friction damper according to the semi-active control strategy. Numerical simulation results demonstrate that the proposed method is very efficient in reducing the peak inter-story drift and acceleration of the nonlinear building structure under earthquake excitations.

  2. Design of passive interconnections in tall buildings subject to earthquake disturbances to suppress inter-storey drifts

    NASA Astrophysics Data System (ADS)

    Yamamoto, K.; Smith, MC

    2016-09-01

    This paper studies the problem of passive control of a multi-storey building subjected to an earthquake disturbance. The building is represented as a homogeneous mass chain model, i.e., a chain of identical masses in which there is an identical passive connection between neighbouring masses and a similar connection to a movable point. The paper considers passive interconnections of the most general type, which may require the use of inerters in addition to springs and dampers. It is shown that the scalar transfer functions from the disturbance to a given inter-storey drift can be represented as complex iterative maps. Using these expressions, two graphical approaches are proposed: one gives a method to achieve a prescribed value for the uniform boundedness of these transfer functions independent of the length of the mass chain, and the other is for a fixed length of the mass chain. A case study is presented to demonstrate the effectiveness of the proposed techniques using a 10-storey building model. The disturbance suppression performance of the designed interconnection is also verified for a 10-storey building model which has a different stiffness distribution but with the same undamped first natural frequency as the homogeneous model.

  3. Development of EnergyPlus Utility to Batch Simulate Building Energy Performance on a National Scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valencia, Jayson F.; Dirks, James A.

    2008-08-29

    EnergyPlus is a simulation program that requires a large number of details to fully define and model a building. Hundreds or even thousands of lines in a text file are needed to run the EnergyPlus simulation depending on the size of the building. To manually create these files is a time consuming process that would not be practical when trying to create input files for thousands of buildings needed to simulate national building energy performance. To streamline the process needed to create the input files for EnergyPlus, two methods were created to work in conjunction with the National Renewable Energymore » Laboratory (NREL) Preprocessor; this reduced the hundreds of inputs needed to define a building in EnergyPlus to a small set of high-level parameters. The first method uses Java routines to perform all of the preprocessing on a Windows machine while the second method carries out all of the preprocessing on the Linux cluster by using an in-house built utility called Generalized Parametrics (GPARM). A comma delimited (CSV) input file is created to define the high-level parameters for any number of buildings. Each method then takes this CSV file and uses the data entered for each parameter to populate an extensible markup language (XML) file used by the NREL Preprocessor to automatically prepare EnergyPlus input data files (idf) using automatic building routines and macro templates. Using a Linux utility called “make”, the idf files can then be automatically run through the Linux cluster and the desired data from each building can be aggregated into one table to be analyzed. Creating a large number of EnergyPlus input files results in the ability to batch simulate building energy performance and scale the result to national energy consumption estimates.« less

  4. Illuminating Tradespace Decisions Using Efficient Experimental Space-Filling Designs for the Engineered Resilient System Architecture

    DTIC Science & Technology

    2015-06-30

    7. Building Statistical Metamodels using Simulation Experimental Designs ............................................... 34 7.1. Statistical Design...system design drivers across several different domain models, our methodology uses statistical metamodeling to approximate the simulations’ behavior. A...output. We build metamodels using a number of statistical methods that include stepwise regression, boosted trees, neural nets, and bootstrap forest

  5. Illuminating Tradespace Decisions Using Efficient Experimental Space-Filling Designs for the Engineered Resilient System Architecture

    DTIC Science & Technology

    2015-06-01

    7. Building Statistical Metamodels using Simulation Experimental Designs ............................................... 34 7.1. Statistical Design...system design drivers across several different domain models, our methodology uses statistical metamodeling to approximate the simulations’ behavior. A...output. We build metamodels using a number of statistical methods that include stepwise regression, boosted trees, neural nets, and bootstrap forest

  6. Building SWPBIS Capacity in Rural Schools through Building-Based Coaching: Early Findings from a District-Based Model

    ERIC Educational Resources Information Center

    Cavanaugh, Brian; Swan, Meaghan

    2015-01-01

    School-wide Positive Behavioral Interventions and Supports (SWPBIS) is a widely used framework for supporting student social and academic behavior. Implementation science indicates that one effective way to implement and scale-up practices, such as SWPBIS, is through coaching; thus, there is a need for efficient, cost-effective methods to develop…

  7. A new fit-for-purpose model testing framework: Decision Crash Tests

    NASA Astrophysics Data System (ADS)

    Tolson, Bryan; Craig, James

    2016-04-01

    Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building decisions. In one case, we show the set of model building decisions has a low probability to correctly support the upgrade decision. In the other case, we show evidence suggesting another set of model building decisions has a high probability to correctly support the decision. The proposed DCT framework focuses on what model users typically care about: the management decision in question. The DCT framework will often be very strict and will produce easy to interpret results enabling clear unsuitability determinations. In the past, hydrologic modelling progress has necessarily meant new models and model building methods. Continued progress in hydrologic modelling requires finding clear evidence to motivate researchers to disregard unproductive models and methods and the DCT framework is built to produce this kind of evidence. References: Andréassian, V., C. Perrin, L. Berthet, N. Le Moine, J. Lerat, C. Loumagne, L. Oudin, T. Mathevet, M.-H. Ramos, and A. Valéry (2009), Crash tests for a standardized evaluation of hydrological models. Hydrology and Earth System Sciences, 13, 1757-1764. Klemeš, V. (1986), Operational testing of hydrological simulation models. Hydrological Sciences Journal, 31 (1), 13-24.

  8. ROOFN3D: Deep Learning Training Data for 3d Building Reconstruction

    NASA Astrophysics Data System (ADS)

    Wichmann, A.; Agoub, A.; Kada, M.

    2018-05-01

    Machine learning methods have gained in importance through the latest development of artificial intelligence and computer hardware. Particularly approaches based on deep learning have shown that they are able to provide state-of-the-art results for various tasks. However, the direct application of deep learning methods to improve the results of 3D building reconstruction is often not possible due, for example, to the lack of suitable training data. To address this issue, we present RoofN3D which provides a new 3D point cloud training dataset that can be used to train machine learning models for different tasks in the context of 3D building reconstruction. It can be used, among others, to train semantic segmentation networks or to learn the structure of buildings and the geometric model construction. Further details about RoofN3D and the developed data preparation framework, which enables the automatic derivation of training data, are described in this paper. Furthermore, we provide an overview of other available 3D point cloud training data and approaches from current literature in which solutions for the application of deep learning to unstructured and not gridded 3D point cloud data are presented.

  9. Assessment of Masonry Buildings Subjected to Landslide-Induced Settlements: From Load Path Method to Evolutionary Optimization Method

    NASA Astrophysics Data System (ADS)

    Palmisano, Fabrizio; Elia, Angelo

    2017-10-01

    One of the main difficulties, when dealing with landslide structural vulnerability, is the diagnosis of the causes of crack patterns. This is also due to the excessive complexity of models based on classical structural mechanics that makes them inappropriate especially when there is the necessity to perform a rapid vulnerability assessment at the territorial scale. This is why, a new approach, based on a ‘simple model’ (i.e. the Load Path Method, LPM), has been proposed by Palmisano and Elia for the interpretation of the behaviour of masonry buildings subjected to landslide-induced settlements. However, the LPM is very useful for rapidly finding the 'most plausible solution' instead of the exact solution. To find the solution, optimization algorithms are necessary. In this scenario, this article aims to show how the Bidirectional Evolutionary Structural Optimization method by Huang and Xie, can be very useful to optimize the strut-and-tie models obtained by using the Load Path Method.

  10. Design and implementation of an air-conditioning system with storage tank for load shifting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsu, Y.Y.; Wu, C.J.; Liou, K.L.

    1987-11-01

    The experience with the design, simulation and implementation of an air-conditioning system with chilled water storage tank is presented in this paper. The system is used to shift air-conditioning load of residential and commercial buildings from on-peak to off-peak period. Demand-side load management can thus be achieved if many buildings are equipped with such storage devices. In the design of this system, a lumped-parameter circuit model is first employed to simulate the heat transfer within the air-conditioned building such that the required capacity of the storage tank can be figured out. Then, a set of desirable parameters for the temperaturemore » controller of the system are determined using the parameter plane method and the root locus method. The validity of the proposed mathematical model and design approach is verified by comparing the results obtained from field tests with those from the computer simulations. Cost-benefit analysis of the system is also discussed.« less

  11. Building Capacity to Use Earth Observations in Decision Making: A Case Study of NASA's DEVELOP National Program Methods and Best Practices

    NASA Astrophysics Data System (ADS)

    Childs-Gleason, L. M.; Ross, K. W.; Crepps, G.; Miller, T. N.; Favors, J. E.; Rogers, L.; Allsbrook, K. N.; Bender, M. R.; Ruiz, M. L.

    2015-12-01

    NASA's DEVELOP National Program fosters an immersive research environment for dual capacity building. Through rapid feasibility Earth science projects, the future workforce and current decision makers are engaged in research projects to build skills and capabilities to use Earth observation in environmental management and policy making. DEVELOP conducts over 80 projects annually, successfully building skills through partnerships with over 150 organizations and providing over 350 opportunities for project participants each year. Filling a void between short-term training courses and long-term research projects, the DEVELOP model has been successful in supporting state, local, federal and international government organizations to adopt methodologies and enhance decision making processes. This presentation will highlight programmatic best practices, feedback from participants and partner organizations, and three sample case studies of successful adoption of methods in the decision making process.

  12. A Computational Workflow for the Automated Generation of Models of Genetic Designs.

    PubMed

    Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil

    2018-06-05

    Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.

  13. Evaluation of soil-foundation-structure interaction effects on seismic response demands of multi-story MRF buildings on raft foundations

    NASA Astrophysics Data System (ADS)

    Abdel Raheem, Shehata E.; Ahmed, Mohamed M.; Alazrak, Tarek M. A.

    2015-03-01

    Soil conditions have a great deal to do with damage to structures during earthquakes. Hence the investigation on the energy transfer mechanism from soils to buildings during earthquakes is critical for the seismic design of multi-story buildings and for upgrading existing structures. Thus, the need for research into soil-structure interaction (SSI) problems is greater than ever. Moreover, recent studies show that the effects of SSI may be detrimental to the seismic response of structure and neglecting SSI in analysis may lead to un-conservative design. Despite this, the conventional design procedure usually involves assumption of fixity at the base of foundation neglecting the flexibility of the foundation, the compressibility of the underneath soil and, consequently, the effect of foundation settlement on further redistribution of bending moment and shear force demands. Hence the SSI analysis of multi-story buildings is the main focus of this research; the effects of SSI are analyzed for typical multi-story building resting on raft foundation. Three methods of analysis are used for seismic demands evaluation of the target moment-resistant frame buildings: equivalent static load; response spectrum methods and nonlinear time history analysis with suit of nine time history records. Three-dimensional FE model is constructed to investigate the effects of different soil conditions and number of stories on the vibration characteristics and seismic response demands of building structures. Numerical results obtained using SSI model with different soil conditions are compared to those corresponding to fixed-base support modeling assumption. The peak responses of story shear, story moment, story displacement, story drift, moments at beam ends, as well as force of inner columns are analyzed. The results of different analysis approaches are used to evaluate the advantages, limitations, and ease of application of each approach for seismic analysis.

  14. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    PubMed Central

    Zhu, Hao; Sun, Yan; Rajagopal, Gunaretnam; Mondry, Adrian; Dhar, Pawan

    2004-01-01

    Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods described. PMID:15339335

  15. Project risk management in the construction of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya

    2018-03-01

    This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.

  16. Refining atmosphere light to improve the dark channel prior algorithm

    NASA Astrophysics Data System (ADS)

    Gan, Ling; Li, Dagang; Zhou, Can

    2017-05-01

    The defogging image gotten through dark channel prior algorithm has some shortcomings, such like color distortion, dimmer light and detail-loss near the observer. The main reasons are that the atmosphere light is estimated as one value and its change in different scene depth is not considered. So we modeled the atmosphere, one parameter of the defogging model. Firstly, we scatter the atmosphere light into equivalent point and build discrete model of the light. Secondly, we build some rough and possible models through analyzing the relationship between the atmosphere light and the medium transmission. Finally, by analyzing the results of many experiments qualitatively and quantitatively, we get the selected and optimized model. Although using this method causes the time-consuming to increase slightly, the evaluations, histogram correlation coefficient and peak signal-to-noise ratio are improved significantly and the defogging result is more conformed to human visual. And the color and the details near the observer in the defogging image are better than that achieved by the primal method.

  17. [Measurement of Water COD Based on UV-Vis Spectroscopy Technology].

    PubMed

    Wang, Xiao-ming; Zhang, Hai-liang; Luo, Wei; Liu, Xue-mei

    2016-01-01

    Ultraviolet/visible (UV/Vis) spectroscopy technology was used to measure water COD. A total of 135 water samples were collected from Zhejiang province. Raw spectra with 3 different pretreatment methods (Multiplicative Scatter Correction (MSC), Standard Normal Variate (SNV) and 1st Derivatives were compared to determine the optimal pretreatment method for analysis. Spectral variable selection is an important strategy in spectrum modeling analysis, because it tends to parsimonious data representation and can lead to multivariate models with better performance. In order to simply calibration models, the preprocessed spectra were then used to select sensitive wavelengths by competitive adaptive reweighted sampling (CARS), Random frog and Successive Genetic Algorithm (GA) methods. Different numbers of sensitive wavelengths were selected by different variable selection methods with SNV preprocessing method. Partial least squares (PLS) was used to build models with the full spectra, and Extreme Learning Machine (ELM) was applied to build models with the selected wavelength variables. The overall results showed that ELM model performed better than PLS model, and the ELM model with the selected wavelengths based on CARS obtained the best results with the determination coefficient (R2), RMSEP and RPD were 0.82, 14.48 and 2.34 for prediction set. The results indicated that it was feasible to use UV/Vis with characteristic wavelengths which were obtained by CARS variable selection method, combined with ELM calibration could apply for the rapid and accurate determination of COD in aquaculture water. Moreover, this study laid the foundation for further implementation of online analysis of aquaculture water and rapid determination of other water quality parameters.

  18. Direct-method SAD phasing with partial-structure iteration: towards automation.

    PubMed

    Wang, J W; Chen, J R; Gu, Y X; Zheng, C D; Fan, H F

    2004-11-01

    The probability formula of direct-method SAD (single-wavelength anomalous diffraction) phasing proposed by Fan & Gu (1985, Acta Cryst. A41, 280-284) contains partial-structure information in the form of a Sim-weighting term. Previously, only the substructure of anomalous scatterers has been included in this term. In the case that the subsequent density modification and model building yields only structure fragments, which do not straightforwardly lead to the complete solution, the partial structure can be fed back into the Sim-weighting term of the probability formula in order to strengthen its phasing power and to benefit the subsequent automatic model building. The procedure has been tested with experimental SAD data from two known proteins with copper and sulfur as the anomalous scatterers.

  19. An Improved Snake Model for Refinement of Lidar-Derived Building Roof Contours Using Aerial Images

    NASA Astrophysics Data System (ADS)

    Chen, Qi; Wang, Shugen; Liu, Xiuguo

    2016-06-01

    Building roof contours are considered as very important geometric data, which have been widely applied in many fields, including but not limited to urban planning, land investigation, change detection and military reconnaissance. Currently, the demand on building contours at a finer scale (especially in urban areas) has been raised in a growing number of studies such as urban environment quality assessment, urban sprawl monitoring and urban air pollution modelling. LiDAR is known as an effective means of acquiring 3D roof points with high elevation accuracy. However, the precision of the building contour obtained from LiDAR data is restricted by its relatively low scanning resolution. With the use of the texture information from high-resolution imagery, the precision can be improved. In this study, an improved snake model is proposed to refine the initial building contours extracted from LiDAR. First, an improved snake model is constructed with the constraints of the deviation angle, image gradient, and area. Then, the nodes of the contour are moved in a certain range to find the best optimized result using greedy algorithm. Considering both precision and efficiency, the candidate shift positions of the contour nodes are constrained, and the searching strategy for the candidate nodes is explicitly designed. The experiments on three datasets indicate that the proposed method for building contour refinement is effective and feasible. The average quality index is improved from 91.66% to 93.34%. The statistics of the evaluation results for every single building demonstrated that 77.0% of the total number of contours is updated with higher quality index.

  20. Seismic evaluation of vulnerability for SAMA educational buildings in Tehran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amini, Omid Nassiri; Amiri, Javad Vaseghi

    2008-07-08

    Earthquake is a destructive phenomenon that trembles different parts of the earth yearly and causes many destructions. Iran is one of the (high seismicity) quack- prone parts of the world that has received a lot of pecuniary damages and life losses each year, schools are of the most important places to be protected during such crisis.There was no special surveillance on designing and building of school's building in Tehran till the late 70's, and as Tehran is on faults, instability of such buildings may cause irrecoverable pecuniary damages and especially life losses, therefore preventing this phenomenon is in an urgentmore » need.For this purpose, some of the schools built during 67-78 mostly with Steel braced frame structures have been selected, first, by evaluating the selected Samples, gathering information and Visual Survey, the prepared questionnaires were filled out. With the use of ARIA and SABA (Venezuela) Methods, new modified combined method for qualified evaluations was found and used.Then, for quantified evaluation, with the use of computer 3D models and nonlinear statically analysis methods, a number of selected buildings of qualified evaluation, were reevaluated and finally with nonlinear dynamic analysis method the real behavior of structures on the earthquakes is studied.The results of qualified and quantified evaluations were compared and a proper Pattern for seismic evaluation of Educational buildings was presented. Otherwise the results can be a guidance for the person in charge of retrofitting or if necessary rebuilding the schools.« less

  1. Development and application of EEAST: a life cycle based model for use of harvested rainwater and composting toilets in buildings.

    PubMed

    Devkota, J; Schlachter, H; Anand, C; Phillips, R; Apul, Defne

    2013-11-30

    Harvested rainwater systems and composting toilets are expected to be an important part of sustainable solutions in buildings. Yet, to this date, a model evaluating their economic and environmental impact has been missing. To address this need, a life cycle based model, EEAST was developed. EEAST was designed to compare the business as usual (BAU) case of using potable water for toilet flushing and irrigation to alternative scenarios of rainwater harvesting and composting toilet based technologies. In EEAST, building characteristics, occupancy, and precipitation are used to size the harvested rainwater and composting toilet systems. Then, life cycle costing and life cycle assessment methods are used to estimate cost, energy, and greenhouse gas (GHG) emission payback periods (PPs) for five alternative scenarios. The scenarios modeled include use of harvested rainwater for toilet flushing, for irrigation, or both; and use of composting toilets with or without harvested rainwater use for irrigation. A sample simulation using EEAST showed that for the office building modeled, the cost PPs were greater than energy PPs which in turn were greater than GHG emission PPs. This was primarily due to energy and emission intensive nature of the centralized water and wastewater infrastructure. The sample simulation also suggested that the composting toilets may have the best performance in all criteria. However, EEAST does not explicitly model solids management and as such may give composting toilets an unfair advantage compared to flush based toilets. EEAST results were found to be very sensitive to cost values used in the model. With the availability of EEAST, life cycle cost, energy, and GHG emissions can now be performed fairly easily by building designers and researchers. Future work is recommended to further improve EEAST and evaluate it for different types of buildings and climates so as to better understand when composting toilets and harvested rainwater systems outperform the BAU case in building design. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Estimation of the Relationship Between Remotely Sensed Anthropogenic Heat Discharge and Building Energy Use

    NASA Technical Reports Server (NTRS)

    Zhou, Yuyu; Weng, Qihao; Gurney, Kevin R.; Shuai, Yanmin; Hu, Xuefei

    2012-01-01

    This paper examined the relationship between remotely sensed anthropogenic heat discharge and energy use from residential and commercial buildings across multiple scales in the city of Indianapolis, Indiana, USA. The anthropogenic heat discharge was estimated with a remote sensing-based surface energy balance model, which was parameterized using land cover, land surface temperature, albedo, and meteorological data. The building energy use was estimated using a GIS-based building energy simulation model in conjunction with Department of Energy/Energy Information Administration survey data, the Assessor's parcel data, GIS floor areas data, and remote sensing-derived building height data. The spatial patterns of anthropogenic heat discharge and energy use from residential and commercial buildings were analyzed and compared. Quantitative relationships were evaluated across multiple scales from pixel aggregation to census block. The results indicate that anthropogenic heat discharge is consistent with building energy use in terms of the spatial pattern, and that building energy use accounts for a significant fraction of anthropogenic heat discharge. The research also implies that the relationship between anthropogenic heat discharge and building energy use is scale-dependent. The simultaneous estimation of anthropogenic heat discharge and building energy use via two independent methods improves the understanding of the surface energy balance in an urban landscape. The anthropogenic heat discharge derived from remote sensing and meteorological data may be able to serve as a spatial distribution proxy for spatially-resolved building energy use, and even for fossil-fuel CO2 emissions if additional factors are considered.

  3. Theoretical basis of the DOE-2 building energy use analysis program

    NASA Astrophysics Data System (ADS)

    Curtis, R. B.

    1981-04-01

    A user-oriented, public domain, computer program was developed that will enable architects and engineers to perform design and retrofit studies of the energy-use of buildings under realistic weather conditions. The DOE-2.1A has been named by the US DOE as the standard evaluation technique for the Congressionally mandated building energy performance standards (BEPS). A number of program design decisions were made that determine the breadth of applicability of DOE-2.1. Such design decisions are intrinsic to all building energy use analysis computer programs and determine the types of buildings or the kind of HVAC systems that can be modeled. In particular, the weighting factor method used in DOE-2 has both advantages and disadvantages relative to other computer programs.

  4. Automatic Generation of Building Models with Levels of Detail 1-3

    NASA Astrophysics Data System (ADS)

    Nguatem, W.; Drauschke, M.; Mayer, H.

    2016-06-01

    We present a workflow for the automatic generation of building models with levels of detail (LOD) 1 to 3 according to the CityGML standard (Gröger et al., 2012). We start with orienting unsorted image sets employing (Mayer et al., 2012), we compute depth maps using semi-global matching (SGM) (Hirschmüller, 2008), and fuse these depth maps to reconstruct dense 3D point clouds (Kuhn et al., 2014). Based on planes segmented from these point clouds, we have developed a stochastic method for roof model selection (Nguatem et al., 2013) and window model selection (Nguatem et al., 2014). We demonstrate our workflow up to the export into CityGML.

  5. Reduced modeling of signal transduction – a modular approach

    PubMed Central

    Koschorreck, Markus; Conzelmann, Holger; Ebert, Sybille; Ederer, Michael; Gilles, Ernst Dieter

    2007-01-01

    Background Combinatorial complexity is a challenging problem in detailed and mechanistic mathematical modeling of signal transduction. This subject has been discussed intensively and a lot of progress has been made within the last few years. A software tool (BioNetGen) was developed which allows an automatic rule-based set-up of mechanistic model equations. In many cases these models can be reduced by an exact domain-oriented lumping technique. However, the resulting models can still consist of a very large number of differential equations. Results We introduce a new reduction technique, which allows building modularized and highly reduced models. Compared to existing approaches further reduction of signal transduction networks is possible. The method also provides a new modularization criterion, which allows to dissect the model into smaller modules that are called layers and can be modeled independently. Hallmarks of the approach are conservation relations within each layer and connection of layers by signal flows instead of mass flows. The reduced model can be formulated directly without previous generation of detailed model equations. It can be understood and interpreted intuitively, as model variables are macroscopic quantities that are converted by rates following simple kinetics. The proposed technique is applicable without using complex mathematical tools and even without detailed knowledge of the mathematical background. However, we provide a detailed mathematical analysis to show performance and limitations of the method. For physiologically relevant parameter domains the transient as well as the stationary errors caused by the reduction are negligible. Conclusion The new layer based reduced modeling method allows building modularized and strongly reduced models of signal transduction networks. Reduced model equations can be directly formulated and are intuitively interpretable. Additionally, the method provides very good approximations especially for macroscopic variables. It can be combined with existing reduction methods without any difficulties. PMID:17854494

  6. Estimate Tsunami Flow Conditions and Large-Debris Tracks for the Design of Coastal Infrastructures along Coastlines of the U.S. Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Thomas, S.; Zhou, H.; Arcas, D.; Titov, V. V.

    2017-12-01

    The increasing potential tsunami hazards pose great challenges for infrastructures along the coastlines of the U.S. Pacific Northwest. Tsunami impact at a coastal site is usually assessed from deterministic scenarios based on 10,000 years of geological records in the Cascadia Subduction Zone (CSZ). Aside from these deterministic methods, the new ASCE 7-16 tsunami provisions provide engineering design criteria of tsunami loads on buildings based on a probabilistic approach. This work develops a site-specific model near Newport, OR using high-resolution grids, and compute tsunami inundation depth and velocities at the study site resulted from credible probabilistic and deterministic earthquake sources in the Cascadia Subduction Zone. Three Cascadia scenarios, two deterministic scenarios, XXL1 and L1, and a 2,500-yr probabilistic scenario compliant with the new ASCE 7-16 standard, are simulated using combination of a depth-averaged shallow water model for offshore propagation and a Boussinesq-type model for onshore inundation. We speculate on the methods and procedure to obtain the 2,500-year probabilistic scenario for Newport that is compliant with the ASCE 7-16 tsunami provisions. We provide details of model results, particularly the inundation depth and flow speed for a new building, which will also be designated as a tsunami vertical evacuation shelter, at Newport, Oregon. We show that the ASCE 7-16 consistent hazards are between those obtained from deterministic L1 and XXL1 scenarios, and the greatest impact on the building may come from later waves. As a further step, we utilize the inundation model results to numerically compute tracks of large vessels in the vicinity of the building site and estimate if these vessels will impact on the building site during the extreme XXL1 and ASCE 7-16 hazard-consistent scenarios. Two-step study is carried out first to study tracks of massless particles and then large vessels with assigned mass considering drag force, inertial force, ship grounding and mooring. The simulation results show that none of the large vessels will impact on the building site in all tested scenarios.

  7. Scalable Deployment of Advanced Building Energy Management Systems

    DTIC Science & Technology

    2013-06-01

    Building Automation and Control Network BDAS Building Data Acquisition System BEM building energy model BIM building information modeling BMS...A prototype toolkit to seamlessly and automatically transfer a Building Information Model ( BIM ) to a Building Energy Model (BEM) has been...circumvent the need to manually construct and maintain a detailed building energy simulation model . This detailed

  8. Causes and Solutions for High Energy Consumption in Traditional Buildings Located in Hot Climate Regions

    NASA Astrophysics Data System (ADS)

    Barayan, Olfat Mohammad

    A considerable amount of money for high-energy consumption is spent in traditional buildings located in hot climate regions. High-energy consumption is significantly influenced by several causes, including building materials, orientation, mass, and openings' sizes. This paper aims to identify these causes and find practical solutions to reduce the annual cost of bills. For the purpose of this study, simulation research method has been followed. A comparison between two Revit models has also been created to point out the major cause of high-energy consumption. By analysing different orientations, wall insulation, and window glazing and applying some other high performance building techniques, a conclusion was found to confirm that appropriate building materials play a vital role in affecting energy cost. Therefore, the ability to reduce the energy cost by more than 50% in traditional buildings depends on a careful balance of building materials, mass, orientation, and type of window glazing.

  9. A concept of integrated environmental approach for building upgrades and new construction: Part 1—setting the stage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bomberg, Mark; Gibson, Michael; Zhang, Jian

    This article highlights the need for an active role for building physics in the development of near-zero energy buildings while analyzing an example of an integrated system for the upgrade of existing buildings. The science called either Building Physics in Europe or Building Science in North America has so far a passive role in explaining observed failures in construction practice. In its new role, it would be integrating modeling and testing to provide predictive capability, so much needed in the development of near-zero energy buildings. The authors attempt to create a compact package, applicable to different climates with small modificationsmore » of some hygrothermal properties of materials. This universal solution is based on a systems approach that is routine for building physics but in contrast to separately conceived sub-systems that are typical for the design of buildings today. One knows that the building structure, energy efficiency, indoor environmental quality, and moisture management all need to be considered to ensure durability of materials and control cost of near-zero energy buildings. These factors must be addressed through contributions of the whole design team. The same approach must be used for the retrofit of buildings. As this integrated design paradigm resulted from demands of sustainable built environment approach, building physics must drop its passive role and improve two critical domains of analysis: (i) linked, real-time hygrothermal and energy models capable of predicting the performance of existing buildings after renovation and (ii) basic methods of indoor environment and moisture management when the exterior of the building cannot be modified.« less

  10. Smart glass as the method of improving the energy efficiency of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Gamayunova, Olga; Gumerova, Eliza; Miloradova, Nadezda

    2018-03-01

    The question that has to be answered in high-rise building is glazing and its service life conditions. Contemporary market offers several types of window units, for instance, wooden, aluminum, PVC and combined models. Wooden and PVC windows become the most widespread and competitive between each other. In recent times design engineers choose smart glass. In this article, the advantages and drawbacks of all types of windows are reviewed, and the recommendations are given according to choice of window type in order to improve energy efficiency of buildings.

  11. A methodological framework to support the initiation, design and institutionalization of participatory modeling processes in water resources management

    NASA Astrophysics Data System (ADS)

    Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan

    2018-01-01

    Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.

  12. Wave-propagation formulation of seismic response of multistory buildings

    USGS Publications Warehouse

    Safak, E.

    1999-01-01

    This paper presents a discrete-time wave-propagation method to calculate the seismic response of multistory buildings, founded on layered soil media and subjected to vertically propagating shear waves. Buildings are modeled as an extension of the layered soil media by considering each story as another layer in the wave-propagation path. The seismic response is expressed in terms of wave travel times between the layers and wave reflection and transmission coefficients at layer interfaces. The method accounts for the filtering effects of the concentrated foundation and floor masses. Compared with commonly used vibration formulation, the wave-propagation formulation provides several advantages, including simplicity, improved accuracy, better representation of damping, the ability to incorporate the soil layers under the foundation, and providing better tools for identification and damage detection from seismic records. Examples are presented to show the versatility and the superiority of the method.

  13. Airside HVAC BESTEST. Adaptation of ASHRAE RP 865 Airside HVAC Equipment Modeling Test Cases for ASHRAE Standard 140. Volume 1, Cases AE101-AE445

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neymark, J.; Kennedy, M.; Judkoff, R.

    This report documents a set of diagnostic analytical verification cases for testing the ability of whole building simulation software to model the air distribution side of typical heating, ventilating and air conditioning (HVAC) equipment. These cases complement the unitary equipment cases included in American National Standards Institute (ANSI)/American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs, which test the ability to model the heat-transfer fluid side of HVAC equipment.

  14. The acute social defeat stress and nest-building test paradigm: A potential new method to screen drugs for depressive-like symptoms.

    PubMed

    Otabi, Hikari; Goto, Tatsuhiko; Okayama, Tsuyoshi; Kohari, Daisuke; Toyoda, Atsushi

    2017-02-01

    Psychosocial stress can cause mental conditions such as depression in humans. To develop drug therapies for the treatment of depression, it is necessary to use animal models of depression to screen drug candidates that exhibit anti-depressive effects. Unfortunately, the present methods of drug screening for antidepressants, the forced-swim test and tail-suspension test, are limiting factors in drug discovery because they are not based on the constructive validity of objective phenotypes in depression. Previously, we discovered that the onset of nest building is severely delayed in mice exposed to subchronic mild social defeat stress (sCSDS). Therefore, a novel paradigm combining acute social defeat stress (ASDS) and the nest-building test (SNB) were established for the efficient screening of drugs for depressive-like symptoms. Since ASDS severely delayed the nest-building process as shown in chronically social defeated mice, we sought to rescue the delayed nest-building behavior in ASDS mice. Injecting a specific serotonin 2a receptor antagonist (SR-46349B), the nest-building deficit exhibited by ASDS mice was partially rescued. On the other hand, a selective serotonin reuptake inhibitor (fluoxetine) did not rescue the nest-building deficit in ASDS mice. Therefore, we conclude that the SNB paradigm is an another potential behavioral method for screening drugs for depressive-like symptoms including attention deficit, anxiety, low locomotion, and decreased motivation. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Flood management: prediction of microbial contamination in large-scale floods in urban environments.

    PubMed

    Taylor, Jonathon; Lai, Ka Man; Davies, Mike; Clifton, David; Ridley, Ian; Biddulph, Phillip

    2011-07-01

    With a changing climate and increased urbanisation, the occurrence and the impact of flooding is expected to increase significantly. Floods can bring pathogens into homes and cause lingering damp and microbial growth in buildings, with the level of growth and persistence dependent on the volume and chemical and biological content of the flood water, the properties of the contaminating microbes, and the surrounding environmental conditions, including the restoration time and methods, the heat and moisture transport properties of the envelope design, and the ability of the construction material to sustain the microbial growth. The public health risk will depend on the interaction of these complex processes and the vulnerability and susceptibility of occupants in the affected areas. After the 2007 floods in the UK, the Pitt review noted that there is lack of relevant scientific evidence and consistency with regard to the management and treatment of flooded homes, which not only put the local population at risk but also caused unnecessary delays in the restoration effort. Understanding the drying behaviour of flooded buildings in the UK building stock under different scenarios, and the ability of microbial contaminants to grow, persist, and produce toxins within these buildings can help inform recovery efforts. To contribute to future flood management, this paper proposes the use of building simulations and biological models to predict the risk of microbial contamination in typical UK buildings. We review the state of the art with regard to biological contamination following flooding, relevant building simulation, simulation-linked microbial modelling, and current practical considerations in flood remediation. Using the city of London as an example, a methodology is proposed that uses GIS as a platform to integrate drying models and microbial risk models with the local building stock and flood models. The integrated tool will help local governments, health authorities, insurance companies and residents to better understand, prepare for and manage a large-scale flood in urban environments. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Dynamical Analysis in the Mathematical Modelling of Human Blood Glucose

    ERIC Educational Resources Information Center

    Bae, Saebyok; Kang, Byungmin

    2012-01-01

    We want to apply the geometrical method to a dynamical system of human blood glucose. Due to the educational importance of model building, we show a relatively general modelling process using observational facts. Next, two models of some concrete forms are analysed in the phase plane by means of linear stability, phase portrait and vector…

  17. Maximum Likelihood Dynamic Factor Modeling for Arbitrary "N" and "T" Using SEM

    ERIC Educational Resources Information Center

    Voelkle, Manuel C.; Oud, Johan H. L.; von Oertzen, Timo; Lindenberger, Ulman

    2012-01-01

    This article has 3 objectives that build on each other. First, we demonstrate how to obtain maximum likelihood estimates for dynamic factor models (the direct autoregressive factor score model) with arbitrary "T" and "N" by means of structural equation modeling (SEM) and compare the approach to existing methods. Second, we go beyond standard time…

  18. Leveraging Modeling Approaches: Reaction Networks and Rules

    PubMed Central

    Blinov, Michael L.; Moraru, Ion I.

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high resolution and/or high throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatio-temporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks – the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks. PMID:22161349

  19. Leveraging modeling approaches: reaction networks and rules.

    PubMed

    Blinov, Michael L; Moraru, Ion I

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high-resolution and/or high-throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatiotemporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks - the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks.

  20. Tools for Evaluating Fault Detection and Diagnostic Methods for HVAC Secondary Systems

    NASA Astrophysics Data System (ADS)

    Pourarian, Shokouh

    Although modern buildings are using increasingly sophisticated energy management and control systems that have tremendous control and monitoring capabilities, building systems routinely fail to perform as designed. More advanced building control, operation, and automated fault detection and diagnosis (AFDD) technologies are needed to achieve the goal of net-zero energy commercial buildings. Much effort has been devoted to develop such technologies for primary heating ventilating and air conditioning (HVAC) systems, and some secondary systems. However, secondary systems, such as fan coil units and dual duct systems, although widely used in commercial, industrial, and multifamily residential buildings, have received very little attention. This research study aims at developing tools that could provide simulation capabilities to develop and evaluate advanced control, operation, and AFDD technologies for these less studied secondary systems. In this study, HVACSIM+ is selected as the simulation environment. Besides developing dynamic models for the above-mentioned secondary systems, two other issues related to the HVACSIM+ environment are also investigated. One issue is the nonlinear equation solver used in HVACSIM+ (Powell's Hybrid method in subroutine SNSQ). It has been found from several previous research projects (ASRHAE RP 825 and 1312) that SNSQ is especially unstable at the beginning of a simulation and sometimes unable to converge to a solution. Another issue is related to the zone model in the HVACSIM+ library of components. Dynamic simulation of secondary HVAC systems unavoidably requires an interacting zone model which is systematically and dynamically interacting with building surrounding. Therefore, the accuracy and reliability of the building zone model affects operational data generated by the developed dynamic tool to predict HVAC secondary systems function. The available model does not simulate the impact of direct solar radiation that enters a zone through glazing and the study of zone model is conducted in this direction to modify the existing zone model. In this research project, the following tasks are completed and summarized in this report: 1. Develop dynamic simulation models in the HVACSIM+ environment for common fan coil unit and dual duct system configurations. The developed simulation models are able to produce both fault-free and faulty operational data under a wide variety of faults and severity levels for advanced control, operation, and AFDD technology development and evaluation purposes; 2. Develop a model structure, which includes the grouping of blocks and superblocks, treatment of state variables, initial and boundary conditions, and selection of equation solver, that can simulate a dual duct system efficiently with satisfactory stability; 3. Design and conduct a comprehensive and systematic validation procedure using collected experimental data to validate the developed simulation models under both fault-free and faulty operational conditions; 4. Conduct a numerical study to compare two solution techniques: Powell's Hybrid (PH) and Levenberg-Marquardt (LM) in terms of their robustness and accuracy. 5. Modification of the thermal state of the existing building zone model in HVACSIM+ library of component. This component is revised to consider the transmitted heat through glazing as a heat source for transient building zone load prediction In this report, literature, including existing HVAC dynamic modeling environment and models, HVAC model validation methodologies, and fault modeling and validation methodologies, are reviewed. The overall methodologies used for fault free and fault model development and validation are introduced. Detailed model development and validation results for the two secondary systems, i.e., fan coil unit and dual duct system are summarized. Experimental data mostly from the Iowa Energy Center Energy Resource Station are used to validate the models developed in this project. Satisfactory model performance in both fault free and fault simulation studies is observed for all studied systems.

  1. Classification of building infrastructure and automatic building footprint delineation using airborne laser swath mapping data

    NASA Astrophysics Data System (ADS)

    Caceres, Jhon

    Three-dimensional (3D) models of urban infrastructure comprise critical data for planners working on problems in wireless communications, environmental monitoring, civil engineering, and urban planning, among other tasks. Photogrammetric methods have been the most common approach to date to extract building models. However, Airborne Laser Swath Mapping (ALSM) observations offer a competitive alternative because they overcome some of the ambiguities that arise when trying to extract 3D information from 2D images. Regardless of the source data, the building extraction process requires segmentation and classification of the data and building identification. In this work, approaches for classifying ALSM data, separating building and tree points, and delineating ALSM footprints from the classified data are described. Digital aerial photographs are used in some cases to verify results, but the objective of this work is to develop methods that can work on ALSM data alone. A robust approach for separating tree and building points in ALSM data is presented. The method is based on supervised learning of the classes (tree vs. building) in a high dimensional feature space that yields good class separability. Features used for classification are based on the generation of local mappings, from three-dimensional space to two-dimensional space, known as "spin images" for each ALSM point to be classified. The method discriminates ALSM returns in compact spaces and even where the classes are very close together or overlapping spatially. A modified algorithm of the Hough Transform is used to orient the spin images, and the spin image parameters are specified such that the mutual information between the spin image pixel values and class labels is maximized. This new approach to ALSM classification allows us to fully exploit the 3D point information in the ALSM data while still achieving good class separability, which has been a difficult trade-off in the past. Supported by the spin image analysis for obtaining an initial classification, an automatic approach for delineating accurate building footprints is presented. The physical fact that laser pulses that happen to strike building edges can produce very different 1st and last return elevations has been long recognized. However, in older generation ALSM systems (<50 kHz pulse rates) such points were too few and far between to delineate building footprints precisely. Furthermore, without the robust separation of nearby trees and vegetation from the buildings, simply extracting ALSM shots where the elevation of the first return was much higher than the elevation of the last return, was not a reliable means of identifying building footprints. However, with the advent of ALSM systems with pulse rates in excess of 100 kHz, and by using spin-imaged based segmentation, it is now possible to extract building edges from the point cloud. A refined classification resulting from incorporating "on-edge" information is developed for obtaining quadrangular footprints. The footprint fitting process involves line generalization, least squares-based clustering and dominant points finding for segmenting individual building edges. In addition, an algorithm for fitting complex footprints using the segmented edges and data inside footprints is also proposed.

  2. Working Toward the Very Low Energy Consumption Building of the Future |

    Science.gov Websites

    systems engineering methods that have transformed other industries, including the aircraft and automobile Merced and United Technologies are studying the use of sensors and occupancy estimating methods to , occupancy dynamics models, and energy control methods. The team will test whether this technology can

  3. Validation and Improvement of Reliability Methods for Air Force Building Systems

    DTIC Science & Technology

    focusing primarily on HVAC systems . This research used contingency analysis to assess the performance of each model for HVAC systems at six Air Force...probabilistic model produced inflated reliability calculations for HVAC systems . In light of these findings, this research employed a stochastic method, a...Nonhomogeneous Poisson Process (NHPP), in an attempt to produce accurate HVAC system reliability calculations. This effort ultimately concluded that

  4. Accounting for Co-Teaching: A Guide for Policymakers and Developers of Value-Added Models

    ERIC Educational Resources Information Center

    Isenberg, Eric; Walsh, Elias

    2015-01-01

    We outline the options available to policymakers for addressing co-teaching in a value-added model. Building on earlier work, we propose an improvement to a method of accounting for co-teaching that treats co-teachers as teams, with each teacher receiving equal credit for co-taught students. Hock and Isenberg (2012) described a method known as the…

  5. Geoinformation techniques for the 3D visualisation of historic buildings and representation of a building's pathology

    NASA Astrophysics Data System (ADS)

    Tsilimantou, Elisavet; Delegou, Ekaterini; Ioannidis, Charalabos; Moropoulou, Antonia

    2016-08-01

    In this paper, the documentation of an historic building registered as Cultural Heritage asset is presented. The aim of the survey is to create a 3D geometric representation of a historic building and in accordance with multidisciplinary study extract useful information regarding the extent of degradation, constructions' durability etc. For the implementation of the survey, a combination of different types of acquisition technologies is used. The project focuses on the study of Villa Klonaridi, in Athens, Greece. For the complete documentation of the building, conventional topography, photogrammetric and laser scanning techniques is combined. Close range photogrammetric techniques are used for the acquisition of the façades and architectural details. One of the main objectives is the development of an accurate 3D model, where the photorealistic representation of the building is achieved, along with the decay pathology, historical phases and architectural components. In order to achieve a suitable graphical representation for the study of the material and decay patterns beyond the 2D representation, 3D modelling and additional information modelling is performed for comparative analysis. The study provides various conclusions regarding the scale of deterioration obtained by the 2D and 3D analysis respectively. Considering the variation in material and decay patterns, comparative results are obtained regarding the degradation of the building. Overall, the paper describes a process performed on a Historic Building, where the 3D digital acquisition of the monuments' structure is realized with the combination of close range surveying and laser scanning methods.

  6. Time-varying metamaterials based on graphene-wrapped microwires: Modeling and potential applications

    NASA Astrophysics Data System (ADS)

    Salary, Mohammad Mahdi; Jafar-Zanjani, Samad; Mosallaei, Hossein

    2018-03-01

    The successful realization of metamaterials and metasurfaces requires the judicious choice of constituent elements. In this paper, we demonstrate the implementation of time-varying metamaterials in the terahertz frequency regime by utilizing graphene-wrapped microwires as building blocks and modulation of graphene conductivity through exterior electrical gating. These elements enable enhancement of light-graphene interaction by utilizing optical resonances associated with Mie scattering, yielding a large tunability and modulation depth. We develop a semianalytical framework based on transition-matrix formulation for modeling and analysis of periodic and aperiodic arrays of such time-varying building blocks. The proposed method is validated against full-wave numerical results obtained using the finite-difference time-domain method. It provides an ideal tool for mathematical synthesis and analysis of space-time gradient metamaterials, eliminating the need for computationally expensive numerical models. Moreover, it allows for a wider exploration of exotic space-time scattering phenomena in time-modulated metamaterials. We apply the method to explore the role of modulation parameters in the generation of frequency harmonics and their emerging wavefronts. Several potential applications of such platforms are demonstrated, including frequency conversion, holographic generation of frequency harmonics, and spatiotemporal manipulation of light. The presented results provide key physical insights to design time-modulated functional metadevices using various building blocks and open up new directions in the emerging paradigm of time-modulated metamaterials.

  7. Quantifying Earthquake Collapse Risk of Tall Steel Braced Frame Buildings Using Rupture-to-Rafters Simulations

    NASA Astrophysics Data System (ADS)

    Mourhatch, Ramses

    This thesis examines collapse risk of tall steel braced frame buildings using rupture-to-rafters simulations due to suite of San Andreas earthquakes. Two key advancements in this work are the development of (i) a rational methodology for assigning scenario earthquake probabilities and (ii) an artificial correction-free approach to broadband ground motion simulation. The work can be divided into the following sections: earthquake source modeling, earthquake probability calculations, ground motion simulations, building response, and performance analysis. As a first step the kinematic source inversions of past earthquakes in the magnitude range of 6-8 are used to simulate 60 scenario earthquakes on the San Andreas fault. For each scenario earthquake a 30-year occurrence probability is calculated and we present a rational method to redistribute the forecast earthquake probabilities from UCERF to the simulated scenario earthquake. We illustrate the inner workings of the method through an example involving earthquakes on the San Andreas fault in southern California. Next, three-component broadband ground motion histories are computed at 636 sites in the greater Los Angeles metropolitan area by superposing short-period (0.2s-2.0s) empirical Green's function synthetics on top of long-period (> 2.0s) spectral element synthetics. We superimpose these seismograms on low-frequency seismograms, computed from kinematic source models using the spectral element method, to produce broadband seismograms. Using the ground motions at 636 sites for the 60 scenario earthquakes, 3-D nonlinear analysis of several variants of an 18-story steel braced frame building, designed for three soil types using the 1994 and 1997 Uniform Building Code provisions and subjected to these ground motions, are conducted. Model performance is classified into one of five performance levels: Immediate Occupancy, Life Safety, Collapse Prevention, Red-Tagged, and Model Collapse. The results are combined with the 30-year probability of occurrence of the San Andreas scenario earthquakes using the PEER performance based earthquake engineering framework to determine the probability of exceedance of these limit states over the next 30 years.

  8. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A

    2012-03-15

    To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Time series modeling and forecasting using memetic algorithms for regime-switching models.

    PubMed

    Bergmeir, Christoph; Triguero, Isaac; Molina, Daniel; Aznarte, José Luis; Benitez, José Manuel

    2012-11-01

    In this brief, we present a novel model fitting procedure for the neuro-coefficient smooth transition autoregressive model (NCSTAR), as presented by Medeiros and Veiga. The model is endowed with a statistically founded iterative building procedure and can be interpreted in terms of fuzzy rule-based systems. The interpretability of the generated models and a mathematically sound building procedure are two very important properties of forecasting models. The model fitting procedure employed by the original NCSTAR is a combination of initial parameter estimation by a grid search procedure with a traditional local search algorithm. We propose a different fitting procedure, using a memetic algorithm, in order to obtain more accurate models. An empirical evaluation of the method is performed, applying it to various real-world time series originating from three forecasting competitions. The results indicate that we can significantly enhance the accuracy of the models, making them competitive to models commonly used in the field.

  10. An Integrated Fuselage-Sting Balance for a Sonic-Boom Wind-Tunnel Model

    NASA Technical Reports Server (NTRS)

    Mack, Robert J.

    2004-01-01

    Measured and predicted pressure signatures from a lifting wind-tunnel model can be compared when the lift on the model is accurately known. The model's lift can be set by bending the support sting to a desired angle of attack. This method is simple in practice, but difficult to accurately apply. A second method is to build a normal force/pitching moment balance into the aft end of the sting, and use an angle-of-attack mechanism to set model attitude. In this report, a method for designing a sting/balance into the aft fuselage/sting of a sonic-boom model is described. A computer code is given, and a sample sting design is outlined to demonstrate the method.

  11. Time on Your Hands: Modeling Time

    ERIC Educational Resources Information Center

    Finson, Kevin; Beaver, John

    2007-01-01

    Building physical models relative to a concept can be an important activity to help students develop and manipulate abstract ideas and mental models that often prove difficult to grasp. One such concept is "time". A method for helping students understand the cyclical nature of time involves the construction of a Time Zone Calculator through a…

  12. Building Pre-Service Teaching Efficacy: A Comparison of Instructional Models

    ERIC Educational Resources Information Center

    Cohen, Rona; Zach, Sima

    2013-01-01

    Background: Cooperative Learning (CL) is an inclusive name for various models of teaching/learning methods, all of which emphasize the fundamental of meaningful collaboration among learners during their learning activities. Purpose: The purpose of this study was to examine whether the CL teaching model contributed to the teaching efficacy and…

  13. FRF-based structural damage detection of controlled buildings with podium structures: Experimental investigation

    NASA Astrophysics Data System (ADS)

    Xu, Y. L.; Huang, Q.; Zhan, S.; Su, Z. Q.; Liu, H. J.

    2014-06-01

    How to use control devices to enhance system identification and damage detection in relation to a structure that requires both vibration control and structural health monitoring is an interesting yet practical topic. In this study, the possibility of using the added stiffness provided by control devices and frequency response functions (FRFs) to detect damage in a building complex was explored experimentally. Scale models of a 12-storey main building and a 3-storey podium structure were built to represent a building complex. Given that the connection between the main building and the podium structure is most susceptible to damage, damage to the building complex was experimentally simulated by changing the connection stiffness. To simulate the added stiffness provided by a semi-active friction damper, a steel circular ring was designed and used to add the related stiffness to the building complex. By varying the connection stiffness using an eccentric wheel excitation system and by adding or not adding the circular ring, eight cases were investigated and eight sets of FRFs were measured. The experimental results were used to detect damage (changes in connection stiffness) using a recently proposed FRF-based damage detection method. The experimental results showed that the FRF-based damage detection method could satisfactorily locate and quantify damage.

  14. Mining Building Energy Management System Data Using Fuzzy Anomaly Detection and Linguistic Descriptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumidu Wijayasekara; Ondrej Linda; Milos Manic

    Building Energy Management Systems (BEMSs) are essential components of modern buildings that utilize digital control technologies to minimize energy consumption while maintaining high levels of occupant comfort. However, BEMSs can only achieve these energy savings when properly tuned and controlled. Since indoor environment is dependent on uncertain criteria such as weather, occupancy, and thermal state, performance of BEMS can be sub-optimal at times. Unfortunately, the complexity of BEMS control mechanism, the large amount of data available and inter-relations between the data can make identifying these sub-optimal behaviors difficult. This paper proposes a novel Fuzzy Anomaly Detection and Linguistic Description (Fuzzy-ADLD)more » based method for improving the understandability of BEMS behavior for improved state-awareness. The presented method is composed of two main parts: 1) detection of anomalous BEMS behavior and 2) linguistic representation of BEMS behavior. The first part utilizes modified nearest neighbor clustering algorithm and fuzzy logic rule extraction technique to build a model of normal BEMS behavior. The second part of the presented method computes the most relevant linguistic description of the identified anomalies. The presented Fuzzy-ADLD method was applied to real-world BEMS system and compared against a traditional alarm based BEMS. In six different scenarios, the Fuzzy-ADLD method identified anomalous behavior either as fast as or faster (an hour or more), that the alarm based BEMS. In addition, the Fuzzy-ADLD method identified cases that were missed by the alarm based system, demonstrating potential for increased state-awareness of abnormal building behavior.« less

  15. Datamining approaches for modeling tumor control probability.

    PubMed

    Naqa, Issam El; Deasy, Joseph O; Mu, Yi; Huang, Ellen; Hope, Andrew J; Lindsay, Patricia E; Apte, Aditya; Alaly, James; Bradley, Jeffrey D

    2010-11-01

    Tumor control probability (TCP) to radiotherapy is determined by complex interactions between tumor biology, tumor microenvironment, radiation dosimetry, and patient-related variables. The complexity of these heterogeneous variable interactions constitutes a challenge for building predictive models for routine clinical practice. We describe a datamining framework that can unravel the higher order relationships among dosimetric dose-volume prognostic variables, interrogate various radiobiological processes, and generalize to unseen data before when applied prospectively. Several datamining approaches are discussed that include dose-volume metrics, equivalent uniform dose, mechanistic Poisson model, and model building methods using statistical regression and machine learning techniques. Institutional datasets of non-small cell lung cancer (NSCLC) patients are used to demonstrate these methods. The performance of the different methods was evaluated using bivariate Spearman rank correlations (rs). Over-fitting was controlled via resampling methods. Using a dataset of 56 patients with primary NCSLC tumors and 23 candidate variables, we estimated GTV volume and V75 to be the best model parameters for predicting TCP using statistical resampling and a logistic model. Using these variables, the support vector machine (SVM) kernel method provided superior performance for TCP prediction with an rs=0.68 on leave-one-out testing compared to logistic regression (rs=0.4), Poisson-based TCP (rs=0.33), and cell kill equivalent uniform dose model (rs=0.17). The prediction of treatment response can be improved by utilizing datamining approaches, which are able to unravel important non-linear complex interactions among model variables and have the capacity to predict on unseen data for prospective clinical applications.

  16. Case study of odor and indoor air quality assessment in the dewatering building at the Stickney Water Reclamation Plant.

    PubMed

    Sharma, Manju; O'Connell, Susan; Garelli, Brett; Sattayatewa, Chakkrid; Moschandreas, Demetrios; Pagilla, Krishna

    2012-01-01

    Indoor air quality (IAQ) and odors were determined using sampling/monitoring, measurement, and modeling methods in a large dewatering building at a very large water reclamation plant. The ultimate goal was to determine control strategies to reduce the sensory impacts on the workforce and achieve odor reduction within the building. Study approaches included: (1) investigation of air mixing by using CO(2) as an indicator, (2) measurement of airflow capacity of ventilation fans, (3) measurement of odors and odorants, (4) development of statistical and IAQ models, and (5) recommendation of control strategies. The results showed that air quality in the building complies with occupational safety and health guidelines; however, nuisance odors that can increase stress and productivity loss still persist. Excess roof fan capacity induced odor dispersion to the upper levels. Lack of a local air exhaust system of sufficient capacity and optimum design was found to be the contributor to occasional less than adequate indoor air quality and odors. Overall, air ventilation rate in the building has less effect on persistence of odors in the building. Odor/odorant emission rates from centrifuge drops were approximately 100 times higher than those from the open conveyors. Based on measurements and modeling, the key control strategies recommended include increasing local air exhaust system capacity and relocation of exhaust hoods closer to the centrifuge drops.

  17. Computational Design of Self-Assembling Protein Nanomaterials with Atomic Level Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Neil P.; Sheffler, William; Sawaya, Michael R.

    2015-09-17

    We describe a general computational method for designing proteins that self-assemble to a desired symmetric architecture. Protein building blocks are docked together symmetrically to identify complementary packing arrangements, and low-energy protein-protein interfaces are then designed between the building blocks in order to drive self-assembly. We used trimeric protein building blocks to design a 24-subunit, 13-nm diameter complex with octahedral symmetry and a 12-subunit, 11-nm diameter complex with tetrahedral symmetry. The designed proteins assembled to the desired oligomeric states in solution, and the crystal structures of the complexes revealed that the resulting materials closely match the design models. The method canmore » be used to design a wide variety of self-assembling protein nanomaterials.« less

  18. A Bayesian prediction model between a biomarker and the clinical endpoint for dichotomous variables.

    PubMed

    Jiang, Zhiwei; Song, Yang; Shou, Qiong; Xia, Jielai; Wang, William

    2014-12-20

    Early biomarkers are helpful for predicting clinical endpoints and for evaluating efficacy in clinical trials even if the biomarker cannot replace clinical outcome as a surrogate. The building and evaluation of an association model between biomarkers and clinical outcomes are two equally important concerns regarding the prediction of clinical outcome. This paper is to address both issues in a Bayesian framework. A Bayesian meta-analytic approach is proposed to build a prediction model between the biomarker and clinical endpoint for dichotomous variables. Compared with other Bayesian methods, the proposed model only requires trial-level summary data of historical trials in model building. By using extensive simulations, we evaluate the link function and the application condition of the proposed Bayesian model under scenario (i) equal positive predictive value (PPV) and negative predictive value (NPV) and (ii) higher NPV and lower PPV. In the simulations, the patient-level data is generated to evaluate the meta-analytic model. PPV and NPV are employed to describe the patient-level relationship between the biomarker and the clinical outcome. The minimum number of historical trials to be included in building the model is also considered. It is seen from the simulations that the logit link function performs better than the odds and cloglog functions under both scenarios. PPV/NPV ≥0.5 for equal PPV and NPV, and PPV + NPV ≥1 for higher NPV and lower PPV are proposed in order to predict clinical outcome accurately and precisely when the proposed model is considered. Twenty historical trials are required to be included in model building when PPV and NPV are equal. For unequal PPV and NPV, the minimum number of historical trials for model building is proposed to be five. A hypothetical example shows an application of the proposed model in global drug development. The proposed Bayesian model is able to predict well the clinical endpoint from the observed biomarker data for dichotomous variables as long as the conditions are satisfied. It could be applied in drug development. But the practical problems in applications have to be studied in further research.

  19. Roof Type Selection Based on Patch-Based Classification Using Deep Learning for High Resolution Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Partovi, T.; Fraundorfer, F.; Azimi, S.; Marmanis, D.; Reinartz, P.

    2017-05-01

    3D building reconstruction from remote sensing image data from satellites is still an active research topic and very valuable for 3D city modelling. The roof model is the most important component to reconstruct the Level of Details 2 (LoD2) for a building in 3D modelling. While the general solution for roof modelling relies on the detailed cues (such as lines, corners and planes) extracted from a Digital Surface Model (DSM), the correct detection of the roof type and its modelling can fail due to low quality of the DSM generated by dense stereo matching. To reduce dependencies of roof modelling on DSMs, the pansharpened satellite images as a rich resource of information are used in addition. In this paper, two strategies are employed for roof type classification. In the first one, building roof types are classified in a state-of-the-art supervised pre-trained convolutional neural network (CNN) framework. In the second strategy, deep features from deep layers of different pre-trained CNN model are extracted and then an RBF kernel using SVM is employed to classify the building roof type. Based on roof complexity of the scene, a roof library including seven types of roofs is defined. A new semi-automatic method is proposed to generate training and test patches of each roof type in the library. Using the pre-trained CNN model does not only decrease the computation time for training significantly but also increases the classification accuracy.

  20. Intelligent Detection of Structure from Remote Sensing Images Based on Deep Learning Method

    NASA Astrophysics Data System (ADS)

    Xin, L.

    2018-04-01

    Utilizing high-resolution remote sensing images for earth observation has become the common method of land use monitoring. It requires great human participation when dealing with traditional image interpretation, which is inefficient and difficult to guarantee the accuracy. At present, the artificial intelligent method such as deep learning has a large number of advantages in the aspect of image recognition. By means of a large amount of remote sensing image samples and deep neural network models, we can rapidly decipher the objects of interest such as buildings, etc. Whether in terms of efficiency or accuracy, deep learning method is more preponderant. This paper explains the research of deep learning method by a great mount of remote sensing image samples and verifies the feasibility of building extraction via experiments.

  1. Twenty Years On!: Updating the IEA BESTEST Building Thermal Fabric Test Cases for ASHRAE Standard 140

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.

    2013-07-01

    ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs applies the IEA BESTEST building thermal fabric test cases and example simulation results originally published in 1995. These software accuracy test cases and their example simulation results, which comprise the first test suite adapted for the initial 2001 version of Standard 140, are approaching their 20th anniversary. In response to the evolution of the state of the art in building thermal fabric modeling since the test cases and example simulation results were developed, work is commencing to update the normative test specification and themore » informative example results.« less

  2. Twenty Years On!: Updating the IEA BESTEST Building Thermal Fabric Test Cases for ASHRAE Standard 140: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.

    2013-07-01

    ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs applies the IEA BESTEST building thermal fabric test cases and example simulation results originally published in 1995. These software accuracy test cases and their example simulation results, which comprise the first test suite adapted for the initial 2001 version of Standard 140, are approaching their 20th anniversary. In response to the evolution of the state of the art in building thermal fabric modeling since the test cases and example simulation results were developed, work is commencing to update the normative test specification and themore » informative example results.« less

  3. Identifying unproven cancer treatments on the health web: addressing accuracy, generalizability and scalability.

    PubMed

    Aphinyanaphongs, Yin; Fu, Lawrence D; Aliferis, Constantin F

    2013-01-01

    Building machine learning models that identify unproven cancer treatments on the Health Web is a promising approach for dealing with the dissemination of false and dangerous information to vulnerable health consumers. Aside from the obvious requirement of accuracy, two issues are of practical importance in deploying these models in real world applications. (a) Generalizability: The models must generalize to all treatments (not just the ones used in the training of the models). (b) Scalability: The models can be applied efficiently to billions of documents on the Health Web. First, we provide methods and related empirical data demonstrating strong accuracy and generalizability. Second, by combining the MapReduce distributed architecture and high dimensionality compression via Markov Boundary feature selection, we show how to scale the application of the models to WWW-scale corpora. The present work provides evidence that (a) a very small subset of unproven cancer treatments is sufficient to build a model to identify unproven treatments on the web; (b) unproven treatments use distinct language to market their claims and this language is learnable; (c) through distributed parallelization and state of the art feature selection, it is possible to prepare the corpora and build and apply models with large scalability.

  4. Building a Relationship between Elements of Product Form Features and Vocabulary Assessment Models

    ERIC Educational Resources Information Center

    Lo, Chi-Hung

    2016-01-01

    Based on the characteristic feature parameterization and the superiority evaluation method (SEM) in extension engineering, a product-shape design method was proposed in this study. The first step of this method is to decompose the basic feature components of a product. After that, the morphological chart method is used to segregate the ideas so as…

  5. The post-evaluation of green residential building in Ningxia

    NASA Astrophysics Data System (ADS)

    Wu, Yunna; Wang, Zhen

    2017-06-01

    Green residential buildings are concerned by more and more people. However, the development of green residential buildings has been limited due to the single-standard requirements and lack of the multi-objective performance. At same time, the evaluation criteria system of green residential building is not comprehensive enough. So first of all, using SPSS software, residents questionnaire surveys are figured and found that the judge of experts and residents about the green elements is inconsistent, so the owners’ satisfaction is included in the post-evaluation criterial systems of green residential building from five aspects-the preliminary work of construction, construction process, economic, social benefits and owners satisfaction in Ningxia area, combined with expert interviews. Secondly, in the post-evaluation, it is difficult for many experts judgment matrix to meet the requirement of consistency, in this paper using MATLAB program, judgment matrix consistency is adjusted. And the weights of the criteria and sub-criteria and experts weights using group AHP method are determined. Finally, the grey clustering method is used to establish the post-evaluation model and the real case of Sai-shang project is carried out. It shows that the result obtained by using the improved criteria system and method in this paper is in a high degree of agreement with the actual result.

  6. Tsunami Simulators in Physical Modelling - Concept to Practical Solutions

    NASA Astrophysics Data System (ADS)

    Chandler, Ian; Allsop, William; Robinson, David; Rossetto, Tiziana; McGovern, David; Todd, David

    2017-04-01

    Whilst many researchers have conducted simple 'tsunami impact' studies, few engineering tools are available to assess the onshore impacts of tsunami, with no agreed methods available to predict loadings on coastal defences, buildings or related infrastructure. Most previous impact studies have relied upon unrealistic waveforms (solitary or dam-break waves and bores) rather than full-duration tsunami waves, or have used simplified models of nearshore and over-land flows. Over the last 10+ years, pneumatic Tsunami Simulators for the hydraulic laboratory have been developed into an exciting and versatile technology, allowing the forces of real-world tsunami to be reproduced and measured in a laboratory environment for the first time. These devices have been used to model generic elevated and N-wave tsunamis up to and over simple shorelines, and at example coastal defences and infrastructure. They have also reproduced full-duration tsunamis including Mercator 2004 and Tohoku 2011, both at 1:50 scale. Engineering scale models of these tsunamis have measured wave run-up on simple slopes, forces on idealised sea defences, pressures / forces on buildings, and scour at idealised buildings. This presentation will describe how these Tsunami Simulators work, demonstrate how they have generated tsunami waves longer than the facilities within which they operate, and will present research results from three generations of Tsunami Simulators. Highlights of direct importance to natural hazard modellers and coastal engineers include measurements of wave run-up levels, forces on single and multiple buildings and comparison with previous theoretical predictions. Multiple buildings have two malign effects. The density of buildings to flow area (blockage ratio) increases water depths and flow velocities in the 'streets'. But the increased building densities themselves also increase the cost of flow per unit area (both personal and monetary). The most recent study with the Tsunami Simulators therefore focussed on the influence of multiple buildings (up to 4 rows) which showed (for instance) that the greatest forces can act on the landward (not seaward) rows of buildings. Studies in the 70m long, 4m wide main channel of the Fast Flow Facility on tsunami defence structures have also measured forces on buildings in the lee of a failed defence wall and tsunami induced scour. Supporting presentations at this conference: McGovern et al on tsunami induced scour at coastal structures and Foster et al on building loads.

  7. Use of NARCCAP Model Projections to Develop a Future Typical Meteorological Year and Estimate the Impact of a Changing Climate on Building Energy Consumption

    NASA Astrophysics Data System (ADS)

    Patton, S. L.; Takle, E. S.; Passe, U.; Kalvelage, K.

    2013-12-01

    Current simulations of building energy consumption use weather input files based on the past thirty years of climate observations. These 20th century climate conditions may be inadequate when designing buildings meant to function well into the 21st century. An alternative is using model projections of climate change to estimate future risk to the built environment. In this study, model-projected changes in climate were combined with existing typical meteorological year data to create future typical meteorological year data. These data were then formatted for use in EnergyPlus simulation software to evaluate their potential impact on commercial building energy consumption. The modeled climate data were taken from the North American Regional Climate Change Assessment Program (NARCCAP). NARCCAP uses results of global climate models to drive regional climate models, also known as dynamical downscaling. This downscaling gives higher resolution results over specific locations, and the multiple global/regional climate model combinations provide a unique opportunity to quantify the uncertainty of climate change projections and their impacts. Our results show a projected decrease in heating energy consumption and a projected increase in cooling energy consumption for nine locations across the United States for all model combinations. Warmer locations may expect a decrease in heating load of around 30% to 45% and an increase in cooling load of around 25% to 35%. Colder locations may expect a decrease in heating load of around 15% to 25% and an increase in cooling load of around 40% to 70%. The change in net energy consumption is determined by the balance between the magnitudes of heating change and cooling change. Net energy consumption is projected to increase by an average of 5% for lower-latitude locations and decrease by an average of 5% for higher-latitude locations. With these projected annual and seasonal changes presenting strong evidence for the unsuitable nature of current building practices holding up under future climate change, we recommend using our methods and results to make modifications and adaptations to existing buildings and to aid in the design of future buildings.

  8. Finite difference and Runge-Kutta methods for solving vibration problems

    NASA Astrophysics Data System (ADS)

    Lintang Renganis Radityani, Scolastika; Mungkasi, Sudi

    2017-11-01

    The vibration of a storey building can be modelled into a system of second order ordinary differential equations. If the number of floors of a building is large, then the result is a large scale system of second order ordinary differential equations. The large scale system is difficult to solve, and if it can be solved, the solution may not be accurate. Therefore, in this paper, we seek for accurate methods for solving vibration problems. We compare the performance of numerical finite difference and Runge-Kutta methods for solving large scale systems of second order ordinary differential equations. The finite difference methods include the forward and central differences. The Runge-Kutta methods include the Euler and Heun methods. Our research results show that the central finite difference and the Heun methods produce more accurate solutions than the forward finite difference and the Euler methods do.

  9. From Oss CAD to Bim for Cultural Heritage Digital Representation

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Karachaliou, E.; Stylianidis, E.

    2017-02-01

    The paper illustrates the use of open source Computer-aided design (CAD) environments in order to develop Building Information Modelling (BIM) tools able to manage 3D models in the field of cultural heritage. Nowadays, the development of Free and Open Source Software (FOSS) has been rapidly growing and their use tends to be consolidated. Although BIM technology is widely known and used, there is a lack of integrated open source platforms able to support all stages of Historic Building Information Modelling (HBIM) processes. The present research aims to use a FOSS CAD environment in order to develop BIM plug-ins which will be able to import and edit digital representations of cultural heritage models derived by photogrammetric methods.

  10. Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering

    NASA Technical Reports Server (NTRS)

    Bolton, Matthew L.; Bass, Ellen J.

    2009-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.

  11. Modeling arson - An exercise in qualitative model building

    NASA Technical Reports Server (NTRS)

    Heineke, J. M.

    1975-01-01

    A detailed example is given of the role of von Neumann and Morgenstern's 1944 'expected utility theorem' (in the theory of games and economic behavior) in qualitative model building. Specifically, an arsonist's decision as to the amount of time to allocate to arson and related activities is modeled, and the responsiveness of this time allocation to changes in various policy parameters is examined. Both the activity modeled and the method of presentation are intended to provide an introduction to the scope and power of the expected utility theorem in modeling situations of 'choice under uncertainty'. The robustness of such a model is shown to vary inversely with the number of preference restrictions used in the analysis. The fewer the restrictions, the wider is the class of agents to which the model is applicable, and accordingly more confidence is put in the derived results. A methodological discussion on modeling human behavior is included.

  12. Demand Response Resource Quantification with Detailed Building Energy Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Elaine; Horsey, Henry; Merket, Noel

    Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.

  13. Basics of Bayesian methods.

    PubMed

    Ghosh, Sujit K

    2010-01-01

    Bayesian methods are rapidly becoming popular tools for making statistical inference in various fields of science including biology, engineering, finance, and genetics. One of the key aspects of Bayesian inferential method is its logical foundation that provides a coherent framework to utilize not only empirical but also scientific information available to a researcher. Prior knowledge arising from scientific background, expert judgment, or previously collected data is used to build a prior distribution which is then combined with current data via the likelihood function to characterize the current state of knowledge using the so-called posterior distribution. Bayesian methods allow the use of models of complex physical phenomena that were previously too difficult to estimate (e.g., using asymptotic approximations). Bayesian methods offer a means of more fully understanding issues that are central to many practical problems by allowing researchers to build integrated models based on hierarchical conditional distributions that can be estimated even with limited amounts of data. Furthermore, advances in numerical integration methods, particularly those based on Monte Carlo methods, have made it possible to compute the optimal Bayes estimators. However, there is a reasonably wide gap between the background of the empirically trained scientists and the full weight of Bayesian statistical inference. Hence, one of the goals of this chapter is to bridge the gap by offering elementary to advanced concepts that emphasize linkages between standard approaches and full probability modeling via Bayesian methods.

  14. A polynomial-chaos-expansion-based building block approach for stochastic analysis of photonic circuits

    NASA Astrophysics Data System (ADS)

    Waqas, Abi; Melati, Daniele; Manfredi, Paolo; Grassi, Flavia; Melloni, Andrea

    2018-02-01

    The Building Block (BB) approach has recently emerged in photonic as a suitable strategy for the analysis and design of complex circuits. Each BB can be foundry related and contains a mathematical macro-model of its functionality. As well known, statistical variations in fabrication processes can have a strong effect on their functionality and ultimately affect the yield. In order to predict the statistical behavior of the circuit, proper analysis of the uncertainties effects is crucial. This paper presents a method to build a novel class of Stochastic Process Design Kits for the analysis of photonic circuits. The proposed design kits directly store the information on the stochastic behavior of each building block in the form of a generalized-polynomial-chaos-based augmented macro-model obtained by properly exploiting stochastic collocation and Galerkin methods. Using this approach, we demonstrate that the augmented macro-models of the BBs can be calculated once and stored in a BB (foundry dependent) library and then used for the analysis of any desired circuit. The main advantage of this approach, shown here for the first time in photonics, is that the stochastic moments of an arbitrary photonic circuit can be evaluated by a single simulation only, without the need for repeated simulations. The accuracy and the significant speed-up with respect to the classical Monte Carlo analysis are verified by means of classical photonic circuit example with multiple uncertain variables.

  15. Fundamental mass transfer modeling of emission of volatile organic compounds from building materials

    NASA Astrophysics Data System (ADS)

    Bodalal, Awad Saad

    In this study, a mass transfer theory based model is presented for characterizing the VOC emissions from building materials. A 3-D diffusion model is developed to describe the emissions of volatile organic compounds (VOCs) from individual sources. Then the formulation is extended to include the emissions from composite sources (system comprising an assemblage of individual sources). The key parameters for the model (The diffusion coefficient of the VOC in the source material D, and the equilibrium partition coefficient k e) were determined independently (model parameters are determined without the use of chamber emission data). This procedure eliminated to a large extent the need for emission testing using environmental chambers, which is costly, time consuming, and may be subject to confounding sink effects. An experimental method is developed and implemented to measure directly the internal diffusion (D) and partition coefficients ( ke). The use of the method is illustrated for three types of VOC's: (i) Aliphatic Hydrocarbons, (ii) Aromatic Hydrocarbons and ( iii) Aldehydes, through typical dry building materials (carpet, plywood, particleboard, vinyl floor tile, gypsum board, sub-floor tile and OSB). Then correlations for predicting D and ke based solely on commonly available properties such as molecular weight and vapour pressure were proposed for each product and type of VOC. These correlations can be used to estimate the D and ke when direct measurement data are not available, and thus facilitate the prediction of VOC emissions from the building materials using mass transfer theory. The VOC emissions from a sub-floor material (made of the recycled automobile tires), and a particleboard are measured and predicted. Finally, a mathematical model to predict the diffusion coefficient through complex sources (floor adhesive) as a function of time was developed. Then this model (for diffusion coefficient in complex sources) was used to predict the emission rate from material system (namely, substrate//glue//vinyl tile).

  16. Semi-automatic building extraction in informal settlements from high-resolution satellite imagery

    NASA Astrophysics Data System (ADS)

    Mayunga, Selassie David

    The extraction of man-made features from digital remotely sensed images is considered as an important step underpinning management of human settlements in any country. Man-made features and buildings in particular are required for varieties of applications such as urban planning, creation of geographical information systems (GIS) databases and Urban City models. The traditional man-made feature extraction methods are very expensive in terms of equipment, labour intensive, need well-trained personnel and cannot cope with changing environments, particularly in dense urban settlement areas. This research presents an approach for extracting buildings in dense informal settlement areas using high-resolution satellite imagery. The proposed system uses a novel strategy of extracting building by measuring a single point at the approximate centre of the building. The fine measurement of the building outlines is then effected using a modified snake model. The original snake model on which this framework is based, incorporates an external constraint energy term which is tailored to preserving the convergence properties of the snake model; its use to unstructured objects will negatively affect their actual shapes. The external constrained energy term was removed from the original snake model formulation, thereby, giving ability to cope with high variability of building shapes in informal settlement areas. The proposed building extraction system was tested on two areas, which have different situations. The first area was Tungi in Dar Es Salaam, Tanzania where three sites were tested. This area is characterized by informal settlements, which are illegally formulated within the city boundaries. The second area was Oromocto in New Brunswick, Canada where two sites were tested. Oromocto area is mostly flat and the buildings are constructed using similar materials. Qualitative and quantitative measures were employed to evaluate the accuracy of the results as well as the performance of the system. The qualitative and quantitative measures were based on visual inspection and by comparing the measured coordinates to the reference data respectively. In the course of this process, a mean area coverage of 98% was achieved for Dar Es Salaam test sites, which globally indicated that the extracted building polygons were close to the ground truth data. Furthermore, the proposed system saved time to extract a single building by 32%. Although the extracted building polygons are within the perimeter of ground truth data, visually some of the extracted building polygons were somewhat distorted. This implies that interactive post-editing process is necessary for cartographic representation.

  17. Development of algorithms for building inventory compilation through remote sensing and statistical inferencing

    NASA Astrophysics Data System (ADS)

    Sarabandi, Pooya

    Building inventories are one of the core components of disaster vulnerability and loss estimations models, and as such, play a key role in providing decision support for risk assessment, disaster management and emergency response efforts. In may parts of the world inclusive building inventories, suitable for the use in catastrophe models cannot be found. Furthermore, there are serious shortcomings in the existing building inventories that include incomplete or out-dated information on critical attributes as well as missing or erroneous values for attributes. In this dissertation a set of methodologies for updating spatial and geometric information of buildings from single and multiple high-resolution optical satellite images are presented. Basic concepts, terminologies and fundamentals of 3-D terrain modeling from satellite images are first introduced. Different sensor projection models are then presented and sources of optical noise such as lens distortions are discussed. An algorithm for extracting height and creating 3-D building models from a single high-resolution satellite image is formulated. The proposed algorithm is a semi-automated supervised method capable of extracting attributes such as longitude, latitude, height, square footage, perimeter, irregularity index and etc. The associated errors due to the interactive nature of the algorithm are quantified and solutions for minimizing the human-induced errors are proposed. The height extraction algorithm is validated against independent survey data and results are presented. The validation results show that an average height modeling accuracy of 1.5% can be achieved using this algorithm. Furthermore, concept of cross-sensor data fusion for the purpose of 3-D scene reconstruction using quasi-stereo images is developed in this dissertation. The developed algorithm utilizes two or more single satellite images acquired from different sensors and provides the means to construct 3-D building models in a more economical way. A terrain-dependent-search algorithm is formulated to facilitate the search for correspondences in a quasi-stereo pair of images. The calculated heights for sample buildings using cross-sensor data fusion algorithm show an average coefficient of variation 1.03%. In order to infer structural-type and occupancy-type, i.e. engineering attributes, of buildings from spatial and geometric attributes of 3-D models, a statistical data analysis framework is formulated. Applications of "Classification Trees" and "Multinomial Logistic Models" in modeling the marginal probabilities of class-membership of engineering attributes are investigated. Adaptive statistical models to incorporate different spatial and geometric attributes of buildings---while inferring the engineering attributes---are developed in this dissertation. The inferred engineering attributes in conjunction with the spatial and geometric attributes derived from the imagery can be used to augment regional building inventories and therefore enhance the result of catastrophe models. In the last part of the dissertation, a set of empirically-derived motion-damage relationships based on the correlation of observed building performance with measured ground-motion parameters from 1994 Northridge and 1999 Chi-Chi Taiwan earthquakes are developed. Fragility functions in the form of cumulative lognormal distributions and damage probability matrices for several classes of buildings (wood, steel and concrete), as well as number of ground-motion intensity measures are developed and compared to currently-used motion-damage relationships.

  18. LEGO-MM: LEarning structured model by probabilistic loGic Ontology tree for MultiMedia.

    PubMed

    Tang, Jinhui; Chang, Shiyu; Qi, Guo-Jun; Tian, Qi; Rui, Yong; Huang, Thomas S

    2016-09-22

    Recent advances in Multimedia ontology have resulted in a number of concept models, e.g., LSCOM and Mediamill 101, which are accessible and public to other researchers. However, most current research effort still focuses on building new concepts from scratch, very few work explores the appropriate method to construct new concepts upon the existing models already in the warehouse. To address this issue, we propose a new framework in this paper, termed LEGO1-MM, which can seamlessly integrate both the new target training examples and the existing primitive concept models to infer the more complex concept models. LEGOMM treats the primitive concept models as the lego toy to potentially construct an unlimited vocabulary of new concepts. Specifically, we first formulate the logic operations to be the lego connectors to combine existing concept models hierarchically in probabilistic logic ontology trees. Then, we incorporate new target training information simultaneously to efficiently disambiguate the underlying logic tree and correct the error propagation. Extensive experiments are conducted on a large vehicle domain data set from ImageNet. The results demonstrate that LEGO-MM has significantly superior performance over existing state-of-the-art methods, which build new concept models from scratch.

  19. Numerical Solutions for Nonlinear High Damping Rubber Bearing Isolators: Newmark's Method with Netwon-Raphson Iteration Revisited

    NASA Astrophysics Data System (ADS)

    Markou, A. A.; Manolis, G. D.

    2018-03-01

    Numerical methods for the solution of dynamical problems in engineering go back to 1950. The most famous and widely-used time stepping algorithm was developed by Newmark in 1959. In the present study, for the first time, the Newmark algorithm is developed for the case of the trilinear hysteretic model, a model that was used to describe the shear behaviour of high damping rubber bearings. This model is calibrated against free-vibration field tests implemented on a hybrid base isolated building, namely the Solarino project in Italy, as well as against laboratory experiments. A single-degree-of-freedom system is used to describe the behaviour of a low-rise building isolated with a hybrid system comprising high damping rubber bearings and low friction sliding bearings. The behaviour of the high damping rubber bearings is simulated by the trilinear hysteretic model, while the description of the behaviour of the low friction sliding bearings is modeled by a linear Coulomb friction model. In order to prove the effectiveness of the numerical method we compare the analytically solved trilinear hysteretic model calibrated from free-vibration field tests (Solarino project) against the same model solved with the Newmark method with Netwon-Raphson iteration. Almost perfect agreement is observed between the semi-analytical solution and the fully numerical solution with Newmark's time integration algorithm. This will allow for extension of the trilinear mechanical models to bidirectional horizontal motion, to time-varying vertical loads, to multi-degree-of-freedom-systems, as well to generalized models connected in parallel, where only numerical solutions are possible.

  20. Hurricane Harvey Building Damage Assessment Using UAV Data

    NASA Astrophysics Data System (ADS)

    Yeom, J.; Jung, J.; Chang, A.; Choi, I.

    2017-12-01

    Hurricane Harvey which was extremely destructive major hurricane struck southern Texas, U.S.A on August 25, causing catastrophic flooding and storm damages. We visited Rockport suffered severe building destruction and conducted UAV (Unmanned Aerial Vehicle) surveying for building damage assessment. UAV provides very high resolution images compared with traditional remote sensing data. In addition, prompt and cost-effective damage assessment can be performed regardless of several limitations in other remote sensing platforms such as revisit interval of satellite platforms, complicated flight plan in aerial surveying, and cloud amounts. In this study, UAV flight and GPS surveying were conducted two weeks after hurricane damage to generate an orthomosaic image and a DEM (Digital Elevation Model). 3D region growing scheme has been proposed to quantitatively estimate building damages considering building debris' elevation change and spectral difference. The result showed that the proposed method can be used for high definition building damage assessment in a time- and cost-effective way.

  1. Elaborating the Conceptual Space of Information-Seeking Phenomena

    ERIC Educational Resources Information Center

    Savolainen, Reijo

    2016-01-01

    Introduction: The article contributes to conceptual studies of information behaviour research by examining the conceptualisations of information seeking and related terms such as information search and browsing. Method: The study builds on Bates' integrated model of information seeking and searching, originally presented in 2002. The model was…

  2. From Point Cloud to Bim: a Modelling Challenge in the Cultural Heritage Field

    NASA Astrophysics Data System (ADS)

    Tommasi, C.; Achille, C.; Fassi, F.

    2016-06-01

    Speaking about modelling the Cultural Heritage, nowadays it is no longer enough to build the mute model of a monument, but it has to contain plenty of information inside it, especially when we refer to existing construction. For this reason, the aim of the research is to insert an historical building inside a BIM process, proposing in this way a working method that can build a reality based model and preserve the unicity of the elements. The question is: "What is the more useful mean in term of survey data management, level of detail, information and time savings?" To test the potentialities and the limits of this process we employed the most used software in the international market, taking as example some composed elements, made by regular and complex, but also modular parts. Once a final model is obtained, it is necessary to provide a test phase on the interoperability between the used software modules, in order to give a general picture of the state of art and to contribute to further studies on this subject.

  3. Converting HAZUS capacity curves to seismic hazard-compatible building fragility functions: effect of hysteretic models

    USGS Publications Warehouse

    Ryu, Hyeuk; Luco, Nicolas; Baker, Jack W.; Karaca, Erdem

    2008-01-01

    A methodology was recently proposed for the development of hazard-compatible building fragility models using parameters of capacity curves and damage state thresholds from HAZUS (Karaca and Luco, 2008). In the methodology, HAZUS curvilinear capacity curves were used to define nonlinear dynamic SDOF models that were subjected to the nonlinear time history analysis instead of the capacity spectrum method. In this study, we construct a multilinear capacity curve with negative stiffness after an ultimate (capping) point for the nonlinear time history analysis, as an alternative to the curvilinear model provided in HAZUS. As an illustration, here we propose parameter values of the multilinear capacity curve for a moderate-code low-rise steel moment resisting frame building (labeled S1L in HAZUS). To determine the final parameter values, we perform nonlinear time history analyses of SDOF systems with various parameter values and investigate their effects on resulting fragility functions through sensitivity analysis. The findings improve capacity curves and thereby fragility and/or vulnerability models for generic types of structures.

  4. Effect of concrete strength gradation to the compressive strength of graded concrete, a numerical approach

    NASA Astrophysics Data System (ADS)

    Pratama, M. Mirza Abdillah; Aylie, Han; Gan, Buntara Sthenly; Umniati, B. Sri; Risdanareni, Puput; Fauziyah, Shifa

    2017-09-01

    Concrete casting, compacting method, and characteristic of the concrete material determine the performance of concrete as building element due to the material uniformity issue. Previous studies show that gradation in strength exists on building member by nature and negatively influence the load carrying capacity of the member. A pilot research had modeled the concrete gradation in strength with controllable variable and observed that the weakest material determines the strength of graded concrete through uniaxial compressive loading test. This research intends to confirm the recent finding by a numerical approach with extensive variables of strength disparity. The finite element analysis was conducted using the Strand7 nonlinear program. The results displayed that the increase of strength disparity in graded concrete models leads to the slight reduction of models strength. A substantial difference in displacement response is encountered on the models for the small disparity of concrete strength. However, the higher strength of concrete mix in the graded concrete models contributes to the rise of material stiffness that provides a beneficial purpose for serviceability of building members.

  5. Investigating the Performance of Alternate Regression Weights by Studying All Possible Criteria in Regression Models with a Fixed Set of Predictors

    ERIC Educational Resources Information Center

    Waller, Niels; Jones, Jeff

    2011-01-01

    We describe methods for assessing all possible criteria (i.e., dependent variables) and subsets of criteria for regression models with a fixed set of predictors, x (where x is an n x 1 vector of independent variables). Our methods build upon the geometry of regression coefficients (hereafter called regression weights) in n-dimensional space. For a…

  6. Homeland Security Collaboration: Catch Phrase or Preeminent Organizational Construct?

    DTIC Science & Technology

    2009-09-01

    collaborative effort? C. RESEARCH METHODOLOGY This research project utilized a modified case study methodology. The traditional case study method ...discussing the research method , offering smart practices and culminate with findings and recommendations. Chapter II Homeland Security Collaboration...41 Centers for Regional Excellence, “Building Models.” 16 Chapter III Research Methodology:  Modified Case Study Method is

  7. Modeling Radioactive Decay Chains with Branching Fraction Uncertainties

    DTIC Science & Technology

    2013-03-01

    moments methods with transmutation matrices. Uncertainty from both half-lives and branching fractions is carried through these calculations by Monte...moment methods, method for sampling from normal distributions for half- life uncertainty, and use of transmutation matrices were leveraged. This...distributions for half-life and branching fraction uncertainties, building decay chains and generating the transmutation matrix (T-matrix

  8. Information Fusion - Methods and Aggregation Operators

    NASA Astrophysics Data System (ADS)

    Torra, Vicenç

    Information fusion techniques are commonly applied in Data Mining and Knowledge Discovery. In this chapter, we will give an overview of such applications considering their three main uses. This is, we consider fusion methods for data preprocessing, model building and information extraction. Some aggregation operators (i.e. particular fusion methods) and their properties are briefly described as well.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.

    Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less

  10. Automatic building of a web-like structure based on thermoplastic adhesive.

    PubMed

    Leach, Derek; Wang, Liyu; Reusser, Dorothea; Iida, Fumiya

    2014-09-01

    Animals build structures to extend their control over certain aspects of the environment; e.g., orb-weaver spiders build webs to capture prey, etc. Inspired by this behaviour of animals, we attempt to develop robotics technology that allows a robot to automatically builds structures to help it accomplish certain tasks. In this paper we show automatic building of a web-like structure with a robot arm based on thermoplastic adhesive (TPA) material. The material properties of TPA, such as elasticity, adhesiveness, and low melting temperature, make it possible for a robot to form threads across an open space by an extrusion-drawing process and then combine several of these threads into a web-like structure. The problems addressed here are discovering which parameters determine the thickness of a thread and determining how web-like structures may be used for certain tasks. We first present a model for the extrusion and the drawing of TPA threads which also includes the temperature-dependent material properties. The model verification result shows that the increasing relative surface area of the TPA thread as it is drawn thinner increases the heat loss of the thread, and that by controlling how quickly the thread is drawn, a range of diameters can be achieved from 0.2-0.75 mm. We then present a method based on a generalized nonlinear finite element truss model. The model was validated and could predict the deformation of various web-like structures when payloads are added. At the end, we demonstrate automatic building of a web-like structure for payload bearing.

  11. Architectural Heritage Visualization Using Interactive Technologies

    NASA Astrophysics Data System (ADS)

    Albourae, A. T.; Armenakis, C.; Kyan, M.

    2017-08-01

    With the increased exposure to tourists, historical monuments are at an ever-growing risk of disappearing. Building Information Modelling (BIM) offers a process of digitally documenting of all the features that are made or incorporated into the building over its life-span, thus affords unique opportunities for information preservation. BIM of historical buildings are called Historical Building Information Models (HBIM). This involves documenting a building in detail throughout its history. Geomatics professionals have the potential to play a major role in this area as they are often the first professionals involved on construction development sites for many Architectural, Engineering, and Construction (AEC) projects. In this work, we discuss how to establish an architectural database of a heritage site, digitally reconstruct, preserve and then interact with it through an immersive environment that leverages BIM for exploring historic buildings. The reconstructed heritage site under investigation was constructed in the early 15th century. In our proposed approach, the site selection was based on many factors such as architectural value, size, and accessibility. The 3D model is extracted from the original collected and integrated data (Image-based, range-based, CAD modelling, and land survey methods), after which the elements of the 3D objects are identified by creating a database using the BIM software platform (Autodesk Revit). The use of modern and widely accessible game engine technology (Unity3D) is explored, allowing the user to fully embed and interact with the scene using handheld devices. The details of implementing an integrated pipeline between HBIM, GIS and augmented and virtual reality (AVR) tools and the findings of the work are presented.

  12. An innovative time-cost-quality tradeoff modeling of building construction project based on resource allocation.

    PubMed

    Hu, Wenfa; He, Xinhua

    2014-01-01

    The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated.

  13. Application of a single-objective, hybrid genetic algorithm approach to pharmacokinetic model building.

    PubMed

    Sherer, Eric A; Sale, Mark E; Pollock, Bruce G; Belani, Chandra P; Egorin, Merrill J; Ivy, Percy S; Lieberman, Jeffrey A; Manuck, Stephen B; Marder, Stephen R; Muldoon, Matthew F; Scher, Howard I; Solit, David B; Bies, Robert R

    2012-08-01

    A limitation in traditional stepwise population pharmacokinetic model building is the difficulty in handling interactions between model components. To address this issue, a method was previously introduced which couples NONMEM parameter estimation and model fitness evaluation to a single-objective, hybrid genetic algorithm for global optimization of the model structure. In this study, the generalizability of this approach for pharmacokinetic model building is evaluated by comparing (1) correct and spurious covariate relationships in a simulated dataset resulting from automated stepwise covariate modeling, Lasso methods, and single-objective hybrid genetic algorithm approaches to covariate identification and (2) information criteria values, model structures, convergence, and model parameter values resulting from manual stepwise versus single-objective, hybrid genetic algorithm approaches to model building for seven compounds. Both manual stepwise and single-objective, hybrid genetic algorithm approaches to model building were applied, blinded to the results of the other approach, for selection of the compartment structure as well as inclusion and model form of inter-individual and inter-occasion variability, residual error, and covariates from a common set of model options. For the simulated dataset, stepwise covariate modeling identified three of four true covariates and two spurious covariates; Lasso identified two of four true and 0 spurious covariates; and the single-objective, hybrid genetic algorithm identified three of four true covariates and one spurious covariate. For the clinical datasets, the Akaike information criterion was a median of 22.3 points lower (range of 470.5 point decrease to 0.1 point decrease) for the best single-objective hybrid genetic-algorithm candidate model versus the final manual stepwise model: the Akaike information criterion was lower by greater than 10 points for four compounds and differed by less than 10 points for three compounds. The root mean squared error and absolute mean prediction error of the best single-objective hybrid genetic algorithm candidates were a median of 0.2 points higher (range of 38.9 point decrease to 27.3 point increase) and 0.02 points lower (range of 0.98 point decrease to 0.74 point increase), respectively, than that of the final stepwise models. In addition, the best single-objective, hybrid genetic algorithm candidate models had successful convergence and covariance steps for each compound, used the same compartment structure as the manual stepwise approach for 6 of 7 (86 %) compounds, and identified 54 % (7 of 13) of covariates included by the manual stepwise approach and 16 covariate relationships not included by manual stepwise models. The model parameter values between the final manual stepwise and best single-objective, hybrid genetic algorithm models differed by a median of 26.7 % (q₁ = 4.9 % and q₃ = 57.1 %). Finally, the single-objective, hybrid genetic algorithm approach was able to identify models capable of estimating absorption rate parameters for four compounds that the manual stepwise approach did not identify. The single-objective, hybrid genetic algorithm represents a general pharmacokinetic model building methodology whose ability to rapidly search the feasible solution space leads to nearly equivalent or superior model fits to pharmacokinetic data.

  14. DNA Dynamics.

    ERIC Educational Resources Information Center

    Warren, Michael D.

    1997-01-01

    Explains a method to enable students to understand DNA and protein synthesis using model-building and role-playing. Acquaints students with the triplet code and transcription. Includes copies of the charts used in this technique. (DDR)

  15. Application of the Monte Carlo method for building up models for octanol-water partition coefficient of platinum complexes

    NASA Astrophysics Data System (ADS)

    Toropov, Andrey A.; Toropova, Alla P.

    2018-06-01

    Predictive model of logP for Pt(II) and Pt(IV) complexes built up with the Monte Carlo method using the CORAL software has been validated with six different splits into the training and validation sets. The improving of the predictive potential of models for six different splits has been obtained using so-called index of ideality of correlation. The suggested models give possibility to extract molecular features, which cause the increase or vice versa decrease of the logP.

  16. Rapid model building of beta-sheets in electron-density maps.

    PubMed

    Terwilliger, Thomas C

    2010-03-01

    A method for rapidly building beta-sheets into electron-density maps is presented. beta-Strands are identified as tubes of high density adjacent to and nearly parallel to other tubes of density. The alignment and direction of each strand are identified from the pattern of high density corresponding to carbonyl and C(beta) atoms along the strand averaged over all repeats present in the strand. The beta-strands obtained are then assembled into a single atomic model of the beta-sheet regions. The method was tested on a set of 42 experimental electron-density maps at resolutions ranging from 1.5 to 3.8 A. The beta-sheet regions were nearly completely built in all but two cases, the exceptions being one structure at 2.5 A resolution in which a third of the residues in beta-sheets were built and a structure at 3.8 A in which under 10% were built. The overall average r.m.s.d. of main-chain atoms in the residues built using this method compared with refined models of the structures was 1.5 A.

  17. Improving the quality of learning in science through optimization of lesson study for learning community

    NASA Astrophysics Data System (ADS)

    Setyaningsih, S.

    2018-03-01

    Lesson Study for Learning Community is one of lecturer profession building system through collaborative and continuous learning study based on the principles of openness, collegiality, and mutual learning to build learning community in order to form professional learning community. To achieve the above, we need a strategy and learning method with specific subscription technique. This paper provides a description of how the quality of learning in the field of science can be improved by implementing strategies and methods accordingly, namely by applying lesson study for learning community optimally. Initially this research was focused on the study of instructional techniques. Learning method used is learning model Contextual teaching and Learning (CTL) and model of Problem Based Learning (PBL). The results showed that there was a significant increase in competence, attitudes, and psychomotor in the four study programs that were modelled. Therefore, it can be concluded that the implementation of learning strategies in Lesson study for Learning Community is needed to be used to improve the competence, attitude and psychomotor of science students.

  18. Software Tools For Building Decision-support Models For Flood Emergency Situations

    NASA Astrophysics Data System (ADS)

    Garrote, L.; Molina, M.; Ruiz, J. M.; Mosquera, J. C.

    The SAIDA decision-support system was developed by the Spanish Ministry of the Environment to provide assistance to decision-makers during flood situations. SAIDA has been tentatively implemented in two test basins: Jucar and Guadalhorce, and the Ministry is currently planning to have it implemented in all major Spanish basins in a few years' time. During the development cycle of SAIDA, the need for providing as- sistance to end-users in model definition and calibration was clearly identified. System developers usually emphasise abstraction and generality with the goal of providing a versatile software environment. End users, on the other hand, require concretion and specificity to adapt the general model to their local basins. As decision-support models become more complex, the gap between model developers and users gets wider: Who takes care of model definition, calibration and validation?. Initially, model developers perform these tasks, but the scope is usually limited to a few small test basins. Before the model enters operational stage, end users must get involved in model construction and calibration, in order to gain confidence in the model recommendations. However, getting the users involved in these activities is a difficult task. The goal of this re- search is to develop representation techniques for simulation and management models in order to define, develop and validate a mechanism, supported by a software envi- ronment, oriented to provide assistance to the end-user in building decision models for the prediction and management of river floods in real time. The system is based on three main building blocks: A library of simulators of the physical system, an editor to assist the user in building simulation models, and a machine learning method to calibrate decision models based on the simulation models provided by the user.

  19. Intelligent demand side management of residential building energy systems

    NASA Astrophysics Data System (ADS)

    Sinha, Maruti N.

    Advent of modern sensing technologies, data processing capabilities and rising cost of energy are driving the implementation of intelligent systems in buildings and houses which constitute 41% of total energy consumption. The primary motivation has been to provide a framework for demand-side management and to improve overall reliability. The entire formulation is to be implemented on NILM (Non-Intrusive Load Monitoring System), a smart meter. This is going to play a vital role in the future of demand side management. Utilities have started deploying smart meters throughout the world which will essentially help to establish communication between utility and consumers. This research is focused on investigation of a suitable thermal model of residential house, building up control system and developing diagnostic and energy usage forecast tool. The present work has considered measurement based approach to pursue. Identification of building thermal parameters is the very first step towards developing performance measurement and controls. The proposed identification technique is PEM (Prediction Error Method) based, discrete state-space model. The two different models have been devised. First model is focused toward energy usage forecast and diagnostics. Here one of the novel idea has been investigated which takes integral of thermal capacity to identify thermal model of house. The purpose of second identification is to build up a model for control strategy. The controller should be able to take into account the weather forecast information, deal with the operating point constraints and at the same time minimize the energy consumption. To design an optimal controller, MPC (Model Predictive Control) scheme has been implemented instead of present thermostatic/hysteretic control. This is a receding horizon approach. Capability of the proposed schemes has also been investigated.

  20. Teacher Stress: Complex Model Building with LISREL. Pedagogical Reports, No. 16.

    ERIC Educational Resources Information Center

    Tellenback, Sten

    This paper presents a complex causal model of teacher stress based on data received from the responses of 1,466 teachers from Malmo, Sweden to a questionnaire. Also presented is a method for treating the model variables as higher-order factors or higher-order theoretical constructs. The paper's introduction presents a brief review of teacher…

  1. Data-driven sampling method for building 3D anatomical models from serial histology

    NASA Astrophysics Data System (ADS)

    Salunke, Snehal Ulhas; Ablove, Tova; Danforth, Theresa; Tomaszewski, John; Doyle, Scott

    2017-03-01

    In this work, we investigate the effect of slice sampling on 3D models of tissue architecture using serial histopathology. We present a method for using a single fully-sectioned tissue block as pilot data, whereby we build a fully-realized 3D model and then determine the optimal set of slices needed to reconstruct the salient features of the model objects under biological investigation. In our work, we are interested in the 3D reconstruction of microvessel architecture in the trigone region between the vagina and the bladder. This region serves as a potential avenue for drug delivery to treat bladder infection. We collect and co-register 23 serial sections of CD31-stained tissue images (6 μm thick sections), from which four microvessels are selected for analysis. To build each model, we perform semi-automatic segmentation of the microvessels. Subsampled meshes are then created by removing slices from the stack, interpolating the missing data, and re-constructing the mesh. We calculate the Hausdorff distance between the full and subsampled meshes to determine the optimal sampling rate for the modeled structures. In our application, we found that a sampling rate of 50% (corresponding to just 12 slices) was sufficient to recreate the structure of the microvessels without significant deviation from the fullyrendered mesh. This pipeline effectively minimizes the number of histopathology slides required for 3D model reconstruction, and can be utilized to either (1) reduce the overall costs of a project, or (2) enable additional analysis on the intermediate slides.

  2. Developing Risk Prediction Models for Postoperative Pancreatic Fistula: a Systematic Review of Methodology and Reporting Quality.

    PubMed

    Wen, Zhang; Guo, Ya; Xu, Banghao; Xiao, Kaiyin; Peng, Tao; Peng, Minhao

    2016-04-01

    Postoperative pancreatic fistula is still a major complication after pancreatic surgery, despite improvements of surgical technique and perioperative management. We sought to systematically review and critically access the conduct and reporting of methods used to develop risk prediction models for predicting postoperative pancreatic fistula. We conducted a systematic search of PubMed and EMBASE databases to identify articles published before January 1, 2015, which described the development of models to predict the risk of postoperative pancreatic fistula. We extracted information of developing a prediction model including study design, sample size and number of events, definition of postoperative pancreatic fistula, risk predictor selection, missing data, model-building strategies, and model performance. Seven studies of developing seven risk prediction models were included. In three studies (42 %), the number of events per variable was less than 10. The number of candidate risk predictors ranged from 9 to 32. Five studies (71 %) reported using univariate screening, which was not recommended in building a multivariate model, to reduce the number of risk predictors. Six risk prediction models (86 %) were developed by categorizing all continuous risk predictors. The treatment and handling of missing data were not mentioned in all studies. We found use of inappropriate methods that could endanger the development of model, including univariate pre-screening of variables, categorization of continuous risk predictors, and model validation. The use of inappropriate methods affects the reliability and the accuracy of the probability estimates of predicting postoperative pancreatic fistula.

  3. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  4. Mechanical modeling for magnetorheological elastomer isolators based on constitutive equations and electromagnetic analysis

    NASA Astrophysics Data System (ADS)

    Wang, Qi; Dong, Xufeng; Li, Luyu; Ou, Jinping

    2018-06-01

    As constitutive models are too complicated and existing mechanical models lack universality, these models are beyond satisfaction for magnetorheological elastomer (MRE) devices. In this article, a novel universal method is proposed to build concise mechanical models. Constitutive model and electromagnetic analysis were applied in this method to ensure universality, while a series of derivations and simplifications were carried out to obtain a concise formulation. To illustrate the proposed modeling method, a conical MRE isolator was introduced. Its basic mechanical equations were built based on equilibrium, deformation compatibility, constitutive equations and electromagnetic analysis. An iteration model and a highly efficient differential equation editor based model were then derived to solve the basic mechanical equations. The final simplified mechanical equations were obtained by re-fitting the simulations with a novel optimal algorithm. In the end, verification test of the isolator has proved the accuracy of the derived mechanical model and the modeling method.

  5. Designing a Pedagogical Model for Web Engineering Education: An Evolutionary Perspective

    ERIC Educational Resources Information Center

    Hadjerrouit, Said

    2005-01-01

    In contrast to software engineering, which relies on relatively well established development approaches, there is a lack of a proven methodology that guides Web engineers in building reliable and effective Web-based systems. Currently, Web engineering lacks process models, architectures, suitable techniques and methods, quality assurance, and a…

  6. Model Building to Facilitate Understanding of Holliday Junction and Heteroduplex Formation, and Holliday Junction Resolution

    ERIC Educational Resources Information Center

    Selvarajah, Geeta; Selvarajah, Susila

    2016-01-01

    Students frequently expressed difficulty in understanding the molecular mechanisms involved in chromosomal recombination. Therefore, we explored alternative methods for presenting the two concepts of the double-strand break model: Holliday junction and heteroduplex formation, and Holliday junction resolution. In addition to a lecture and…

  7. Rank and Sparsity in Language Processing

    ERIC Educational Resources Information Center

    Hutchinson, Brian

    2013-01-01

    Language modeling is one of many problems in language processing that have to grapple with naturally high ambient dimensions. Even in large datasets, the number of unseen sequences is overwhelmingly larger than the number of observed ones, posing clear challenges for estimation. Although existing methods for building smooth language models tend to…

  8. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    NASA Astrophysics Data System (ADS)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; Zhang, Guannan; Ye, Ming; Wu, Jianfeng; Wu, Jichun

    2017-12-01

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we develop a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.

  9. Detection of collapsed buildings from lidar data due to the 2016 Kumamoto earthquake in Japan

    NASA Astrophysics Data System (ADS)

    Moya, Luis; Yamazaki, Fumio; Liu, Wen; Yamada, Masumi

    2018-01-01

    The 2016 Kumamoto earthquake sequence was triggered by an Mw 6.2 event at 21:26 on 14 April. Approximately 28 h later, at 01:25 on 16 April, an Mw 7.0 event (the mainshock) followed. The epicenters of both events were located near the residential area of Mashiki and affected the region nearby. Due to very strong seismic ground motion, the earthquake produced extensive damage to buildings and infrastructure. In this paper, collapsed buildings were detected using a pair of digital surface models (DSMs), taken before and after the 16 April mainshock by airborne light detection and ranging (lidar) flights. Different methods were evaluated to identify collapsed buildings from the DSMs. The change in average elevation within a building footprint was found to be the most important factor. Finally, the distribution of collapsed buildings in the study area was presented, and the result was consistent with that of a building damage survey performed after the earthquake.

  10. Solving the problem of building models of crosslinked polymers: an example focussing on validation of the properties of crosslinked epoxy resins.

    PubMed

    Hall, Stephen A; Howlin, Brendan J; Hamerton, Ian; Baidak, Alex; Billaud, Claude; Ward, Steven

    2012-01-01

    The construction of molecular models of crosslinked polymers is an area of some difficulty and considerable interest. We report here a new method of constructing these models and validate the method by modelling three epoxy systems based on the epoxy monomers bisphenol F diglycidyl ether (BFDGE) and triglycidyl-p-amino phenol (TGAP) with the curing agent diamino diphenyl sulphone (DDS). The main emphasis of the work concerns the improvement of the techniques for the molecular simulation of these epoxies and specific attention is paid towards model construction techniques, including automated model building and prediction of glass transition temperatures (T(g)). Typical models comprise some 4200-4600 atoms (ca. 120-130 monomers). In a parallel empirical study, these systems have been cast, cured and analysed by dynamic mechanical thermal analysis (DMTA) to measure T(g). Results for the three epoxy systems yield good agreement with experimental T(g) ranges of 200-220°C, 270-285°C and 285-290°C with corresponding simulated ranges of 210-230°C, 250-300°C, and 250-300°C respectively.

  11. Solving the Problem of Building Models of Crosslinked Polymers: An Example Focussing on Validation of the Properties of Crosslinked Epoxy Resins

    PubMed Central

    Hall, Stephen A.; Howlin, Brendan J; Hamerton, Ian; Baidak, Alex; Billaud, Claude; Ward, Steven

    2012-01-01

    The construction of molecular models of crosslinked polymers is an area of some difficulty and considerable interest. We report here a new method of constructing these models and validate the method by modelling three epoxy systems based on the epoxy monomers bisphenol F diglycidyl ether (BFDGE) and triglycidyl-p-amino phenol (TGAP) with the curing agent diamino diphenyl sulphone (DDS). The main emphasis of the work concerns the improvement of the techniques for the molecular simulation of these epoxies and specific attention is paid towards model construction techniques, including automated model building and prediction of glass transition temperatures (Tg). Typical models comprise some 4200–4600 atoms (ca. 120–130 monomers). In a parallel empirical study, these systems have been cast, cured and analysed by dynamic mechanical thermal analysis (DMTA) to measure Tg. Results for the three epoxy systems yield good agreement with experimental Tg ranges of 200–220°C, 270–285°C and 285–290°C with corresponding simulated ranges of 210–230°C, 250–300°C, and 250–300°C respectively. PMID:22916182

  12. CONFOLD2: improved contact-driven ab initio protein structure modeling.

    PubMed

    Adhikari, Badri; Cheng, Jianlin

    2018-01-25

    Contact-guided protein structure prediction methods are becoming more and more successful because of the latest advances in residue-residue contact prediction. To support contact-driven structure prediction, effective tools that can quickly build tertiary structural models of good quality from predicted contacts need to be developed. We develop an improved contact-driven protein modelling method, CONFOLD2, and study how it may be effectively used for ab initio protein structure prediction with predicted contacts as input. It builds models using various subsets of input contacts to explore the fold space under the guidance of a soft square energy function, and then clusters the models to obtain the top five models. CONFOLD2 obtains an average reconstruction accuracy of 0.57 TM-score for the 150 proteins in the PSICOV contact prediction dataset. When benchmarked on the CASP11 contacts predicted using CONSIP2 and CASP12 contacts predicted using Raptor-X, CONFOLD2 achieves a mean TM-score of 0.41 on both datasets. CONFOLD2 allows to quickly generate top five structural models for a protein sequence when its secondary structures and contacts predictions at hand. The source code of CONFOLD2 is publicly available at https://github.com/multicom-toolbox/CONFOLD2/ .

  13. Systems biology by the rules: hybrid intelligent systems for pathway modeling and discovery.

    PubMed

    Bosl, William J

    2007-02-15

    Expert knowledge in journal articles is an important source of data for reconstructing biological pathways and creating new hypotheses. An important need for medical research is to integrate this data with high throughput sources to build useful models that span several scales. Researchers traditionally use mental models of pathways to integrate information and development new hypotheses. Unfortunately, the amount of information is often overwhelming and these are inadequate for predicting the dynamic response of complex pathways. Hierarchical computational models that allow exploration of semi-quantitative dynamics are useful systems biology tools for theoreticians, experimentalists and clinicians and may provide a means for cross-communication. A novel approach for biological pathway modeling based on hybrid intelligent systems or soft computing technologies is presented here. Intelligent hybrid systems, which refers to several related computing methods such as fuzzy logic, neural nets, genetic algorithms, and statistical analysis, has become ubiquitous in engineering applications for complex control system modeling and design. Biological pathways may be considered to be complex control systems, which medicine tries to manipulate to achieve desired results. Thus, hybrid intelligent systems may provide a useful tool for modeling biological system dynamics and computational exploration of new drug targets. A new modeling approach based on these methods is presented in the context of hedgehog regulation of the cell cycle in granule cells. Code and input files can be found at the Bionet website: www.chip.ord/~wbosl/Software/Bionet. This paper presents the algorithmic methods needed for modeling complicated biochemical dynamics using rule-based models to represent expert knowledge in the context of cell cycle regulation and tumor growth. A notable feature of this modeling approach is that it allows biologists to build complex models from their knowledge base without the need to translate that knowledge into mathematical form. Dynamics on several levels, from molecular pathways to tissue growth, are seamlessly integrated. A number of common network motifs are examined and used to build a model of hedgehog regulation of the cell cycle in cerebellar neurons, which is believed to play a key role in the etiology of medulloblastoma, a devastating childhood brain cancer.

  14. Reconstructing Buildings with Discontinuities and Roof Overhangs from Oblique Aerial Imagery

    NASA Astrophysics Data System (ADS)

    Frommholz, D.; Linkiewicz, M.; Meissner, H.; Dahlke, D.

    2017-05-01

    This paper proposes a two-stage method for the reconstruction of city buildings with discontinuities and roof overhangs from oriented nadir and oblique aerial images. To model the structures the input data is transformed into a dense point cloud, segmented and filtered with a modified marching cubes algorithm to reduce the positional noise. Assuming a monolithic building the remaining vertices are initially projected onto a 2D grid and passed to RANSAC-based regression and topology analysis to geometrically determine finite wall, ground and roof planes. If this should fail due to the presence of discontinuities the regression will be repeated on a 3D level by traversing voxels within the regularly subdivided bounding box of the building point set. For each cube a planar piece of the current surface is approximated and expanded. The resulting segments get mutually intersected yielding both topological and geometrical nodes and edges. These entities will be eliminated if their distance-based affiliation to the defining point sets is violated leaving a consistent building hull including its structural breaks. To add the roof overhangs the computed polygonal meshes are projected onto the digital surface model derived from the point cloud. Their shapes are offset equally along the edge normals with subpixel accuracy by detecting the zero-crossings of the second-order directional derivative in the gradient direction of the height bitmap and translated back into world space to become a component of the building. As soon as the reconstructed objects are finished the aerial images are further used to generate a compact texture atlas for visualization purposes. An optimized atlas bitmap is generated that allows perspectivecorrect multi-source texture mapping without prior rectification involving a partially parallel placement algorithm. Moreover, the texture atlases undergo object-based image analysis (OBIA) to detect window areas which get reintegrated into the building models. To evaluate the performance of the proposed method a proof-of-concept test on sample structures obtained from real-world data of Heligoland/Germany has been conducted. It revealed good reconstruction accuracy in comparison to the cadastral map, a speed-up in texture atlas optimization and visually attractive render results.

  15. Automated crystallographic ligand building using the medial axis transform of an electron-density isosurface.

    PubMed

    Aishima, Jun; Russel, Daniel S; Guibas, Leonidas J; Adams, Paul D; Brunger, Axel T

    2005-10-01

    Automatic fitting methods that build molecules into electron-density maps usually fail below 3.5 A resolution. As a first step towards addressing this problem, an algorithm has been developed using an approximation of the medial axis to simplify an electron-density isosurface. This approximation captures the central axis of the isosurface with a graph which is then matched against a graph of the molecular model. One of the first applications of the medial axis to X-ray crystallography is presented here. When applied to ligand fitting, the method performs at least as well as methods based on selecting peaks in electron-density maps. Generalization of the method to recognition of common features across multiple contour levels could lead to powerful automatic fitting methods that perform well even at low resolution.

  16. Interdiffusion of Polycarbonate in Fused Deposition Modeling Welds

    NASA Astrophysics Data System (ADS)

    Seppala, Jonathan; Forster, Aaron; Satija, Sushil; Jones, Ronald; Migler, Kalman

    2015-03-01

    Fused deposition modeling (FDM), a now common and inexpensive additive manufacturing method, produces 3D objects by extruding molten polymer layer-by-layer. Compared to traditional polymer processing methods (injection, vacuum, and blow molding), FDM parts have inferior mechanical properties, surface finish, and dimensional stability. From a polymer processing point of view the polymer-polymer weld between each layer limits the mechanical strength of the final part. Unlike traditional processing methods, where the polymer is uniformly melted and entangled, FDM welds are typically weaker due to the short time available for polymer interdiffusion and entanglement. To emulate the FDM process thin film bilayers of polycarbonate/d-polycarbonate were annealed using scaled times and temperatures accessible in FDM. Shift factors from Time-Temperature Superposition, measured by small amplitude oscillatory shear, were used to calculate reasonable annealing times (min) at temperatures below the actual extrusion temperature. The extent of interdiffusion was then measured using neutron reflectivity. Analogous specimens were prepared to characterize the mechanical properties. FDM build parameters were then related to interdiffusion between welded layers and mechanical properties. Understating the relationship between build parameters, interdiffusion, and mechanical strength will allow FDM users to print stronger parts in an intelligent manner rather than using trial-and-error and build parameter lock-in.

  17. Semantic Segmentation of Indoor Point Clouds Using Convolutional Neural Network

    NASA Astrophysics Data System (ADS)

    Babacan, K.; Chen, L.; Sohn, G.

    2017-11-01

    As Building Information Modelling (BIM) thrives, geometry becomes no longer sufficient; an ever increasing variety of semantic information is needed to express an indoor model adequately. On the other hand, for the existing buildings, automatically generating semantically enriched BIM from point cloud data is in its infancy. The previous research to enhance the semantic content rely on frameworks in which some specific rules and/or features that are hand coded by specialists. These methods immanently lack generalization and easily break in different circumstances. On this account, a generalized framework is urgently needed to automatically and accurately generate semantic information. Therefore we propose to employ deep learning techniques for the semantic segmentation of point clouds into meaningful parts. More specifically, we build a volumetric data representation in order to efficiently generate the high number of training samples needed to initiate a convolutional neural network architecture. The feedforward propagation is used in such a way to perform the classification in voxel level for achieving semantic segmentation. The method is tested both for a mobile laser scanner point cloud, and a larger scale synthetically generated data. We also demonstrate a case study, in which our method can be effectively used to leverage the extraction of planar surfaces in challenging cluttered indoor environments.

  18. Quantifying Impacts of Urban Growth Potential on Army Training Capabilities

    DTIC Science & Technology

    2017-09-12

    Capacity” ERDC/CERL TR-17-34 ii Abstract Building on previous studies of urban growth and population effects on U.S. military installations and...combat team studies . CAA has developed an iterative process that builds on Military Value Anal- ysis (MVA) models that include a set of attributes that...Methods and tools were developed to support a nationwide analysis. This study focused on installations operating training areas that were high

  19. Objected-oriented remote sensing image classification method based on geographic ontology model

    NASA Astrophysics Data System (ADS)

    Chu, Z.; Liu, Z. J.; Gu, H. Y.

    2016-11-01

    Nowadays, with the development of high resolution remote sensing image and the wide application of laser point cloud data, proceeding objected-oriented remote sensing classification based on the characteristic knowledge of multi-source spatial data has been an important trend on the field of remote sensing image classification, which gradually replaced the traditional method through improving algorithm to optimize image classification results. For this purpose, the paper puts forward a remote sensing image classification method that uses the he characteristic knowledge of multi-source spatial data to build the geographic ontology semantic network model, and carries out the objected-oriented classification experiment to implement urban features classification, the experiment uses protégé software which is developed by Stanford University in the United States, and intelligent image analysis software—eCognition software as the experiment platform, uses hyperspectral image and Lidar data that is obtained through flight in DaFeng City of JiangSu as the main data source, first of all, the experiment uses hyperspectral image to obtain feature knowledge of remote sensing image and related special index, the second, the experiment uses Lidar data to generate nDSM(Normalized DSM, Normalized Digital Surface Model),obtaining elevation information, the last, the experiment bases image feature knowledge, special index and elevation information to build the geographic ontology semantic network model that implement urban features classification, the experiment results show that, this method is significantly higher than the traditional classification algorithm on classification accuracy, especially it performs more evidently on the respect of building classification. The method not only considers the advantage of multi-source spatial data, for example, remote sensing image, Lidar data and so on, but also realizes multi-source spatial data knowledge integration and application of the knowledge to the field of remote sensing image classification, which provides an effective way for objected-oriented remote sensing image classification in the future.

  20. Building and testing models with extended Higgs sectors

    NASA Astrophysics Data System (ADS)

    Ivanov, Igor P.

    2017-07-01

    Models with non-minimal Higgs sectors represent a mainstream direction in theoretical exploration of physics opportunities beyond the Standard Model. Extended scalar sectors help alleviate difficulties of the Standard Model and lead to a rich spectrum of characteristic collider signatures and astroparticle consequences. In this review, we introduce the reader to the world of extended Higgs sectors. Not pretending to exhaustively cover the entire body of literature, we walk through a selection of the most popular examples: the two- and multi-Higgs-doublet models, as well as singlet and triplet extensions. We will show how one typically builds models with extended Higgs sectors, describe the main goals and the challenges which arise on the way, and mention some methods to overcome them. We will also describe how such models can be tested, what are the key observables one focuses on, and illustrate the general strategy with a subjective selection of results.

  1. Demonstration of reduced-order urban scale building energy models

    DOE PAGES

    Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew; ...

    2017-09-08

    The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less

  2. Demonstration of reduced-order urban scale building energy models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew

    The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less

  3. Specification and implementation of IFC based performance metrics to support building life cycle assessment of hybrid energy systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrissey, Elmer; O'Donnell, James; Keane, Marcus

    2004-03-29

    Minimizing building life cycle energy consumption is becoming of paramount importance. Performance metrics tracking offers a clear and concise manner of relating design intent in a quantitative form. A methodology is discussed for storage and utilization of these performance metrics through an Industry Foundation Classes (IFC) instantiated Building Information Model (BIM). The paper focuses on storage of three sets of performance data from three distinct sources. An example of a performance metrics programming hierarchy is displayed for a heat pump and a solar array. Utilizing the sets of performance data, two discrete performance effectiveness ratios may be computed, thus offeringmore » an accurate method of quantitatively assessing building performance.« less

  4. Oblique Photogrammetry Supporting 3d Urban Reconstruction of Complex Scenarios

    NASA Astrophysics Data System (ADS)

    Toschi, I.; Ramos, M. M.; Nocerino, E.; Menna, F.; Remondino, F.; Moe, K.; Poli, D.; Legat, K.; Fassi, F.

    2017-05-01

    Accurate 3D city models represent an important source of geospatial information to support various "smart city" applications, such as space management, energy assessment, 3D cartography, noise and pollution mapping as well as disaster management. Even though remarkable progress has been made in recent years, there are still many open issues, especially when it comes to the 3D modelling of complex urban scenarios like historical and densely-built city centres featuring narrow streets and non-conventional building shapes. Most approaches introduce strong building priors/constraints on symmetry and roof typology that penalize urban environments having high variations of roof shapes. Furthermore, although oblique photogrammetry is rapidly maturing, the use of slanted views for façade reconstruction is not completely included in the reconstruction pipeline of state-of-the-art software. This paper aims to investigate state-of-the-art methods for 3D building modelling in complex urban scenarios with the support of oblique airborne images. A reconstruction approach based on roof primitives fitting is tested. Oblique imagery is then exploited to support the manual editing of the generated building models. At the same time, mobile mapping data are collected at cm resolution and then integrated with the aerial ones. All approaches are tested on the historical city centre of Bergamo (Italy).

  5. Continuation Power Flow Analysis for PV Integration Studies at Distribution Feeders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jiyu; Zhu, Xiangqi; Lubkeman, David L.

    2017-10-30

    This paper presents a method for conducting continuation power flow simulation on high-solar penetration distribution feeders. A load disaggregation method is developed to disaggregate the daily feeder load profiles collected in substations down to each load node, where the electricity consumption of residential houses and commercial buildings are modeled using actual data collected from single family houses and commercial buildings. This allows the modeling of power flow and voltage profile along a distribution feeder on a continuing fashion for a 24- hour period at minute-by-minute resolution. By separating the feeder into load zones based on the distance between the loadmore » node and the feeder head, we studied the impact of PV penetration on distribution grid operation in different seasons and under different weather conditions for different PV placements.« less

  6. Use of an expert system data analysis manager for space shuttle main engine test evaluation

    NASA Technical Reports Server (NTRS)

    Abernethy, Ken

    1988-01-01

    The ability to articulate, collect, and automate the application of the expertise needed for the analysis of space shuttle main engine (SSME) test data would be of great benefit to NASA liquid rocket engine experts. This paper describes a project whose goal is to build a rule-based expert system which incorporates such expertise. Experiential expertise, collected directly from the experts currently involved in SSME data analysis, is used to build a rule base to identify engine anomalies similar to those analyzed previously. Additionally, an alternate method of expertise capture is being explored. This method would generate rules inductively based on calculations made using a theoretical model of the SSME's operation. The latter rules would be capable of diagnosing anomalies which may not have appeared before, but whose effects can be predicted by the theoretical model.

  7. Numerical Modelling of Connections Between Stones in Foundations of Historical Buildings

    NASA Astrophysics Data System (ADS)

    Przewlocki, Jaroslaw; Zielinska, Monika; Grebowski, Karol

    2017-12-01

    The aim of this paper is to analyse the behaviour of old building foundations composed of stones (the main load-bearing elements) and mortar, based on numerical analysis. Some basic aspects of historical foundations are briefly discussed, with an emphasis on their development, techniques, and material. The behaviour of a foundation subjected to the loads transmitted from the upper parts of the structure is described using the finite element method (FEM). The main problems in analysing the foundations of historical buildings are determining the characteristics of the materials and the degree of degradation of the mortar, which is the weakest part of the foundation. Mortar is graded using the damaged-plastic model. In this model, exceeding the bearing capacity occurs due to the degradation of materials. The damaged-plastic model is the most accurate model describing the work and properties of mortar because it shows exactly what happens with this material throughout its total load history. For a uniformly loaded fragment of the foundation, both stresses and strains were analysed. The results of the analysis presented in this paper contribute to further research in the field of understanding both behaviour and modelling in historical buildings’ foundations.

  8. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    NASA Astrophysics Data System (ADS)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  9. Scalable Learning for Geostatistics and Speaker Recognition

    DTIC Science & Technology

    2011-01-01

    of prior knowledge of the model or due to improved robustness requirements). Both these methods have their own advantages and disadvantages. The use...application. If the data is well-correlated and low-dimensional, any prior knowledge available on the data can be used to build a parametric model. In the...absence of prior knowledge , non-parametric methods can be used. If the data is high-dimensional, PCA based dimensionality reduction is often the first

  10. Template based protein structure modeling by global optimization in CASP11.

    PubMed

    Joo, Keehyoung; Joung, InSuk; Lee, Sun Young; Kim, Jong Yun; Cheng, Qianyi; Manavalan, Balachandran; Joung, Jong Young; Heo, Seungryong; Lee, Juyong; Nam, Mikyung; Lee, In-Ho; Lee, Sung Jong; Lee, Jooyoung

    2016-09-01

    For the template-based modeling (TBM) of CASP11 targets, we have developed three new protein modeling protocols (nns for server prediction and LEE and LEER for human prediction) by improving upon our previous CASP protocols (CASP7 through CASP10). We applied the powerful global optimization method of conformational space annealing to three stages of optimization, including multiple sequence-structure alignment, three-dimensional (3D) chain building, and side-chain remodeling. For more successful fold recognition, a new alignment method called CRFalign was developed. It can incorporate sensitive positional and environmental dependence in alignment scores as well as strong nonlinear correlations among various features. Modifications and adjustments were made to the form of the energy function and weight parameters pertaining to the chain building procedure. For the side-chain remodeling step, residue-type dependence was introduced to the cutoff value that determines the entry of a rotamer to the side-chain modeling library. The improved performance of the nns server method is attributed to successful fold recognition achieved by combining several methods including CRFalign and to the current modeling formulation that can incorporate native-like structural aspects present in multiple templates. The LEE protocol is identical to the nns one except that CASP11-released server models are used as templates. The success of LEE in utilizing CASP11 server models indicates that proper template screening and template clustering assisted by appropriate cluster ranking promises a new direction to enhance protein 3D modeling. Proteins 2016; 84(Suppl 1):221-232. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  11. A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery.

    PubMed

    Siddiqui, Fasahat Ullah; Teng, Shyh Wei; Awrangjeb, Mohammad; Lu, Guojun

    2016-07-19

    Existing automatic building extraction methods are not effective in extracting buildings which are small in size and have transparent roofs. The application of large area threshold prohibits detection of small buildings and the use of ground points in generating the building mask prevents detection of transparent buildings. In addition, the existing methods use numerous parameters to extract buildings in complex environments, e.g., hilly area and high vegetation. However, the empirical tuning of large number of parameters reduces the robustness of building extraction methods. This paper proposes a novel Gradient-based Building Extraction (GBE) method to address these limitations. The proposed method transforms the Light Detection And Ranging (LiDAR) height information into intensity image without interpolation of point heights and then analyses the gradient information in the image. Generally, building roof planes have a constant height change along the slope of a roof plane whereas trees have a random height change. With such an analysis, buildings of a greater range of sizes with a transparent or opaque roof can be extracted. In addition, a local colour matching approach is introduced as a post-processing stage to eliminate trees. This stage of our proposed method does not require any manual setting and all parameters are set automatically from the data. The other post processing stages including variance, point density and shadow elimination are also applied to verify the extracted buildings, where comparatively fewer empirically set parameters are used. The performance of the proposed GBE method is evaluated on two benchmark data sets by using the object and pixel based metrics (completeness, correctness and quality). Our experimental results show the effectiveness of the proposed method in eliminating trees, extracting buildings of all sizes, and extracting buildings with and without transparent roof. When compared with current state-of-the-art building extraction methods, the proposed method outperforms the existing methods in various evaluation metrics.

  12. A Robust Gradient Based Method for Building Extraction from LiDAR and Photogrammetric Imagery

    PubMed Central

    Siddiqui, Fasahat Ullah; Teng, Shyh Wei; Awrangjeb, Mohammad; Lu, Guojun

    2016-01-01

    Existing automatic building extraction methods are not effective in extracting buildings which are small in size and have transparent roofs. The application of large area threshold prohibits detection of small buildings and the use of ground points in generating the building mask prevents detection of transparent buildings. In addition, the existing methods use numerous parameters to extract buildings in complex environments, e.g., hilly area and high vegetation. However, the empirical tuning of large number of parameters reduces the robustness of building extraction methods. This paper proposes a novel Gradient-based Building Extraction (GBE) method to address these limitations. The proposed method transforms the Light Detection And Ranging (LiDAR) height information into intensity image without interpolation of point heights and then analyses the gradient information in the image. Generally, building roof planes have a constant height change along the slope of a roof plane whereas trees have a random height change. With such an analysis, buildings of a greater range of sizes with a transparent or opaque roof can be extracted. In addition, a local colour matching approach is introduced as a post-processing stage to eliminate trees. This stage of our proposed method does not require any manual setting and all parameters are set automatically from the data. The other post processing stages including variance, point density and shadow elimination are also applied to verify the extracted buildings, where comparatively fewer empirically set parameters are used. The performance of the proposed GBE method is evaluated on two benchmark data sets by using the object and pixel based metrics (completeness, correctness and quality). Our experimental results show the effectiveness of the proposed method in eliminating trees, extracting buildings of all sizes, and extracting buildings with and without transparent roof. When compared with current state-of-the-art building extraction methods, the proposed method outperforms the existing methods in various evaluation metrics. PMID:27447631

  13. Object-Based Dense Matching Method for Maintaining Structure Characteristics of Linear Buildings

    PubMed Central

    Yan, Yiming; Qiu, Mingjie; Zhao, Chunhui; Wang, Liguo

    2018-01-01

    In this paper, we proposed a novel object-based dense matching method specially for the high-precision disparity map of building objects in urban areas, which can maintain accurate object structure characteristics. The proposed framework mainly includes three stages. Firstly, an improved edge line extraction method is proposed for the edge segments to fit closely to building outlines. Secondly, a fusion method is proposed for the outlines under the constraint of straight lines, which can maintain the building structural attribute with parallel or vertical edges, which is very useful for the dense matching method. Finally, we proposed an edge constraint and outline compensation (ECAOC) dense matching method to maintain building object structural characteristics in the disparity map. In the proposed method, the improved edge lines are used to optimize matching search scope and matching template window, and the high-precision building outlines are used to compensate the shape feature of building objects. Our method can greatly increase the matching accuracy of building objects in urban areas, especially at building edges. For the outline extraction experiments, our fusion method verifies the superiority and robustness on panchromatic images of different satellites and different resolutions. For the dense matching experiments, our ECOAC method shows great advantages for matching accuracy of building objects in urban areas compared with three other methods. PMID:29596393

  14. Energy Savings Analysis of the Proposed NYStretch-Energy Code 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Bing; Zhang, Jian; Chen, Yan

    This study was conducted by the Pacific Northwest National Laboratory (PNNL) in support of the stretch energy code development led by the New York State Energy Research and Development Authority (NYSERDA). In 2017 NYSERDA developed its 2016 Stretch Code Supplement to the 2016 New York State Energy Conservation Construction Code (hereinafter referred to as “NYStretch-Energy”). NYStretch-Energy is intended as a model energy code for statewide voluntary adoption that anticipates other code advancements culminating in the goal of a statewide Net Zero Energy Code by 2028. Since then, NYSERDA continues to develop the NYStretch-Energy Code 2018 edition. To support the effort,more » PNNL conducted energy simulation analysis to quantify the energy savings of proposed commercial provisions of the NYStretch-Energy Code (2018) in New York. The focus of this project is the 20% improvement over existing commercial model energy codes. A key requirement of the proposed stretch code is that it be ‘adoptable’ as an energy code, meaning that it must align with current code scope and limitations, and primarily impact building components that are currently regulated by local building departments. It is largely limited to prescriptive measures, which are what most building departments and design projects are most familiar with. This report describes a set of energy-efficiency measures (EEMs) that demonstrate 20% energy savings over ANSI/ASHRAE/IES Standard 90.1-2013 (ASHRAE 2013) across a broad range of commercial building types and all three climate zones in New York. In collaboration with New Building Institute, the EEMs were developed from national model codes and standards, high-performance building codes and standards, regional energy codes, and measures being proposed as part of the on-going code development process. PNNL analyzed these measures using whole building energy models for selected prototype commercial buildings and multifamily buildings representing buildings in New York. Section 2 of this report describes the analysis methodology, including the building types and construction area weights update for this analysis, the baseline, and the method to conduct the energy saving analysis. Section 3 provides detailed specifications of the EEMs and bundles. Section 4 summarizes the results of individual EEMs and EEM bundles by building type, energy end-use and climate zone. Appendix A documents detailed descriptions of the selected prototype buildings. Appendix B provides energy end-use breakdown results by building type for both the baseline code and stretch code in all climate zones.« less

  15. Building analytical three-field cosmological models

    NASA Astrophysics Data System (ADS)

    Santos, J. R. L.; Moraes, P. H. R. S.; Ferreira, D. A.; Neta, D. C. Vilar

    2018-02-01

    A difficult task to deal with is the analytical treatment of models composed of three real scalar fields, as their equations of motion are in general coupled and hard to integrate. In order to overcome this problem we introduce a methodology to construct three-field models based on the so-called "extension method". The fundamental idea of the procedure is to combine three one-field systems in a non-trivial way, to construct an effective three scalar field model. An interesting scenario where the method can be implemented is with inflationary models, where the Einstein-Hilbert Lagrangian is coupled with the scalar field Lagrangian. We exemplify how a new model constructed from our method can lead to non-trivial behaviors for cosmological parameters.

  16. Decision making based on analysis of benefit versus costs of preventive retrofit versus costs of repair after earthquake hazards

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, M.

    2012-04-01

    In this presentation interventions on seismically vulnerable early reinforced concrete skeleton buildings, from the interwar time, at different performance levels, from avoiding collapse up to assuring immediate post-earthquake functionality are considered. Between these two poles there are degrees of damage depending on the performance aim set. The costs of the retrofit and post-earthquake repair differ depending on the targeted performance. Not only an earthquake has impact on a heritage building, but also the retrofit measure, for example on its appearance or its functional layout. This way criteria of the structural engineer, the investor, the architect/conservator/urban planner and the owner/inhabitants from the neighbourhood are considered for taking a benefit-cost decision. Benefit-cost analysis based decision is an element in a risk management process. A solution must be found on how much change to accept for retrofit and how much repairable damage to take into account. There are two impact studies. Numerical simulation was run for the building typology considered for successive earthquakes, selected in a deterministic way (1977, 1986 and two for 1991 from Vrancea, Romania and respectively 1978 Thessaloniki, Greece), considering also the case when retrofit is done between two earthquakes. The typology of buildings itself was studied not only for Greece and Romania, but for numerous European countries, including Italy. The typology was compared to earlier reinforced concrete buildings, with Hennebique system, in order to see to which amount these can belong to structural heritage and to shape the criteria of the architect/conservator. Based on the typology study two model buildings were designed, and for one of these different retrofit measures (side walls, structural walls, steel braces, steel jacketing) were considered, while for the other one of these retrofit techniques (diagonal braces, which permits adding also active measures such as energy dissipaters) to different amount and location in the building was considered. Device computations, a civil engineering method for building economics (and which was, before statistics existed, also the method for computing the costs of general upgrade of buildings), were done for the retrofit and for the repair measures, being able to be applied for different countries, also ones where there is no database on existing projects in seismic retrofit. The building elements for which the device computations were done are named "retrofit elements" and they can be new elements, modified elements or replaced elements of the initial building. The addition of the devices is simple, as the row in project management was, but, for the sake of comparison, also complex project management computed in other works was compared for innovative measures such as FRP (with glass and fibre). The theoretical costs for model measures were compared to the way costs of real retrofit for this building type (with reinforced concrete jacketing and FRP) are computed in Greece. The theoretical proposed measures were generally compared to those applied in practice, in Romania and Italy as well. A further study will include these, as in Italy diagonal braces with dissipation had been used. The typology of braces is relevant also for the local seismic culture, maybe outgoing for another type of skeleton structures the distribution of which has been studied: the timber skeleton. A subtype of Romanian reinforced concrete skeleton buildings includes diagonal braces. In order to assess the costs of rebuilding or general upgrade without retrofit, architecture methods for building economics are considered based on floor surface. Diagrams have been built to see how the total costs vary as addition between the preventive retrofit and the post-earthquake repair, and tables to compare to the costs of rebuilding, outgoing from a the model of addition of day-lighting in atria of buildings. The moment when a repair measure has to be applied, function of the recurrence period of earthquakes, is similar to the depth of the atria. Depending on how strong the expected earthquake is, a more extensive retrofit is required in order to decrease repair costs. A further study would allow converting the device computations in floor surface costs, to be able not only to implement in an ICT environment by means of ontology and BIM, but also to convert to urban scale. For the latter studies of probabilistic application of structural mechanics models instead of observation based statistics can be considered. But first the socio-economic models of construction management games will be considered, both computer games and board hard-copy games, starting with SimCity which initially included the San Francisco 1906 earthquake, in order to see how the resources needed can be modeled. All criteria build the taxonomy of decision. Among them different ways to make the cost-benefit analysis exist, from weighted tree to pair-wise comparison. The taxonomy was modeled as a decision tree, which builds the basis for an ontology.

  17. From complex questionnaire and interviewing data to intelligent Bayesian Network models for medical decision support

    PubMed Central

    Constantinou, Anthony Costa; Fenton, Norman; Marsh, William; Radlinski, Lukasz

    2016-01-01

    Objectives 1) To develop a rigorous and repeatable method for building effective Bayesian network (BN) models for medical decision support from complex, unstructured and incomplete patient questionnaires and interviews that inevitably contain examples of repetitive, redundant and contradictory responses; 2) To exploit expert knowledge in the BN development since further data acquisition is usually not possible; 3) To ensure the BN model can be used for interventional analysis; 4) To demonstrate why using data alone to learn the model structure and parameters is often unsatisfactory even when extensive data is available. Method The method is based on applying a range of recent BN developments targeted at helping experts build BNs given limited data. While most of the components of the method are based on established work, its novelty is that it provides a rigorous consolidated and generalised framework that addresses the whole life-cycle of BN model development. The method is based on two original and recent validated BN models in forensic psychiatry, known as DSVM-MSS and DSVM-P. Results When employed with the same datasets, the DSVM-MSS demonstrated competitive to superior predictive performance (AUC scores 0.708 and 0.797) against the state-of-the-art (AUC scores ranging from 0.527 to 0.705), and the DSVM-P demonstrated superior predictive performance (cross-validated AUC score of 0.78) against the state-of-the-art (AUC scores ranging from 0.665 to 0.717). More importantly, the resulting models go beyond improving predictive accuracy and into usefulness for risk management purposes through intervention, and enhanced decision support in terms of answering complex clinical questions that are based on unobserved evidence. Conclusions This development process is applicable to any application domain which involves large-scale decision analysis based on such complex information, rather than based on data with hard facts, and in conjunction with the incorporation of expert knowledge for decision support via intervention. The novelty extends to challenging the decision scientists to reason about building models based on what information is really required for inference, rather than based on what data is available and hence, forces decision scientists to use available data in a much smarter way. PMID:26830286

  18. Individualized Cognitive Modeling for Close-Loop Task Mitigation

    NASA Technical Reports Server (NTRS)

    Zhang, Guangfan; Xu, Roger; Wang, Wei; Li, Jiang; Schnell, Tom; Keller, Mike

    2010-01-01

    An accurate real-time operator functional state assessment makes it possible to perform task management, minimize risks, and improve mission performance. In this paper, we discuss the development of an individualized operator functional state assessment model that identifies states likely leading to operational errors. To address large individual variations, we use two different approaches to build a model for each individual using its data as well as data from subjects with similar responses. If a subject's response is similar to that of the individual of interest in a specific functional state, all the training data from this subject will be used to build the individual model. The individualization methods have been successfully verified and validated with a driving test data set provided by University of Iowa. With the individualized models, the mean squared error can be significantly decreased (by around 20%).

  19. Earthquake Vulnerability Assessment for Hospital Buildings Using a Gis-Based Group Multi Criteria Decision Making Approach: a Case Study of Tehran, Iran

    NASA Astrophysics Data System (ADS)

    Delavar, M. R.; Moradi, M.; Moshiri, B.

    2015-12-01

    Nowadays, urban areas are threatened by a number of natural hazards such as flood, landslide and earthquake. They can cause huge damages to buildings and human beings which necessitates disaster mitigation and preparation. One of the most important steps in disaster management is to understand all impacts and effects of disaster on urban facilities. Given that hospitals take care of vulnerable people reaction of hospital buildings against earthquake is vital. In this research, the vulnerability of hospital buildings against earthquake is analysed. The vulnerability of buildings is related to a number of criteria including age of building, number of floors, the quality of materials and intensity of the earthquake. Therefore, the problem of seismic vulnerability assessment is a multi-criteria assessment problem and multi criteria decision making methods can be used to address the problem. In this paper a group multi criteria decision making model is applied because using only one expert's judgments can cause biased vulnerability maps. Sugeno integral which is able to take into account the interaction among criteria is employed to assess the vulnerability degree of buildings. Fuzzy capacities which are similar to layer weights in weighted linear averaging operator are calculated using particle swarm optimization. Then, calculated fuzzy capacities are included into the model to compute a vulnerability degree for each hospital.

  20. Iterative model building, structure refinement and density modification with the PHENIX AutoBuild wizard.

    PubMed

    Terwilliger, Thomas C; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Moriarty, Nigel W; Zwart, Peter H; Hung, Li Wei; Read, Randy J; Adams, Paul D

    2008-01-01

    The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 A, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution.

  1. SU-E-T-59: Calculations of Collimator Scatter Factors (Sc) with and Without Custom-Made Build-Up Caps for CyberKnife

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wokoma, S; Yoon, J; Jung, J

    2014-06-01

    Purpose: To investigate the impact of custom-made build-up caps for a diode detector in robotic radiosurgery radiation fields with variable collimator (IRIS) for collimator scatter factor (Sc) calculation. Methods: An acrylic cap was custom-made to fit our SFD (IBA Dosimetry, Germany) diode detector. The cap has thickness of 5 cm, corresponding to a depth beyond electron contamination. IAEA phase space data was used for beam modeling and DOSRZnrc code was used to model the detector. The detector was positioned at 80 cm source-to-detector distance. Calculations were performed with the SFD, with and without the build-up cap, for clinical IRIS settingsmore » ranging from 7.5 to 60 mm. Results: The collimator scatter factors were calculated with and without 5 cm build-up cap. They were agreed within 3% difference except 15 mm cone. The Sc factor for 15 mm cone without buildup was 13.2% lower than that with buildup. Conclusion: Sc data is a critical component in advanced algorithms for treatment planning in order to calculate the dose accurately. After incorporating build-up cap, we discovered differences of up to 13.2 % in Sc factors in the SFD detector, when compared against in-air measurements without build-up caps.« less

  2. Materials, used in historical buildings, analysis methods and solutions puroposals

    NASA Astrophysics Data System (ADS)

    Döndüren, M. Sami; Sişik, Ozlem

    2017-10-01

    Most of historical buildings are built with pressure principle and have the characteristics of masonry structures. Therefore, the structure components of buildings are constituted bearing walls, columns, buttresses, vaults and domes. Natural stone, cut stone, rubble stone brick or alternate materials were used in the bearing elements. Brick-dust and mortar with more binding feature were used as combination elements. In time, some problems were occurred in used materials and in structure as a result of various effects. Therefore, it is necessary to apply various applications in framework of repair and strengthening of buildings. In this study, restoration of historic buildings and the control of the adequacy of the bearing systems as one most important part of structure were examined. For this purpose, static analysis of Edirne-Merkez Demirtaş (Timurtaş) mosque located in Edirne was tested. Testes could give suggestions and be applied if buildings needed be revealed. The structure was modelled with finite element model of sap2000 package program and the forces generated under various loads and stresses, the occurred deformation due to that, overflow of allowable stress of this deformation and stresses were investigated. As the results of this study can be note that the maximum compressive stress at the construction is calculated as 1.1 MPa.

  3. Thermal Insulating Concrete Wall Panel Design for Sustainable Built Environment

    PubMed Central

    Zhou, Ao; Wong, Kwun-Wah

    2014-01-01

    Air-conditioning system plays a significant role in providing users a thermally comfortable indoor environment, which is a necessity in modern buildings. In order to save the vast energy consumed by air-conditioning system, the building envelopes in envelope-load dominated buildings should be well designed such that the unwanted heat gain and loss with environment can be minimized. In this paper, a new design of concrete wall panel that enhances thermal insulation of buildings by adding a gypsum layer inside concrete is presented. Experiments have been conducted for monitoring the temperature variation in both proposed sandwich wall panel and conventional concrete wall panel under a heat radiation source. For further understanding the thermal effect of such sandwich wall panel design from building scale, two three-story building models adopting different wall panel designs are constructed for evaluating the temperature distribution of entire buildings using finite element method. Both the experimental and simulation results have shown that the gypsum layer improves the thermal insulation performance by retarding the heat transfer across the building envelopes. PMID:25177718

  4. Thermal insulating concrete wall panel design for sustainable built environment.

    PubMed

    Zhou, Ao; Wong, Kwun-Wah; Lau, Denvid

    2014-01-01

    Air-conditioning system plays a significant role in providing users a thermally comfortable indoor environment, which is a necessity in modern buildings. In order to save the vast energy consumed by air-conditioning system, the building envelopes in envelope-load dominated buildings should be well designed such that the unwanted heat gain and loss with environment can be minimized. In this paper, a new design of concrete wall panel that enhances thermal insulation of buildings by adding a gypsum layer inside concrete is presented. Experiments have been conducted for monitoring the temperature variation in both proposed sandwich wall panel and conventional concrete wall panel under a heat radiation source. For further understanding the thermal effect of such sandwich wall panel design from building scale, two three-story building models adopting different wall panel designs are constructed for evaluating the temperature distribution of entire buildings using finite element method. Both the experimental and simulation results have shown that the gypsum layer improves the thermal insulation performance by retarding the heat transfer across the building envelopes.

  5. Analysis of ecstasy tablets: comparison of reflectance and transmittance near infrared spectroscopy.

    PubMed

    Schneider, Ralph Carsten; Kovar, Karl-Artur

    2003-07-08

    Calibration models for the quantitation of commonly used ecstasy substances have been developed using near infrared spectroscopy (NIR) in diffuse reflectance and in transmission mode by applying seized ecstasy tablets for model building and validation. The samples contained amphetamine, N-methyl-3,4-methylenedioxy-amphetamine (MDMA) and N-ethyl-3,4-methylenedioxy-amphetamine (MDE) in different concentrations. All tablets were analyzed using high performance liquid chromatography (HPLC) with diode array detection as reference method. We evaluated the performance of each NIR measurement method with regard to its ability to predict the content of each tablet with a low root mean square error of prediction (RMSEP). Best calibration models could be generated by using NIR measurement in transmittance mode with wavelength selection and 1/x-transformation of the raw data. The models build in reflectance mode showed higher RMSEPs using as data pretreatment, wavelength selection, 1/x-transformation and a second order Savitzky-Golay derivative with five point smoothing was applied to obtain the best models. To estimate the influence of inhomogeneities in the illegal tablets, a calibration of the destroyed, i.e. triturated samples was build and compared to the corresponding data of the whole tablets. The calibrations using these homogenized tablets showed lower RMSEPs. We can conclude that NIR analysis of ecstasy tablets in transmission mode is more suitable than measurement in diffuse reflectance to obtain quantification models for their active ingredients with regard to low errors of prediction. Inhomogeneities in the samples are equalized when measuring the tablets as powdered samples.

  6. Strategies for carbohydrate model building, refinement and validation

    PubMed Central

    2017-01-01

    Sugars are the most stereochemically intricate family of biomolecules and present substantial challenges to anyone trying to understand their nomenclature, reactions or branched structures. Current crystallographic programs provide an abstraction layer allowing inexpert structural biologists to build complete protein or nucleic acid model components automatically either from scratch or with little manual intervention. This is, however, still not generally true for sugars. The need for carbohydrate-specific building and validation tools has been highlighted a number of times in the past, concomitantly with the introduction of a new generation of experimental methods that have been ramping up the production of protein–sugar complexes and glycoproteins for the past decade. While some incipient advances have been made to address these demands, correctly modelling and refining carbohydrates remains a challenge. This article will address many of the typical difficulties that a structural biologist may face when dealing with carbohydrates, with an emphasis on problem solving in the resolution range where X-ray crystallography and cryo-electron microscopy are expected to overlap in the next decade. PMID:28177313

  7. Challenges in microbial ecology: building predictive understanding of community function and dynamics

    PubMed Central

    Widder, Stefanie; Allen, Rosalind J; Pfeiffer, Thomas; Curtis, Thomas P; Wiuf, Carsten; Sloan, William T; Cordero, Otto X; Brown, Sam P; Momeni, Babak; Shou, Wenying; Kettle, Helen; Flint, Harry J; Haas, Andreas F; Laroche, Béatrice; Kreft, Jan-Ulrich; Rainey, Paul B; Freilich, Shiri; Schuster, Stefan; Milferstedt, Kim; van der Meer, Jan R; Groβkopf, Tobias; Huisman, Jef; Free, Andrew; Picioreanu, Cristian; Quince, Christopher; Klapper, Isaac; Labarthe, Simon; Smets, Barth F; Wang, Harris; Soyer, Orkun S

    2016-01-01

    The importance of microbial communities (MCs) cannot be overstated. MCs underpin the biogeochemical cycles of the earth's soil, oceans and the atmosphere, and perform ecosystem functions that impact plants, animals and humans. Yet our ability to predict and manage the function of these highly complex, dynamically changing communities is limited. Building predictive models that link MC composition to function is a key emerging challenge in microbial ecology. Here, we argue that addressing this challenge requires close coordination of experimental data collection and method development with mathematical model building. We discuss specific examples where model–experiment integration has already resulted in important insights into MC function and structure. We also highlight key research questions that still demand better integration of experiments and models. We argue that such integration is needed to achieve significant progress in our understanding of MC dynamics and function, and we make specific practical suggestions as to how this could be achieved. PMID:27022995

  8. Strategies for carbohydrate model building, refinement and validation.

    PubMed

    Agirre, Jon

    2017-02-01

    Sugars are the most stereochemically intricate family of biomolecules and present substantial challenges to anyone trying to understand their nomenclature, reactions or branched structures. Current crystallographic programs provide an abstraction layer allowing inexpert structural biologists to build complete protein or nucleic acid model components automatically either from scratch or with little manual intervention. This is, however, still not generally true for sugars. The need for carbohydrate-specific building and validation tools has been highlighted a number of times in the past, concomitantly with the introduction of a new generation of experimental methods that have been ramping up the production of protein-sugar complexes and glycoproteins for the past decade. While some incipient advances have been made to address these demands, correctly modelling and refining carbohydrates remains a challenge. This article will address many of the typical difficulties that a structural biologist may face when dealing with carbohydrates, with an emphasis on problem solving in the resolution range where X-ray crystallography and cryo-electron microscopy are expected to overlap in the next decade.

  9. The Creation of Space Vector Models of Buildings From RPAS Photogrammetry Data

    NASA Astrophysics Data System (ADS)

    Trhan, Ondrej

    2017-06-01

    The results of Remote Piloted Aircraft System (RPAS) photogrammetry are digital surface models and orthophotos. The main problem of the digital surface models obtained is that buildings are not perpendicular and the shape of roofs is deformed. The task of this paper is to obtain a more accurate digital surface model using building reconstructions. The paper discusses the problem of obtaining and approximating building footprints, reconstructing the final spatial vector digital building model, and modifying the buildings on the digital surface model.

  10. A Simplified Method for the 3D Printing of Molecular Models for Chemical Education

    ERIC Educational Resources Information Center

    Jones, Oliver A. H.; Spencer, Michelle J. S.

    2018-01-01

    Using tangible models to help students visualize chemical structures in three dimensions has been a mainstay of chemistry education for many years. Conventional chemistry modeling kits are, however, limited in the types and accuracy of the molecules, bonds and structures they can be used to build. The recent development of 3D printing technology…

  11. Faults Discovery By Using Mined Data

    NASA Technical Reports Server (NTRS)

    Lee, Charles

    2005-01-01

    Fault discovery in the complex systems consist of model based reasoning, fault tree analysis, rule based inference methods, and other approaches. Model based reasoning builds models for the systems either by mathematic formulations or by experiment model. Fault Tree Analysis shows the possible causes of a system malfunction by enumerating the suspect components and their respective failure modes that may have induced the problem. The rule based inference build the model based on the expert knowledge. Those models and methods have one thing in common; they have presumed some prior-conditions. Complex systems often use fault trees to analyze the faults. Fault diagnosis, when error occurs, is performed by engineers and analysts performing extensive examination of all data gathered during the mission. International Space Station (ISS) control center operates on the data feedback from the system and decisions are made based on threshold values by using fault trees. Since those decision-making tasks are safety critical and must be done promptly, the engineers who manually analyze the data are facing time challenge. To automate this process, this paper present an approach that uses decision trees to discover fault from data in real-time and capture the contents of fault trees as the initial state of the trees.

  12. Sustainable Design Approach: A case study of BIM use

    NASA Astrophysics Data System (ADS)

    Abdelhameed, Wael

    2017-11-01

    Achieving sustainable design in areas such as energy-efficient design depends largely on the accuracy of the analysis performed after the design is completed with all its components and material details. There are different analysis approaches and methods that predict relevant values and metrics such as U value, energy use and energy savings. Although certain differences in the accuracy of these approaches and methods have been recorded, this research paper does not focus on such matter, where determining the reason for discrepancies between those approaches and methods is difficult, because all error sources act simultaneously. The research paper rather introduces an approach through which BIM, building information modelling, can be utilised during the initial phases of the designing process, by analysing the values and metrics of sustainable design before going into the design details of a building. Managing all of the project drawings in a single file, BIM -building information modelling- is well known as one digital platform that offers a multidisciplinary detailed design -AEC model (Barison and Santos, 2010, Welle et.al., 2011). The paper presents in general BIM use in the early phases of the design process, in order to achieve certain required areas of sustainable design. The paper proceeds to introduce BIM use in specific areas such as site selection, wind velocity and building orientation, in terms of reaching the farther possible sustainable solution. In the initial phases of designing, material details and building components are not fully specified or selected yet. The designer usually focuses on zoning, topology, circulations, and other design requirements. The proposed approach employs the strategies and analysis of BIM use during those initial design phases in order to have the analysis and results of each solution or alternative design. The stakeholders and designers would have a better effective decision making process with a full clarity of each alternative's consequences. The architect would settle down and proceed in the alternative design of the best sustainable analysis. In later design stages, using the sustainable types of materials such as insulation, cladding, etc., and applying sustainable building components such as doors, windows, etc. would add more improvements and enhancements in reaching better values and metrics. The paper describes the methodology of this design approach through BIM strategies adopted in design creation. Case studies of architectural designs are used to highlight the details and benefits of this proposed approach.

  13. Diffusion of Energy Efficient Technology in Commercial Buildings: An Analysis of the Commercial Building Partnerships Program

    NASA Astrophysics Data System (ADS)

    Antonopoulos, Chrissi Argyro

    This study presents findings from survey and interview data investigating replication of green building measures by Commercial Building Partnership (CBP) partners that worked directly with the Pacific Northwest National Laboratory (PNNL). PNNL partnered directly with 12 organizations on new and retrofit construction projects, which represented approximately 28 percent of the entire U.S. Department of Energy (DOE) CBP program. Through a feedback survey mechanism, along with personal interviews, quantitative and qualitative data were gathered relating to replication efforts by each organization. These data were analyzed to provide insight into two primary research areas: 1) CBP partners' replication efforts of green building approaches used in the CBP project to the rest of the organization's building portfolio, and, 2) the market potential for technology diffusion into the total U.S. commercial building stock, as a direct result of the CBP program. The first area of this research focused specifically on replication efforts underway or planned by each CBP program participant. The second area of this research develops a diffusion of innovations model to analyze potential broad market impacts of the CBP program on the commercial building industry in the United States. Findings from this study provided insight into motivations and objectives CBP partners had for program participation. Factors that impact replication include motivation, organizational structure and objectives firms have for implementation of energy efficient technologies. Comparing these factors between different CBP partners revealed patterns in motivation for constructing energy efficient buildings, along with better insight into market trends for green building practices. The optimized approach to the CBP program allows partners to develop green building parameters that fit the specific uses of their building, resulting in greater motivation for replication. In addition, the diffusion model developed for this analysis indicates that this method of market prediction may be used to adequately capture cumulative construction metrics for a whole-building analysis as opposed to individual energy efficiency measures used in green building.

  14. GC-ASM: Synergistic Integration of Graph-Cut and Active Shape Model Strategies for Medical Image Segmentation

    PubMed Central

    Chen, Xinjian; Udupa, Jayaram K.; Alavi, Abass; Torigian, Drew A.

    2013-01-01

    Image segmentation methods may be classified into two categories: purely image based and model based. Each of these two classes has its own advantages and disadvantages. In this paper, we propose a novel synergistic combination of the image based graph-cut (GC) method with the model based ASM method to arrive at the GC-ASM method for medical image segmentation. A multi-object GC cost function is proposed which effectively integrates the ASM shape information into the GC framework. The proposed method consists of two phases: model building and segmentation. In the model building phase, the ASM model is built and the parameters of the GC are estimated. The segmentation phase consists of two main steps: initialization (recognition) and delineation. For initialization, an automatic method is proposed which estimates the pose (translation, orientation, and scale) of the model, and obtains a rough segmentation result which also provides the shape information for the GC method. For delineation, an iterative GC-ASM algorithm is proposed which performs finer delineation based on the initialization results. The proposed methods are implemented to operate on 2D images and evaluated on clinical chest CT, abdominal CT, and foot MRI data sets. The results show the following: (a) An overall delineation accuracy of TPVF > 96%, FPVF < 0.6% can be achieved via GC-ASM for different objects, modalities, and body regions. (b) GC-ASM improves over ASM in its accuracy and precision to search region. (c) GC-ASM requires far fewer landmarks (about 1/3 of ASM) than ASM. (d) GC-ASM achieves full automation in the segmentation step compared to GC which requires seed specification and improves on the accuracy of GC. (e) One disadvantage of GC-ASM is its increased computational expense owing to the iterative nature of the algorithm. PMID:23585712

  15. GC-ASM: Synergistic Integration of Graph-Cut and Active Shape Model Strategies for Medical Image Segmentation.

    PubMed

    Chen, Xinjian; Udupa, Jayaram K; Alavi, Abass; Torigian, Drew A

    2013-05-01

    Image segmentation methods may be classified into two categories: purely image based and model based. Each of these two classes has its own advantages and disadvantages. In this paper, we propose a novel synergistic combination of the image based graph-cut (GC) method with the model based ASM method to arrive at the GC-ASM method for medical image segmentation. A multi-object GC cost function is proposed which effectively integrates the ASM shape information into the GC framework. The proposed method consists of two phases: model building and segmentation. In the model building phase, the ASM model is built and the parameters of the GC are estimated. The segmentation phase consists of two main steps: initialization (recognition) and delineation. For initialization, an automatic method is proposed which estimates the pose (translation, orientation, and scale) of the model, and obtains a rough segmentation result which also provides the shape information for the GC method. For delineation, an iterative GC-ASM algorithm is proposed which performs finer delineation based on the initialization results. The proposed methods are implemented to operate on 2D images and evaluated on clinical chest CT, abdominal CT, and foot MRI data sets. The results show the following: (a) An overall delineation accuracy of TPVF > 96%, FPVF < 0.6% can be achieved via GC-ASM for different objects, modalities, and body regions. (b) GC-ASM improves over ASM in its accuracy and precision to search region. (c) GC-ASM requires far fewer landmarks (about 1/3 of ASM) than ASM. (d) GC-ASM achieves full automation in the segmentation step compared to GC which requires seed specification and improves on the accuracy of GC. (e) One disadvantage of GC-ASM is its increased computational expense owing to the iterative nature of the algorithm.

  16. Modelling of Indoor Environments Using Lindenmayer Systems

    NASA Astrophysics Data System (ADS)

    Peter, M.

    2017-09-01

    Documentation of the "as-built" state of building interiors has gained a lot of interest in the recent years. Various data acquisition methods exist, e.g. the extraction from photographed evacuation plans using image processing or, most prominently, indoor mobile laser scanning. Due to clutter or data gaps as well as errors during data acquisition and processing, automatic reconstruction of CAD/BIM-like models from these data sources is not a trivial task. Thus it is often tried to support reconstruction by general rules for the perpendicularity and parallelism which are predominant in man-made structures. Indoor environments of large, public buildings, however, often also follow higher-level rules like symmetry and repetition of e.g. room sizes and corridor widths. In the context of reconstruction of city city elements (e.g. street networks) or building elements (e.g. façade layouts), formal grammars have been put to use. In this paper, we describe the use of Lindenmayer systems - which originally have been developed for the computer-based modelling of plant growth - to model and reproduce the layout of indoor environments in 2D.

  17. Research on the key technologies of 3D spatial data organization and management for virtual building environments

    NASA Astrophysics Data System (ADS)

    Gong, Jun; Zhu, Qing

    2006-10-01

    As the special case of VGE in the fields of AEC (architecture, engineering and construction), Virtual Building Environment (VBE) has been broadly concerned. Highly complex, large-scale 3d spatial data is main bottleneck of VBE applications, so 3d spatial data organization and management certainly becomes the core technology for VBE. This paper puts forward 3d spatial data model for VBE, and the performance to implement it is very high. Inherent storage method of CAD data makes data redundant, and doesn't concern efficient visualization, which is a practical bottleneck to integrate CAD model, so An Efficient Method to Integrate CAD Model Data is put forward. Moreover, Since the 3d spatial indices based on R-tree are usually limited by their weakness of low efficiency due to the severe overlap of sibling nodes and the uneven size of nodes, a new node-choosing algorithm of R-tree are proposed.

  18. The benefit of 3D laser scanning technology in the generation and calibration of FEM models for health assessment of concrete structures.

    PubMed

    Yang, Hao; Xu, Xiangyang; Neumann, Ingo

    2014-11-19

    Terrestrial laser scanning technology (TLS) is a new technique for quickly getting three-dimensional information. In this paper we research the health assessment of concrete structures with a Finite Element Method (FEM) model based on TLS. The goal focuses on the benefits of 3D TLS in the generation and calibration of FEM models, in order to build a convenient, efficient and intelligent model which can be widely used for the detection and assessment of bridges, buildings, subways and other objects. After comparing the finite element simulation with surface-based measurement data from TLS, the FEM model is determined to be acceptable with an error of less than 5%. The benefit of TLS lies mainly in the possibility of a surface-based validation of results predicted by the FEM model.

  19. Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Model View Definition

    DTIC Science & Technology

    2013-06-01

    building information models ( BIM ) at the coordinated design stage of building construction. 1.3 Approach To...standard for exchanging Building Information Modeling ( BIM ) data, which defines hundreds of classes for common use in software, currently supported by...specifications, Construction Operations Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF:

  20. Understanding the Effects of Different Study Methods on Retention of Information and Transfer of Learning

    ERIC Educational Resources Information Center

    Egan, Rylan G.

    2012-01-01

    Introduction: The following study investigates relationships between spaced practice (re-studying after a delay) and transfer of learning. Specifically, the impact on learners ability to transfer learning after participating in spaced model-building or unstructured study of narrated text. Method: Subjects were randomly assigned either to a…

  1. PREDICTION OF THE ACUTE TOXICITY OF ORGANIC COMPOUNDS TO THE FATHEAD MINNOW (PIMEPHALES PROMALAS) USING A GROUP CONTRIBUTION METHOD

    EPA Science Inventory

    A group contribution method has been developed to correlate the acute toxicity (96 h LC50) to the fathead minnow (Pimephales promelas) for 379 organic chemicals. Multilinear regression and computational neural networks (CNNs) were used for model building. The multilinear linear m...

  2. Molecule kernels: a descriptor- and alignment-free quantitative structure-activity relationship approach.

    PubMed

    Mohr, Johannes A; Jain, Brijnesh J; Obermayer, Klaus

    2008-09-01

    Quantitative structure activity relationship (QSAR) analysis is traditionally based on extracting a set of molecular descriptors and using them to build a predictive model. In this work, we propose a QSAR approach based directly on the similarity between the 3D structures of a set of molecules measured by a so-called molecule kernel, which is independent of the spatial prealignment of the compounds. Predictors can be build using the molecule kernel in conjunction with the potential support vector machine (P-SVM), a recently proposed machine learning method for dyadic data. The resulting models make direct use of the structural similarities between the compounds in the test set and a subset of the training set and do not require an explicit descriptor construction. We evaluated the predictive performance of the proposed method on one classification and four regression QSAR datasets and compared its results to the results reported in the literature for several state-of-the-art descriptor-based and 3D QSAR approaches. In this comparison, the proposed molecule kernel method performed better than the other QSAR methods.

  3. Assessment of Automated Measurement and Verification (M&V) Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Touzani, Samir; Custodio, Claudine

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  4. Body-wide hierarchical fuzzy modeling, recognition, and delineation of anatomy in medical images.

    PubMed

    Udupa, Jayaram K; Odhner, Dewey; Zhao, Liming; Tong, Yubing; Matsumoto, Monica M S; Ciesielski, Krzysztof C; Falcao, Alexandre X; Vaideeswaran, Pavithra; Ciesielski, Victoria; Saboury, Babak; Mohammadianrasanani, Syedmehrdad; Sin, Sanghun; Arens, Raanan; Torigian, Drew A

    2014-07-01

    To make Quantitative Radiology (QR) a reality in radiological practice, computerized body-wide Automatic Anatomy Recognition (AAR) becomes essential. With the goal of building a general AAR system that is not tied to any specific organ system, body region, or image modality, this paper presents an AAR methodology for localizing and delineating all major organs in different body regions based on fuzzy modeling ideas and a tight integration of fuzzy models with an Iterative Relative Fuzzy Connectedness (IRFC) delineation algorithm. The methodology consists of five main steps: (a) gathering image data for both building models and testing the AAR algorithms from patient image sets existing in our health system; (b) formulating precise definitions of each body region and organ and delineating them following these definitions; (c) building hierarchical fuzzy anatomy models of organs for each body region; (d) recognizing and locating organs in given images by employing the hierarchical models; and (e) delineating the organs following the hierarchy. In Step (c), we explicitly encode object size and positional relationships into the hierarchy and subsequently exploit this information in object recognition in Step (d) and delineation in Step (e). Modality-independent and dependent aspects are carefully separated in model encoding. At the model building stage, a learning process is carried out for rehearsing an optimal threshold-based object recognition method. The recognition process in Step (d) starts from large, well-defined objects and proceeds down the hierarchy in a global to local manner. A fuzzy model-based version of the IRFC algorithm is created by naturally integrating the fuzzy model constraints into the delineation algorithm. The AAR system is tested on three body regions - thorax (on CT), abdomen (on CT and MRI), and neck (on MRI and CT) - involving a total of over 35 organs and 130 data sets (the total used for model building and testing). The training and testing data sets are divided into equal size in all cases except for the neck. Overall the AAR method achieves a mean accuracy of about 2 voxels in localizing non-sparse blob-like objects and most sparse tubular objects. The delineation accuracy in terms of mean false positive and negative volume fractions is 2% and 8%, respectively, for non-sparse objects, and 5% and 15%, respectively, for sparse objects. The two object groups achieve mean boundary distance relative to ground truth of 0.9 and 1.5 voxels, respectively. Some sparse objects - venous system (in the thorax on CT), inferior vena cava (in the abdomen on CT), and mandible and naso-pharynx (in neck on MRI, but not on CT) - pose challenges at all levels, leading to poor recognition and/or delineation results. The AAR method fares quite favorably when compared with methods from the recent literature for liver, kidneys, and spleen on CT images. We conclude that separation of modality-independent from dependent aspects, organization of objects in a hierarchy, encoding of object relationship information explicitly into the hierarchy, optimal threshold-based recognition learning, and fuzzy model-based IRFC are effective concepts which allowed us to demonstrate the feasibility of a general AAR system that works in different body regions on a variety of organs and on different modalities. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Methodological Framework for Analysis of Buildings-Related Programs with BEAMS, 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, Douglas B.; Dirks, James A.; Hostick, Donna J.

    The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official “benefits estimates” for each of its major programs using its Planning, Analysis, and Evaluation (PAE) Team. PAE conducts an annual integrated modeling and analysis effort to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. These estimates are part of EERE’s budget request and are also used in the formulation of EERE’s performance measures. Two of EERE’s major programs are the Building Technologies Program (BT) and the Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports PAEmore » by developing the program characterizations and other market information necessary to provide input to the EERE integrated modeling analysis as part of PAE’s Portfolio Decision Support (PDS) effort. Additionally, PNNL also supports BT by providing line-item estimates for the Program’s internal use. PNNL uses three modeling approaches to perform these analyses. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits using one of those methods: the Building Energy Analysis and Modeling System (BEAMS). BEAMS is a PC-based accounting model that was built in Visual Basic by PNNL specifically for estimating the benefits of buildings-related projects. It allows various types of projects to be characterized including whole-building, envelope, lighting, and equipment projects. This document contains an overview section that describes the estimation process and the models used to estimate energy savings. The body of the document describes the algorithms used within the BEAMS software. This document serves both as stand-alone documentation for BEAMS, and also as a supplemental update of a previous document, Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort, (Elliott et al. 2004b). The areas most changed since the publication of that previous document are those discussing the calculation of lighting and HVAC interactive effects (for both lighting and envelope/whole-building projects). This report does not attempt to convey inputs to BEAMS or the methodology of their derivation.« less

  6. A Feature and Algorithm Selection Method for Improving the Prediction of Protein Structural Class.

    PubMed

    Ni, Qianwu; Chen, Lei

    2017-01-01

    Correct prediction of protein structural class is beneficial to investigation on protein functions, regulations and interactions. In recent years, several computational methods have been proposed in this regard. However, based on various features, it is still a great challenge to select proper classification algorithm and extract essential features to participate in classification. In this study, a feature and algorithm selection method was presented for improving the accuracy of protein structural class prediction. The amino acid compositions and physiochemical features were adopted to represent features and thirty-eight machine learning algorithms collected in Weka were employed. All features were first analyzed by a feature selection method, minimum redundancy maximum relevance (mRMR), producing a feature list. Then, several feature sets were constructed by adding features in the list one by one. For each feature set, thirtyeight algorithms were executed on a dataset, in which proteins were represented by features in the set. The predicted classes yielded by these algorithms and true class of each protein were collected to construct a dataset, which were analyzed by mRMR method, yielding an algorithm list. From the algorithm list, the algorithm was taken one by one to build an ensemble prediction model. Finally, we selected the ensemble prediction model with the best performance as the optimal ensemble prediction model. Experimental results indicate that the constructed model is much superior to models using single algorithm and other models that only adopt feature selection procedure or algorithm selection procedure. The feature selection procedure or algorithm selection procedure are really helpful for building an ensemble prediction model that can yield a better performance. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  7. Electron-Ion Dynamics with Time-Dependent Density Functional Theory: Towards Predictive Solar Cell Modeling: Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maitra, Neepa

    2016-07-14

    This project investigates the accuracy of currently-used functionals in time-dependent density functional theory, which is today routinely used to predict and design materials and computationally model processes in solar energy conversion. The rigorously-based electron-ion dynamics method developed here sheds light on traditional methods and overcomes challenges those methods have. The fundamental research undertaken here is important for building reliable and practical methods for materials discovery. The ultimate goal is to use these tools for the computational design of new materials for solar cell devices of high efficiency.

  8. Evaluating capacity-building for mental health system strengthening in low- and middle-income countries for service users and caregivers, service planners and researchers.

    PubMed

    Hanlon, C; Semrau, M; Alem, A; Abayneh, S; Abdulmalik, J; Docrat, S; Evans-Lacko, S; Gureje, O; Jordans, M; Lempp, H; Mugisha, J; Petersen, I; Shidhaye, R; Thornicroft, G

    2018-02-01

    Efforts to support the scale-up of integrated mental health care in low- and middle-income countries (LMICs) need to focus on building human resource capacity in health system strengthening, as well as in the direct provision of mental health care. In a companion editorial, we describe a range of capacity-building activities that are being implemented by a multi-country research consortium (Emerald: Emerging mental health systems in low- and middle-income countries) for (1) service users and caregivers, (2) service planners and policy-makers and (3) researchers in six LMICs (Ethiopia, India, Nepal, Nigeria, South Africa and Uganda). In this paper, we focus on the methodology being used to evaluate the impact of capacity-building in these three target groups. We first review the evidence base for approaches to evaluation of capacity-building, highlighting the gaps in this area. We then describe the adaptation of best practice for the Emerald capacity-building evaluation. The resulting mixed method evaluation framework was tailored to each target group and to each country context. We identified a need to expand the evidence base on indicators of successful capacity-building across the different target groups. To address this, we developed an evaluation plan to measure the adequacy and usefulness of quantitative capacity-building indicators when compared with qualitative evaluation. We argue that evaluation needs to be an integral part of capacity-building activities and that expertise needs to be built in methods of evaluation. The Emerald evaluation provides a potential model for capacity-building evaluation across key stakeholder groups and promises to extend understanding of useful indicators of success.

  9. Evaluation Methodologies for Information Management Systems; Building Digital Tobacco Industry Document Libraries at the University of California, San Francisco Library/Center for Knowledge Management; Experiments with the IFLA Functional Requirements for Bibliographic Records (FRBR); Coming to Term: Designing the Texas Email Repository Model.

    ERIC Educational Resources Information Center

    Morse, Emile L.; Schmidt, Heidi; Butter, Karen; Rider, Cynthia; Hickey, Thomas B.; O'Neill, Edward T.; Toves, Jenny; Green, Marlan; Soy, Sue; Gunn, Stan; Galloway, Patricia

    2002-01-01

    Includes four articles that discuss evaluation methods for information management systems under the Defense Advanced Research Projects Agency; building digital libraries at the University of California San Francisco's Tobacco Control Archives; IFLA's Functional Requirements for Bibliographic Records; and designing the Texas email repository model…

  10. OPEN AIR DEMOLITION OF FACILITIES HIGHLY CONTAMINATED WITH PLUTONIUM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LLOYD, E.R.

    2007-05-31

    The demolition of highly contaminated plutonium buildings usually is a long and expensive process that involves decontaminating the building to near free- release standards and then using conventional methods to remove the structure. It doesn't, however, have to be that way. Fluor has torn down buildings highly contaminated with plutonium without excessive decontamination. By removing the select source term and fixing the remaining contamination on the walls, ceilings, floors, and equipment surfaces; open-air demolition is not only feasible, but it can be done cheaper, better (safer), and faster. Open-air demolition techniques were used to demolish two highly contaminated buildings tomore » slab-on-grade. These facilities on the Department of Energy's Hanford Site were located in, or very near, compounds of operating nuclear facilities that housed hundreds of people working on a daily basis. To keep the facilities operating and the personnel safe, the projects had to be creative in demolishing the structures. Several key techniques were used to control contamination and keep it within the confines of the demolition area: spraying fixatives before demolition; applying fixative and misting with a fine spray of water as the buildings were being taken down; and demolishing the buildings in a controlled and methodical manner. In addition, detailed air-dispersion modeling was done to establish necessary building and meteorological conditions and to confirm the adequacy of the proposed methods. Both demolition projects were accomplished without any spread of contamination outside the modest buffer areas established for contamination control. Furthermore, personnel exposure to radiological and physical hazards was significantly reduced by using heavy equipment rather than ''hands on'' techniques.« less

  11. Building an ontology of pulmonary diseases with natural language processing tools using textual corpora.

    PubMed

    Baneyx, Audrey; Charlet, Jean; Jaulent, Marie-Christine

    2007-01-01

    Pathologies and acts are classified in thesauri to help physicians to code their activity. In practice, the use of thesauri is not sufficient to reduce variability in coding and thesauri are not suitable for computer processing. We think the automation of the coding task requires a conceptual modeling of medical items: an ontology. Our task is to help lung specialists code acts and diagnoses with software that represents medical knowledge of this concerned specialty by an ontology. The objective of the reported work was to build an ontology of pulmonary diseases dedicated to the coding process. To carry out this objective, we develop a precise methodological process for the knowledge engineer in order to build various types of medical ontologies. This process is based on the need to express precisely in natural language the meaning of each concept using differential semantics principles. A differential ontology is a hierarchy of concepts and relationships organized according to their similarities and differences. Our main research hypothesis is to apply natural language processing tools to corpora to develop the resources needed to build the ontology. We consider two corpora, one composed of patient discharge summaries and the other being a teaching book. We propose to combine two approaches to enrich the ontology building: (i) a method which consists of building terminological resources through distributional analysis and (ii) a method based on the observation of corpus sequences in order to reveal semantic relationships. Our ontology currently includes 1550 concepts and the software implementing the coding process is still under development. Results show that the proposed approach is operational and indicates that the combination of these methods and the comparison of the resulting terminological structures give interesting clues to a knowledge engineer for the building of an ontology.

  12. Assessing Graduate Attributes: Building a Criteria-Based Competency Model

    ERIC Educational Resources Information Center

    Ipperciel, Donald; ElAtia, Samira

    2014-01-01

    Graduate attributes (GAs) have become a necessary framework of reference for the 21st century competency-based model of higher education. However, the issue of evaluating and assessing GAs still remains unchartered territory. In this article, we present a criteria-based method of assessment that allows for an institution-wide comparison of the…

  13. Conflicts Management Model in School: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The object of this study is to evaluate the reasons for conflicts occurring in school according to perceptions and views of teachers and resolution strategies used for conflicts and to build a model based on the results obtained. In the research, explanatory design including quantitative and qualitative methods has been used. The quantitative part…

  14. Intelligent tutoring systems for systems engineering methodologies

    NASA Technical Reports Server (NTRS)

    Meyer, Richard J.; Toland, Joel; Decker, Louis

    1991-01-01

    The general goal is to provide the technology required to build systems that can provide intelligent tutoring in IDEF (Integrated Computer Aided Manufacturing Definition Method) modeling. The following subject areas are covered: intelligent tutoring systems for systems analysis methodologies; IDEF tutor architecture and components; developing cognitive skills for IDEF modeling; experimental software; and PC based prototype.

  15. Demographic Accounting and Model-Building. Education and Development Technical Reports.

    ERIC Educational Resources Information Center

    Stone, Richard

    This report describes and develops a model for coordinating a variety of demographic and social statistics within a single framework. The framework proposed, together with its associated methods of analysis, serves both general and specific functions. The general aim of these functions is to give numerical definition to the pattern of society and…

  16. Knowledge Management in Preserving Ecosystems: The Case of Seoul

    ERIC Educational Resources Information Center

    Lee, Jeongseok

    2009-01-01

    This study explores the utility of employing knowledge management as a framework for understanding how public managers perform ecosystem management. It applies the grounded theory method to build a model. The model is generated by applying the concept of knowledge process to an investigation of how the urban ecosystem is publicly managed by civil…

  17. Building Comprehensive High School Guidance Programs through the Smaller Learning Communities Model

    ERIC Educational Resources Information Center

    Harper, Geralyn

    2013-01-01

    Despite many reform initiatives, including the federally funded initiative titled the Smaller Learning Communities' (SLC) Model, many students are still underexposed to comprehensive guidance programs. The purpose of this mixed method project study was to examine which components in a comprehensive guidance program for the learning academies at a…

  18. 4D modeling in high-rise construction

    NASA Astrophysics Data System (ADS)

    Balakina, Anastasiya; Simankina, Tatyana; Lukinov, Vitaly

    2018-03-01

    High-rise construction is a complex construction process, requiring the use of more perfected and sophisticated tools for design, planning and construction management. The use of BIM-technologies allows minimizing the risks associated with design errors and errors that occur during construction. This article discusses a visual planning method using the 4D model, which allows the project team to create an accurate and complete construction plan, which is much more difficult to achieve with the help of traditional planning methods. The use of the 4D model in the construction of a 70-story building allowed to detect spatial and temporal errors before the start of construction work. In addition to identifying design errors, 4D modeling has allowed to optimize the construction, as follows: to optimize the operation of cranes, the placement of building structures and materials at various stages of construction, to optimize the organization of work performance, as well as to monitor the activities related to the preparation of the construction site for compliance with labor protection and safety requirements, which resulted in saving money and time.

  19. Analytical Tools for Functional Assessment of Architectural Layouts

    NASA Astrophysics Data System (ADS)

    Bąkowski, Jarosław

    2017-10-01

    Functional layout of the building, understood as a layout or set of the facility rooms (or groups of rooms) with a system of internal communication, creates an environment and a place of mutual relations between the occupants of the object. Achieving optimal (from the occupants’ point of view) spatial arrangement is possible through activities that often go beyond the stage of architectural design. Adopted in the architectural design, most often during trial and error process or on the basis of previous experience (evidence-based design), functional layout is subject to continuous evaluation and dynamic changing since the beginning of its use. Such verification of the occupancy phase allows to plan future, possible transformations, as well as to develop model solutions for use in other settings. In broader terms, the research hypothesis is to examine whether and how the collected datasets concerning the facility and its utilization can be used to develop methods for assessing functional layout of buildings. In other words, if it is possible to develop an objective method of assessing functional layouts basing on a set of buildings’ parameters: technical, technological and functional ones and whether the method allows developing a set of tools enhancing the design methodology of complex functional objects. By linking the design with the construction phase it is possible to build parametric models of functional layouts, especially in the context of sustainable design or lean design in every aspect: ecological (by reducing the property’s impact on environment), economic (by optimizing its cost) and social (through the implementation of high-performance work environment). Parameterization of size and functional connections of the facility become part of the analyses, as well as the element of model solutions. The “lean” approach means the process of analysis of the existing scheme and consequently - finding weak points as well as means for eliminating these defects. This approach, supplemented by the method of reverse engineering means that already in the design phase there is essential knowledge about the functioning of the facility. It is far beyond intuitive knowledge, based on the standards and specifications. In the scope of reverse engineering methods, the subject of the research is an audit of the product (i.e. architectural design, especially the built spatial layout) in order to determine exactly how it works. Information gained in this way is to help building a system for supporting decisions for preparing design solutions for future investments as well as the functional analysis itself becomes an essential part of the setting up building information process. The data are presented with graphical methods as networks of different factors between rooms. The direct analytical method for the setting is to determine the functional collision between users’ tracks, finding or indication of the shortest paths connecting analyzed rooms and finally to identify the optimal location of these rooms (each according to different factor). The measurement data are supplemented by the results of surveys conducted among users of hospitals, statistics and quantitative medical procedures performed in the test section of the hospital. The results of research are transferred and integrated with BIM system (building information modelling system), and included in the specifications of the IFC (Industry Foundation Classes), especially at the level of information on the relationship between the individual properties associated with elements (in the case of hospitals it may be information about the necessary connections with other rooms, access times from or to specific rooms, rooms utilization conditions, fire safety protection and conditions and many other). At the level of the BIM specification the model data are integrated at the BIM 6D (an extension of the model data with a range of functional analysis) or even BIM 7D (additional integration with systems used at the stage of operation and maintenance of the facility).

  20. New strategy for determination of anthocyanins, polyphenols and antioxidant capacity of Brassica oleracea liquid extract using infrared spectroscopies and multivariate regression

    NASA Astrophysics Data System (ADS)

    de Oliveira, Isadora R. N.; Roque, Jussara V.; Maia, Mariza P.; Stringheta, Paulo C.; Teófilo, Reinaldo F.

    2018-04-01

    A new method was developed to determine the antioxidant properties of red cabbage extract (Brassica oleracea) by mid (MID) and near (NIR) infrared spectroscopies and partial least squares (PLS) regression. A 70% (v/v) ethanolic extract of red cabbage was concentrated to 9° Brix and further diluted (12 to 100%) in water. The dilutions were used as external standards for the building of PLS models. For the first time, this strategy was applied for building multivariate regression models. Reference analyses and spectral data were obtained from diluted extracts. The determinate properties were total and monomeric anthocyanins, total polyphenols and antioxidant capacity by ABTS (2,2-azino-bis(3-ethyl-benzothiazoline-6-sulfonate)) and DPPH (2,2-diphenyl-1-picrylhydrazyl) methods. Ordered predictors selection (OPS) and genetic algorithm (GA) were used for feature selection before PLS regression (PLS-1). In addition, a PLS-2 regression was applied to all properties simultaneously. PLS-1 models provided more predictive models than did PLS-2 regression. PLS-OPS and PLS-GA models presented excellent prediction results with a correlation coefficient higher than 0.98. However, the best models were obtained using PLS and variable selection with the OPS algorithm and the models based on NIR spectra were considered more predictive for all properties. Then, these models provided a simple, rapid and accurate method for determination of red cabbage extract antioxidant properties and its suitability for use in the food industry.

  1. Protein structure modeling and refinement by global optimization in CASP12.

    PubMed

    Hong, Seung Hwan; Joung, InSuk; Flores-Canales, Jose C; Manavalan, Balachandran; Cheng, Qianyi; Heo, Seungryong; Kim, Jong Yun; Lee, Sun Young; Nam, Mikyung; Joo, Keehyoung; Lee, In-Ho; Lee, Sung Jong; Lee, Jooyoung

    2018-03-01

    For protein structure modeling in the CASP12 experiment, we have developed a new protocol based on our previous CASP11 approach. The global optimization method of conformational space annealing (CSA) was applied to 3 stages of modeling: multiple sequence-structure alignment, three-dimensional (3D) chain building, and side-chain re-modeling. For better template selection and model selection, we updated our model quality assessment (QA) method with the newly developed SVMQA (support vector machine for quality assessment). For 3D chain building, we updated our energy function by including restraints generated from predicted residue-residue contacts. New energy terms for the predicted secondary structure and predicted solvent accessible surface area were also introduced. For difficult targets, we proposed a new method, LEEab, where the template term played a less significant role than it did in LEE, complemented by increased contributions from other terms such as the predicted contact term. For TBM (template-based modeling) targets, LEE performed better than LEEab, but for FM targets, LEEab was better. For model refinement, we modified our CASP11 molecular dynamics (MD) based protocol by using explicit solvents and tuning down restraint weights. Refinement results from MD simulations that used a new augmented statistical energy term in the force field were quite promising. Finally, when using inaccurate information (such as the predicted contacts), it was important to use the Lorentzian function for which the maximal penalty arising from wrong information is always bounded. © 2017 Wiley Periodicals, Inc.

  2. Linear solvation energy relationships in normal phase chromatography based on gradient separations.

    PubMed

    Wu, Di; Lucy, Charles A

    2017-09-22

    Coupling the modified Soczewiñski model and one gradient run, a gradient method was developed to build a linear solvation energy relationship (LSER) for normal phase chromatography. The gradient method was tested on dinitroanilinopropyl (DNAP) and silica columns with hexane/dichloromethane (DCM) mobile phases. LSER models built based on the gradient separation agree with those derived from a series of isocratic separations. Both models have similar LSER coefficients and comparable goodness of fit, but the LSER model based on gradient separation required fewer trial and error experiments. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Improved Maximum Parsimony Models for Phylogenetic Networks.

    PubMed

    Van Iersel, Leo; Jones, Mark; Scornavacca, Celine

    2018-05-01

    Phylogenetic networks are well suited to represent evolutionary histories comprising reticulate evolution. Several methods aiming at reconstructing explicit phylogenetic networks have been developed in the last two decades. In this article, we propose a new definition of maximum parsimony for phylogenetic networks that permits to model biological scenarios that cannot be modeled by the definitions currently present in the literature (namely, the "hardwired" and "softwired" parsimony). Building on this new definition, we provide several algorithmic results that lay the foundations for new parsimony-based methods for phylogenetic network reconstruction.

  4. Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Application of Model View Definition Attributes

    DTIC Science & Technology

    2013-06-01

    Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...to develop a life-cycle building model have resulted in the definition of a “core” building information model that contains general information de...develop an information -exchange Model View Definition (MVD) for building electrical systems. The objective of the current work was to document the

  5. An algorithm to estimate building heights from Google street-view imagery using single view metrology across a representational state transfer system

    NASA Astrophysics Data System (ADS)

    Díaz, Elkin; Arguello, Henry

    2016-05-01

    Urban ecosystem studies require monitoring, controlling and planning to analyze building density, urban density, urban planning, atmospheric modeling and land use. In urban planning, there are many methods for building height estimation using optical remote sensing images. These methods however, highly depend on sun illumination and cloud-free weather. In contrast, high resolution synthetic aperture radar provides images independent from daytime and weather conditions, although, these images rely on special hardware and expensive acquisition. Most of the biggest cities around the world have been photographed by Google street view under different conditions. Thus, thousands of images from the principal streets of a city can be accessed online. The availability of this and similar rich city imagery such as StreetSide from Microsoft, represents huge opportunities in computer vision because these images can be used as input in many applications such as 3D modeling, segmentation, recognition and stereo correspondence. This paper proposes a novel algorithm to estimate building heights using public Google Street-View imagery. The objective of this work is to obtain thousands of geo-referenced images from Google Street-View using a representational state transfer system, and estimate their average height using single view metrology. Furthermore, the resulting measurements and image metadata are used to derive a layer of heights in a Google map available online. The experimental results show that the proposed algorithm can estimate an accurate average building height map of thousands of images using Google Street-View Imagery of any city.

  6. Compact and Hybrid Feature Description for Building Extraction

    NASA Astrophysics Data System (ADS)

    Li, Z.; Liu, Y.; Hu, Y.; Li, P.; Ding, Y.

    2017-05-01

    Building extraction in aerial orthophotos is crucial for various applications. Currently, deep learning has been shown to be successful in addressing building extraction with high accuracy and high robustness. However, quite a large number of samples is required in training a classifier when using deep learning model. In order to realize accurate and semi-interactive labelling, the performance of feature description is crucial, as it has significant effect on the accuracy of classification. In this paper, we bring forward a compact and hybrid feature description method, in order to guarantees desirable classification accuracy of the corners on the building roof contours. The proposed descriptor is a hybrid description of an image patch constructed from 4 sets of binary intensity tests. Experiments show that benefiting from binary description and making full use of color channels, this descriptor is not only computationally frugal, but also accurate than SURF for building extraction.

  7. UNDERSTANDING FLOW OF ENERGY IN BUILDINGS USING MODAL ANALYSIS METHODOLOGY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Gardner; Kevin Heglund; Kevin Van Den Wymelenberg

    2013-07-01

    It is widely understood that energy storage is the key to integrating variable generators into the grid. It has been proposed that the thermal mass of buildings could be used as a distributed energy storage solution and several researchers are making headway in this problem. However, the inability to easily determine the magnitude of the building’s effective thermal mass, and how the heating ventilation and air conditioning (HVAC) system exchanges thermal energy with it, is a significant challenge to designing systems which utilize this storage mechanism. In this paper we adapt modal analysis methods used in mechanical structures to identifymore » the primary modes of energy transfer among thermal masses in a building. The paper describes the technique using data from an idealized building model. The approach is successfully applied to actual temperature data from a commercial building in downtown Boise, Idaho.« less

  8. 10 CFR 434.505 - Reference building method.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Reference building method. 434.505 Section 434.505 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.505 Reference building method. 505...

  9. 10 CFR 434.505 - Reference building method.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Reference building method. 434.505 Section 434.505 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.505 Reference building method. 505.1...

  10. 10 CFR 434.505 - Reference building method.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Reference building method. 434.505 Section 434.505 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.505 Reference building method. 505...

  11. 10 CFR 434.505 - Reference building method.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Reference building method. 434.505 Section 434.505 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.505 Reference building method. 505...

  12. 10 CFR 434.505 - Reference building method.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Reference building method. 434.505 Section 434.505 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.505 Reference building method. 505...

  13. Method of Testing and Predicting Failures of Electronic Mechanical Systems

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Patterson-Hine, Frances A.

    1996-01-01

    A method employing a knowledge base of human expertise comprising a reliability model analysis implemented for diagnostic routines is disclosed. The reliability analysis comprises digraph models that determine target events created by hardware failures human actions, and other factors affecting the system operation. The reliability analysis contains a wealth of human expertise information that is used to build automatic diagnostic routines and which provides a knowledge base that can be used to solve other artificial intelligence problems.

  14. [Study of building quantitative analysis model for chlorophyll in winter wheat with reflective spectrum using MSC-ANN algorithm].

    PubMed

    Liang, Xue; Ji, Hai-yan; Wang, Peng-xin; Rao, Zhen-hong; Shen, Bing-hui

    2010-01-01

    Preprocess method of multiplicative scatter correction (MSC) was used to reject noises in the original spectra produced by the environmental physical factor effectively, then the principal components of near-infrared spectroscopy were calculated by nonlinear iterative partial least squares (NIPALS) before building the back propagation artificial neural networks method (BP-ANN), and the numbers of principal components were calculated by the method of cross validation. The calculated principal components were used as the inputs of the artificial neural networks model, and the artificial neural networks model was used to find the relation between chlorophyll in winter wheat and reflective spectrum, which can predict the content of chlorophyll in winter wheat. The correlation coefficient (r) of calibration set was 0.9604, while the standard deviation (SD) and relative standard deviation (RSD) was 0.187 and 5.18% respectively. The correlation coefficient (r) of predicted set was 0.9600, and the standard deviation (SD) and relative standard deviation (RSD) was 0.145 and 4.21% respectively. It means that the MSC-ANN algorithm can reject noises in the original spectra produced by the environmental physical factor effectively and set up an exact model to predict the contents of chlorophyll in living leaves veraciously to replace the classical method and meet the needs of fast analysis of agricultural products.

  15. Digital Microdroplet Ejection Technology-Based Heterogeneous Objects Prototyping

    PubMed Central

    Yang, Jiquan; Feng, Chunmei; Yang, Jianfei; Zhu, Liya; Guo, Aiqing

    2016-01-01

    An integrate fabrication framework is presented to build heterogeneous objects (HEO) using digital microdroplets injecting technology and rapid prototyping. The heterogeneous materials part design and manufacturing method in structure and material was used to change the traditional process. The net node method was used for digital modeling that can configure multimaterials in time. The relationship of material, color, and jetting nozzle was built. The main important contributions are to combine the structure, material, and visualization in one process and give the digital model for manufacture. From the given model, it is concluded that the method is effective for HEO. Using microdroplet rapid prototyping and the model given in the paper HEO could be gotten basically. The model could be used in 3D biomanufacturing. PMID:26981110

  16. Digital Microdroplet Ejection Technology-Based Heterogeneous Objects Prototyping.

    PubMed

    Li, Na; Yang, Jiquan; Feng, Chunmei; Yang, Jianfei; Zhu, Liya; Guo, Aiqing

    2016-01-01

    An integrate fabrication framework is presented to build heterogeneous objects (HEO) using digital microdroplets injecting technology and rapid prototyping. The heterogeneous materials part design and manufacturing method in structure and material was used to change the traditional process. The net node method was used for digital modeling that can configure multimaterials in time. The relationship of material, color, and jetting nozzle was built. The main important contributions are to combine the structure, material, and visualization in one process and give the digital model for manufacture. From the given model, it is concluded that the method is effective for HEO. Using microdroplet rapid prototyping and the model given in the paper HEO could be gotten basically. The model could be used in 3D biomanufacturing.

  17. How can machine-learning methods assist in virtual screening for hyperuricemia? A healthcare machine-learning approach.

    PubMed

    Ichikawa, Daisuke; Saito, Toki; Ujita, Waka; Oyama, Hiroshi

    2016-12-01

    Our purpose was to develop a new machine-learning approach (a virtual health check-up) toward identification of those at high risk of hyperuricemia. Applying the system to general health check-ups is expected to reduce medical costs compared with administering an additional test. Data were collected during annual health check-ups performed in Japan between 2011 and 2013 (inclusive). We prepared training and test datasets from the health check-up data to build prediction models; these were composed of 43,524 and 17,789 persons, respectively. Gradient-boosting decision tree (GBDT), random forest (RF), and logistic regression (LR) approaches were trained using the training dataset and were then used to predict hyperuricemia in the test dataset. Undersampling was applied to build the prediction models to deal with the imbalanced class dataset. The results showed that the RF and GBDT approaches afforded the best performances in terms of sensitivity and specificity, respectively. The area under the curve (AUC) values of the models, which reflected the total discriminative ability of the classification, were 0.796 [95% confidence interval (CI): 0.766-0.825] for the GBDT, 0.784 [95% CI: 0.752-0.815] for the RF, and 0.785 [95% CI: 0.752-0.819] for the LR approaches. No significant differences were observed between pairs of each approach. Small changes occurred in the AUCs after applying undersampling to build the models. We developed a virtual health check-up that predicted the development of hyperuricemia using machine-learning methods. The GBDT, RF, and LR methods had similar predictive capability. Undersampling did not remarkably improve predictive power. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Study of City Landscape Heritage Using Lidar Data and 3d-City Models

    NASA Astrophysics Data System (ADS)

    Rubinowicz, P.; Czynska, K.

    2015-04-01

    In contemporary town planning protection of urban landscape is a significant issue. It regards especially those cities, where urban structures are the result of ages of evolution and layering of historical development process. Specific panoramas and other strategic views with historic city dominants can be an important part of the cultural heritage and genius loci. Other hand, protection of such expositions introduces limitations for future based city development. Digital Earth observation techniques creates new possibilities for more accurate urban studies, monitoring of urbanization processes and measuring of city landscape parameters. The paper examines possibilities of application of Lidar data and digital 3D-city models for: a) evaluation of strategic city views, b) mapping landscape absorption limits, and c) determination protection zones, where the urbanization and buildings height should be limited. In reference to this goal, the paper introduces a method of computational analysis of the city landscape called Visual Protection Surface (VPS). The method allows to emulate a virtual surface above the city including protection of a selected strategic views. The surface defines maximum height of buildings in such a way, that no new facility can be seen in any of selected views. The research includes also analyses of the quality of simulations according the form and precision of the input data: airborne Lidar / DSM model and more advanced 3D-city models (incl. semantic of the geometry, like in CityGML format). The outcome can be a support for professional planning of tall building development. Application of VPS method have been prepared by a computer program developed by the authors (C++). Simulations were carried out on an example of the city of Dresden.

  19. Iterative model building, structure refinement and density modification with the PHENIX AutoBuild wizard

    PubMed Central

    Terwilliger, Thomas C.; Grosse-Kunstleve, Ralf W.; Afonine, Pavel V.; Moriarty, Nigel W.; Zwart, Peter H.; Hung, Li-Wei; Read, Randy J.; Adams, Paul D.

    2008-01-01

    The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 Å, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution. PMID:18094468

  20. An Innovative Time-Cost-Quality Tradeoff Modeling of Building Construction Project Based on Resource Allocation

    PubMed Central

    2014-01-01

    The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated. PMID:24672351

Top