Building Energy Modeling and Control Methods for Optimization and Renewables Integration
NASA Astrophysics Data System (ADS)
Burger, Eric M.
This dissertation presents techniques for the numerical modeling and control of building systems, with an emphasis on thermostatically controlled loads. The primary objective of this work is to address technical challenges related to the management of energy use in commercial and residential buildings. This work is motivated by the need to enhance the performance of building systems and by the potential for aggregated loads to perform load following and regulation ancillary services, thereby enabling the further adoption of intermittent renewable energy generation technologies. To increase the generalizability of the techniques, an emphasis is placed on recursive and adaptive methods which minimize the need for customization to specific buildings and applications. The techniques presented in this dissertation can be divided into two general categories: modeling and control. Modeling techniques encompass the processing of data streams from sensors and the training of numerical models. These models enable us to predict the energy use of a building and of sub-systems, such as a heating, ventilation, and air conditioning (HVAC) unit. Specifically, we first present an ensemble learning method for the short-term forecasting of total electricity demand in buildings. As the deployment of intermittent renewable energy resources continues to rise, the generation of accurate building-level electricity demand forecasts will be valuable to both grid operators and building energy management systems. Second, we present a recursive parameter estimation technique for identifying a thermostatically controlled load (TCL) model that is non-linear in the parameters. For TCLs to perform demand response services in real-time markets, online methods for parameter estimation are needed. Third, we develop a piecewise linear thermal model of a residential building and train the model using data collected from a custom-built thermostat. This model is capable of approximating unmodeled dynamics within a building by learning from sensor data. Control techniques encompass the application of optimal control theory, model predictive control, and convex distributed optimization to TCLs. First, we present the alternative control trajectory (ACT) representation, a novel method for the approximate optimization of non-convex discrete systems. This approach enables the optimal control of a population of non-convex agents using distributed convex optimization techniques. Second, we present a distributed convex optimization algorithm for the control of a TCL population. Experimental results demonstrate the application of this algorithm to the problem of renewable energy generation following. This dissertation contributes to the development of intelligent energy management systems for buildings by presenting a suite of novel and adaptable modeling and control techniques. Applications focus on optimizing the performance of building operations and on facilitating the integration of renewable energy resources.
NASA Technical Reports Server (NTRS)
Peabody, Hume L.
2017-01-01
This presentation is meant to be an overview of the model building process It is based on typical techniques (Monte Carlo Ray Tracing for radiation exchange, Lumped Parameter, Finite Difference for thermal solution) used by the aerospace industry This is not intended to be a "How to Use ThermalDesktop" course. It is intended to be a "How to Build Thermal Models" course and the techniques will be demonstrated using the capabilities of ThermalDesktop (TD). Other codes may or may not have similar capabilities. The General Model Building Process can be broken into four top level steps: 1. Build Model; 2. Check Model; 3. Execute Model; 4. Verify Results.
The Building Blocks of Geology.
ERIC Educational Resources Information Center
Gibson, Betty O.
2001-01-01
Discusses teaching techniques for teaching about rocks, minerals, and the differences between them. Presents a model-building activity that uses plastic building blocks to build crystal and rock models. (YDS)
A Learning Framework for Control-Oriented Modeling of Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubio-Herrero, Javier; Chandan, Vikas; Siegel, Charles M.
Buildings consume a significant amount of energy worldwide. Several building optimization and control use cases require models of energy consumption which are control oriented, have high predictive capability, imposes minimal data pre-processing requirements, and have the ability to be adapted continuously to account for changing conditions as new data becomes available. Data driven modeling techniques, that have been investigated so far, while promising in the context of buildings, have been unable to simultaneously satisfy all the requirements mentioned above. In this context, deep learning techniques such as Recurrent Neural Networks (RNNs) hold promise, empowered by advanced computational capabilities and bigmore » data opportunities. In this paper, we propose a deep learning based methodology for the development of control oriented models for building energy management and test in on data from a real building. Results show that the proposed methodology outperforms other data driven modeling techniques significantly. We perform a detailed analysis of the proposed methodology along dimensions such as topology, sensitivity, and downsampling. Lastly, we conclude by envisioning a building analytics suite empowered by the proposed deep framework, that can drive several use cases related to building energy management.« less
NASA Astrophysics Data System (ADS)
Rüther, Heinz; Martine, Hagai M.; Mtalo, E. G.
This paper presents a novel approach to semiautomatic building extraction in informal settlement areas from aerial photographs. The proposed approach uses a strategy of delineating buildings by optimising their approximate building contour position. Approximate building contours are derived automatically by locating elevation blobs in digital surface models. Building extraction is then effected by means of the snakes algorithm and the dynamic programming optimisation technique. With dynamic programming, the building contour optimisation problem is realized through a discrete multistage process and solved by the "time-delayed" algorithm, as developed in this work. The proposed building extraction approach is a semiautomatic process, with user-controlled operations linking fully automated subprocesses. Inputs into the proposed building extraction system are ortho-images and digital surface models, the latter being generated through image matching techniques. Buildings are modeled as "lumps" or elevation blobs in digital surface models, which are derived by altimetric thresholding of digital surface models. Initial windows for building extraction are provided by projecting the elevation blobs centre points onto an ortho-image. In the next step, approximate building contours are extracted from the ortho-image by region growing constrained by edges. Approximate building contours thus derived are inputs into the dynamic programming optimisation process in which final building contours are established. The proposed system is tested on two study areas: Marconi Beam in Cape Town, South Africa, and Manzese in Dar es Salaam, Tanzania. Sixty percent of buildings in the study areas have been extracted and verified and it is concluded that the proposed approach contributes meaningfully to the extraction of buildings in moderately complex and crowded informal settlement areas.
Geospatial database for heritage building conservation
NASA Astrophysics Data System (ADS)
Basir, W. N. F. W. A.; Setan, H.; Majid, Z.; Chong, A.
2014-02-01
Heritage buildings are icons from the past that exist in present time. Through heritage architecture, we can learn about economic issues and social activities of the past. Nowadays, heritage buildings are under threat from natural disaster, uncertain weather, pollution and others. In order to preserve this heritage for the future generation, recording and documenting of heritage buildings are required. With the development of information system and data collection technique, it is possible to create a 3D digital model. This 3D information plays an important role in recording and documenting heritage buildings. 3D modeling and virtual reality techniques have demonstrated the ability to visualize the real world in 3D. It can provide a better platform for communication and understanding of heritage building. Combining 3D modelling with technology of Geographic Information System (GIS) will create a database that can make various analyses about spatial data in the form of a 3D model. Objectives of this research are to determine the reliability of Terrestrial Laser Scanning (TLS) technique for data acquisition of heritage building and to develop a geospatial database for heritage building conservation purposes. The result from data acquisition will become a guideline for 3D model development. This 3D model will be exported to the GIS format in order to develop a database for heritage building conservation. In this database, requirements for heritage building conservation process are included. Through this research, a proper database for storing and documenting of the heritage building conservation data will be developed.
Toward a virtual building laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klems, J.H.; Finlayson, E.U.; Olsen, T.H.
1999-03-01
In order to achieve in a timely manner the large energy and dollar savings technically possible through improvements in building energy efficiency, it will be necessary to solve the problem of design failure risk. The most economical method of doing this would be to learn to calculate building performance with sufficient detail, accuracy and reliability to avoid design failure. Existing building simulation models (BSM) are a large step in this direction, but are still not capable of this level of modeling. Developments in computational fluid dynamics (CFD) techniques now allow one to construct a road map from present BSM's tomore » a complete building physical model. The most useful first step is a building interior model (BIM) that would allow prediction of local conditions affecting occupant health and comfort. To provide reliable prediction a BIM must incorporate the correct physical boundary conditions on a building interior. Doing so raises a number of specific technical problems and research questions. The solution of these within a context useful for building research and design is not likely to result from other research on CFD, which is directed toward the solution of different types of problems. A six-step plan for incorporating the correct boundary conditions within the context of the model problem of a large atrium has been outlined. A promising strategy for constructing a BIM is the overset grid technique for representing a building space in a CFD calculation. This technique promises to adapt well to building design and allows a step-by-step approach. A state-of-the-art CFD computer code using this technique has been adapted to the problem and can form the departure point for this research.« less
A Method to Test Model Calibration Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, Ron; Polly, Ben; Neymark, Joel
This paper describes a method for testing model calibration techniques. Calibration is commonly used in conjunction with energy retrofit audit models. An audit is conducted to gather information about the building needed to assemble an input file for a building energy modeling tool. A calibration technique is used to reconcile model predictions with utility data, and then the 'calibrated model' is used to predict energy savings from a variety of retrofit measures and combinations thereof. Current standards and guidelines such as BPI-2400 and ASHRAE-14 set criteria for 'goodness of fit' and assume that if the criteria are met, then themore » calibration technique is acceptable. While it is logical to use the actual performance data of the building to tune the model, it is not certain that a good fit will result in a model that better predicts post-retrofit energy savings. Therefore, the basic idea here is that the simulation program (intended for use with the calibration technique) is used to generate surrogate utility bill data and retrofit energy savings data against which the calibration technique can be tested. This provides three figures of merit for testing a calibration technique, 1) accuracy of the post-retrofit energy savings prediction, 2) closure on the 'true' input parameter values, and 3) goodness of fit to the utility bill data. The paper will also discuss the pros and cons of using this synthetic surrogate data approach versus trying to use real data sets of actual buildings.« less
A Method to Test Model Calibration Techniques: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, Ron; Polly, Ben; Neymark, Joel
This paper describes a method for testing model calibration techniques. Calibration is commonly used in conjunction with energy retrofit audit models. An audit is conducted to gather information about the building needed to assemble an input file for a building energy modeling tool. A calibration technique is used to reconcile model predictions with utility data, and then the 'calibrated model' is used to predict energy savings from a variety of retrofit measures and combinations thereof. Current standards and guidelines such as BPI-2400 and ASHRAE-14 set criteria for 'goodness of fit' and assume that if the criteria are met, then themore » calibration technique is acceptable. While it is logical to use the actual performance data of the building to tune the model, it is not certain that a good fit will result in a model that better predicts post-retrofit energy savings. Therefore, the basic idea here is that the simulation program (intended for use with the calibration technique) is used to generate surrogate utility bill data and retrofit energy savings data against which the calibration technique can be tested. This provides three figures of merit for testing a calibration technique, 1) accuracy of the post-retrofit energy savings prediction, 2) closure on the 'true' input parameter values, and 3) goodness of fit to the utility bill data. The paper will also discuss the pros and cons of using this synthetic surrogate data approach versus trying to use real data sets of actual buildings.« less
Combined Use of Terrestrial Laser Scanning and IR Thermography Applied to a Historical Building
Costanzo, Antonio; Minasi, Mario; Casula, Giuseppe; Musacchio, Massimo; Buongiorno, Maria Fabrizia
2015-01-01
The conservation of architectural heritage usually requires a multidisciplinary approach involving a variety of specialist expertise and techniques. Nevertheless, destructive techniques should be avoided, wherever possible, in order to preserve the integrity of the historical buildings, therefore the development of non-destructive and non-contact techniques is extremely important. In this framework, a methodology for combining the terrestrial laser scanning and the infrared thermal images is proposed, in order to obtain a reconnaissance of the conservation state of a historical building. The proposed case study is represented by St. Augustine Monumental Compound, located in the historical centre of the town of Cosenza (Calabria, South Italy). Adopting the proposed methodology, the paper illustrates the main results obtained for the building test overlaying and comparing the collected data with both techniques, in order to outline the capabilities both to detect the anomalies and to improve the knowledge on health state of the masonry building. The 3D model, also, allows to provide a reference model, laying the groundwork for implementation of a monitoring multisensor system based on the use of non-destructive techniques. PMID:25609042
Combined use of terrestrial laser scanning and IR thermography applied to a historical building.
Costanzo, Antonio; Minasi, Mario; Casula, Giuseppe; Musacchio, Massimo; Buongiorno, Maria Fabrizia
2014-12-24
The conservation of architectural heritage usually requires a multidisciplinary approach involving a variety of specialist expertise and techniques. Nevertheless, destructive techniques should be avoided, wherever possible, in order to preserve the integrity of the historical buildings, therefore the development of non-destructive and non-contact techniques is extremely important. In this framework, a methodology for combining the terrestrial laser scanning and the infrared thermal images is proposed, in order to obtain a reconnaissance of the conservation state of a historical building. The proposed case study is represented by St. Augustine Monumental Compound, located in the historical centre of the town of Cosenza (Calabria, South Italy). Adopting the proposed methodology, the paper illustrates the main results obtained for the building test overlaying and comparing the collected data with both techniques, in order to outline the capabilities both to detect the anomalies and to improve the knowledge on health state of the masonry building. The 3D model, also, allows to provide a reference model, laying the groundwork for implementation of a monitoring multisensor system based on the use of non-destructive techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This appendix summarizes building characteristics used to determine heating and cooling loads for each of the five building types in each of the four regions. For the selected five buildings, the following data are attached: new and existing construction characteristics; new and existing construction thermal resistance; floor plan and elevation; people load schedule; lighting load schedule; appliance load schedule; ventilation schedule; and hot water use schedule. For the five building types (single family, apartment buildings, commercial buildings, office buildings, and schools), data are compiled in 10 appendices. These are Building Characteristics; Alternate Energy Sources and Energy Conservation Techniques Description, Costs,more » Fuel Price Scenarios; Life Cycle Cost Model; Simulation Models; Solar Heating/Cooling System; Condensed Weather; Single and Multi-Family Dwelling Characteristics and Energy Conservation Techniques; Mixed Strategies for Energy Conservation and Alternative Energy Utilization in Buildings. An extensive bibliography is given in the final appendix. (MCW)« less
NASA Astrophysics Data System (ADS)
Ergun, Bahadir
2007-07-01
Today, terrestrial laser scanning has been a frequently used methodology for the documentation of historical buildings and cultural heritages. The historical peninsula region is the documentation of historical buildings and cover approximately 1500 ha. Terrestrial laser scanning and close range image photogrammetry techniques are integrated to each other to create a 3D urban model of Istanbul including the most important landmarks and the buildings reflecting the most brilliant areas of Byzantine and Ottoman Empires.
NASA Astrophysics Data System (ADS)
Kaskhedikar, Apoorva Prakash
According to the U.S. Energy Information Administration, commercial buildings represent about 40% of the United State's energy consumption of which office buildings consume a major portion. Gauging the extent to which an individual building consumes energy in excess of its peers is the first step in initiating energy efficiency improvement. Energy Benchmarking offers initial building energy performance assessment without rigorous evaluation. Energy benchmarking tools based on the Commercial Buildings Energy Consumption Survey (CBECS) database are investigated in this thesis. This study proposes a new benchmarking methodology based on decision trees, where a relationship between the energy use intensities (EUI) and building parameters (continuous and categorical) is developed for different building types. This methodology was applied to medium office and school building types contained in the CBECS database. The Random Forest technique was used to find the most influential parameters that impact building energy use intensities. Subsequently, correlations which were significant were identified between EUIs and CBECS variables. Other than floor area, some of the important variables were number of workers, location, number of PCs and main cooling equipment. The coefficient of variation was used to evaluate the effectiveness of the new model. The customization technique proposed in this thesis was compared with another benchmarking model that is widely used by building owners and designers namely, the ENERGY STAR's Portfolio Manager. This tool relies on the standard Linear Regression methods which is only able to handle continuous variables. The model proposed uses data mining technique and was found to perform slightly better than the Portfolio Manager. The broader impacts of the new benchmarking methodology proposed is that it allows for identifying important categorical variables, and then incorporating them in a local, as against a global, model framework for EUI pertinent to the building type. The ability to identify and rank the important variables is of great importance in practical implementation of the benchmarking tools which rely on query-based building and HVAC variable filters specified by the user.
Development of an automated energy audit protocol for office buildings
NASA Astrophysics Data System (ADS)
Deb, Chirag
This study aims to enhance the building energy audit process, and bring about reduction in time and cost requirements in the conduction of a full physical audit. For this, a total of 5 Energy Service Companies in Singapore have collaborated and provided energy audit reports for 62 office buildings. Several statistical techniques are adopted to analyse these reports. These techniques comprise cluster analysis and development of prediction models to predict energy savings for buildings. The cluster analysis shows that there are 3 clusters of buildings experiencing different levels of energy savings. To understand the effect of building variables on the change in EUI, a robust iterative process for selecting the appropriate variables is developed. The results show that the 4 variables of GFA, non-air-conditioning energy consumption, average chiller plant efficiency and installed capacity of chillers should be taken for clustering. This analysis is extended to the development of prediction models using linear regression and artificial neural networks (ANN). An exhaustive variable selection algorithm is developed to select the input variables for the two energy saving prediction models. The results show that the ANN prediction model can predict the energy saving potential of a given building with an accuracy of +/-14.8%.
Computational Fluid Dynamics (CFD) techniques are increasingly being applied to air quality modeling of short-range dispersion, especially the flow and dispersion around buildings and other geometrically complex structures. The proper application and accuracy of such CFD techniqu...
Cruz, Antonio M; Barr, Cameron; Puñales-Pozo, Elsa
2008-01-01
This research's main goals were to build a predictor for a turnaround time (TAT) indicator for estimating its values and use a numerical clustering technique for finding possible causes of undesirable TAT values. The following stages were used: domain understanding, data characterisation and sample reduction and insight characterisation. Building the TAT indicator multiple linear regression predictor and clustering techniques were used for improving corrective maintenance task efficiency in a clinical engineering department (CED). The indicator being studied was turnaround time (TAT). Multiple linear regression was used for building a predictive TAT value model. The variables contributing to such model were clinical engineering department response time (CE(rt), 0.415 positive coefficient), stock service response time (Stock(rt), 0.734 positive coefficient), priority level (0.21 positive coefficient) and service time (0.06 positive coefficient). The regression process showed heavy reliance on Stock(rt), CE(rt) and priority, in that order. Clustering techniques revealed the main causes of high TAT values. This examination has provided a means for analysing current technical service quality and effectiveness. In doing so, it has demonstrated a process for identifying areas and methods of improvement and a model against which to analyse these methods' effectiveness.
Building energy modeling for green architecture and intelligent dashboard applications
NASA Astrophysics Data System (ADS)
DeBlois, Justin
Buildings are responsible for 40% of the carbon emissions in the United States. Energy efficiency in this sector is key to reducing overall greenhouse gas emissions. This work studied the passive technique called the roof solar chimney for reducing the cooling load in homes architecturally. Three models of the chimney were created: a zonal building energy model, computational fluid dynamics model, and numerical analytic model. The study estimated the error introduced to the building energy model (BEM) through key assumptions, and then used a sensitivity analysis to examine the impact on the model outputs. The conclusion was that the error in the building energy model is small enough to use it for building simulation reliably. Further studies simulated the roof solar chimney in a whole building, integrated into one side of the roof. Comparisons were made between high and low efficiency constructions, and three ventilation strategies. The results showed that in four US climates, the roof solar chimney results in significant cooling load energy savings of up to 90%. After developing this new method for the small scale representation of a passive architecture technique in BEM, the study expanded the scope to address a fundamental issue in modeling - the implementation of the uncertainty from and improvement of occupant behavior. This is believed to be one of the weakest links in both accurate modeling and proper, energy efficient building operation. A calibrated model of the Mascaro Center for Sustainable Innovation's LEED Gold, 3,400 m2 building was created. Then algorithms were developed for integration to the building's dashboard application that show the occupant the energy savings for a variety of behaviors in real time. An approach using neural networks to act on real-time building automation system data was found to be the most accurate and efficient way to predict the current energy savings for each scenario. A stochastic study examined the impact of the representation of unpredictable occupancy patterns on model results. Combined, these studies inform modelers and researchers on frameworks for simulating holistically designed architecture and improving the interaction between models and building occupants, in residential and commercial settings. v
Style grammars for interactive visualization of architecture.
Aliaga, Daniel G; Rosen, Paul A; Bekins, Daniel R
2007-01-01
Interactive visualization of architecture provides a way to quickly visualize existing or novel buildings and structures. Such applications require both fast rendering and an effortless input regimen for creating and changing architecture using high-level editing operations that automatically fill in the necessary details. Procedural modeling and synthesis is a powerful paradigm that yields high data amplification and can be coupled with fast-rendering techniques to quickly generate plausible details of a scene without much or any user interaction. Previously, forward generating procedural methods have been proposed where a procedure is explicitly created to generate particular content. In this paper, we present our work in inverse procedural modeling of buildings and describe how to use an extracted repertoire of building grammars to facilitate the visualization and quick modification of architectural structures and buildings. We demonstrate an interactive application where the user draws simple building blocks and, using our system, can automatically complete the building "in the style of" other buildings using view-dependent texture mapping or nonphotorealistic rendering techniques. Our system supports an arbitrary number of building grammars created from user subdivided building models and captured photographs. Using only edit, copy, and paste metaphors, the entire building styles can be altered and transferred from one building to another in a few operations, enhancing the ability to modify an existing architectural structure or to visualize a novel building in the style of the others.
Research on BIM-based building information value chain reengineering
NASA Astrophysics Data System (ADS)
Hui, Zhao; Weishuang, Xie
2017-04-01
The achievement of value and value-added factor to the building engineering information is accomplished through a chain-flow, that is, building the information value chain. Based on the deconstruction of the information chain on the construction information in the traditional information mode, this paper clarifies the value characteristics and requirements of each stage of the construction project. In order to achieve building information value-added, the paper deconstructs the traditional building information value chain, reengineer the information value chain model on the basis of the theory and techniques of BIM, to build value-added management model and analyse the value of the model.
| 303-384-7527 Noah joined NREL in 2017 after having worked as a consulting building energy analyst. His to smooth the integration of building energy modeling into the building design process. Noah applies a variety of analytical techniques to solve problems associated with building performance as they
NASA Astrophysics Data System (ADS)
Dore, C.; Murphy, M.
2013-02-01
This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.
Knowledge-based model building of proteins: concepts and examples.
Bajorath, J.; Stenkamp, R.; Aruffo, A.
1993-01-01
We describe how to build protein models from structural templates. Methods to identify structural similarities between proteins in cases of significant, moderate to low, or virtually absent sequence similarity are discussed. The detection and evaluation of structural relationships is emphasized as a central aspect of protein modeling, distinct from the more technical aspects of model building. Computational techniques to generate and complement comparative protein models are also reviewed. Two examples, P-selectin and gp39, are presented to illustrate the derivation of protein model structures and their use in experimental studies. PMID:7505680
NASA Astrophysics Data System (ADS)
Tsilimantou, Elisavet; Delegou, Ekaterini; Ioannidis, Charalabos; Moropoulou, Antonia
2016-08-01
In this paper, the documentation of an historic building registered as Cultural Heritage asset is presented. The aim of the survey is to create a 3D geometric representation of a historic building and in accordance with multidisciplinary study extract useful information regarding the extent of degradation, constructions' durability etc. For the implementation of the survey, a combination of different types of acquisition technologies is used. The project focuses on the study of Villa Klonaridi, in Athens, Greece. For the complete documentation of the building, conventional topography, photogrammetric and laser scanning techniques is combined. Close range photogrammetric techniques are used for the acquisition of the façades and architectural details. One of the main objectives is the development of an accurate 3D model, where the photorealistic representation of the building is achieved, along with the decay pathology, historical phases and architectural components. In order to achieve a suitable graphical representation for the study of the material and decay patterns beyond the 2D representation, 3D modelling and additional information modelling is performed for comparative analysis. The study provides various conclusions regarding the scale of deterioration obtained by the 2D and 3D analysis respectively. Considering the variation in material and decay patterns, comparative results are obtained regarding the degradation of the building. Overall, the paper describes a process performed on a Historic Building, where the 3D digital acquisition of the monuments' structure is realized with the combination of close range surveying and laser scanning methods.
Artificial intelligence support for scientific model-building
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1992-01-01
Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.
Agent-based modeling: a new approach for theory building in social psychology.
Smith, Eliot R; Conrey, Frederica R
2007-02-01
Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach.
Model-based tomographic reconstruction
Chambers, David H; Lehman, Sean K; Goodman, Dennis M
2012-06-26
A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.
NLOS Correction/Exclusion for GNSS Measurement Using RAIM and City Building Models.
Hsu, Li-Ta; Gu, Yanlei; Kamijo, Shunsuke
2015-07-17
Currently, global navigation satellite system (GNSS) receivers can provide accurate and reliable positioning service in open-field areas. However, their performance in the downtown areas of cities is still affected by the multipath and none-line-of-sight (NLOS) receptions. This paper proposes a new positioning method using 3D building models and the receiver autonomous integrity monitoring (RAIM) satellite selection method to achieve satisfactory positioning performance in urban area. The 3D building model uses a ray-tracing technique to simulate the line-of-sight (LOS) and NLOS signal travel distance, which is well-known as pseudorange, between the satellite and receiver. The proposed RAIM fault detection and exclusion (FDE) is able to compare the similarity between the raw pseudorange measurement and the simulated pseudorange. The measurement of the satellite will be excluded if the simulated and raw pseudoranges are inconsistent. Because of the assumption of the single reflection in the ray-tracing technique, an inconsistent case indicates it is a double or multiple reflected NLOS signal. According to the experimental results, the RAIM satellite selection technique can reduce by about 8.4% and 36.2% the positioning solutions with large errors (solutions estimated on the wrong side of the road) for the 3D building model method in the middle and deep urban canyon environment, respectively.
Hybrid 3D reconstruction and image-based rendering techniques for reality modeling
NASA Astrophysics Data System (ADS)
Sequeira, Vitor; Wolfart, Erik; Bovisio, Emanuele; Biotti, Ester; Goncalves, Joao G. M.
2000-12-01
This paper presents a component approach that combines in a seamless way the strong features of laser range acquisition with the visual quality of purely photographic approaches. The relevant components of the system are: (i) Panoramic images for distant background scenery where parallax is insignificant; (ii) Photogrammetry for background buildings and (iii) High detailed laser based models for the primary environment, structure of exteriors of buildings and interiors of rooms. These techniques have a wide range of applications in visualization, virtual reality, cost effective as-built analysis of architectural and industrial environments, building facilities management, real-estate, E-commerce, remote inspection of hazardous environments, TV production and many others.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loveday, D.L.; Craggs, C.
Box-Jenkins-based multivariate stochastic modeling is carried out using data recorded from a domestic heating system. The system comprises an air-source heat pump sited in the roof space of a house, solar assistance being provided by the conventional tile roof acting as a radiation absorber. Multivariate models are presented which illustrate the time-dependent relationships between three air temperatures - at external ambient, at entry to, and at exit from, the heat pump evaporator. Using a deterministic modeling approach, physical interpretations are placed on the results of the multivariate technique. It is concluded that the multivariate Box-Jenkins approach is a suitable techniquemore » for building thermal analysis. Application to multivariate Box-Jenkins approach is a suitable technique for building thermal analysis. Application to multivariate model-based control is discussed, with particular reference to building energy management systems. It is further concluded that stochastic modeling of data drawn from a short monitoring period offers a means of retrofitting an advanced model-based control system in existing buildings, which could be used to optimize energy savings. An approach to system simulation is suggested.« less
Masonry structures built with fictile tubules: Experimental and numerical analyses
NASA Astrophysics Data System (ADS)
Tiberti, Simone; Scuro, Carmelo; Codispoti, Rosamaria; Olivito, Renato S.; Milani, Gabriele
2017-11-01
Masonry structures with fictile tubules were a distinctive building technique of the Mediterranean area. This technique dates back to Roman and early Christian times, used to build vaulted constructions and domes with various geometrical forms by virtue of their modular structure. In the present work, experimental tests were carried out to identify the mechanical properties of hollow clay fictile tubules and a possible reinforcing technique for existing buildings employing such elements. The experimental results were then validated by devising and analyzing numerical models with the FE software Abaqus, also aimed at investigating the structural behavior of an arch via linear and nonlinear static analyses.
Hardware-in-the-Loop Modeling and Simulation Methods for Daylight Systems in Buildings
NASA Astrophysics Data System (ADS)
Mead, Alex Robert
This dissertation introduces hardware-in-the-loop modeling and simulation techniques to the daylighting community, with specific application to complex fenestration systems. No such application of this class of techniques, optimally combining mathematical-modeling and physical-modeling experimentation, is known to the author previously in the literature. Daylighting systems in buildings have a large impact on both the energy usage of a building as well as the occupant experience within a space. As such, a renewed interest has been placed on designing and constructing buildings with an emphasis on daylighting in recent times as part of the "green movement.''. Within daylighting systems, a specific subclass of building envelope is receiving much attention: complex fenestration systems (CFSs). CFSs are unique as compared to regular fenestration systems (e.g. glazing) in the regard that they allow for non-specular transmission of daylight into a space. This non-specular nature can be leveraged by designers to "optimize'' the times of the day and the days of the year that daylight enters a space. Examples of CFSs include: Venetian blinds, woven fabric shades, and prismatic window coatings. In order to leverage the non-specular transmission properties of CFSs, however, engineering analysis techniques capable of faithfully representing the physics of these systems are needed. Traditionally, the analysis techniques available to the daylighting community fall broadly into three classes: simplified techniques, mathematical-modeling and simulation, and physical-modeling and experimentation. Simplified techniques use "rules-of-thumb'' heuristics to provide insights for simple daylighting systems. Mathematical-modeling and simulation use complex numerical models to provide more detailed insights into system performance. Finally, physical-models can be instrumented and excited using artificial and natural light sources to provide performance insight into a daylighting system. Each class of techniques, broadly speaking however, has advantages and disadvantages with respect to the cost of execution (e.g. money, time, expertise) and the fidelity of the provided insight into the performance of the daylighting system. This varying tradeoff of cost and insight between the techniques determines which techniques are employed for which projects. Daylighting systems with CFS components, however, when considered for simulation with respect to these traditional technique classes, defy high fidelity analysis. Simplified techniques are clearly not applicable. Mathematical-models must have great complexity in order to capture the non-specular transmission accurately, which greatly limit their applicability. This leaves physical modeling, the most costly, as the preferred method for CFS. While mathematical-modeling and simulation methods do exist, they are in general costly and and still approximations of the underlying CFS behavior. Meaning in fact, measurements of CFSs are currently the only practical method to capture the behavior of CFSs. Traditional measurements of CFSs transmission and reflection properties are conducted using an instrument called a goniophotometer and produce a measurement in the form of a Bidirectional Scatter Distribution Function (BSDF) based on the Klems Basis. This measurement must be executed for each possible state of the CFS, hence only a subset of the possible behaviors can be captured for CFSs with continuously varying configurations. In the current era of rapid prototyping (e.g. 3D printing) and automated control of buildings including daylighting systems, a new analysis technique is needed which can faithfully represent these CFSs which are being designed and constructed at an increasing rate. Hardware-in-the-loop modeling and simulation is a perfect fit to the current need of analyzing daylighting systems with CFSs. In the proposed hardware-in-the-loop modeling and simulation approach of this dissertation, physical-models of real CFSs are excited using either natural or artificial light. The exiting luminance distribution from these CFSs is measured and used as inputs to a Radiance mathematical-model of the interior of the space, which is proposed to be lit by the CFS containing daylighting system. Hence, the components of the total daylighting and building system which are not mathematically-modeled well, the CFS, are physically excited and measured, while the components which are modeled properly, namely the interior building space, are mathematically-modeled. In order to excite and measure CFSs behavior, a novel parallel goniophotometer, referred to as the CUBE 2.0, is developed in this dissertation. The CUBE 2.0 measures the input illuminance distribution and the output luminance distribution with respect to a CFS under test. Further, the process is fully automated allowing for deployable experiments on proposed building sites, as well as in laboratory based experiments. In this dissertation, three CFSs, two commercially available and one novel--Twitchell's Textilene 80 Black, Twitchell's Shade View Ebony, and Translucent Concrete Panels (TCP)--are simulated on the CUBE 2.0 system for daylong deployments at one minute time steps. These CFSs are assumed to be placed in the glazing space within the Reference Office Radiance model, for which horizontal illuminance on a work plane of 0.8 m height is calculated for each time step. While Shade View Ebony and TCPs are unmeasured CFSs with respect to BSDF, Textilene 80 Black has been previously measured. As such a validation of the CUBE 2.0 using the goniophotometer measured BSDF is presented, with measurement errors of the horizontal illuminance between +3% and -10%. These error levels are considered to be valid within experimental daylighting investigations. Non-validated results are also presented in full for both Shade View Ebony as well as TCP. Concluding remarks and future directions for HWiL simulation close the dissertation.
Positional estimation techniques for an autonomous mobile robot
NASA Technical Reports Server (NTRS)
Nandhakumar, N.; Aggarwal, J. K.
1990-01-01
Techniques for positional estimation of a mobile robot navigation in an indoor environment are described. A comprehensive review of the various positional estimation techniques studied in the literature is first presented. The techniques are divided into four different types and each of them is discussed briefly. Two different kinds of environments are considered for positional estimation; mountainous natural terrain and an urban, man-made environment with polyhedral buildings. In both cases, the robot is assumed to be equipped with single visual camera that can be panned and tilted and also a 3-D description (world model) of the environment is given. Such a description could be obtained from a stereo pair of aerial images or from the architectural plans of the buildings. Techniques for positional estimation using the camera input and the world model are presented.
Boat Building Design and Construction Techniques in the Architectural Design Studio.
ERIC Educational Resources Information Center
Smith, Richard A.
1982-01-01
Describes a model boat building project for architectural design studios. Working from traditional sailboat designs, students study the "lines" drawings of boats, make full-size drawings from scale drawings, and then construct model wooden boats. Available from Carfax Publishing Company, P.O. Box 25, Abindgon, Oxfordshire OX14 1RW…
A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...
2016-01-01
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
NLOS Correction/Exclusion for GNSS Measurement Using RAIM and City Building Models
Hsu, Li-Ta; Gu, Yanlei; Kamijo, Shunsuke
2015-01-01
Currently, global navigation satellite system (GNSS) receivers can provide accurate and reliable positioning service in open-field areas. However, their performance in the downtown areas of cities is still affected by the multipath and none-line-of-sight (NLOS) receptions. This paper proposes a new positioning method using 3D building models and the receiver autonomous integrity monitoring (RAIM) satellite selection method to achieve satisfactory positioning performance in urban area. The 3D building model uses a ray-tracing technique to simulate the line-of-sight (LOS) and NLOS signal travel distance, which is well-known as pseudorange, between the satellite and receiver. The proposed RAIM fault detection and exclusion (FDE) is able to compare the similarity between the raw pseudorange measurement and the simulated pseudorange. The measurement of the satellite will be excluded if the simulated and raw pseudoranges are inconsistent. Because of the assumption of the single reflection in the ray-tracing technique, an inconsistent case indicates it is a double or multiple reflected NLOS signal. According to the experimental results, the RAIM satellite selection technique can reduce by about 8.4% and 36.2% the positioning solutions with large errors (solutions estimated on the wrong side of the road) for the 3D building model method in the middle and deep urban canyon environment, respectively. PMID:26193278
NASA Astrophysics Data System (ADS)
Tarbotton, C.; Walters, R. A.; Goff, J. R.; Dominey-Howes, D.; Turner, I. L.
2012-12-01
As communities become increasingly aware of the risks posed by tsunamis, it is important to develop methods for predicting the damage they can cause to the built environment. This will provide the information needed to make informed decisions regarding land-use, building codes, and evacuation. At present, a number of tsunami-building vulnerability assessment models are available, however, the relative infrequency and destructive nature of tsunamis has long made it difficult to obtain the data necessary to adequately validate and compare them. Further complicating matters is that the inundation of a tsunami in the built environment is very difficult model, as is the response of a building to the hydraulic forces that a tsunami generates. Variations in building design and condition will significantly affect a building's susceptibility to damage. Likewise, factors affecting the flow conditions at a building (i.e. surrounding structures and topography), will greatly affect its exposure. This presents significant challenges for practitioners, as they are often left in the dark on how to use hazard modeling and vulnerability assessment techniques together to conduct the community-scale impact studies required for tsunami planning. This paper presents the results of an in-depth case study of Yuriage, Miyagi Prefecture - a coastal city in Japan that was badly damaged by the 2011 Tohoku tsunami. The aim of the study was twofold: 1) To test and compare existing tsunami vulnerability assessment models and 2) To more effectively utilize hydrodynamic models in the context of tsunami impact studies. Following the 2011 Tohoku event, an unprecedented quantity of field data, imagery and video emerged. Yuriage in particular, features a comprehensive set of street level Google Street View imagery, available both before and after the event. This has enabled the collection of a large dataset describing the characteristics of the buildings existing before the event as well the subsequent damage that they sustained during. These data together with the detailed results from hydrodynamic models have been used to provide the building, damage and hazard data necessary to rigorously test and compare existing vulnerability assessments techniques. The result is a much-improved understanding of the capabilities of existing vulnerability assessment techniques, as well as important improvements to their assessment framework This provides much needed guidance to practitioners on how to conduct tsunami impact assessments in the future. Furthermore, the study introduces some new methods of integrating hydrodynamic models into vulnerability assessment models, offering guidance on how to more effectively model tsunami inundation in the built environment.
ERIC Educational Resources Information Center
Campbell, Robert E.; And Others
This handbook presents management techniques, program ideas, and student activities for building comprehensive secondary career guidance programs. Part 1 (chapter 1) traces the history of guidance to set the stage for the current emphasis on comprehensive programs, summarizes four representative models for designing comprehensive programs, and…
Simulating Building Fires for Movies
NASA Technical Reports Server (NTRS)
Rodriguez, Ricardo C.; Johnson, Randall P.
1987-01-01
Fire scenes for cinematography staged at relatively low cost in method that combines several existing techniques. Nearly realistic scenes, suitable for firefighter training, produced with little specialized equipment. Sequences of scenes set up quickly and easily, without compromising safety because model not burned. Images of fire, steam, and smoke superimposed on image of building to simulate burning of building.
Modelling Technology for Building Fire Scene with Virtual Geographic Environment
NASA Astrophysics Data System (ADS)
Song, Y.; Zhao, L.; Wei, M.; Zhang, H.; Liu, W.
2017-09-01
Building fire is a risky activity that can lead to disaster and massive destruction. The management and disposal of building fire has always attracted much interest from researchers. Integrated Virtual Geographic Environment (VGE) is a good choice for building fire safety management and emergency decisions, in which a more real and rich fire process can be computed and obtained dynamically, and the results of fire simulations and analyses can be much more accurate as well. To modelling building fire scene with VGE, the application requirements and modelling objective of building fire scene were analysed in this paper. Then, the four core elements of modelling building fire scene (the building space environment, the fire event, the indoor Fire Extinguishing System (FES) and the indoor crowd) were implemented, and the relationship between the elements was discussed also. Finally, with the theory and framework of VGE, the technology of building fire scene system with VGE was designed within the data environment, the model environment, the expression environment, and the collaborative environment as well. The functions and key techniques in each environment are also analysed, which may provide a reference for further development and other research on VGE.
Information support of monitoring of technical condition of buildings in construction risk area
NASA Astrophysics Data System (ADS)
Skachkova, M. E.; Lepihina, O. Y.; Ignatova, V. V.
2018-05-01
The paper presents the results of the research devoted to the development of a model of information support of monitoring buildings technical condition; these buildings are located in the construction risk area. As a result of the visual and instrumental survey, as well as the analysis of existing approaches and techniques, attributive and cartographic databases have been created. These databases allow monitoring defects and damages of buildings located in a 30-meter risk area from the object under construction. The classification of structures and defects of these buildings under survey is presented. The functional capabilities of the developed model and the field of it practical applications are determined.
Building Quakes: Detection of Weld Fractures in Buildings using High-Frequency Seismic Techniques
NASA Astrophysics Data System (ADS)
Heckman, V.; Kohler, M. D.; Heaton, T. H.
2009-12-01
Catastrophic fracture of welded beam-column connections in buildings was observed in the Northridge and Kobe earthquakes. Despite the structural importance of such connections, it can be difficult to locate damage in structural members underneath superficial building features. We have developed a novel technique to locate fracturing welds in buildings in real time using high-frequency information from seismograms. Numerical and experimental methods were used to investigate an approach for detecting the brittle fracture of welds of beam-column connections in instrumented steel moment-frame buildings through the use of time-reversed Green’s functions and wave propagation reciprocity. The approach makes use of a prerecorded catalogue of Green’s functions for an instrumented building to detect high-frequency failure events in the building during a later earthquake by screening continuous data for the presence of one or more of the events. This was explored experimentally by comparing structural responses of a small-scale laboratory structure under a variety of loading conditions. Experimentation was conducted on a polyvinyl chloride frame model structure with data recorded at a sample rate of 2000 Hz using piezoelectric accelerometers and a 24-bit digitizer. Green’s functions were obtained by applying impulsive force loads at various locations along the structure with a rubber-tipped force transducer hammer. We performed a blind test using cross-correlation techniques to determine if it was possible to use the catalogue of Green’s functions to pinpoint the absolute times and locations of subsequent, induced failure events in the structure. A finite-element method was used to simulate the response of the model structure to various source mechanisms in order to determine the types of elastic waves that were produced as well as to obtain a general understanding of the structural response to localized loading and fracture.
Discrepant Questioning as a Tool To Build Complex Mental Models of Respiration.
ERIC Educational Resources Information Center
Rea-Ramirez, Mary Anne; Nunez-Oviedo, Maria C.
Discrepant questioning is a teaching technique that can help students "unlearn" misconceptions and process science ideas for deep understanding. Discrepant questioning is a technique in which teachers question students in a way that requires them to examine their ideas or models, without giving information prematurely to the student or passing…
A Comparison of Two Balance Calibration Model Building Methods
NASA Technical Reports Server (NTRS)
DeLoach, Richard; Ulbrich, Norbert
2007-01-01
Simulated strain-gage balance calibration data is used to compare the accuracy of two balance calibration model building methods for different noise environments and calibration experiment designs. The first building method obtains a math model for the analysis of balance calibration data after applying a candidate math model search algorithm to the calibration data set. The second building method uses stepwise regression analysis in order to construct a model for the analysis. Four balance calibration data sets were simulated in order to compare the accuracy of the two math model building methods. The simulated data sets were prepared using the traditional One Factor At a Time (OFAT) technique and the Modern Design of Experiments (MDOE) approach. Random and systematic errors were introduced in the simulated calibration data sets in order to study their influence on the math model building methods. Residuals of the fitted calibration responses and other statistical metrics were compared in order to evaluate the calibration models developed with different combinations of noise environment, experiment design, and model building method. Overall, predicted math models and residuals of both math model building methods show very good agreement. Significant differences in model quality were attributable to noise environment, experiment design, and their interaction. Generally, the addition of systematic error significantly degraded the quality of calibration models developed from OFAT data by either method, but MDOE experiment designs were more robust with respect to the introduction of a systematic component of the unexplained variance.
NASA Astrophysics Data System (ADS)
Kumar, Manish; Sharma, Navjeet; Sarin, Amit
2018-05-01
Studies have confirmed that elevated levels of radon/thoron in the human-environments can substantially increase the risk of lung cancer in general population. The building materials are the second largest contributors to indoor radon/thoron after soil and bedrock beneath dwellings. In present investigation, the exhalation rates of radon/thoron from different building materials samples have been analysed using active technique. Radon/thoron concentrations in a model room have been predicted based on the exhalation rates from walls, floor and roof. The indoor concentrations show significant variations depending upon the ventilation rate and type of building materials used.
A sensitivity model for energy consumption in buildings. Part 1: Effect of exterior environment
NASA Technical Reports Server (NTRS)
Lansing, F. L.
1981-01-01
A simple analytical model is developed for the simulation of seasonal heating and cooling loads of any class of buildings to complement available computerized techniques which make hourly, daily, and monthly calculations. An expression for the annual energy utilization index, which is a common measure of rating buildings having the same functional utilization, is derived to include about 30 parameters for both building interior and exterior environments. The sensitivity of a general class building to either controlled or uncontrolled weather parameters is examined. A hypothetical office type building, located at the Goldstone Space Communication Complex, Goldstone, California, is selected as an example for the numerical sensitivity evaluations. Several expressions of variations in local outside air temperature, pressure, solar radiation, and wind velocity are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
Feasibility of Close-Range Photogrammetric Models for Geographic Information System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Luke; /Rice U.
2011-06-22
The objective of this project was to determine the feasibility of using close-range architectural photogrammetry as an alternative three dimensional modeling technique in order to place the digital models in a geographic information system (GIS) at SLAC. With the available equipment and Australis photogrammetry software, the creation of full and accurate models of an example building, Building 281 on SLAC campus, was attempted. After conducting several equipment tests to determine the precision achievable, a complete photogrammetric survey was attempted. The dimensions of the resulting models were then compared against the true dimensions of the building. A complete building model wasmore » not evidenced to be obtainable using the current equipment and software. This failure was likely attributable to the limits of the software rather than the precision of the physical equipment. However, partial models of the building were shown to be accurate and determined to still be usable in a GIS. With further development of the photogrammetric software and survey procedure, the desired generation of a complete three dimensional model is likely still feasible.« less
González, Janneth; Gálvez, Angela; Morales, Ludis; Barreto, George E.; Capani, Francisco; Sierra, Omar; Torres, Yolima
2013-01-01
Three-dimensional models of the alpha- and beta-1 subunits of the calcium-activated potassium channel (BK) were predicted by threading modeling. A recursive approach comprising of sequence alignment and model building based on three templates was used to build these models, with the refinement of non-conserved regions carried out using threading techniques. The complex formed by the subunits was studied by means of docking techniques, using 3D models of the two subunits, and an approach based on rigid-body structures. Structural effects of the complex were analyzed with respect to hydrogen-bond interactions and binding-energy calculations. Potential interaction sites of the complex were determined by referencing a study of the difference accessible surface area (DASA) of the protein subunits in the complex. PMID:23492851
Haghighat, F; Lee, C S; Ghaly, W S
2002-06-01
The measurement and prediction of building material emission rates have been the subject of intensive research over the past decade, resulting in the development of advanced sensory and chemical analysis measurement techniques as well as the development of analytical and numerical models. One of the important input parameters for these models is the diffusion coefficient. Several experimental techniques have been applied to estimate the diffusion coefficient. An extensive literature review of the techniques used to measure this coefficient was carried out, for building materials exposed to volatile organic compounds (VOC). This paper reviews these techniques; it also analyses the results and discusses the possible causes of difference in the reported data. It was noted that the discrepancy between the different results was mainly because of the assumptions made in and the techniques used to analyze the data. For a given technique, the results show that there can be a difference of up to 700% in the reported data. Moreover, the paper proposes what is referred to as the mass exchanger method, to calculate diffusion coefficients considering both diffusion and convection. The results obtained by this mass exchanger method were compared with those obtained by the existing method considering only diffusion. It was demonstrated that, for porous materials, the convection resistance could not be ignored when compared with the diffusion resistance.
NASA Technical Reports Server (NTRS)
Garduno-Juarez, R.; Shibata, M.; Zielinski, T. J.; Rein, R.
1987-01-01
A model of the complex between the acetylcholine receptor and the snake neurotoxin, cobratoxin, was built by molecular model building and energy optimization techniques. The experimentally identified functionally important residues of cobratoxin and the dodecapeptide corresponding to the residues 185-196 of acetylcholine receptor alpha subunit were used to build the model. Both cis and trans conformers of cyclic L-cystine portion of the dodecapeptide were examined. Binding residues independently identified on cobratoxin are shown to interact with the dodecapeptide AChR model.
Implicit Regularization for Reconstructing 3D Building Rooftop Models Using Airborne LiDAR Data
Jung, Jaewook; Jwa, Yoonseok; Sohn, Gunho
2017-01-01
With rapid urbanization, highly accurate and semantically rich virtualization of building assets in 3D become more critical for supporting various applications, including urban planning, emergency response and location-based services. Many research efforts have been conducted to automatically reconstruct building models at city-scale from remotely sensed data. However, developing a fully-automated photogrammetric computer vision system enabling the massive generation of highly accurate building models still remains a challenging task. One the most challenging task for 3D building model reconstruction is to regularize the noises introduced in the boundary of building object retrieved from a raw data with lack of knowledge on its true shape. This paper proposes a data-driven modeling approach to reconstruct 3D rooftop models at city-scale from airborne laser scanning (ALS) data. The focus of the proposed method is to implicitly derive the shape regularity of 3D building rooftops from given noisy information of building boundary in a progressive manner. This study covers a full chain of 3D building modeling from low level processing to realistic 3D building rooftop modeling. In the element clustering step, building-labeled point clouds are clustered into homogeneous groups by applying height similarity and plane similarity. Based on segmented clusters, linear modeling cues including outer boundaries, intersection lines, and step lines are extracted. Topology elements among the modeling cues are recovered by the Binary Space Partitioning (BSP) technique. The regularity of the building rooftop model is achieved by an implicit regularization process in the framework of Minimum Description Length (MDL) combined with Hypothesize and Test (HAT). The parameters governing the MDL optimization are automatically estimated based on Min-Max optimization and Entropy-based weighting method. The performance of the proposed method is tested over the International Society for Photogrammetry and Remote Sensing (ISPRS) benchmark datasets. The results show that the proposed method can robustly produce accurate regularized 3D building rooftop models. PMID:28335486
Implicit Regularization for Reconstructing 3D Building Rooftop Models Using Airborne LiDAR Data.
Jung, Jaewook; Jwa, Yoonseok; Sohn, Gunho
2017-03-19
With rapid urbanization, highly accurate and semantically rich virtualization of building assets in 3D become more critical for supporting various applications, including urban planning, emergency response and location-based services. Many research efforts have been conducted to automatically reconstruct building models at city-scale from remotely sensed data. However, developing a fully-automated photogrammetric computer vision system enabling the massive generation of highly accurate building models still remains a challenging task. One the most challenging task for 3D building model reconstruction is to regularize the noises introduced in the boundary of building object retrieved from a raw data with lack of knowledge on its true shape. This paper proposes a data-driven modeling approach to reconstruct 3D rooftop models at city-scale from airborne laser scanning (ALS) data. The focus of the proposed method is to implicitly derive the shape regularity of 3D building rooftops from given noisy information of building boundary in a progressive manner. This study covers a full chain of 3D building modeling from low level processing to realistic 3D building rooftop modeling. In the element clustering step, building-labeled point clouds are clustered into homogeneous groups by applying height similarity and plane similarity. Based on segmented clusters, linear modeling cues including outer boundaries, intersection lines, and step lines are extracted. Topology elements among the modeling cues are recovered by the Binary Space Partitioning (BSP) technique. The regularity of the building rooftop model is achieved by an implicit regularization process in the framework of Minimum Description Length (MDL) combined with Hypothesize and Test (HAT). The parameters governing the MDL optimization are automatically estimated based on Min-Max optimization and Entropy-based weighting method. The performance of the proposed method is tested over the International Society for Photogrammetry and Remote Sensing (ISPRS) benchmark datasets. The results show that the proposed method can robustly produce accurate regularized 3D building rooftop models.
Localized Segment Based Processing for Automatic Building Extraction from LiDAR Data
NASA Astrophysics Data System (ADS)
Parida, G.; Rajan, K. S.
2017-05-01
The current methods of object segmentation and extraction and classification of aerial LiDAR data is manual and tedious task. This work proposes a technique for object segmentation out of LiDAR data. A bottom-up geometric rule based approach was used initially to devise a way to segment buildings out of the LiDAR datasets. For curved wall surfaces, comparison of localized surface normals was done to segment buildings. The algorithm has been applied to both synthetic datasets as well as real world dataset of Vaihingen, Germany. Preliminary results show successful segmentation of the buildings objects from a given scene in case of synthetic datasets and promissory results in case of real world data. The advantages of the proposed work is non-dependence on any other form of data required except LiDAR. It is an unsupervised method of building segmentation, thus requires no model training as seen in supervised techniques. It focuses on extracting the walls of the buildings to construct the footprint, rather than focussing on roof. The focus on extracting the wall to reconstruct the buildings from a LiDAR scene is crux of the method proposed. The current segmentation approach can be used to get 2D footprints of the buildings, with further scope to generate 3D models. Thus, the proposed method can be used as a tool to get footprints of buildings in urban landscapes, helping in urban planning and the smart cities endeavour.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Building simulations are increasingly used in various applications related to energy efficient buildings. For individual buildings, applications include: design of new buildings, prediction of retrofit savings, ratings, performance path code compliance and qualification for incentives. Beyond individual building applications, larger scale applications (across the stock of buildings at various scales: national, regional and state) include: codes and standards development, utility program design, regional/state planning, and technology assessments. For these sorts of applications, a set of representative buildings are typically simulated to predict performance of the entire population of buildings. Focusing on the U.S. single-family residential building stock, this paper willmore » describe how multiple data sources for building characteristics are combined into a highly-granular database that preserves the important interdependencies of the characteristics. We will present the sampling technique used to generate a representative set of thousands (up to hundreds of thousands) of building models. We will also present results of detailed calibrations against building stock consumption data.« less
NASA Astrophysics Data System (ADS)
Todorov, Evgueni Iordanov
2017-04-01
The lack of validated nondestructive evaluation (NDE) techniques for examination during and after additive manufacturing (AM) component fabrication is one of the obstacles in the way of broadening use of AM for critical applications. Knowledge of electromagnetic properties of powder (e.g. feedstock) and solid AM metal components is necessary to evaluate and deploy electromagnetic NDE modalities for examination of AM components. The objective of this research study was to develop and implement techniques for measurement of powder and solid metal electromagnetic properties. Three materials were selected - Inconel 625, duplex stainless steel 2205, and carbon steel 4140. The powder properties were measured with alternate current (AC) model based eddy current technique and direct current (DC) resistivity measurements. The solid metal properties were measured with DC resistivity measurements, DC magnetic techniques, and AC model based eddy current technique. Initial magnetic permeability and electrical conductivity were acquired for both powder and solid metal. Additional magnetic properties such as maximum permeability, coercivity, retentivity, and others were acquired for 2205 and 4140. Two groups of specimens were tested along the build length and width respectively to investigate for possible anisotropy. There was no significant difference or anisotropy when comparing measurements acquired along build length to those along the width. A trend in AC measurements might be associated with build geometry. Powder electrical conductivity was very low and difficult to estimate reliably with techniques used in the study. The agreement between various techniques was very good where adequate comparison was possible.
Severity-Based Adaptation with Limited Data for ASR to Aid Dysarthric Speakers
Mustafa, Mumtaz Begum; Salim, Siti Salwah; Mohamed, Noraini; Al-Qatab, Bassam; Siong, Chng Eng
2014-01-01
Automatic speech recognition (ASR) is currently used in many assistive technologies, such as helping individuals with speech impairment in their communication ability. One challenge in ASR for speech-impaired individuals is the difficulty in obtaining a good speech database of impaired speakers for building an effective speech acoustic model. Because there are very few existing databases of impaired speech, which are also limited in size, the obvious solution to build a speech acoustic model of impaired speech is by employing adaptation techniques. However, issues that have not been addressed in existing studies in the area of adaptation for speech impairment are as follows: (1) identifying the most effective adaptation technique for impaired speech; and (2) the use of suitable source models to build an effective impaired-speech acoustic model. This research investigates the above-mentioned two issues on dysarthria, a type of speech impairment affecting millions of people. We applied both unimpaired and impaired speech as the source model with well-known adaptation techniques like the maximum likelihood linear regression (MLLR) and the constrained-MLLR(C-MLLR). The recognition accuracy of each impaired speech acoustic model is measured in terms of word error rate (WER), with further assessments, including phoneme insertion, substitution and deletion rates. Unimpaired speech when combined with limited high-quality speech-impaired data improves performance of ASR systems in recognising severely impaired dysarthric speech. The C-MLLR adaptation technique was also found to be better than MLLR in recognising mildly and moderately impaired speech based on the statistical analysis of the WER. It was found that phoneme substitution was the biggest contributing factor in WER in dysarthric speech for all levels of severity. The results show that the speech acoustic models derived from suitable adaptation techniques improve the performance of ASR systems in recognising impaired speech with limited adaptation data. PMID:24466004
On The Modeling of Educational Systems: II
ERIC Educational Resources Information Center
Grauer, Robert T.
1975-01-01
A unified approach to model building is developed from the separate techniques of regression, simulation, and factorial design. The methodology is applied in the context of a suburban school district. (Author/LS)
Dorninger, Peter; Pfeifer, Norbert
2008-01-01
Three dimensional city models are necessary for supporting numerous management applications. For the determination of city models for visualization purposes, several standardized workflows do exist. They are either based on photogrammetry or on LiDAR or on a combination of both data acquisition techniques. However, the automated determination of reliable and highly accurate city models is still a challenging task, requiring a workflow comprising several processing steps. The most relevant are building detection, building outline generation, building modeling, and finally, building quality analysis. Commercial software tools for building modeling require, generally, a high degree of human interaction and most automated approaches described in literature stress the steps of such a workflow individually. In this article, we propose a comprehensive approach for automated determination of 3D city models from airborne acquired point cloud data. It is based on the assumption that individual buildings can be modeled properly by a composition of a set of planar faces. Hence, it is based on a reliable 3D segmentation algorithm, detecting planar faces in a point cloud. This segmentation is of crucial importance for the outline detection and for the modeling approach. We describe the theoretical background, the segmentation algorithm, the outline detection, and the modeling approach, and we present and discuss several actual projects. PMID:27873931
NASA Astrophysics Data System (ADS)
Homainejad, Amir S.; Satari, Mehran
2000-05-01
VR is possible which brings users to the reality by computer and VE is a simulated world which takes users to any points and directions of the object. VR and VE can be very useful if accurate and precise data are sued, and allows users to work with realistic model. Photogrammetry is a technique which is able to collect and provide accurate and precise data for building 3D model in a computer. Data can be collected from various sensor and cameras, and methods of data collector are vary based on the method of image acquiring. Indeed VR includes real-time graphics, 3D model, and display and it has application in the entertainment industry, flight simulators, industrial design.
NASA Astrophysics Data System (ADS)
Schweier, C.; Markus, M.; Steinle, E.
2004-04-01
Catastrophic events like strong earthquakes can cause big losses in life and economic values. An increase in the efficiency of reconnaissance techniques could help to reduce the losses in life as many victims die after and not during the event. A basic prerequisite to improve the rescue teams' work is an improved planning of the measures. This can only be done on the basis of reliable and detailed information about the actual situation in the affected regions. Therefore, a bundle of projects at Karlsruhe university aim at the development of a tool for fast information retrieval after strong earthquakes. The focus is on urban areas as the most losses occur there. In this paper the approach for a damage analysis of buildings will be presented. It consists of an automatic methodology to model buildings in three dimensions, a comparison of pre- and post-event models to detect changes and a subsequent classification of the changes into damage types. The process is based on information extraction from airborne laserscanning data, i.e. digital surface models (DSM) acquired through scanning of an area with pulsed laser light. To date, there are no laserscanning derived DSMs available to the authors that were taken of areas that suffered damages from earthquakes. Therefore, it was necessary to simulate such data for the development of the damage detection methodology. In this paper two different methodologies used for simulating the data will be presented. The first method is to create CAD models of undamaged buildings based on their construction plans and alter them artificially in such a way as if they had suffered serious damage. Then, a laserscanning data set is simulated based on these models which can be compared with real laserscanning data acquired of the buildings (in intact state). The other approach is to use measurements of actual damaged buildings and simulate their intact state. It is possible to model the geometrical structure of these damaged buildings based on digital photography taken after the event by evaluating the images with photogrammetrical methods. The intact state of the buildings is simulated based on on-site investigations, and finally laserscanning data are simulated for both states.
Shock wave interaction with L-shaped structures
NASA Astrophysics Data System (ADS)
Miller, Richard C.
1993-12-01
This study investigated the interaction of shock waves with L-shaped structures using the CTH hydrodynamics code developed by Sandia National Laboratories. Computer models of shock waves traveling through air were developed using techniques similar to shock tube experiments. Models of L-shaped buildings were used to determine overpressures achieved by the reflecting shock versus angle of incidence of the shock front. An L-shaped building model rotated 45 degrees to the planar shock front produced the highest reflected overpressure of 9.73 atmospheres in the corner joining the two wings, a value 9.5 times the incident overpressure of 1.02 atmospheres. The same L-shaped building was modeled with the two wings separated by 4.24 meters to simulate an open courtyard. This open area provided a relief path for the incident shock wave, creating a peak overpressure of only 4.86 atmospheres on the building's wall surfaces from the same 1.02 atmosphere overpressure incident shock wave.
Use of Hypnosis in Self-Esteem Building: Review and Model for Rehabilitation.
ERIC Educational Resources Information Center
Klich, Beatriz de M.; Miller, Mary Ball
Hypnotherapeutic approaches in helping physically disabled patients cope with stress and plan further goals during the rehabilitation period are discussed. Several techniques possible in a rehabilitation setting are presented, including integration of ego strengthening and self-esteem building. It is noted that, in rehabilitation, a major goal of…
Neuro-fuzzy control of structures using acceleration feedback
NASA Astrophysics Data System (ADS)
Schurter, Kyle C.; Roschke, Paul N.
2001-08-01
This paper described a new approach for the reduction of environmentally induced vibration in constructed facilities by way of a neuro-fuzzy technique. The new control technique is presented and tested in a numerical study that involves two types of building models. The energy of each building is dissipated through magnetorheological (MR) dampers whose damping properties are continuously updated by a fuzzy controller. This semi-active control scheme relies on the development of a correlation between the accelerations of the building (controller input) and the voltage applied to the MR damper (controller output). This correlation forms the basis for the development of an intelligent neuro-fuzzy control strategy. To establish a context for assessing the effectiveness of the semi-active control scheme, responses to earthquake excitation are compared with passive strategies that have similar authority for control. According to numerical simulation, MR dampers are less effective control mechanisms than passive dampers with respect to a single degree of freedom (DOF) building model. On the other hand, MR dampers are predicted to be superior when used with multiple DOF structures for reduction of lateral acceleration.
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
Fais, Silvana; Casula, Giuseppe; Cuccuru, Francesco; Ligas, Paola; Bianchi, Maria Giovanna
2018-03-12
In the following we present a new non-invasive methodology aimed at the diagnosis of stone building materials used in historical buildings and architectural elements. This methodology consists of the integrated sequential application of in situ proximal sensing methodologies such as the 3D Terrestrial Laser Scanner for the 3D modelling of investigated objects together with laboratory and in situ non-invasive multi-techniques acoustic data, preceded by an accurate petrographical study of the investigated stone materials by optical and scanning electron microscopy. The increasing necessity to integrate different types of techniques in the safeguard of the Cultural Heritage is the result of the following two interdependent factors: 1) The diagnostic process on the building stone materials of monuments is increasingly focused on difficult targets in critical situations. In these cases, the diagnosis using only one type of non-invasive technique may not be sufficient to investigate the conservation status of the stone materials of the superficial and inner parts of the studied structures 2) Recent technological and scientific developments in the field of non-invasive diagnostic techniques for different types of materials favors and supports the acquisition, processing and interpretation of huge multidisciplinary datasets.
Current State of the Art Historic Building Information Modelling
NASA Astrophysics Data System (ADS)
Dore, C.; Murphy, M.
2017-08-01
In an extensive review of existing literature a number of observations were made in relation to the current approaches for recording and modelling existing buildings and environments: Data collection and pre-processing techniques are becoming increasingly automated to allow for near real-time data capture and fast processing of this data for later modelling applications. Current BIM software is almost completely focused on new buildings and has very limited tools and pre-defined libraries for modelling existing and historic buildings. The development of reusable parametric library objects for existing and historic buildings supports modelling with high levels of detail while decreasing the modelling time. Mapping these parametric objects to survey data, however, is still a time-consuming task that requires further research. Promising developments have been made towards automatic object recognition and feature extraction from point clouds for as-built BIM. However, results are currently limited to simple and planar features. Further work is required for automatic accurate and reliable reconstruction of complex geometries from point cloud data. Procedural modelling can provide an automated solution for generating 3D geometries but lacks the detail and accuracy required for most as-built applications in AEC and heritage fields.
NASA Astrophysics Data System (ADS)
Çaktı, Eser; Ercan, Tülay; Dar, Emrullah
2017-04-01
Istanbul's vast historical and cultural heritage is under constant threat of earthquakes. Historical records report repeated damages to the city's landmark buildings. Our efforts towards earthquake protection of several buildings in Istanbul involve earthquake monitoring via structural health monitoring systems, linear and non-linear structural modelling and analysis in search of past and future earthquake performance, shake-table testing of scaled models and non-destructive testing. More recently we have been using laser technology in monitoring structural deformations and damage in five monumental buildings which are Hagia Sophia Museum and Fatih, Sultanahmet, Süleymaniye and Mihrimah Sultan Mosques. This presentation is about these efforts with special emphasis on the use of laser scanning in monitoring of edifices.
NASA Astrophysics Data System (ADS)
Chiabrando, F.; Sammartano, G.; Spanò, A.
2016-06-01
This paper retraces some research activities and application of 3D survey techniques and Building Information Modelling (BIM) in the environment of Cultural Heritage. It describes the diffusion of as-built BIM approach in the last years in Heritage Assets management, the so-called Built Heritage Information Modelling/Management (BHIMM or HBIM), that is nowadays an important and sustainable perspective in documentation and administration of historic buildings and structures. The work focuses the documentation derived from 3D survey techniques that can be understood like a significant and unavoidable knowledge base for the BIM conception and modelling, in the perspective of a coherent and complete management and valorisation of CH. It deepens potentialities, offered by 3D integrated survey techniques, to acquire productively and quite easilymany 3D information, not only geometrical but also radiometric attributes, helping the recognition, interpretation and characterization of state of conservation and degradation of architectural elements. From these data, they provide more and more high descriptive models corresponding to the geometrical complexity of buildings or aggregates in the well-known 5D (3D + time and cost dimensions). Points clouds derived from 3D survey acquisition (aerial and terrestrial photogrammetry, LiDAR and their integration) are reality-based models that can be use in a semi-automatic way to manage, interpret, and moderately simplify geometrical shapes of historical buildings that are examples, as is well known, of non-regular and complex geometry, instead of modern constructions with simple and regular ones. In the paper, some of these issues are addressed and analyzed through some experiences regarding the creation and the managing of HBIMprojects on historical heritage at different scales, using different platforms and various workflow. The paper focuses on LiDAR data handling with the aim to manage and extract geometrical information; on development and optimization of semi-automatic process of segmentation, recognition and modelling of historical shapes of complex structures; on communication of historical heritage by virtual and augmented reality (VR/AR) in a 3D reconstruction of buildings aggregates from a LiDAR and UAV survey. The HBIM model have been implemented and optimized to be managed and browse by mobile devices for not only touristic or informative scopes, but also to ensure that HBIM platforms will become more easy and valuable tools helping all professionals of AEC involved in the documentation and valorisation process, that nowadays more and more distinguish CH policies.
a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation
NASA Astrophysics Data System (ADS)
Kıvılcım, C. Ö.; Duran, Z.
2016-06-01
The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.
Naccarato, Attilio; Furia, Emilia; Sindona, Giovanni; Tagarelli, Antonio
2016-09-01
Four class-modeling techniques (soft independent modeling of class analogy (SIMCA), unequal dispersed classes (UNEQ), potential functions (PF), and multivariate range modeling (MRM)) were applied to multielement distribution to build chemometric models able to authenticate chili pepper samples grown in Calabria respect to those grown outside of Calabria. The multivariate techniques were applied by considering both all the variables (32 elements, Al, As, Ba, Ca, Cd, Ce, Co, Cr, Cs, Cu, Dy, Fe, Ga, La, Li, Mg, Mn, Na, Nd, Ni, Pb, Pr, Rb, Sc, Se, Sr, Tl, Tm, V, Y, Yb, Zn) and variables selected by means of stepwise linear discriminant analysis (S-LDA). In the first case, satisfactory and comparable results in terms of CV efficiency are obtained with the use of SIMCA and MRM (82.3 and 83.2% respectively), whereas MRM performs better than SIMCA in terms of forced model efficiency (96.5%). The selection of variables by S-LDA permitted to build models characterized, in general, by a higher efficiency. MRM provided again the best results for CV efficiency (87.7% with an effective balance of sensitivity and specificity) as well as forced model efficiency (96.5%). Copyright © 2016 Elsevier Ltd. All rights reserved.
Optimisation of Critical Infrastructure Protection: The SiVe Project on Airport Security
NASA Astrophysics Data System (ADS)
Breiing, Marcus; Cole, Mara; D'Avanzo, John; Geiger, Gebhard; Goldner, Sascha; Kuhlmann, Andreas; Lorenz, Claudia; Papproth, Alf; Petzel, Erhard; Schwetje, Oliver
This paper outlines the scientific goals, ongoing work and first results of the SiVe research project on critical infrastructure security. The methodology is generic while pilot studies are chosen from airport security. The outline proceeds in three major steps, (1) building a threat scenario, (2) development of simulation models as scenario refinements, and (3) assessment of alternatives. Advanced techniques of systems analysis and simulation are employed to model relevant airport structures and processes as well as offences. Computer experiments are carried out to compare and optimise alternative solutions. The optimality analyses draw on approaches to quantitative risk assessment recently developed in the operational sciences. To exploit the advantages of the various techniques, an integrated simulation workbench is build up in the project.
Semiautomated model building for RNA crystallography using a directed rotameric approach.
Keating, Kevin S; Pyle, Anna Marie
2010-05-04
Structured RNA molecules play essential roles in a variety of cellular processes; however, crystallographic studies of such RNA molecules present a large number of challenges. One notable complication arises from the low resolutions typical of RNA crystallography, which results in electron density maps that are imprecise and difficult to interpret. This problem is exacerbated by the lack of computational tools for RNA modeling, as many of the techniques commonly used in protein crystallography have no equivalents for RNA structure. This leads to difficulty and errors in the model building process, particularly in modeling of the RNA backbone, which is highly error prone due to the large number of variable torsion angles per nucleotide. To address this, we have developed a method for accurately building the RNA backbone into maps of intermediate or low resolution. This method is semiautomated, as it requires a crystallographer to first locate phosphates and bases in the electron density map. After this initial trace of the molecule, however, an accurate backbone structure can be built without further user intervention. To accomplish this, backbone conformers are first predicted using RNA pseudotorsions and the base-phosphate perpendicular distance. Detailed backbone coordinates are then calculated to conform both to the predicted conformer and to the previously located phosphates and bases. This technique is shown to produce accurate backbone structure even when starting from imprecise phosphate and base coordinates. A program implementing this methodology is currently available, and a plugin for the Coot model building program is under development.
Virtual 3d City Modeling: Techniques and Applications
NASA Astrophysics Data System (ADS)
Singh, S. P.; Jain, K.; Mandla, V. R.
2013-08-01
3D city model is a digital representation of the Earth's surface and it's related objects such as Building, Tree, Vegetation, and some manmade feature belonging to urban area. There are various terms used for 3D city models such as "Cybertown", "Cybercity", "Virtual City", or "Digital City". 3D city models are basically a computerized or digital model of a city contains the graphic representation of buildings and other objects in 2.5 or 3D. Generally three main Geomatics approach are using for Virtual 3-D City models generation, in first approach, researcher are using Conventional techniques such as Vector Map data, DEM, Aerial images, second approach are based on High resolution satellite images with LASER scanning, In third method, many researcher are using Terrestrial images by using Close Range Photogrammetry with DSM & Texture mapping. We start this paper from the introduction of various Geomatics techniques for 3D City modeling. These techniques divided in to two main categories: one is based on Automation (Automatic, Semi-automatic and Manual methods), and another is Based on Data input techniques (one is Photogrammetry, another is Laser Techniques). After details study of this, finally in short, we are trying to give the conclusions of this study. In the last, we are trying to give the conclusions of this research paper and also giving a short view for justification and analysis, and present trend for 3D City modeling. This paper gives an overview about the Techniques related with "Generation of Virtual 3-D City models using Geomatics Techniques" and the Applications of Virtual 3D City models. Photogrammetry, (Close range, Aerial, Satellite), Lasergrammetry, GPS, or combination of these modern Geomatics techniques play a major role to create a virtual 3-D City model. Each and every techniques and method has some advantages and some drawbacks. Point cloud model is a modern trend for virtual 3-D city model. Photo-realistic, Scalable, Geo-referenced virtual 3-D City model is a very useful for various kinds of applications such as for planning in Navigation, Tourism, Disasters Management, Transportations, Municipality, Urban Environmental Managements and Real-estate industry. So the Construction of Virtual 3-D city models is a most interesting research topic in recent years.
Neuronize: a tool for building realistic neuronal cell morphologies
Brito, Juan P.; Mata, Susana; Bayona, Sofia; Pastor, Luis; DeFelipe, Javier; Benavides-Piccione, Ruth
2013-01-01
This study presents a tool, Neuronize, for building realistic three-dimensional models of neuronal cells from the morphological information extracted through computer-aided tracing applications. Neuronize consists of a set of methods designed to build 3D neural meshes that approximate the cell membrane at different resolution levels, allowing a balance to be reached between the complexity and the quality of the final model. The main contribution of the present study is the proposal of a novel approach to build a realistic and accurate 3D shape of the soma from the incomplete information stored in the digitally traced neuron, which usually consists of a 2D cell body contour. This technique is based on the deformation of an initial shape driven by the position and thickness of the first order dendrites. The addition of a set of spines along the dendrites completes the model, building a final 3D neuronal cell suitable for its visualization in a wide range of 3D environments. PMID:23761740
Neuronize: a tool for building realistic neuronal cell morphologies.
Brito, Juan P; Mata, Susana; Bayona, Sofia; Pastor, Luis; Defelipe, Javier; Benavides-Piccione, Ruth
2013-01-01
This study presents a tool, Neuronize, for building realistic three-dimensional models of neuronal cells from the morphological information extracted through computer-aided tracing applications. Neuronize consists of a set of methods designed to build 3D neural meshes that approximate the cell membrane at different resolution levels, allowing a balance to be reached between the complexity and the quality of the final model. The main contribution of the present study is the proposal of a novel approach to build a realistic and accurate 3D shape of the soma from the incomplete information stored in the digitally traced neuron, which usually consists of a 2D cell body contour. This technique is based on the deformation of an initial shape driven by the position and thickness of the first order dendrites. The addition of a set of spines along the dendrites completes the model, building a final 3D neuronal cell suitable for its visualization in a wide range of 3D environments.
Laboratory and Physical Modelling of Building Ventilation Flows
NASA Astrophysics Data System (ADS)
Hunt, Gary
2001-11-01
Heating and ventilating buildings accounts for a significant fraction of the total energy budget of cities and an immediate challenge in building physics is for the design of sustainable, low-energy buildings. Natural ventilation provides a low-energy solution as it harness the buoyancy force associated with temperature differences between the internal and external environment, and the wind to drive a ventilating flow. Modern naturally-ventilated buildings use innovative design solutions, e.g. glazed atria and solar chimneys, to enhance the ventilation and demand for these and other designs has far outstripped our understanding of the fluid mechanics within these buildings. Developing an understanding of the thermal stratification and movement of air provides a considerable challenge as the flows involve interactions between stratification and turbulence and often in complex geometries. An approach that has provided significant new insight into these flows and which has led to the development of design guidelines for architects and ventilation engineers is laboratory modelling at small-scale in water tanks combined with physical modelling. Density differences to drive the flow in simplified plexiglass models of rooms or buildings are provided by fresh and salt water solutions, and wind flow is represented by a mean flow in a flume tank. In tandom with the experiments, theoretical models that capture the essential physics of these flows have been developed in order to generalise the experimental results to a wide range of typical building geometries and operating conditions. This paper describes the application and outcomes of these modelling techniques to the study of a variety of natural ventilation flows in buildings.
Regression Commonality Analysis: A Technique for Quantitative Theory Building
ERIC Educational Resources Information Center
Nimon, Kim; Reio, Thomas G., Jr.
2011-01-01
When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…
Computational Fluid Dynamics (CFD) techniques are increasingly being applied to air quality modeling of short-range dispersion, especially the flow and dispersion around buildings and other geometrically complex structures. The proper application and accuracy of such CFD techniqu...
Automated Classification of Heritage Buildings for As-Built Bim Using Machine Learning Techniques
NASA Astrophysics Data System (ADS)
Bassier, M.; Vergauwen, M.; Van Genechten, B.
2017-08-01
Semantically rich three dimensional models such as Building Information Models (BIMs) are increasingly used in digital heritage. They provide the required information to varying stakeholders during the different stages of the historic buildings life cyle which is crucial in the conservation process. The creation of as-built BIM models is based on point cloud data. However, manually interpreting this data is labour intensive and often leads to misinterpretations. By automatically classifying the point cloud, the information can be proccesed more effeciently. A key aspect in this automated scan-to-BIM process is the classification of building objects. In this research we look to automatically recognise elements in existing buildings to create compact semantic information models. Our algorithm efficiently extracts the main structural components such as floors, ceilings, roofs, walls and beams despite the presence of significant clutter and occlusions. More specifically, Support Vector Machines (SVM) are proposed for the classification. The algorithm is evaluated using real data of a variety of existing buildings. The results prove that the used classifier recognizes the objects with both high precision and recall. As a result, entire data sets are reliably labelled at once. The approach enables experts to better document and process heritage assets.
Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller
2018-02-01
Given the complexity of factors contributing to alcohol misuse, appropriate epistemologies and methodologies are needed to understand and intervene meaningfully. We aimed to (1) provide an overview of computational modeling methodologies, with an emphasis on system dynamics modeling; (2) explain how community-based system dynamics modeling can forge new directions in alcohol prevention research; and (3) present a primer on how to build alcohol misuse simulation models using system dynamics modeling, with an emphasis on stakeholder involvement, data sources and model validation. Throughout, we use alcohol misuse among college students in the United States as a heuristic example for demonstrating these methodologies. System dynamics modeling employs a top-down aggregate approach to understanding dynamically complex problems. Its three foundational properties-stocks, flows and feedbacks-capture non-linearity, time-delayed effects and other system characteristics. As a methodological choice, system dynamics modeling is amenable to participatory approaches; in particular, community-based system dynamics modeling has been used to build impactful models for addressing dynamically complex problems. The process of community-based system dynamics modeling consists of numerous stages: (1) creating model boundary charts, behavior-over-time-graphs and preliminary system dynamics models using group model-building techniques; (2) model formulation; (3) model calibration; (4) model testing and validation; and (5) model simulation using learning-laboratory techniques. Community-based system dynamics modeling can provide powerful tools for policy and intervention decisions that can result ultimately in sustainable changes in research and action in alcohol misuse prevention. © 2017 Society for the Study of Addiction.
GLO-STIX: Graph-Level Operations for Specifying Techniques and Interactive eXploration
Stolper, Charles D.; Kahng, Minsuk; Lin, Zhiyuan; Foerster, Florian; Goel, Aakash; Stasko, John; Chau, Duen Horng
2015-01-01
The field of graph visualization has produced a wealth of visualization techniques for accomplishing a variety of analysis tasks. Therefore analysts often rely on a suite of different techniques, and visual graph analysis application builders strive to provide this breadth of techniques. To provide a holistic model for specifying network visualization techniques (as opposed to considering each technique in isolation) we present the Graph-Level Operations (GLO) model. We describe a method for identifying GLOs and apply it to identify five classes of GLOs, which can be flexibly combined to re-create six canonical graph visualization techniques. We discuss advantages of the GLO model, including potentially discovering new, effective network visualization techniques and easing the engineering challenges of building multi-technique graph visualization applications. Finally, we implement the GLOs that we identified into the GLO-STIX prototype system that enables an analyst to interactively explore a graph by applying GLOs. PMID:26005315
NASA Astrophysics Data System (ADS)
Li, Xiwang
Buildings consume about 41.1% of primary energy and 74% of the electricity in the U.S. Moreover, it is estimated by the National Energy Technology Laboratory that more than 1/4 of the 713 GW of U.S. electricity demand in 2010 could be dispatchable if only buildings could respond to that dispatch through advanced building energy control and operation strategies and smart grid infrastructure. In this study, it is envisioned that neighboring buildings will have the tendency to form a cluster, an open cyber-physical system to exploit the economic opportunities provided by a smart grid, distributed power generation, and storage devices. Through optimized demand management, these building clusters will then reduce overall primary energy consumption and peak time electricity consumption, and be more resilient to power disruptions. Therefore, this project seeks to develop a Net-zero building cluster simulation testbed and high fidelity energy forecasting models for adaptive and real-time control and decision making strategy development that can be used in a Net-zero building cluster. The following research activities are summarized in this thesis: 1) Development of a building cluster emulator for building cluster control and operation strategy assessment. 2) Development of a novel building energy forecasting methodology using active system identification and data fusion techniques. In this methodology, a systematic approach for building energy system characteristic evaluation, system excitation and model adaptation is included. The developed methodology is compared with other literature-reported building energy forecasting methods; 3) Development of the high fidelity on-line building cluster energy forecasting models, which includes energy forecasting models for buildings, PV panels, batteries and ice tank thermal storage systems 4) Small scale real building validation study to verify the performance of the developed building energy forecasting methodology. The outcomes of this thesis can be used for building cluster energy forecasting model development and model based control and operation optimization. The thesis concludes with a summary of the key outcomes of this research, as well as a list of recommendations for future work.
Structural Equation Model Trees
ERIC Educational Resources Information Center
Brandmaier, Andreas M.; von Oertzen, Timo; McArdle, John J.; Lindenberger, Ulman
2013-01-01
In the behavioral and social sciences, structural equation models (SEMs) have become widely accepted as a modeling tool for the relation between latent and observed variables. SEMs can be seen as a unification of several multivariate analysis techniques. SEM Trees combine the strengths of SEMs and the decision tree paradigm by building tree…
Modeling User Behavior in Computer Learning Tasks.
ERIC Educational Resources Information Center
Mantei, Marilyn M.
Model building techniques from Artifical Intelligence and Information-Processing Psychology are applied to human-computer interface tasks to evaluate existing interfaces and suggest new and better ones. The model is in the form of an augmented transition network (ATN) grammar which is built by applying grammar induction heuristics on a sequential…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Touzani, Samir; Custodio, Claudine
Trustworthy savings calculations are critical to convincing investors in energy efficiency projects of the benefit and cost-effectiveness of such investments and their ability to replace or defer supply-side capital investments. However, today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of efficiency projects. They also require time-consuming manual data acquisition and often do not deliver results until years after the program period has ended. The rising availability of “smart” meters, combined with new analytical approaches to quantifying savings, has opened the door to conducting M&V more quickly and at lower cost,more » with comparable or improved accuracy. These meter- and software-based approaches, increasingly referred to as “M&V 2.0”, are the subject of surging industry interest, particularly in the context of utility energy efficiency programs. Program administrators, evaluators, and regulators are asking how M&V 2.0 compares with more traditional methods, how proprietary software can be transparently performance tested, how these techniques can be integrated into the next generation of whole-building focused efficiency programs. This paper expands recent analyses of public-domain whole-building M&V methods, focusing on more novel M&V2.0 modeling approaches that are used in commercial technologies, as well as approaches that are documented in the literature, and/or developed by the academic building research community. We present a testing procedure and metrics to assess the performance of whole-building M&V methods. We then illustrate the test procedure by evaluating the accuracy of ten baseline energy use models, against measured data from a large dataset of 537 buildings. The results of this study show that the already available advanced interval data baseline models hold great promise for scaling the adoption of building measured savings calculations using Advanced Metering Infrastructure (AMI) data. Median coefficient of variation of the root mean squared error (CV(RMSE)) was less than 25% for every model tested when twelve months of training data were used. With even six months of training data, median CV(RMSE) for daily energy total was under 25% for all models tested. Finally, these findings can be used to build confidence in model robustness, and the readiness of these approaches for industry uptake and adoption« less
Granderson, Jessica; Touzani, Samir; Custodio, Claudine; ...
2016-04-16
Trustworthy savings calculations are critical to convincing investors in energy efficiency projects of the benefit and cost-effectiveness of such investments and their ability to replace or defer supply-side capital investments. However, today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of efficiency projects. They also require time-consuming manual data acquisition and often do not deliver results until years after the program period has ended. The rising availability of “smart” meters, combined with new analytical approaches to quantifying savings, has opened the door to conducting M&V more quickly and at lower cost,more » with comparable or improved accuracy. These meter- and software-based approaches, increasingly referred to as “M&V 2.0”, are the subject of surging industry interest, particularly in the context of utility energy efficiency programs. Program administrators, evaluators, and regulators are asking how M&V 2.0 compares with more traditional methods, how proprietary software can be transparently performance tested, how these techniques can be integrated into the next generation of whole-building focused efficiency programs. This paper expands recent analyses of public-domain whole-building M&V methods, focusing on more novel M&V2.0 modeling approaches that are used in commercial technologies, as well as approaches that are documented in the literature, and/or developed by the academic building research community. We present a testing procedure and metrics to assess the performance of whole-building M&V methods. We then illustrate the test procedure by evaluating the accuracy of ten baseline energy use models, against measured data from a large dataset of 537 buildings. The results of this study show that the already available advanced interval data baseline models hold great promise for scaling the adoption of building measured savings calculations using Advanced Metering Infrastructure (AMI) data. Median coefficient of variation of the root mean squared error (CV(RMSE)) was less than 25% for every model tested when twelve months of training data were used. With even six months of training data, median CV(RMSE) for daily energy total was under 25% for all models tested. Finally, these findings can be used to build confidence in model robustness, and the readiness of these approaches for industry uptake and adoption« less
Estimation of Solar Radiation on Building Roofs in Mountainous Areas
NASA Astrophysics Data System (ADS)
Agugiaro, G.; Remondino, F.; Stevanato, G.; De Filippi, R.; Furlanello, C.
2011-04-01
The aim of this study is estimating solar radiation on building roofs in complex mountain landscape areas. A multi-scale solar radiation estimation methodology is proposed that combines 3D data ranging from regional scale to the architectural one. Both the terrain and the nearby building shadowing effects are considered. The approach is modular and several alternative roof models, obtained by surveying and modelling techniques at varying level of detail, can be embedded in a DTM, e.g. that of an Alpine valley surrounded by mountains. The solar radiation maps obtained from raster models at different resolutions are compared and evaluated in order to obtain information regarding the benefits and disadvantages tied to each roof modelling approach. The solar radiation estimation is performed within the open-source GRASS GIS environment using r.sun and its ancillary modules.
NASA Astrophysics Data System (ADS)
Pathak, Jaideep; Wikner, Alexander; Fussell, Rebeckah; Chandra, Sarthak; Hunt, Brian R.; Girvan, Michelle; Ott, Edward
2018-04-01
A model-based approach to forecasting chaotic dynamical systems utilizes knowledge of the mechanistic processes governing the dynamics to build an approximate mathematical model of the system. In contrast, machine learning techniques have demonstrated promising results for forecasting chaotic systems purely from past time series measurements of system state variables (training data), without prior knowledge of the system dynamics. The motivation for this paper is the potential of machine learning for filling in the gaps in our underlying mechanistic knowledge that cause widely-used knowledge-based models to be inaccurate. Thus, we here propose a general method that leverages the advantages of these two approaches by combining a knowledge-based model and a machine learning technique to build a hybrid forecasting scheme. Potential applications for such an approach are numerous (e.g., improving weather forecasting). We demonstrate and test the utility of this approach using a particular illustrative version of a machine learning known as reservoir computing, and we apply the resulting hybrid forecaster to a low-dimensional chaotic system, as well as to a high-dimensional spatiotemporal chaotic system. These tests yield extremely promising results in that our hybrid technique is able to accurately predict for a much longer period of time than either its machine-learning component or its model-based component alone.
NASA Astrophysics Data System (ADS)
Chouika, N.; Mezrag, C.; Moutarde, H.; Rodríguez-Quintero, J.
2018-05-01
A systematic approach for the model building of Generalized Parton Distributions (GPDs), based on their overlap representation within the DGLAP kinematic region and a further covariant extension to the ERBL one, is applied to the valence-quark pion's case, using light-front wave functions inspired by the Nakanishi representation of the pion Bethe-Salpeter amplitudes (BSA). This simple but fruitful pion GPD model illustrates the general model building technique and, in addition, allows for the ambiguities related to the covariant extension, grounded on the Double Distribution (DD) representation, to be constrained by requiring a soft-pion theorem to be properly observed.
Elia, Marinos; Betts, Peter; Jackson, Diane M; Mulligan, Jean
2007-09-01
Intrauterine programming of body composition [percentage body fat (%BF)] has been sparsely examined with multiple independent reference techniques in children. The effects on and consequences of body build (dimensions, mass, and length of body segments) are unclear. The study examined whether percentage fat and relation of percentage fat to body mass index (BMI; in kg/m2) in prepubertal children are programmed during intrauterine development and are dependent on body build. It also aimed to examine the extent to which height can be predicted by parental height and birth weight. Eighty-five white children (44 boys, 41 girls; aged 6.5-9.1 y) had body composition measured with a 4-component model (n = 58), dual-energy X-ray absorptiometry (n = 84), deuterium dilution (n = 81), densitometry (n = 62), and skinfold thicknesses (n = 85). An increase in birth weight of 1 SD was associated with a decrease of 1.95% fat as measured by the 4-component model (P = 0.012) and 0.82-2.75% by the other techniques. These associations were independent of age, sex, socioeconomic status, physical activity, BMI, and body build. Body build did not decrease the strength of the associations. Birth weight was a significantly better predictor of height than was self-reported midparental height, accounting for 19.4% of the variability at 5 y of age and 10.3% at 7.8 y of age (17.8% and 8.8% of which were independent of parental height at these ages, respectively). Consistent trends across body-composition measurement techniques add strength to the suggestion that percentage fat in prepubertal children is programmed in utero (independently of body build and BMI). It also suggests birth weight is a better predictor of prepubertal height than is self-reported midparental height.
ERIC Educational Resources Information Center
Gálvez, Jaime; Conejo, Ricardo; Guzmán, Eduardo
2013-01-01
One of the most popular student modeling approaches is Constraint-Based Modeling (CBM). It is an efficient approach that can be easily applied inside an Intelligent Tutoring System (ITS). Even with these characteristics, building new ITSs requires carefully designing the domain model to be taught because different sources of errors could affect…
Jacquemin, Bénédicte; Lepeule, Johanna; Boudier, Anne; Arnould, Caroline; Benmerad, Meriem; Chappaz, Claire; Ferran, Joane; Kauffmann, Francine; Morelli, Xavier; Pin, Isabelle; Pison, Christophe; Rios, Isabelle; Temam, Sofia; Künzli, Nino; Slama, Rémy; Siroux, Valérie
2013-09-01
Errors in address geocodes may affect estimates of the effects of air pollution on health. We investigated the impact of four geocoding techniques on the association between urban air pollution estimated with a fine-scale (10 m × 10 m) dispersion model and lung function in adults. We measured forced expiratory volume in 1 sec (FEV1) and forced vital capacity (FVC) in 354 adult residents of Grenoble, France, who were participants in two well-characterized studies, the Epidemiological Study on the Genetics and Environment on Asthma (EGEA) and the European Community Respiratory Health Survey (ECRHS). Home addresses were geocoded using individual building matching as the reference approach and three spatial interpolation approaches. We used a dispersion model to estimate mean PM10 and nitrogen dioxide concentrations at each participant's address during the 12 months preceding their lung function measurements. Associations between exposures and lung function parameters were adjusted for individual confounders and same-day exposure to air pollutants. The geocoding techniques were compared with regard to geographical distances between coordinates, exposure estimates, and associations between the estimated exposures and health effects. Median distances between coordinates estimated using the building matching and the three interpolation techniques were 26.4, 27.9, and 35.6 m. Compared with exposure estimates based on building matching, PM10 concentrations based on the three interpolation techniques tended to be overestimated. When building matching was used to estimate exposures, a one-interquartile range increase in PM10 (3.0 μg/m3) was associated with a 3.72-point decrease in FVC% predicted (95% CI: -0.56, -6.88) and a 3.86-point decrease in FEV1% predicted (95% CI: -0.14, -3.24). The magnitude of associations decreased when other geocoding approaches were used [e.g., for FVC% predicted -2.81 (95% CI: -0.26, -5.35) using NavTEQ, or 2.08 (95% CI -4.63, 0.47, p = 0.11) using Google Maps]. Our findings suggest that the choice of geocoding technique may influence estimated health effects when air pollution exposures are estimated using a fine-scale exposure model.
Saini, Harsh; Raicar, Gaurav; Dehzangi, Abdollah; Lal, Sunil; Sharma, Alok
2015-12-07
Protein subcellular localization is an important topic in proteomics since it is related to a protein׳s overall function, helps in the understanding of metabolic pathways, and in drug design and discovery. In this paper, a basic approximation technique from natural language processing called the linear interpolation smoothing model is applied for predicting protein subcellular localizations. The proposed approach extracts features from syntactical information in protein sequences to build probabilistic profiles using dependency models, which are used in linear interpolation to determine how likely is a sequence to belong to a particular subcellular location. This technique builds a statistical model based on maximum likelihood. It is able to deal effectively with high dimensionality that hinders other traditional classifiers such as Support Vector Machines or k-Nearest Neighbours without sacrificing performance. This approach has been evaluated by predicting subcellular localizations of Gram positive and Gram negative bacterial proteins. Copyright © 2015 Elsevier Ltd. All rights reserved.
Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.
Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil
2014-08-01
We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.
Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods
Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil
2015-01-01
We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called “Patient Recursive Survival Peeling” is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called “combined” cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication. PMID:26997922
NASA Astrophysics Data System (ADS)
Arida, Maya Ahmad
In 1972 sustainable development concept existed and during The years it became one of the most important solution to save natural resources and energy, but now with rising energy costs and increasing awareness of the effect of global warming, the development of building energy saving methods and models become apparently more necessary for sustainable future. According to U.S. Energy Information Administration EIA (EIA), today buildings in the U.S. consume 72 percent of electricity produced, and use 55 percent of U.S. natural gas. Buildings account for about 40 percent of the energy consumed in the United States, more than industry and transportation. Of this energy, heating and cooling systems use about 55 percent. If energy-use trends continue, buildings will become the largest consumer of global energy by 2025. This thesis proposes procedures and analysis techniques for building energy system and optimization methods using time series auto regression artificial neural networks. The model predicts whole building energy consumptions as a function of four input variables, dry bulb and wet bulb outdoor air temperatures, hour of day and type of day. The proposed model and the optimization process are tested using data collected from an existing building located in Greensboro, NC. The testing results show that the model can capture very well the system performance, and The optimization method was also developed to automate the process of finding the best model structure that can produce the best accurate prediction against the actual data. The results show that the developed model can provide results sufficiently accurate for its use in various energy efficiency and saving estimation applications.
Summary on several key techniques in 3D geological modeling.
Mei, Gang
2014-01-01
Several key techniques in 3D geological modeling including planar mesh generation, spatial interpolation, and surface intersection are summarized in this paper. Note that these techniques are generic and widely used in various applications but play a key role in 3D geological modeling. There are two essential procedures in 3D geological modeling: the first is the simulation of geological interfaces using geometric surfaces and the second is the building of geological objects by means of various geometric computations such as the intersection of surfaces. Discrete geometric surfaces that represent geological interfaces can be generated by creating planar meshes first and then spatially interpolating; those surfaces intersect and then form volumes that represent three-dimensional geological objects such as rock bodies. In this paper, the most commonly used algorithms of the key techniques in 3D geological modeling are summarized.
Modeling and managing risk early in software development
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.
1993-01-01
In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.
Indoor Radon Concentration Related to Different Radon Areas and Indoor Radon Prediction
NASA Astrophysics Data System (ADS)
Juhásová Šenitková, Ingrid; Šál, Jiří
2017-12-01
Indoor radon has been observed in the buildings at areas with different radon risk potential. Preventive measures are based on control of main potential radon sources (soil gas, building material and supplied water) to avoid building of new houses above recommended indoor radon level 200 Bq/m3. Radon risk (index) estimation of individual building site bedrock in case of new house siting and building protection according technical building code are obligatory. Remedial actions in buildings built at high radon risk areas were carried out principally by unforced ventilation and anti-radon insulation. Significant differences were found in the level of radon concentration between rooms where radon reduction techniques were designed and those where it was not designed. The mathematical model based on radon exhalation from soil has been developed to describe the physical processes determining indoor radon concentration. The model is focused on combined radon diffusion through the slab and advection through the gap from sub-slab soil. In this model, radon emanated from building materials is considered not having a significant contribution to indoor radon concentration. Dimensional analysis and Gauss-Newton nonlinear least squares parametric regression were used to simplify the problem, identify essential input variables and find parameter values. The presented verification case study is introduced for real buildings with respect to various underground construction types. Presented paper gives picture of possible mathematical approach to indoor radon concentration prediction.
ERIC Educational Resources Information Center
Warren, Michael D.
1997-01-01
Explains a method to enable students to understand DNA and protein synthesis using model-building and role-playing. Acquaints students with the triplet code and transcription. Includes copies of the charts used in this technique. (DDR)
Model-Based Building Detection from Low-Cost Optical Sensors Onboard Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Karantzalos, K.; Koutsourakis, P.; Kalisperakis, I.; Grammatikopoulos, L.
2015-08-01
The automated and cost-effective building detection in ultra high spatial resolution is of major importance for various engineering and smart city applications. To this end, in this paper, a model-based building detection technique has been developed able to extract and reconstruct buildings from UAV aerial imagery and low-cost imaging sensors. In particular, the developed approach through advanced structure from motion, bundle adjustment and dense image matching computes a DSM and a true orthomosaic from the numerous GoPro images which are characterised by important geometric distortions and fish-eye effect. An unsupervised multi-region, graphcut segmentation and a rule-based classification is responsible for delivering the initial multi-class classification map. The DTM is then calculated based on inpaininting and mathematical morphology process. A data fusion process between the detected building from the DSM/DTM and the classification map feeds a grammar-based building reconstruction and scene building are extracted and reconstructed. Preliminary experimental results appear quite promising with the quantitative evaluation indicating detection rates at object level of 88% regarding the correctness and above 75% regarding the detection completeness.
Benedek, C; Descombes, X; Zerubia, J
2012-01-01
In this paper, we introduce a new probabilistic method which integrates building extraction with change detection in remotely sensed image pairs. A global optimization process attempts to find the optimal configuration of buildings, considering the observed data, prior knowledge, and interactions between the neighboring building parts. We present methodological contributions in three key issues: 1) We implement a novel object-change modeling approach based on Multitemporal Marked Point Processes, which simultaneously exploits low-level change information between the time layers and object-level building description to recognize and separate changed and unaltered buildings. 2) To answer the challenges of data heterogeneity in aerial and satellite image repositories, we construct a flexible hierarchical framework which can create various building appearance models from different elementary feature-based modules. 3) To simultaneously ensure the convergence, optimality, and computation complexity constraints raised by the increased data quantity, we adopt the quick Multiple Birth and Death optimization technique for change detection purposes, and propose a novel nonuniform stochastic object birth process which generates relevant objects with higher probability based on low-level image features.
Proposal for constructing an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.
1990-01-01
Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.
NASA Astrophysics Data System (ADS)
Bittner, K.; d'Angelo, P.; Körner, M.; Reinartz, P.
2018-05-01
Three-dimensional building reconstruction from remote sensing imagery is one of the most difficult and important 3D modeling problems for complex urban environments. The main data sources provided the digital representation of the Earths surface and related natural, cultural, and man-made objects of the urban areas in remote sensing are the digital surface models (DSMs). The DSMs can be obtained either by light detection and ranging (LIDAR), SAR interferometry or from stereo images. Our approach relies on automatic global 3D building shape refinement from stereo DSMs using deep learning techniques. This refinement is necessary as the DSMs, which are extracted from image matching point clouds, suffer from occlusions, outliers, and noise. Though most previous works have shown promising results for building modeling, this topic remains an open research area. We present a new methodology which not only generates images with continuous values representing the elevation models but, at the same time, enhances the 3D object shapes, buildings in our case. Mainly, we train a conditional generative adversarial network (cGAN) to generate accurate LIDAR-like DSM height images from the noisy stereo DSM input. The obtained results demonstrate the strong potential of creating large areas remote sensing depth images where the buildings exhibit better-quality shapes and roof forms.
Cakmak, Ercan; Kirka, Michael M.; Watkins, Thomas R.; ...
2016-02-23
Theta-shaped specimens were additively manufactured out of Inconel 718 powders using an electron beam melting technique, as a model complex load bearing structure. We employed two different build strategies; producing two sets of specimens. Microstructural and micro-mechanical characterizations were performed using electron back-scatter, synchrotron x-ray and in-situ neutron diffraction techniques. In particular, the cross-members of the specimens were the focus of the synchrotron x-ray and in-situ neutron diffraction measurements. The build strategies employed resulted in the formation of distinct microstructures and crystallographic textures, signifying the importance of build-parameter manipulation for microstructural optimization. Large strain anisotropy of the different lattice planesmore » was observed during in-situ loading. Texture was concluded to have a distinct effect upon both the axial and transverse strain responses of the cross-members. In particular, the (200), (220) and (420) transverse lattice strains all showed unexpected overlapping trends in both builds. This was related to the strong {200} textures along the build/loading direction, providing agreement between the experimental and calculated results.« less
NASA Astrophysics Data System (ADS)
Santagati, C.; Inzerillo, L.; Di Paola, F.
2013-07-01
3D reconstruction from images has undergone a revolution in the last few years. Computer vision techniques use photographs from data set collection to rapidly build detailed 3D models. The simultaneous applications of different algorithms (MVS), the different techniques of image matching, feature extracting and mesh optimization are inside an active field of research in computer vision. The results are promising: the obtained models are beginning to challenge the precision of laser-based reconstructions. Among all the possibilities we can mainly distinguish desktop and web-based packages. Those last ones offer the opportunity to exploit the power of cloud computing in order to carry out a semi-automatic data processing, thus allowing the user to fulfill other tasks on its computer; whereas desktop systems employ too much processing time and hard heavy approaches. Computer vision researchers have explored many applications to verify the visual accuracy of 3D model but the approaches to verify metric accuracy are few and no one is on Autodesk 123D Catch applied on Architectural Heritage Documentation. Our approach to this challenging problem is to compare the 3Dmodels by Autodesk 123D Catch and 3D models by terrestrial LIDAR considering different object size, from the detail (capitals, moldings, bases) to large scale buildings for practitioner purpose.
Road traffic air and noise pollution exposure assessment - A review of tools and techniques.
Khan, Jibran; Ketzel, Matthias; Kakosimos, Konstantinos; Sørensen, Mette; Jensen, Steen Solvang
2018-09-01
Road traffic induces air and noise pollution in urban environments having negative impacts on human health. Thus, estimating exposure to road traffic air and noise pollution (hereafter, air and noise pollution) is important in order to improve the understanding of human health outcomes in epidemiological studies. The aims of this review are (i) to summarize current practices of modelling and exposure assessment techniques for road traffic air and noise pollution (ii) to highlight the potential of existing tools and techniques for their combined exposure assessment for air and noise together with associated challenges, research gaps and priorities. The study reviews literature about air and noise pollution from urban road traffic, including other relevant characteristics such as the employed dispersion models, Geographic Information System (GIS)-based tool, spatial scale of exposure assessment, study location, sample size, type of traffic data and building geometry information. Deterministic modelling is the most frequently used assessment technique for both air and noise pollution of short-term and long-term exposure. We observed a larger variety among air pollution models as compared to the applied noise models. Correlations between air and noise pollution vary significantly (0.05-0.74) and are affected by several parameters such as traffic attributes, building attributes and meteorology etc. Buildings act as screens for the dispersion of pollution, but the reduction effect is much larger for noise than for air pollution. While, meteorology has a greater influence on air pollution levels as compared to noise, although also important for noise pollution. There is a significant potential for developing a standard tool to assess combined exposure of traffic related air and noise pollution to facilitate health related studies. GIS, due to its geographic nature, is well established and has a significant capability to simultaneously address both exposures. Copyright © 2018 Elsevier B.V. All rights reserved.
A compressed sensing method with analytical results for lidar feature classification
NASA Astrophysics Data System (ADS)
Allen, Josef D.; Yuan, Jiangbo; Liu, Xiuwen; Rahmes, Mark
2011-04-01
We present an innovative way to autonomously classify LiDAR points into bare earth, building, vegetation, and other categories. One desirable product of LiDAR data is the automatic classification of the points in the scene. Our algorithm automatically classifies scene points using Compressed Sensing Methods via Orthogonal Matching Pursuit algorithms utilizing a generalized K-Means clustering algorithm to extract buildings and foliage from a Digital Surface Models (DSM). This technology reduces manual editing while being cost effective for large scale automated global scene modeling. Quantitative analyses are provided using Receiver Operating Characteristics (ROC) curves to show Probability of Detection and False Alarm of buildings vs. vegetation classification. Histograms are shown with sample size metrics. Our inpainting algorithms then fill the voids where buildings and vegetation were removed, utilizing Computational Fluid Dynamics (CFD) techniques and Partial Differential Equations (PDE) to create an accurate Digital Terrain Model (DTM) [6]. Inpainting preserves building height contour consistency and edge sharpness of identified inpainted regions. Qualitative results illustrate other benefits such as Terrain Inpainting's unique ability to minimize or eliminate undesirable terrain data artifacts.
NASA Astrophysics Data System (ADS)
Oniga, E.; Chirilă, C.; Stătescu, F.
2017-02-01
Nowadays, Unmanned Aerial Systems (UASs) are a wide used technique for acquisition in order to create buildings 3D models, providing the acquisition of a high number of images at very high resolution or video sequences, in a very short time. Since low-cost UASs are preferred, the accuracy of a building 3D model created using this platforms must be evaluated. To achieve results, the dean's office building from the Faculty of "Hydrotechnical Engineering, Geodesy and Environmental Engineering" of Iasi, Romania, has been chosen, which is a complex shape building with the roof formed of two hyperbolic paraboloids. Seven points were placed on the ground around the building, three of them being used as GCPs, while the remaining four as Check points (CPs) for accuracy assessment. Additionally, the coordinates of 10 natural CPs representing the building characteristic points were measured with a Leica TCR 405 total station. The building 3D model was created as a point cloud which was automatically generated based on digital images acquired with the low-cost UASs, using the image matching algorithm and different software like 3DF Zephyr, Visual SfM, PhotoModeler Scanner and Drone2Map for ArcGIS. Except for the PhotoModeler Scanner software, the interior and exterior orientation parameters were determined simultaneously by solving a self-calibrating bundle adjustment. Based on the UAS point clouds, automatically generated by using the above mentioned software and GNSS data respectively, the parameters of the east side hyperbolic paraboloid were calculated using the least squares method and a statistical blunder detection. Then, in order to assess the accuracy of the building 3D model, several comparisons were made for the facades and the roof with reference data, considered with minimum errors: TLS mesh for the facades and GNSS mesh for the roof. Finally, the front facade of the building was created in 3D based on its characteristic points using the PhotoModeler Scanner software, resulting a CAD (Computer Aided Design) model. The results showed the high potential of using low-cost UASs for building 3D model creation and if the building 3D model is created based on its characteristic points the accuracy is significantly improved.
working at NREL, Maharshi was a graduate student at ASU where he focused on incorporation of machine learning techniques into the modeling of commercial building stocks that can assist the policy-level
Design and Implementation Skills for Social Innovation.
ERIC Educational Resources Information Center
Tornatzky, Louis G.; Fairweather, George W.
New models of research and training combined with dissemination techniques can contribute to relevant social change. The Ecological Psychology Program at Michigan State University, a graduate training program which focuses on model building and implementation research, offers ideas on the plausability of social programming. The process would…
Data mining and statistical inference in selective laser melting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamath, Chandrika
Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less
Data mining and statistical inference in selective laser melting
Kamath, Chandrika
2016-01-11
Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less
Keegan, Ronan M; McNicholas, Stuart J; Thomas, Jens M H; Simpkin, Adam J; Simkovic, Felix; Uski, Ville; Ballard, Charles C; Winn, Martyn D; Wilson, Keith S; Rigden, Daniel J
2018-03-01
Increasing sophistication in molecular-replacement (MR) software and the rapid expansion of the PDB in recent years have allowed the technique to become the dominant method for determining the phases of a target structure in macromolecular X-ray crystallography. In addition, improvements in bioinformatic techniques for finding suitable homologous structures for use as MR search models, combined with developments in refinement and model-building techniques, have pushed the applicability of MR to lower sequence identities and made weak MR solutions more amenable to refinement and improvement. MrBUMP is a CCP4 pipeline which automates all stages of the MR procedure. Its scope covers everything from the sourcing and preparation of suitable search models right through to rebuilding of the positioned search model. Recent improvements to the pipeline include the adoption of more sensitive bioinformatic tools for sourcing search models, enhanced model-preparation techniques including better ensembling of homologues, and the use of phase improvement and model building on the resulting solution. The pipeline has also been deployed as an online service through CCP4 online, which allows its users to exploit large bioinformatic databases and coarse-grained parallelism to speed up the determination of a possible solution. Finally, the molecular-graphics application CCP4mg has been combined with MrBUMP to provide an interactive visual aid to the user during the process of selecting and manipulating search models for use in MR. Here, these developments in MrBUMP are described with a case study to explore how some of the enhancements to the pipeline and to CCP4mg can help to solve a difficult case.
Keegan, Ronan M.; McNicholas, Stuart J.; Thomas, Jens M. H.; Simpkin, Adam J.; Uski, Ville; Ballard, Charles C.
2018-01-01
Increasing sophistication in molecular-replacement (MR) software and the rapid expansion of the PDB in recent years have allowed the technique to become the dominant method for determining the phases of a target structure in macromolecular X-ray crystallography. In addition, improvements in bioinformatic techniques for finding suitable homologous structures for use as MR search models, combined with developments in refinement and model-building techniques, have pushed the applicability of MR to lower sequence identities and made weak MR solutions more amenable to refinement and improvement. MrBUMP is a CCP4 pipeline which automates all stages of the MR procedure. Its scope covers everything from the sourcing and preparation of suitable search models right through to rebuilding of the positioned search model. Recent improvements to the pipeline include the adoption of more sensitive bioinformatic tools for sourcing search models, enhanced model-preparation techniques including better ensembling of homologues, and the use of phase improvement and model building on the resulting solution. The pipeline has also been deployed as an online service through CCP4 online, which allows its users to exploit large bioinformatic databases and coarse-grained parallelism to speed up the determination of a possible solution. Finally, the molecular-graphics application CCP4mg has been combined with MrBUMP to provide an interactive visual aid to the user during the process of selecting and manipulating search models for use in MR. Here, these developments in MrBUMP are described with a case study to explore how some of the enhancements to the pipeline and to CCP4mg can help to solve a difficult case. PMID:29533225
Occupancy schedules learning process through a data mining framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
D'Oca, Simona; Hong, Tianzhen
Building occupancy is a paramount factor in building energy simulations. Specifically, lighting, plug loads, HVAC equipment utilization, fresh air requirements and internal heat gain or loss greatly depends on the level of occupancy within a building. Developing the appropriate methodologies to describe and reproduce the intricate network responsible for human-building interactions are needed. Extrapolation of patterns from big data streams is a powerful analysis technique which will allow for a better understanding of energy usage in buildings. A three-step data mining framework is applied to discover occupancy patterns in office spaces. First, a data set of 16 offices with 10more » minute interval occupancy data, over a two year period is mined through a decision tree model which predicts the occupancy presence. Then a rule induction algorithm is used to learn a pruned set of rules on the results from the decision tree model. Finally, a cluster analysis is employed in order to obtain consistent patterns of occupancy schedules. Furthermore, the identified occupancy rules and schedules are representative as four archetypal working profiles that can be used as input to current building energy modeling programs, such as EnergyPlus or IDA-ICE, to investigate impact of occupant presence on design, operation and energy use in office buildings.« less
Occupancy schedules learning process through a data mining framework
D'Oca, Simona; Hong, Tianzhen
2014-12-17
Building occupancy is a paramount factor in building energy simulations. Specifically, lighting, plug loads, HVAC equipment utilization, fresh air requirements and internal heat gain or loss greatly depends on the level of occupancy within a building. Developing the appropriate methodologies to describe and reproduce the intricate network responsible for human-building interactions are needed. Extrapolation of patterns from big data streams is a powerful analysis technique which will allow for a better understanding of energy usage in buildings. A three-step data mining framework is applied to discover occupancy patterns in office spaces. First, a data set of 16 offices with 10more » minute interval occupancy data, over a two year period is mined through a decision tree model which predicts the occupancy presence. Then a rule induction algorithm is used to learn a pruned set of rules on the results from the decision tree model. Finally, a cluster analysis is employed in order to obtain consistent patterns of occupancy schedules. Furthermore, the identified occupancy rules and schedules are representative as four archetypal working profiles that can be used as input to current building energy modeling programs, such as EnergyPlus or IDA-ICE, to investigate impact of occupant presence on design, operation and energy use in office buildings.« less
Exploitation of Digital Surface Models Generated from WORLDVIEW-2 Data for SAR Simulation Techniques
NASA Astrophysics Data System (ADS)
Ilehag, R.; Auer, S.; d'Angelo, P.
2017-05-01
GeoRaySAR, an automated SAR simulator developed at DLR, identifies buildings in high resolution SAR data by utilizing geometric knowledge extracted from digital surface models (DSMs). Hitherto, the simulator has utilized DSMs generated from LiDAR data from airborne sensors with pre-filtered vegetation. Discarding the need for pre-optimized model input, DSMs generated from high resolution optical data (acquired with WorldView-2) are used for the extraction of building-related SAR image parts in this work. An automatic preprocessing of the DSMs has been developed for separating buildings from elevated vegetation (trees, bushes) and reducing the noise level. Based on that, automated simulations are triggered considering the properties of real SAR images. Locations in three cities, Munich, London and Istanbul, were chosen as study areas to determine advantages and limitations related to WorldView-2 DSMs as input for GeoRaySAR. Beyond, the impact of the quality of the DSM in terms of building extraction is evaluated as well as evaluation of building DSM, a DSM only containing buildings. The results indicate that building extents can be detected with DSMs from optical satellite data with various success, dependent on the quality of the DSM as well as on the SAR imaging perspective.
Applying Regression Analysis to Problems in Institutional Research.
ERIC Educational Resources Information Center
Bohannon, Tom R.
1988-01-01
Regression analysis is one of the most frequently used statistical techniques in institutional research. Principles of least squares, model building, residual analysis, influence statistics, and multi-collinearity are described and illustrated. (Author/MSE)
Vehicle System Management Modeling in UML for Ares I
NASA Technical Reports Server (NTRS)
Pearson, Newton W.; Biehn, Bradley A.; Curry, Tristan D.; Martinez, Mario R.
2011-01-01
The Spacecraft & Vehicle Systems Department of Marshall Space Flight Center is responsible for modeling the Vehicle System Management for the Ares I vehicle which was a part of the now canceled Constellation Program. An approach to generating the requirements for the Vehicle System Management was to use the Unified Modeling Language technique to build and test a model that would fulfill the Vehicle System Management requirements. UML has been used on past projects (flight software) in the design phase of the effort but this was the first attempt to use the UML technique from a top down requirements perspective.
Predicting the activity of drugs for a group of imidazopyridine anticoccidial compounds.
Si, Hongzong; Lian, Ning; Yuan, Shuping; Fu, Aiping; Duan, Yun-Bo; Zhang, Kejun; Yao, Xiaojun
2009-10-01
Gene expression programming (GEP) is a novel machine learning technique. The GEP is used to build nonlinear quantitative structure-activity relationship model for the prediction of the IC(50) for the imidazopyridine anticoccidial compounds. This model is based on descriptors which are calculated from the molecular structure. Four descriptors are selected from the descriptors' pool by heuristic method (HM) to build multivariable linear model. The GEP method produced a nonlinear quantitative model with a correlation coefficient and a mean error of 0.96 and 0.24 for the training set, 0.91 and 0.52 for the test set, respectively. It is shown that the GEP predicted results are in good agreement with experimental ones.
Building Change Detection from LIDAR Point Cloud Data Based on Connected Component Analysis
NASA Astrophysics Data System (ADS)
Awrangjeb, M.; Fraser, C. S.; Lu, G.
2015-08-01
Building data are one of the important data types in a topographic database. Building change detection after a period of time is necessary for many applications, such as identification of informal settlements. Based on the detected changes, the database has to be updated to ensure its usefulness. This paper proposes an improved building detection technique, which is a prerequisite for many building change detection techniques. The improved technique examines the gap between neighbouring buildings in the building mask in order to avoid under segmentation errors. Then, a new building change detection technique from LIDAR point cloud data is proposed. Buildings which are totally new or demolished are directly added to the change detection output. However, for demolished or extended building parts, a connected component analysis algorithm is applied and for each connected component its area, width and height are estimated in order to ascertain if it can be considered as a demolished or new building part. Finally, a graphical user interface (GUI) has been developed to update detected changes to the existing building map. Experimental results show that the improved building detection technique can offer not only higher performance in terms of completeness and correctness, but also a lower number of undersegmentation errors as compared to its original counterpart. The proposed change detection technique produces no omission errors and thus it can be exploited for enhanced automated building information updating within a topographic database. Using the developed GUI, the user can quickly examine each suggested change and indicate his/her decision with a minimum number of mouse clicks.
Designing a Pedagogical Model for Web Engineering Education: An Evolutionary Perspective
ERIC Educational Resources Information Center
Hadjerrouit, Said
2005-01-01
In contrast to software engineering, which relies on relatively well established development approaches, there is a lack of a proven methodology that guides Web engineers in building reliable and effective Web-based systems. Currently, Web engineering lacks process models, architectures, suitable techniques and methods, quality assurance, and a…
Model-Driven Design: Systematically Building Integrated Blended Learning Experiences
ERIC Educational Resources Information Center
Laster, Stephen
2010-01-01
Developing and delivering curricula that are integrated and that use blended learning techniques requires a highly orchestrated design. While institutions have demonstrated the ability to design complex curricula on an ad-hoc basis, these projects are generally successful at a great human and capital cost. Model-driven design provides a…
Theoretical basis of the DOE-2 building energy use analysis program
NASA Astrophysics Data System (ADS)
Curtis, R. B.
1981-04-01
A user-oriented, public domain, computer program was developed that will enable architects and engineers to perform design and retrofit studies of the energy-use of buildings under realistic weather conditions. The DOE-2.1A has been named by the US DOE as the standard evaluation technique for the Congressionally mandated building energy performance standards (BEPS). A number of program design decisions were made that determine the breadth of applicability of DOE-2.1. Such design decisions are intrinsic to all building energy use analysis computer programs and determine the types of buildings or the kind of HVAC systems that can be modeled. In particular, the weighting factor method used in DOE-2 has both advantages and disadvantages relative to other computer programs.
Summary on Several Key Techniques in 3D Geological Modeling
2014-01-01
Several key techniques in 3D geological modeling including planar mesh generation, spatial interpolation, and surface intersection are summarized in this paper. Note that these techniques are generic and widely used in various applications but play a key role in 3D geological modeling. There are two essential procedures in 3D geological modeling: the first is the simulation of geological interfaces using geometric surfaces and the second is the building of geological objects by means of various geometric computations such as the intersection of surfaces. Discrete geometric surfaces that represent geological interfaces can be generated by creating planar meshes first and then spatially interpolating; those surfaces intersect and then form volumes that represent three-dimensional geological objects such as rock bodies. In this paper, the most commonly used algorithms of the key techniques in 3D geological modeling are summarized. PMID:24772029
Banach, Marzena; Wasilewska, Agnieszka; Dlugosz, Rafal; Pauk, Jolanta
2018-05-18
Due to the problem of aging societies, there is a need for smart buildings to monitor and support people with various disabilities, including rheumatoid arthritis. The aim of this paper is to elaborate on novel techniques for wireless motion capture systems for the monitoring and rehabilitation of disabled people for application in smart buildings. The proposed techniques are based on cross-verification of distance measurements between markers and transponders in an environment with highly variable parameters. To their verification, algorithms that enable comprehensive investigation of a system with different numbers of transponders and varying ambient parameters (temperature and noise) were developed. In the estimation of the real positions of markers, various linear and nonlinear filters were used. Several thousand tests were carried out for various system parameters and different marker locations. The results show that localization error may be reduced by as much as 90%. It was observed that repetition of measurement reduces localization error by as much as one order of magnitude. The proposed system, based on wireless techniques, offers a high commercial potential. However, it requires extensive cooperation between teams, including hardware and software design, system modelling, and architectural design.
NASA Astrophysics Data System (ADS)
Lee, Yu-Cheng; Yen, Tieh-Min; Tsai, Chih-Hung
This study provides an integrated model of Supplier Quality Performance Assesment (SQPA) activity for the semiconductor industry through introducing the ISO 9001 management framework, Importance-Performance Analysis (IPA) Supplier Quality Performance Assesment and Taguchi`s Signal-to-Noise Ratio (S/N) techniques. This integrated model provides a SQPA methodology to create value for all members under mutual cooperation and trust in the supply chain. This method helps organizations build a complete SQPA framework, linking organizational objectives and SQPA activities to optimize rating techniques to promote supplier quality improvement. The techniques used in SQPA activities are easily understood. A case involving a design house is illustrated to show our model.
Application of zonal model on indoor air sensor network design
NASA Astrophysics Data System (ADS)
Chen, Y. Lisa; Wen, Jin
2007-04-01
Growing concerns over the safety of the indoor environment have made the use of sensors ubiquitous. Sensors that detect chemical and biological warfare agents can offer early warning of dangerous contaminants. However, current sensor system design is more informed by intuition and experience rather by systematic design. To develop a sensor system design methodology, a proper indoor airflow modeling approach is needed. Various indoor airflow modeling techniques, from complicated computational fluid dynamics approaches to simplified multi-zone approaches, exist in the literature. In this study, the effects of two airflow modeling techniques, multi-zone modeling technique and zonal modeling technique, on indoor air protection sensor system design are discussed. Common building attack scenarios, using a typical CBW agent, are simulated. Both multi-zone and zonal models are used to predict airflows and contaminant dispersion. Genetic Algorithm is then applied to optimize the sensor location and quantity. Differences in the sensor system design resulting from the two airflow models are discussed for a typical office environment and a large hall environment.
Artificial Intelligence for VHSIC Systems Design (AIVD) User Reference Manual
1988-12-01
The goal of this program was to develop prototype tools which would use artificial intelligence techniques to extend the Architecture Design and Assessment (ADAS) software capabilities. These techniques were applied in a number of ways to increase the productivity of ADAS users. AIM will reduce the amount of time spent on tedious, negative, and error-prone steps. It will also provide f documentation that will assist users in varifying that the models they build are correct Finally, AIVD will help make ADAS models more reusable.
NASA Astrophysics Data System (ADS)
Majerek, Dariusz; Guz, Łukasz; Suchorab, Zbigniew; Łagód, Grzegorz; Sobczuk, Henryk
2017-07-01
Mold that develops on moistened building barriers is a major cause of the Sick Building Syndrome (SBS). Fungal contamination is normally evaluated using standard biological methods which are time-consuming and require a lot of manual labor. Fungi emit Volatile Organic Compounds (VOC) that can be detected in the indoor air using several techniques of detection e.g. chromatography. VOCs can be also detected using gas sensors arrays. All array sensors generate particular voltage signals that ought to be analyzed using properly selected statistical methods of interpretation. This work is focused on the attempt to apply statistical classifying models in evaluation of signals from gas sensors matrix to analyze the air sampled from the headspace of various types of the building materials at different level of contamination but also clean reference materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riley, William Jowett
1996-05-01
Indoor air exposures to gaseous contaminants originating in soil can cause large human health risks. To predict and control these exposures, the mechanisms that affect vapor transport in near-surface soils need to be understood. In particular, radon exposure is a concern since average indoor radon concentrations lead to much higher risks than are generally accepted for exposure to other environmental contaminants. This dissertation examines an important component of the indoor radon problem: the impacts of wind on soil-gas and radon transport and entry into buildings. The research includes experimental and modeling studies of wind`s interactions with a building`s superstructure andmore » the resulting soil-gas and radon flows in the surrounding soil. In addition to exploring the effects of steady winds, a novel modeling technique is developed to examine the impacts of fluctuating winds on soil-gas and radon transport.« less
Hall, Stephen A; Howlin, Brendan J; Hamerton, Ian; Baidak, Alex; Billaud, Claude; Ward, Steven
2012-01-01
The construction of molecular models of crosslinked polymers is an area of some difficulty and considerable interest. We report here a new method of constructing these models and validate the method by modelling three epoxy systems based on the epoxy monomers bisphenol F diglycidyl ether (BFDGE) and triglycidyl-p-amino phenol (TGAP) with the curing agent diamino diphenyl sulphone (DDS). The main emphasis of the work concerns the improvement of the techniques for the molecular simulation of these epoxies and specific attention is paid towards model construction techniques, including automated model building and prediction of glass transition temperatures (T(g)). Typical models comprise some 4200-4600 atoms (ca. 120-130 monomers). In a parallel empirical study, these systems have been cast, cured and analysed by dynamic mechanical thermal analysis (DMTA) to measure T(g). Results for the three epoxy systems yield good agreement with experimental T(g) ranges of 200-220°C, 270-285°C and 285-290°C with corresponding simulated ranges of 210-230°C, 250-300°C, and 250-300°C respectively.
Hall, Stephen A.; Howlin, Brendan J; Hamerton, Ian; Baidak, Alex; Billaud, Claude; Ward, Steven
2012-01-01
The construction of molecular models of crosslinked polymers is an area of some difficulty and considerable interest. We report here a new method of constructing these models and validate the method by modelling three epoxy systems based on the epoxy monomers bisphenol F diglycidyl ether (BFDGE) and triglycidyl-p-amino phenol (TGAP) with the curing agent diamino diphenyl sulphone (DDS). The main emphasis of the work concerns the improvement of the techniques for the molecular simulation of these epoxies and specific attention is paid towards model construction techniques, including automated model building and prediction of glass transition temperatures (Tg). Typical models comprise some 4200–4600 atoms (ca. 120–130 monomers). In a parallel empirical study, these systems have been cast, cured and analysed by dynamic mechanical thermal analysis (DMTA) to measure Tg. Results for the three epoxy systems yield good agreement with experimental Tg ranges of 200–220°C, 270–285°C and 285–290°C with corresponding simulated ranges of 210–230°C, 250–300°C, and 250–300°C respectively. PMID:22916182
Sariyar, Murat; Hoffmann, Isabell; Binder, Harald
2014-02-26
Molecular data, e.g. arising from microarray technology, is often used for predicting survival probabilities of patients. For multivariate risk prediction models on such high-dimensional data, there are established techniques that combine parameter estimation and variable selection. One big challenge is to incorporate interactions into such prediction models. In this feasibility study, we present building blocks for evaluating and incorporating interactions terms in high-dimensional time-to-event settings, especially for settings in which it is computationally too expensive to check all possible interactions. We use a boosting technique for estimation of effects and the following building blocks for pre-selecting interactions: (1) resampling, (2) random forests and (3) orthogonalization as a data pre-processing step. In a simulation study, the strategy that uses all building blocks is able to detect true main effects and interactions with high sensitivity in different kinds of scenarios. The main challenge are interactions composed of variables that do not represent main effects, but our findings are also promising in this regard. Results on real world data illustrate that effect sizes of interactions frequently may not be large enough to improve prediction performance, even though the interactions are potentially of biological relevance. Screening interactions through random forests is feasible and useful, when one is interested in finding relevant two-way interactions. The other building blocks also contribute considerably to an enhanced pre-selection of interactions. We determined the limits of interaction detection in terms of necessary effect sizes. Our study emphasizes the importance of making full use of existing methods in addition to establishing new ones.
Method development of damage detection in asymmetric buildings
NASA Astrophysics Data System (ADS)
Wang, Yi; Thambiratnam, David P.; Chan, Tommy H. T.; Nguyen, Andy
2018-01-01
Aesthetics and functionality requirements have caused most buildings to be asymmetric in recent times. Such buildings exhibit complex vibration characteristics under dynamic loads as there is coupling between the lateral and torsional components of vibration, and are referred to as torsionally coupled buildings. These buildings require three dimensional modelling and analysis. In spite of much recent research and some successful applications of vibration based damage detection methods to civil structures in recent years, the applications to asymmetric buildings has been a challenging task for structural engineers. There has been relatively little research on detecting and locating damage specific to torsionally coupled asymmetric buildings. This paper aims to compare the difference in vibration behaviour between symmetric and asymmetric buildings and then use the vibration characteristics for predicting damage in them. The need for developing a special method to detect damage in asymmetric buildings thus becomes evident. Towards this end, this paper modifies the traditional modal strain energy based damage index by decomposing the mode shapes into their lateral and vertical components and to form component specific damage indices. The improved approach is then developed by combining the modified strain energy based damage indices with the modal flexibility method which was modified to suit three dimensional structures to form a new damage indicator. The procedure is illustrated through numerical studies conducted on three dimensional five-story symmetric and asymmetric frame structures with the same layout, after validating the modelling techniques through experimental testing of a laboratory scale asymmetric building model. Vibration parameters obtained from finite element analysis of the intact and damaged building models are then applied into the proposed algorithms for detecting and locating the single and multiple damages in these buildings. The results obtained from a number of different damage scenarios confirm the feasibility of the proposed vibration based damage detection method for three dimensional asymmetric buildings.
NASA Astrophysics Data System (ADS)
Barayan, Olfat Mohammad
A considerable amount of money for high-energy consumption is spent in traditional buildings located in hot climate regions. High-energy consumption is significantly influenced by several causes, including building materials, orientation, mass, and openings' sizes. This paper aims to identify these causes and find practical solutions to reduce the annual cost of bills. For the purpose of this study, simulation research method has been followed. A comparison between two Revit models has also been created to point out the major cause of high-energy consumption. By analysing different orientations, wall insulation, and window glazing and applying some other high performance building techniques, a conclusion was found to confirm that appropriate building materials play a vital role in affecting energy cost. Therefore, the ability to reduce the energy cost by more than 50% in traditional buildings depends on a careful balance of building materials, mass, orientation, and type of window glazing.
Computational fluid dynamic modeling of the summit of Mt. Hopkins for the MMT Observatory
NASA Astrophysics Data System (ADS)
Callahan, S.
2010-07-01
Over the past three decades, the staff of the MMT observatory used a variety of techniques to predict the summit wind characteristics including wind tunnel modeling and the release of smoke bombs. With the planned addition of a new instrument repair facility to be constructed on the summit of Mt. Hopkins, new computational fluid dynamic (CFD) models were made to determine the building's influence on the thermal environment around the telescope. The models compared the wind profiles and density contours above the telescope enclosure with and without the new building. The results show the steeply-sided Mount Hopkins dominates the summit wind profiles. In typical winds, the height of the telescope remains above the ground layer and is sufficiently separated from the new facility to insure the heat from the new building does not interfere with the telescope. The results also confirmed the observatories waste heat exhaust duct location needs to be relocated to prevent heat from being trapped in the wind shadow of the new building and lofting above the telescope. These useful models provide many insights into understanding the thermal environment of the summit.
OPEN AIR DEMOLITION OF FACILITIES HIGHLY CONTAMINATED WITH PLUTONIUM
DOE Office of Scientific and Technical Information (OSTI.GOV)
LLOYD, E.R.
2007-05-31
The demolition of highly contaminated plutonium buildings usually is a long and expensive process that involves decontaminating the building to near free- release standards and then using conventional methods to remove the structure. It doesn't, however, have to be that way. Fluor has torn down buildings highly contaminated with plutonium without excessive decontamination. By removing the select source term and fixing the remaining contamination on the walls, ceilings, floors, and equipment surfaces; open-air demolition is not only feasible, but it can be done cheaper, better (safer), and faster. Open-air demolition techniques were used to demolish two highly contaminated buildings tomore » slab-on-grade. These facilities on the Department of Energy's Hanford Site were located in, or very near, compounds of operating nuclear facilities that housed hundreds of people working on a daily basis. To keep the facilities operating and the personnel safe, the projects had to be creative in demolishing the structures. Several key techniques were used to control contamination and keep it within the confines of the demolition area: spraying fixatives before demolition; applying fixative and misting with a fine spray of water as the buildings were being taken down; and demolishing the buildings in a controlled and methodical manner. In addition, detailed air-dispersion modeling was done to establish necessary building and meteorological conditions and to confirm the adequacy of the proposed methods. Both demolition projects were accomplished without any spread of contamination outside the modest buffer areas established for contamination control. Furthermore, personnel exposure to radiological and physical hazards was significantly reduced by using heavy equipment rather than ''hands on'' techniques.« less
NASA Astrophysics Data System (ADS)
de Marzo, C. N.
2002-06-01
Neutrino astronomy is one of the frontier of the high energy astrophysics. I discuss how to build a neutrino telescope and which requirements such a detector must fulfil. A measurable flux of astrophysical neutrinos is predicted by several models for a detector at the cubic kilometer scale. The way pursued until now in building such huge apparatuses is Cherenkov light detection in water or in ice. There have been attempts to build neutrino telescopes and also some projects are yet under construction or under way to start. This situation is reviewed and also techniques alternatives to the Cherenkov light detection are mentioned.
Ancient techniques for new materials
NASA Technical Reports Server (NTRS)
2000-01-01
NASA is looking to biological techniques that are millions of years old to help it develop new materials and technologies for the 21st century. Sponsored by NASA, Jeffrey Brinker of the University of New Mexico is studying how multiple elements can assemble themselves into a composite material that is clear, tough, and impermeable. His research is based on the model of how an abalone builds the nacre, also called mother-of-pearl, inside its shell. The mollusk layers bricks of calcium carbonate (the main ingredient in classroom chalk) and mortar of biopolymer to form a new material (top and bottom left) that is twice as hard and 1,000 times as tough as either of the original building materials.
1993-02-10
new technology is to have sufficient control of processing to *- describable by an appropriate elecromagnetic model . build useful devices. For example...3. W aveguide Modulators .................................. 7 B. Integrated Optical Device and Circuit Modeling ... ................... .. 10 C...following categories: A. Integrated Optical Devices and Technology B. Integrated Optical Device and Circuit Modeling C. Cryogenic Etching for Low
Mathematical Models in Educational Planning. Education and Development, Technical Reports.
ERIC Educational Resources Information Center
Organisation for Economic Cooperation and Development, Paris (France).
This volume contains papers, presented at a 1966 OECD meeting, on the possibilities of applying a number of related techniques such as mathematical model building, simulation, and systematic control theory to the problems of educational planning. The authors and their papers are (1) Richard Stone, "A View of the Conference," (2) Hector…
Parent Resources during Adolescence: Effects on Education and Careers in Young Adulthood
ERIC Educational Resources Information Center
Faas, Caitlin; Benson, Mark J.; Kaestle, Christine E.
2013-01-01
Building on the Wisconsin Model of Status Attainment, this study examined the contextual process of obtaining educational attainment and the subsequent work outcomes and career satisfaction. This study used the National Longitudinal Study of Adolescent Health (Add Health) with structural equation modeling techniques to assess US participants from…
Consensus Modeling in Support of a Semi-Automated Read-Across Application (SOT)
Read-across is a widely used technique to help fill data gaps in a risk assessment. With the increasing availability of large amounts of computable in vitro and in vivo data on chemicals, it should be possible to build a variety of computer models to help guide a risk assessor in...
Teaching Tip: Using Rapid Game Prototyping for Exploring Requirements Discovery and Modeling
ERIC Educational Resources Information Center
Dalal, Nikunj
2012-01-01
We describe the use of rapid game prototyping as a pedagogic technique to experientially explore and learn requirements discovery, modeling, and specification in systems analysis and design courses. Students have a natural interest in gaming that transcends age, gender, and background. Rapid digital game creation is used to build computer games…
Advanced statistics: linear regression, part II: multiple linear regression.
Marill, Keith A
2004-01-01
The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.
NASA Astrophysics Data System (ADS)
Yamamoto, K.; Smith, MC
2016-09-01
This paper studies the problem of passive control of a multi-storey building subjected to an earthquake disturbance. The building is represented as a homogeneous mass chain model, i.e., a chain of identical masses in which there is an identical passive connection between neighbouring masses and a similar connection to a movable point. The paper considers passive interconnections of the most general type, which may require the use of inerters in addition to springs and dampers. It is shown that the scalar transfer functions from the disturbance to a given inter-storey drift can be represented as complex iterative maps. Using these expressions, two graphical approaches are proposed: one gives a method to achieve a prescribed value for the uniform boundedness of these transfer functions independent of the length of the mass chain, and the other is for a fixed length of the mass chain. A case study is presented to demonstrate the effectiveness of the proposed techniques using a 10-storey building model. The disturbance suppression performance of the designed interconnection is also verified for a 10-storey building model which has a different stiffness distribution but with the same undamped first natural frequency as the homogeneous model.
Organism-level models: When mechanisms and statistics fail us
NASA Astrophysics Data System (ADS)
Phillips, M. H.; Meyer, J.; Smith, W. P.; Rockhill, J. K.
2014-03-01
Purpose: To describe the unique characteristics of models that represent the entire course of radiation therapy at the organism level and to highlight the uses to which such models can be put. Methods: At the level of an organism, traditional model-building runs into severe difficulties. We do not have sufficient knowledge to devise a complete biochemistry-based model. Statistical model-building fails due to the vast number of variables and the inability to control many of them in any meaningful way. Finally, building surrogate models, such as animal-based models, can result in excluding some of the most critical variables. Bayesian probabilistic models (Bayesian networks) provide a useful alternative that have the advantages of being mathematically rigorous, incorporating the knowledge that we do have, and being practical. Results: Bayesian networks representing radiation therapy pathways for prostate cancer and head & neck cancer were used to highlight the important aspects of such models and some techniques of model-building. A more specific model representing the treatment of occult lymph nodes in head & neck cancer were provided as an example of how such a model can inform clinical decisions. A model of the possible role of PET imaging in brain cancer was used to illustrate the means by which clinical trials can be modelled in order to come up with a trial design that will have meaningful outcomes. Conclusions: Probabilistic models are currently the most useful approach to representing the entire therapy outcome process.
Execution models for mapping programs onto distributed memory parallel computers
NASA Technical Reports Server (NTRS)
Sussman, Alan
1992-01-01
The problem of exploiting the parallelism available in a program to efficiently employ the resources of the target machine is addressed. The problem is discussed in the context of building a mapping compiler for a distributed memory parallel machine. The paper describes using execution models to drive the process of mapping a program in the most efficient way onto a particular machine. Through analysis of the execution models for several mapping techniques for one class of programs, we show that the selection of the best technique for a particular program instance can make a significant difference in performance. On the other hand, the results of benchmarks from an implementation of a mapping compiler show that our execution models are accurate enough to select the best mapping technique for a given program.
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.
INDUCTIVE SYSTEM HEALTH MONITORING WITH STATISTICAL METRICS
NASA Technical Reports Server (NTRS)
Iverson, David L.
2005-01-01
Model-based reasoning is a powerful method for performing system monitoring and diagnosis. Building models for model-based reasoning is often a difficult and time consuming process. The Inductive Monitoring System (IMS) software was developed to provide a technique to automatically produce health monitoring knowledge bases for systems that are either difficult to model (simulate) with a computer or which require computer models that are too complex to use for real time monitoring. IMS processes nominal data sets collected either directly from the system or from simulations to build a knowledge base that can be used to detect anomalous behavior in the system. Machine learning and data mining techniques are used to characterize typical system behavior by extracting general classes of nominal data from archived data sets. In particular, a clustering algorithm forms groups of nominal values for sets of related parameters. This establishes constraints on those parameter values that should hold during nominal operation. During monitoring, IMS provides a statistically weighted measure of the deviation of current system behavior from the established normal baseline. If the deviation increases beyond the expected level, an anomaly is suspected, prompting further investigation by an operator or automated system. IMS has shown potential to be an effective, low cost technique to produce system monitoring capability for a variety of applications. We describe the training and system health monitoring techniques of IMS. We also present the application of IMS to a data set from the Space Shuttle Columbia STS-107 flight. IMS was able to detect an anomaly in the launch telemetry shortly after a foam impact damaged Columbia's thermal protection system.
Hiyama, Kyosuke
2015-01-01
Applying data mining techniques on a database of BIM models could provide valuable insights in key design patterns implicitly present in these BIM models. The architectural designer would then be able to use previous data from existing building projects as default values in building performance simulation software for the early phases of building design. The author has proposed the method to minimize the magnitude of the variation in these default values in subsequent design stages. This approach maintains the accuracy of the simulation results in the initial stages of building design. In this study, a more convincing argument is presented to demonstrate the significance of the new method. The variation in the ideal default values for different building design conditions is assessed first. Next, the influence of each condition on these variations is investigated. The space depth is found to have a large impact on the ideal default value of the window to wall ratio. In addition, the presence or absence of lighting control and natural ventilation has a significant influence on the ideal default value. These effects can be used to identify the types of building conditions that should be considered to determine the ideal default values.
2015-01-01
Applying data mining techniques on a database of BIM models could provide valuable insights in key design patterns implicitly present in these BIM models. The architectural designer would then be able to use previous data from existing building projects as default values in building performance simulation software for the early phases of building design. The author has proposed the method to minimize the magnitude of the variation in these default values in subsequent design stages. This approach maintains the accuracy of the simulation results in the initial stages of building design. In this study, a more convincing argument is presented to demonstrate the significance of the new method. The variation in the ideal default values for different building design conditions is assessed first. Next, the influence of each condition on these variations is investigated. The space depth is found to have a large impact on the ideal default value of the window to wall ratio. In addition, the presence or absence of lighting control and natural ventilation has a significant influence on the ideal default value. These effects can be used to identify the types of building conditions that should be considered to determine the ideal default values. PMID:26090512
Learning Petri net models of non-linear gene interactions.
Mayo, Michael
2005-10-01
Understanding how an individual's genetic make-up influences their risk of disease is a problem of paramount importance. Although machine-learning techniques are able to uncover the relationships between genotype and disease, the problem of automatically building the best biochemical model or "explanation" of the relationship has received less attention. In this paper, I describe a method based on random hill climbing that automatically builds Petri net models of non-linear (or multi-factorial) disease-causing gene-gene interactions. Petri nets are a suitable formalism for this problem, because they are used to model concurrent, dynamic processes analogous to biochemical reaction networks. I show that this method is routinely able to identify perfect Petri net models for three disease-causing gene-gene interactions recently reported in the literature.
Reduced modeling of signal transduction – a modular approach
Koschorreck, Markus; Conzelmann, Holger; Ebert, Sybille; Ederer, Michael; Gilles, Ernst Dieter
2007-01-01
Background Combinatorial complexity is a challenging problem in detailed and mechanistic mathematical modeling of signal transduction. This subject has been discussed intensively and a lot of progress has been made within the last few years. A software tool (BioNetGen) was developed which allows an automatic rule-based set-up of mechanistic model equations. In many cases these models can be reduced by an exact domain-oriented lumping technique. However, the resulting models can still consist of a very large number of differential equations. Results We introduce a new reduction technique, which allows building modularized and highly reduced models. Compared to existing approaches further reduction of signal transduction networks is possible. The method also provides a new modularization criterion, which allows to dissect the model into smaller modules that are called layers and can be modeled independently. Hallmarks of the approach are conservation relations within each layer and connection of layers by signal flows instead of mass flows. The reduced model can be formulated directly without previous generation of detailed model equations. It can be understood and interpreted intuitively, as model variables are macroscopic quantities that are converted by rates following simple kinetics. The proposed technique is applicable without using complex mathematical tools and even without detailed knowledge of the mathematical background. However, we provide a detailed mathematical analysis to show performance and limitations of the method. For physiologically relevant parameter domains the transient as well as the stationary errors caused by the reduction are negligible. Conclusion The new layer based reduced modeling method allows building modularized and strongly reduced models of signal transduction networks. Reduced model equations can be directly formulated and are intuitively interpretable. Additionally, the method provides very good approximations especially for macroscopic variables. It can be combined with existing reduction methods without any difficulties. PMID:17854494
Procedural Modeling for Rapid-Prototyping of Multiple Building Phases
NASA Astrophysics Data System (ADS)
Saldana, M.; Johanson, C.
2013-02-01
RomeLab is a multidisciplinary working group at UCLA that uses the city of Rome as a laboratory for the exploration of research approaches and dissemination practices centered on the intersection of space and time in antiquity. In this paper we present a multiplatform workflow for the rapid-prototyping of historical cityscapes through the use of geographic information systems, procedural modeling, and interactive game development. Our workflow begins by aggregating archaeological data in a GIS database. Next, 3D building models are generated from the ArcMap shapefiles in Esri CityEngine using procedural modeling techniques. A GIS-based terrain model is also adjusted in CityEngine to fit the building elevations. Finally, the terrain and city models are combined in Unity, a game engine which we used to produce web-based interactive environments which are linked to the GIS data using keyhole markup language (KML). The goal of our workflow is to demonstrate that knowledge generated within a first-person virtual world experience can inform the evaluation of data derived from textual and archaeological sources, and vice versa.
NASA Astrophysics Data System (ADS)
Amado, L.; Osma, G.; Villamizar, R.
2016-07-01
This paper presents the modelling of lighting behaviour of a hybrid lighting system - HLS in inner spaces for tropical climate. HLS aims to mitigate the problem of high electricity consumption used by artificial lighting in buildings. These systems integrate intelligently the daylight and artificial light through control strategies. However, selection of these strategies usually depends on expertise of designer and of available budget. In order to improve the selection process of the control strategies, this paper analyses the Electrical Engineering Building (EEB) case, initially modelling of lighting behaviour is established for the HLS of a classroom and an office. This allows estimating the illuminance level of the mixed lighting in the space, and energy consumption by artificial light according to different lighting control techniques, a control strategy based on occupancy and a combination of them. The model considers the concept of Daylight Factor (DF) for the estimating of daylight illuminance on the work plane for tropical climatic conditions. The validation of the model was carried out by comparing the measured and model-estimated indoor illuminances.
The Buccaneer software for automated model building. 1. Tracing protein chains.
Cowtan, Kevin
2006-09-01
A new technique for the automated tracing of protein chains in experimental electron-density maps is described. The technique relies on the repeated application of an oriented electron-density likelihood target function to identify likely C(alpha) positions. This function is applied both in the location of a few promising ;seed' positions in the map and to grow those initial C(alpha) positions into extended chain fragments. Techniques for assembling the chain fragments into an initial chain trace are discussed.
NASA Astrophysics Data System (ADS)
Console, R.; Greco, M.; Colangelo, A.; Cioè, A.; Trivigno, L.; Chiappini, M.; Ponzo, F.
2015-12-01
Recognizing that the Italian territory is prone to disasters in connection with seismic and hydro-geological risk, it has become necessary to define novel regulations and viable solutions aimed at conveying the economical resources of the Italian Government, too often utilized for the management of post-event situations, towards prevention activities. The work synthetically presents the project developed by the CGIAM together with the INGV, and open to collaboration with other Italian and International partners. This project is aimed at the development of a National System for prevention and mitigation of the earthquakes damages, through the definition of a model that achieves the mitigation of the building collapsing risk and the consequent reduction of casualties. Such a model is based on two main issues a) a correct evaluation of risk, defined as a reliable assessment of the hazard expected at a given site and of the vulnerability of civil and industrial buildings, b) setting up of novel strategies for the safety of buildings. The hazard assessment is pursued through the application of innovative multidisciplinary geophysical methodologies and the application of a physically based earthquake simulator. The structural vulnerability of buildings is estimated by means of simplified techniques based on few representative parameters (such as different structural typologies, dynamic soil-structure interaction, etc.) and, for detailed studies, standard protocols for model updating techniques. We analyze, through numerical and experimental approaches, new solutions for the use of innovative materials, and new techniques for the reduction of seismic vulnerability of structural, non-structural and accessorial elements, including low cost type. The project activities are initially implemented on a study area in Southern Italy (Calabria) selected because of its tectonic complexity. The results are expected to be applicable for other hazardous seismic areas of Italy.
A framework for automatic feature extraction from airborne light detection and ranging data
NASA Astrophysics Data System (ADS)
Yan, Jianhua
Recent advances in airborne Light Detection and Ranging (LIDAR) technology allow rapid and inexpensive measurements of topography over large areas. Airborne LIDAR systems usually return a 3-dimensional cloud of point measurements from reflective objects scanned by the laser beneath the flight path. This technology is becoming a primary method for extracting information of different kinds of geometrical objects, such as high-resolution digital terrain models (DTMs), buildings and trees, etc. In the past decade, LIDAR gets more and more interest from researchers in the field of remote sensing and GIS. Compared to the traditional data sources, such as aerial photography and satellite images, LIDAR measurements are not influenced by sun shadow and relief displacement. However, voluminous data pose a new challenge for automated extraction the geometrical information from LIDAR measurements because many raster image processing techniques cannot be directly applied to irregularly spaced LIDAR points. In this dissertation, a framework is proposed to filter out information about different kinds of geometrical objects, such as terrain and buildings from LIDAR automatically. They are essential to numerous applications such as flood modeling, landslide prediction and hurricane animation. The framework consists of several intuitive algorithms. Firstly, a progressive morphological filter was developed to detect non-ground LIDAR measurements. By gradually increasing the window size and elevation difference threshold of the filter, the measurements of vehicles, vegetation, and buildings are removed, while ground data are preserved. Then, building measurements are identified from no-ground measurements using a region growing algorithm based on the plane-fitting technique. Raw footprints for segmented building measurements are derived by connecting boundary points and are further simplified and adjusted by several proposed operations to remove noise, which is caused by irregularly spaced LIDAR measurements. To reconstruct 3D building models, the raw 2D topology of each building is first extracted and then further adjusted. Since the adjusting operations for simple building models do not work well on 2D topology, 2D snake algorithm is proposed to adjust 2D topology. The 2D snake algorithm consists of newly defined energy functions for topology adjusting and a linear algorithm to find the minimal energy value of 2D snake problems. Data sets from urbanized areas including large institutional, commercial, and small residential buildings were employed to test the proposed framework. The results demonstrated that the proposed framework achieves a very good performance.
Hong, Nian; Zhu, Panfeng; Liu, An
2017-12-01
Urban road stormwater is an alternative water resource to mitigate water shortage issues in the worldwide. Heavy metals deposited (build-up) on urban road surface can enter road stormwater runoff, undermining stormwater reuse safety. As heavy metal build-up loads perform high variabilities in terms of spatial distribution and is strongly influenced by surrounding land uses, it is essential to develop an approach to identify hot-spots where stormwater runoff could include high heavy metal concentrations and hence cannot be reused if it is not properly treated. This study developed a robust modelling approach to estimating heavy metal build-up loads on urban roads using land use fractions (representing percentages of land uses within a given area) by an artificial neural network (ANN) model technique. Based on the modelling results, a series of heavy metal load spatial distribution maps and a comprehensive ecological risk map were generated. These maps provided a visualization platform to identify priority areas where the stormwater can be safely reused. Additionally, these maps can be utilized as an urban land use planning tool in the context of effective stormwater reuse strategy implementation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Fusion of 3D models derived from TLS and image-based techniques for CH enhanced documentation
NASA Astrophysics Data System (ADS)
Bastonero, P.; Donadio, E.; Chiabrando, F.; Spanò, A.
2014-05-01
Recognizing the various advantages offered by 3D new metric survey technologies in the Cultural Heritage documentation phase, this paper presents some tests of 3D model generation, using different methods, and their possible fusion. With the aim to define potentialities and problems deriving from integration or fusion of metric data acquired with different survey techniques, the elected test case is an outstanding Cultural Heritage item, presenting both widespread and specific complexities connected to the conservation of historical buildings. The site is the Staffarda Abbey, the most relevant evidence of medieval architecture in Piedmont. This application faced one of the most topical architectural issues consisting in the opportunity to study and analyze an object as a whole, from twice location of acquisition sensors, both the terrestrial and the aerial one. In particular, the work consists in the evaluation of chances deriving from a simple union or from the fusion of different 3D cloudmodels of the abbey, achieved by multi-sensor techniques. The aerial survey is based on a photogrammetric RPAS (Remotely piloted aircraft system) flight while the terrestrial acquisition have been fulfilled by laser scanning survey. Both techniques allowed to extract and process different point clouds and to generate consequent 3D continuous models which are characterized by different scale, that is to say different resolutions and diverse contents of details and precisions. Starting from these models, the proposed process, applied to a sample area of the building, aimed to test the generation of a unique 3Dmodel thorough a fusion of different sensor point clouds. Surely, the describing potential and the metric and thematic gains feasible by the final model exceeded those offered by the two detached models.
Associating clinical archetypes through UMLS Metathesaurus term clusters.
Lezcano, Leonardo; Sánchez-Alonso, Salvador; Sicilia, Miguel-Angel
2012-06-01
Clinical archetypes are modular definitions of clinical data, expressed using standard or open constraint-based data models as the CEN EN13606 and openEHR. There is an increasing archetype specification activity that raises the need for techniques to associate archetypes to support better management and user navigation in archetype repositories. This paper reports on a computational technique to generate tentative archetype associations by mapping them through term clusters obtained from the UMLS Metathesaurus. The terms are used to build a bipartite graph model and graph connectivity measures can be used for deriving associations.
Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles
NASA Technical Reports Server (NTRS)
Gamble, Ed
2012-01-01
Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses
Plank, G; Prassl, AJ; Augustin, C
2014-01-01
Despite the evident multiphysics nature of the heart – it is an electrically controlled mechanical pump – most modeling studies considered electrophysiology and mechanics in isolation. In no small part, this is due to the formidable modeling challenges involved in building strongly coupled anatomically accurate and biophyically detailed multi-scale multi-physics models of cardiac electro-mechanics. Among the main challenges are the selection of model components and their adjustments to achieve integration into a consistent organ-scale model, dealing with technical difficulties such as the exchange of data between electro-physiological and mechanical model, particularly when using different spatio-temporal grids for discretization, and, finally, the implementation of advanced numerical techniques to deal with the substantial computational. In this study we report on progress made in developing a novel modeling framework suited to tackle these challenges. PMID:24043050
Improved Air Combat Awareness; with AESA and Next-Generation Signal Processing
2002-09-01
competence network Building techniques Software development environment Communication Computer architecture Modeling Real-time programming Radar...memory access, skewed load and store, 3.2 GB/s BW • Performance: 400 MFLOPS Runtime environment Custom runtime routines Driver routines Hardware
Snapshot imaging polarimeters using spatial modulation
NASA Astrophysics Data System (ADS)
Luo, Haitao
The recent demonstration of a novel snapshot imaging polarimeter using the fringe modulation technique shows a promise in building a compact and moving-parts-free device. As just demonstrated in principle, this technique has not been adequately studied. In the effort of advancing this technique, we build a complete theory framework that can address the key issues regarding the polarization aberrations caused by using the functional elements. With this model, we can have the necessary knowledge in designing, analyzing and optimizing the systems. Also, we propose a broader technique that uses arbitrary modulation instead of sinusoidal fringes, which can give us more engineering freedom and can be the solution of achromatizing the system. In the hardware aspect, several important progresses are made. We extend the polarimeter technique from visible to middle wavelength infrared by using the yttrium vanadate crystals. Also, we incorporate a Savart Plate polarimter into a fundus camera to measure the human eye's retinal retardance, useful information for glaucoma diagnosis. Thirdly, a world-smallest imaging polarimeter is proposed and demonstrated, which may open many applications in security, remote sensing and bioscience.
NECAP 4.1: NASA's Energy-Cost Analysis Program input manual
NASA Technical Reports Server (NTRS)
Jensen, R. N.
1982-01-01
The computer program NECAP (NASA's Energy Cost Analysis Program) is described. The program is a versatile building design and energy analysis tool which has embodied within it state of the art techniques for performing thermal load calculations and energy use predictions. With the program, comparisons of building designs and operational alternatives for new or existing buildings can be made. The major feature of the program is the response factor technique for calculating the heat transfer through the building surfaces which accounts for the building's mass. The program expands the response factor technique into a space response factor to account for internal building temperature swings; this is extremely important in determining true building loads and energy consumption when internal temperatures are allowed to swing.
Multi-scale modeling in cell biology
Meier-Schellersheim, Martin; Fraser, Iain D. C.; Klauschen, Frederick
2009-01-01
Biomedical research frequently involves performing experiments and developing hypotheses that link different scales of biological systems such as, for instance, the scales of intracellular molecular interactions to the scale of cellular behavior and beyond to the behavior of cell populations. Computational modeling efforts that aim at exploring such multi-scale systems quantitatively with the help of simulations have to incorporate several different simulation techniques due to the different time and space scales involved. Here, we provide a non-technical overview of how different scales of experimental research can be combined with the appropriate computational modeling techniques. We also show that current modeling software permits building and simulating multi-scale models without having to become involved with the underlying technical details of computational modeling. PMID:20448808
Use of Advanced Machine-Learning Techniques for Non-Invasive Monitoring of Hemorrhage
2010-04-01
that state-of-the-art machine learning techniques when integrated with novel non-invasive monitoring technologies could detect subtle, physiological...decompensation. Continuous, non-invasively measured hemodynamic signals (e.g., ECG, blood pressures, stroke volume) were used for the development of machine ... learning algorithms. Accuracy estimates were obtained by building models using 27 subjects and testing on the 28th. This process was repeated 28 times
Comparative analysis of economic models in selected solar energy computer programs
NASA Astrophysics Data System (ADS)
Powell, J. W.; Barnes, K. A.
1982-01-01
The economic evaluation models in five computer programs widely used for analyzing solar energy systems (F-CHART 3.0, F-CHART 4.0, SOLCOST, BLAST, and DOE-2) are compared. Differences in analysis techniques and assumptions among the programs are assessed from the point of view of consistency with the Federal requirements for life cycle costing (10 CFR Part 436), effect on predicted economic performance, and optimal system size, case of use, and general applicability to diverse systems types and building types. The FEDSOL program developed by the National Bureau of Standards specifically to meet the Federal life cycle cost requirements serves as a basis for the comparison. Results of the study are illustrated in test cases of two different types of Federally owned buildings: a single family residence and a low rise office building.
Leveraging Modeling Approaches: Reaction Networks and Rules
Blinov, Michael L.; Moraru, Ion I.
2012-01-01
We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high resolution and/or high throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatio-temporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks – the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks. PMID:22161349
Leveraging modeling approaches: reaction networks and rules.
Blinov, Michael L; Moraru, Ion I
2012-01-01
We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high-resolution and/or high-throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatiotemporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks - the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks.
Integration of Infrared Thermography and Photogrammetric Surveying of Built Landscape
NASA Astrophysics Data System (ADS)
Scaioni, M.; Rosina, E.; L'Erario, A.; Dìaz-Vilariño, L.
2017-05-01
The thermal analysis of buildings represents a key-step for reduction of energy consumption, also in the case of Cultural Heritage. Here the complexity of the constructions and the adopted materials might require special analysis and tailored solutions. Infrared Thermography (IRT) is an important non-destructive investigation technique that may aid in the thermal analysis of buildings. The paper reports the application of IRT on a listed building, belonging to the Cultural Heritage and to a residential one, as a demonstration that IRT is a suitable and convenient tool for analysing the existing buildings. The purposes of the analysis are the assessment of the damages and energy efficiency of the building envelope. Since in many cases the complex geometry of historic constructions may involve the thermal analysis, the integration of IRT and accurate 3D models were developed during the latest years. Here authors propose a solution based on the up-to-date photogrammetric solutions for purely image-based 3D modelling, including automatic image orientation/sensor calibration using Structure-from-Motion and dense matching. Thus, an almost fully automatic pipeline for the generation of accurate 3D models showing the temperatures on a building skin in a realistic manner is described, where the only manual task is given by the measurement of a few common points for co-registration of RGB and IR photogrammetric projects.
3D-model building of the jaw impression
NASA Astrophysics Data System (ADS)
Ahmed, Moumen T.; Yamany, Sameh M.; Hemayed, Elsayed E.; Farag, Aly A.
1997-03-01
A novel approach is proposed to obtain a record of the patient's occlusion using computer vision. Data acquisition is obtained using intra-oral video cameras. The technique utilizes shape from shading to extract 3D information from 2D views of the jaw, and a novel technique for 3D data registration using genetic algorithms. The resulting 3D model can be used for diagnosis, treatment planning, and implant purposes. The overall purpose of this research is to develop a model-based vision system for orthodontics to replace traditional approaches. This system will be flexible, accurate, and will reduce the cost of orthodontic treatments.
Global Dynamic Exposure and the OpenBuildingMap
NASA Astrophysics Data System (ADS)
Schorlemmer, D.; Beutin, T.; Hirata, N.; Hao, K. X.; Wyss, M.; Cotton, F.; Prehn, K.
2015-12-01
Detailed understanding of local risk factors regarding natural catastrophes requires in-depth characterization of the local exposure. Current exposure capture techniques have to find the balance between resolution and coverage. We aim at bridging this gap by employing a crowd-sourced approach to exposure capturing focusing on risk related to earthquake hazard. OpenStreetMap (OSM), the rich and constantly growing geographical database, is an ideal foundation for us. More than 2.5 billion geographical nodes, more than 150 million building footprints (growing by ~100'000 per day), and a plethora of information about school, hospital, and other critical facility locations allow us to exploit this dataset for risk-related computations. We will harvest this dataset by collecting exposure and vulnerability indicators from explicitly provided data (e.g. hospital locations), implicitly provided data (e.g. building shapes and positions), and semantically derived data, i.e. interpretation applying expert knowledge. With this approach, we can increase the resolution of existing exposure models from fragility classes distribution via block-by-block specifications to building-by-building vulnerability. To increase coverage, we will provide a framework for collecting building data by any person or community. We will implement a double crowd-sourced approach to bring together the interest and enthusiasm of communities with the knowledge of earthquake and engineering experts. The first crowd-sourced approach aims at collecting building properties in a community by local people and activists. This will be supported by tailored building capture tools for mobile devices for simple and fast building property capturing. The second crowd-sourced approach involves local experts in estimating building vulnerability that will provide building classification rules that translate building properties into vulnerability and exposure indicators as defined in the Building Taxonomy 2.0 developed by the Global Earthquake Model (GEM). These indicators will then be combined with a hazard model using the GEM OpenQuake engine to compute a risk model. The free/open framework we will provide can be used on commodity hardware for local to regional exposure capturing and for communities to understand their earthquake risk.
Additive Manufacturing Techniques for the Reconstruction of 3D Fetal Faces.
Speranza, Domenico; Citro, Daniela; Padula, Francesco; Motyl, Barbara; Marcolin, Federica; Calì, Michele; Martorelli, Massimo
2017-01-01
This paper deals with additive manufacturing techniques for the creation of 3D fetal face models starting from routine 3D ultrasound data. In particular, two distinct themes are addressed. First, a method for processing and building 3D models based on the use of medical image processing techniques is proposed. Second, the preliminary results of a questionnaire distributed to future parents consider the use of these reconstructions both from an emotional and an affective point of view. In particular, the study focuses on the enhancement of the perception of maternity or paternity and the improvement in the relationship between parents and physicians in case of fetal malformations, in particular facial or cleft lip diseases.
NASA Astrophysics Data System (ADS)
Setyaningsih, S.
2018-03-01
Lesson Study for Learning Community is one of lecturer profession building system through collaborative and continuous learning study based on the principles of openness, collegiality, and mutual learning to build learning community in order to form professional learning community. To achieve the above, we need a strategy and learning method with specific subscription technique. This paper provides a description of how the quality of learning in the field of science can be improved by implementing strategies and methods accordingly, namely by applying lesson study for learning community optimally. Initially this research was focused on the study of instructional techniques. Learning method used is learning model Contextual teaching and Learning (CTL) and model of Problem Based Learning (PBL). The results showed that there was a significant increase in competence, attitudes, and psychomotor in the four study programs that were modelled. Therefore, it can be concluded that the implementation of learning strategies in Lesson study for Learning Community is needed to be used to improve the competence, attitude and psychomotor of science students.
Modeling and Analysis of Commercial Building Electrical Loads for Demand Side Management
NASA Astrophysics Data System (ADS)
Berardino, Jonathan
In recent years there has been a push in the electric power industry for more customer involvement in the electricity markets. Traditionally the end user has played a passive role in the planning and operation of the power grid. However, many energy markets have begun opening up opportunities to consumers who wish to commit a certain amount of their electrical load under various demand side management programs. The potential benefits of more demand participation include reduced operating costs and new revenue opportunities for the consumer, as well as more reliable and secure operations for the utilities. The management of these load resources creates challenges and opportunities to the end user that were not present in previous market structures. This work examines the behavior of commercial-type building electrical loads and their capacity for supporting demand side management actions. This work is motivated by the need for accurate and dynamic tools to aid in the advancement of demand side operations. A dynamic load model is proposed for capturing the response of controllable building loads. Building-specific load forecasting techniques are developed, with particular focus paid to the integration of building management system (BMS) information. These approaches are tested using Drexel University building data. The application of building-specific load forecasts and dynamic load modeling to the optimal scheduling of multi-building systems in the energy market is proposed. Sources of potential load uncertainty are introduced in the proposed energy management problem formulation in order to investigate the impact on the resulting load schedule.
Situational awareness for unmanned ground vehicles in semi-structured environments
NASA Astrophysics Data System (ADS)
Goodsell, Thomas G.; Snorrason, Magnus; Stevens, Mark R.
2002-07-01
Situational Awareness (SA) is a critical component of effective autonomous vehicles, reducing operator workload and allowing an operator to command multiple vehicles or simultaneously perform other tasks. Our Scene Estimation & Situational Awareness Mapping Engine (SESAME) provides SA for mobile robots in semi-structured scenes, such as parking lots and city streets. SESAME autonomously builds volumetric models for scene analysis. For example, a SES-AME equipped robot can build a low-resolution 3-D model of a row of cars, then approach a specific car and build a high-resolution model from a few stereo snapshots. The model can be used onboard to determine the type of car and locate its license plate, or the model can be segmented out and sent back to an operator who can view it from different viewpoints. As new views of the scene are obtained, the model is updated and changes are tracked (such as cars arriving or departing). Since the robot's position must be accurately known, SESAME also has automated techniques for deter-mining the position and orientation of the camera (and hence, robot) with respect to existing maps. This paper presents an overview of the SESAME architecture and algorithms, including our model generation algorithm.
Bruce Bagwell, C
2018-01-01
This chapter outlines how to approach the complex tasks associated with designing models for high-dimensional cytometry data. Unlike gating approaches, modeling lends itself to automation and accounts for measurement overlap among cellular populations. Designing these models is now easier because of a new technique called high-definition t-SNE mapping. Nontrivial examples are provided that serve as a guide to create models that are consistent with data.
Phase Diversity and Polarization Augmented Techniques for Active Imaging
2007-03-01
build up a system model for use in algorithm development. 32 IV. Conventional Imaging and Atmospheric Turbulence With an understanding of scalar...28, 59, 115 Cholesky Factorization, 14, 42 C2n, see Turbulence Coherent Image Model, 36 Complete Data, see EM Algorithm Complex Coherence...Data, see EM Algorithm Homotopic, 62 Impulse Response, 34, 44 Incoherent Image Model, 36 Incomplete Data, see EM Algorithm Lo- Turbulence Outer Scale
Modeling the Environmental Impact of Air Traffic Operations
NASA Technical Reports Server (NTRS)
Chen, Neil
2011-01-01
There is increased interest to understand and mitigate the impacts of air traffic on the climate, since greenhouse gases, nitrogen oxides, and contrails generated by air traffic can have adverse impacts on the climate. The models described in this presentation are useful for quantifying these impacts and for studying alternative environmentally aware operational concepts. These models have been developed by leveraging and building upon existing simulation and optimization techniques developed for the design of efficient traffic flow management strategies. Specific enhancements to the existing simulation and optimization techniques include new models that simulate aircraft fuel flow, emissions and contrails. To ensure that these new models are beneficial to the larger climate research community, the outputs of these new models are compatible with existing global climate modeling tools like the FAA's Aviation Environmental Design Tool.
A new technique for the characterization of chaff elements
NASA Astrophysics Data System (ADS)
Scholfield, David; Myat, Maung; Dauby, Jason; Fesler, Jonathon; Bright, Jonathan
2011-07-01
A new technique for the experimental characterization of electromagnetic chaff based on Inverse Synthetic Aperture Radar is presented. This technique allows for the characterization of as few as one filament of chaff in a controlled anechoic environment allowing for stability and repeatability of experimental results. This approach allows for a deeper understanding of the fundamental phenomena of electromagnetic scattering from chaff through an incremental analysis approach. Chaff analysis can now begin with a single element and progress through the build-up of particles into pseudo-cloud structures. This controlled incremental approach is supported by an identical incremental modeling and validation process. Additionally, this technique has the potential to produce considerable savings in financial and schedule cost and provides a stable and repeatable experiment to aid model valuation.
Yang, Hao; Xu, Xiangyang; Neumann, Ingo
2014-11-19
Terrestrial laser scanning technology (TLS) is a new technique for quickly getting three-dimensional information. In this paper we research the health assessment of concrete structures with a Finite Element Method (FEM) model based on TLS. The goal focuses on the benefits of 3D TLS in the generation and calibration of FEM models, in order to build a convenient, efficient and intelligent model which can be widely used for the detection and assessment of bridges, buildings, subways and other objects. After comparing the finite element simulation with surface-based measurement data from TLS, the FEM model is determined to be acceptable with an error of less than 5%. The benefit of TLS lies mainly in the possibility of a surface-based validation of results predicted by the FEM model.
NASA Astrophysics Data System (ADS)
Yongzhi, WANG; hui, WANG; Lixia, LIAO; Dongsen, LI
2017-02-01
In order to analyse the geological characteristics of salt rock and stability of salt caverns, rough three-dimensional (3D) models of salt rock stratum and the 3D models of salt caverns on study areas are built by 3D GIS spatial modeling technique. During implementing, multi-source data, such as basic geographic data, DEM, geological plane map, geological section map, engineering geological data, and sonar data are used. In this study, the 3D spatial analyzing and calculation methods, such as 3D GIS intersection detection method in three-dimensional space, Boolean operations between three-dimensional space entities, three-dimensional space grid discretization, are used to build 3D models on wall rock of salt caverns. Our methods can provide effective calculation models for numerical simulation and analysis of the creep characteristics of wall rock in salt caverns.
Formal Methods for Automated Diagnosis of Autosub 6000
NASA Technical Reports Server (NTRS)
Ernits, Juhan; Dearden, Richard; Pebody, Miles
2009-01-01
This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.
ERIC Educational Resources Information Center
Annulis, Heather M.; Gaudet, Cyndi H.
2007-01-01
A shortage of a qualified and skilled workforce exists to meet the demands of the geospatial industry (NASA, 2002). Solving today's workforce issues requires new and innovative methods and techniques for this high growth, high technology industry. One tool to support workforce development is a competency model which can be used to build a…
Policy Capturing with Local Models: The Application of the AID technique in Modeling Judgment
1972-12-01
or coding phases have upon the derived policy modelo . Particularly important aspects of these subtasks include: 1) Initial identification and coding of...in o c building pJha~sed a.ird the 1 50 a ~pli- atuls f the cr osi - vuljdatiof po[pulationl. Th.is iiicreasv iii aitr ilvatabl to Lxo ba sic fa ctu r
NASA Astrophysics Data System (ADS)
Grazzini, A.; Lacidogna, G.; Valente, S.; Accornero, F.
2018-06-01
Masonry walls of historical buildings are subject to rising damp effects due to capillary or rain infiltrations, which in the time produce decay and delamination of historical plasters. In the restoration of masonry buildings, the plaster detachment frequently occurs because of mechanical incompatibility in repair mortar. An innovative laboratory procedure is described for test mechanical adhesion of new repair mortars. Compression static tests were carried out on composite specimens stone block-repair mortar, which specific geometry can test the de-bonding process of mortar in adherence with a stone masonry structure. The acoustic emission (AE) technique was employed for estimating the amount of energy released from fracture propagation in adherence surface between mortar and stone. A numerical simulation was elaborated based on the cohesive crack model. The evolution of detachment process of mortar in a coupled stone brick-mortar system was analysed by triangulation of AE signals, which can improve the numerical model and predict the type of failure in the adhesion surface of repair plaster. Through the cohesive crack model, it was possible to interpret theoretically the de-bonding phenomena occurring at the interface between stone block and mortar. Therefore, the mechanical behaviour of the interface is characterized.
NASA Astrophysics Data System (ADS)
Xu, Pengpeng
Hotel building is one of the high-energy-consuming building types, and retrofitting hotel buildings is an untapped solution to help cut carbon emissions contributing towards sustainable development. Energy Performance Contracting (EPC) has been promulgated as a market mechanism for the delivery of energy efficiency projects. EPC mechanism has been introduced into China relatively recently, and it has not been implemented successfully in building energy efficiency retrofit projects. The aim of this research is to develop a model for achieving the sustainability of Building Energy Efficiency Retrofit (BEER) in hotel buildings under the Energy Performance Contracting (EPC) mechanism. The objectives include: • To identify a set of Key Performance Indicators (KPIs) for measuring the sustainability of BEER in hotel buildings; • To identify Critical Success Factors (CSFs) under EPC mechanism that have a strong correlation with sustainable BEER project; • To develop a model explaining the relationships between the CSFs and the sustainability performance of BEER in hotel building. Literature reviews revealed the essence of sustainable BEER and EPC, which help to develop a conceptual framework for analyzing sustainable BEER under EPC mechanism in hotel buildings. 11 potential KPIs for sustainable BEER and 28 success factors of EPC were selected based on the developed framework. A questionnaire survey was conducted to ascertain the importance of selected performance indicators and success factors. Fuzzy set theory was adopted in identifying the KPIs. Six KPIs were identified from the 11 selected performance indicators. Through a questionnaire survey, out of the 28 success factors, 21 Critical Success Factors (CSFs) were also indentified. Using the factor analysis technique, the 21 identified CSFs in this study were grouped into six clusters to help explain project success of sustainable BEER. Finally, AHP/ANP approach was used in this research to develop a model to examine the interrelationships among the identified CSFs, KPIs, and sustainable dimensions of BEER. The findings indicate that the success of sustainable BEER in hotel buildings under the EPC mechanism is mainly decided by project objectives control mechanism, available technology, organizing capacity of team leader, trust among partners, accurate M&V, and team workers' technical skills.
NASA Astrophysics Data System (ADS)
Osman, Ayat E.
Energy use in commercial buildings constitutes a major proportion of the energy consumption and anthropogenic emissions in the USA. Cogeneration systems offer an opportunity to meet a building's electrical and thermal demands from a single energy source. To answer the question of what is the most beneficial and cost effective energy source(s) that can be used to meet the energy demands of the building, optimizations techniques have been implemented in some studies to find the optimum energy system based on reducing cost and maximizing revenues. Due to the significant environmental impacts that can result from meeting the energy demands in buildings, building design should incorporate environmental criteria in the decision making criteria. The objective of this research is to develop a framework and model to optimize a building's operation by integrating congregation systems and utility systems in order to meet the electrical, heating, and cooling demand by considering the potential life cycle environmental impact that might result from meeting those demands as well as the economical implications. Two LCA Optimization models have been developed within a framework that uses hourly building energy data, life cycle assessment (LCA), and mixed-integer linear programming (MILP). The objective functions that are used in the formulation of the problems include: (1) Minimizing life cycle primary energy consumption, (2) Minimizing global warming potential, (3) Minimizing tropospheric ozone precursor potential, (4) Minimizing acidification potential, (5) Minimizing NOx, SO 2 and CO2, and (6) Minimizing life cycle costs, considering a study period of ten years and the lifetime of equipment. The two LCA optimization models can be used for: (a) long term planning and operational analysis in buildings by analyzing the hourly energy use of a building during a day and (b) design and quick analysis of building operation based on periodic analysis of energy use of a building in a year. A Pareto-optimal frontier is also derived, which defines the minimum cost required to achieve any level of environmental emission or primary energy usage value or inversely the minimum environmental indicator and primary energy usage value that can be achieved and the cost required to achieve that value.
Streamflow characterization using functional data analysis of the Potomac River
NASA Astrophysics Data System (ADS)
Zelmanow, A.; Maslova, I.; Ticlavilca, A. M.; McKee, M.
2013-12-01
Flooding and droughts are extreme hydrological events that affect the United States economically and socially. The severity and unpredictability of flooding has caused billions of dollars in damage and the loss of lives in the eastern United States. In this context, there is an urgent need to build a firm scientific basis for adaptation by developing and applying new modeling techniques for accurate streamflow characterization and reliable hydrological forecasting. The goal of this analysis is to use numerical streamflow characteristics in order to classify, model, and estimate the likelihood of extreme events in the eastern United States, mainly the Potomac River. Functional data analysis techniques are used to study yearly streamflow patterns, with the extreme streamflow events characterized via functional principal component analysis. These methods are merged with more classical techniques such as cluster analysis, classification analysis, and time series modeling. The developed functional data analysis approach is used to model continuous streamflow hydrographs. The forecasting potential of this technique is explored by incorporating climate factors to produce a yearly streamflow outlook.
Sha, Chenyuan; Wang, Xuemei; Lin, Yuanyuan; Fan, Yifan; Chen, Xi; Hang, Jian
2018-08-15
Sustainable urban design is an effective way to improve urban ventilation and reduce vehicular pollutant exposure to urban residents. This paper investigated the impacts of urban open space and 'lift-up' building design on vehicular CO (carbon monoxide) exposure in typical three-dimensional (3D) urban canopy layer (UCL) models under neutral atmospheric conditions. The building intake fraction (IF) represents the fraction of total vehicular pollutant emissions inhaled by residents when they stay at home. The building daily CO exposure (E t ) means the extent of human beings' contact with CO within one day indoor at home. Computational fluid dynamics (CFD) simulations integrating with these two concepts were performed to solve turbulent flow and assess vehicular CO exposure to urban residents. CFD technique with the standard k-ε model was successfully validated by wind tunnel data. The initial numerical UCL model consists of 5-row and 5-column (5×5) cubic buildings (building height H=street width W=30m) with four approaching wind directions (θ=0°, 15°, 30°, 45°). In Group I, one of the 25 building models is removed to attain urban open space settings. In Group II, the first floor (Lift-up1), or second floor (Lift-up2), or third floor (Lift-up3) of all buildings is elevated respectively to create wind pathways through buildings. Compared to the initial case, urban open space can slightly or significantly reduce pollutant exposure for urban residents. As θ=30° and 45°, open space settings are more effective to reduce pollutant exposure than θ=0° and 15°.The pollutant dilution near or surrounding open space and in its adjacent downstream regions is usually enhanced. Lift-up1 and Lift-up2 experience much greater pollutant exposure reduction in all wind directions than Lift-up3 and open space. Although further investigations are still required to provide practical guidelines, this study is one of the first attempts for reducing urban pollutant exposure by improving urban design. Copyright © 2018. Published by Elsevier B.V.
Interior thermal insulation systems for historical building envelopes
NASA Astrophysics Data System (ADS)
Jerman, Miloš; Solař, Miloš; Černý, Robert
2017-11-01
The design specifics of interior thermal insulation systems applied for historical building envelopes are described. The vapor-tight systems and systems based on capillary thermal insulation materials are taken into account as two basic options differing in building-physical considerations. The possibilities of hygrothermal analysis of renovated historical envelopes including laboratory methods, computer simulation techniques, and in-situ tests are discussed. It is concluded that the application of computational models for hygrothermal assessment of interior thermal insulation systems should always be performed with a particular care. On one hand, they present a very effective tool for both service life assessment and possible planning of subsequent reconstructions. On the other, the hygrothermal analysis of any historical building can involve quite a few potential uncertainties which may affect negatively the accuracy of obtained results.
Osman, Reham B; Alharbi, Nawal; Wismeijer, Daniel
The aim of this study was to evaluate the effect of the build orientation/build angle on the dimensional accuracy of full-coverage dental restorations manufactured using digital light-processing technology (DLP-AM). A full dental crown was digitally designed and 3D-printed using DLP-AM. Nine build angles were used: 90, 120, 135, 150, 180, 210, 225, 240, and 270 degrees. The specimens were digitally scanned using a high-resolution optical surface scanner (IScan D104i, Imetric). Dimensional accuracy was evaluated using the digital subtraction technique. The 3D digital files of the scanned printed crowns (test model) were exported in standard tessellation language (STL) format and superimposed on the STL file of the designed crown [reference model] using Geomagic Studio 2014 (3D Systems). The root mean square estimate (RMSE) values were evaluated, and the deviation patterns on the color maps were further assessed. The build angle influenced the dimensional accuracy of 3D-printed restorations. The lowest RMSE was recorded for the 135-degree and 210-degree build angles. However, the overall deviation pattern on the color map was more favorable with the 135-degree build angle in contrast with the 210-degree build angle where the deviation was observed around the critical marginal area. Within the limitations of this study, the recommended build angle using the current DLP system was 135 degrees. Among the selected build angles, it offers the highest dimensional accuracy and the most favorable deviation pattern. It also offers a self-supporting crown geometry throughout the building process.
Wastewater quality monitoring system using sensor fusion and machine learning techniques.
Qin, Xusong; Gao, Furong; Chen, Guohua
2012-03-15
A multi-sensor water quality monitoring system incorporating an UV/Vis spectrometer and a turbidimeter was used to monitor the Chemical Oxygen Demand (COD), Total Suspended Solids (TSS) and Oil & Grease (O&G) concentrations of the effluents from the Chinese restaurant on campus and an electrocoagulation-electroflotation (EC-EF) pilot plant. In order to handle the noise and information unbalance in the fused UV/Vis spectra and turbidity measurements during the calibration model building, an improved boosting method, Boosting-Iterative Predictor Weighting-Partial Least Squares (Boosting-IPW-PLS), was developed in the present study. The Boosting-IPW-PLS method incorporates IPW into boosting scheme to suppress the quality-irrelevant variables by assigning small weights, and builds up the models for the wastewater quality predictions based on the weighted variables. The monitoring system was tested in the field with satisfactory results, underlying the potential of this technique for the online monitoring of water quality. Copyright © 2011 Elsevier Ltd. All rights reserved.
An integrated fuzzy approach for strategic alliance partner selection in third-party logistics.
Erkayman, Burak; Gundogar, Emin; Yilmaz, Aysegul
2012-01-01
Outsourcing some of the logistic activities is a useful strategy for companies in recent years. This makes it possible for firms to concentrate on their main issues and processes and presents facility to improve logistics performance, to reduce costs, and to improve quality. Therefore provider selection and evaluation in third-party logistics become important activities for companies. Making a strategic decision like this is significantly hard and crucial. In this study we proposed a fuzzy multicriteria decision making (MCDM) approach to effectively select the most appropriate provider. First we identify the provider selection criteria and build the hierarchical structure of decision model. After building the hierarchical structure we determined the selection criteria weights by using fuzzy analytical hierarchy process (AHP) technique. Then we applied fuzzy technique for order preference by similarity to ideal solution (TOPSIS) to obtain final rankings for providers. And finally an illustrative example is also given to demonstrate the effectiveness of the proposed model.
An Integrated Fuzzy Approach for Strategic Alliance Partner Selection in Third-Party Logistics
Gundogar, Emin; Yılmaz, Aysegul
2012-01-01
Outsourcing some of the logistic activities is a useful strategy for companies in recent years. This makes it possible for firms to concentrate on their main issues and processes and presents facility to improve logistics performance, to reduce costs, and to improve quality. Therefore provider selection and evaluation in third-party logistics become important activities for companies. Making a strategic decision like this is significantly hard and crucial. In this study we proposed a fuzzy multicriteria decision making (MCDM) approach to effectively select the most appropriate provider. First we identify the provider selection criteria and build the hierarchical structure of decision model. After building the hierarchical structure we determined the selection criteria weights by using fuzzy analytical hierarchy process (AHP) technique. Then we applied fuzzy technique for order preference by similarity to ideal solution (TOPSIS) to obtain final rankings for providers. And finally an illustrative example is also given to demonstrate the effectiveness of the proposed model. PMID:23365520
Suchting, Robert; Gowin, Joshua L; Green, Charles E; Walss-Bass, Consuelo; Lane, Scott D
2018-01-01
Rationale : Given datasets with a large or diverse set of predictors of aggression, machine learning (ML) provides efficient tools for identifying the most salient variables and building a parsimonious statistical model. ML techniques permit efficient exploration of data, have not been widely used in aggression research, and may have utility for those seeking prediction of aggressive behavior. Objectives : The present study examined predictors of aggression and constructed an optimized model using ML techniques. Predictors were derived from a dataset that included demographic, psychometric and genetic predictors, specifically FK506 binding protein 5 (FKBP5) polymorphisms, which have been shown to alter response to threatening stimuli, but have not been tested as predictors of aggressive behavior in adults. Methods : The data analysis approach utilized component-wise gradient boosting and model reduction via backward elimination to: (a) select variables from an initial set of 20 to build a model of trait aggression; and then (b) reduce that model to maximize parsimony and generalizability. Results : From a dataset of N = 47 participants, component-wise gradient boosting selected 8 of 20 possible predictors to model Buss-Perry Aggression Questionnaire (BPAQ) total score, with R 2 = 0.66. This model was simplified using backward elimination, retaining six predictors: smoking status, psychopathy (interpersonal manipulation and callous affect), childhood trauma (physical abuse and neglect), and the FKBP5_13 gene (rs1360780). The six-factor model approximated the initial eight-factor model at 99.4% of R 2 . Conclusions : Using an inductive data science approach, the gradient boosting model identified predictors consistent with previous experimental work in aggression; specifically psychopathy and trauma exposure. Additionally, allelic variants in FKBP5 were identified for the first time, but the relatively small sample size limits generality of results and calls for replication. This approach provides utility for the prediction of aggression behavior, particularly in the context of large multivariate datasets.
UNDERSTANDING FLOW OF ENERGY IN BUILDINGS USING MODAL ANALYSIS METHODOLOGY
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Gardner; Kevin Heglund; Kevin Van Den Wymelenberg
2013-07-01
It is widely understood that energy storage is the key to integrating variable generators into the grid. It has been proposed that the thermal mass of buildings could be used as a distributed energy storage solution and several researchers are making headway in this problem. However, the inability to easily determine the magnitude of the building’s effective thermal mass, and how the heating ventilation and air conditioning (HVAC) system exchanges thermal energy with it, is a significant challenge to designing systems which utilize this storage mechanism. In this paper we adapt modal analysis methods used in mechanical structures to identifymore » the primary modes of energy transfer among thermal masses in a building. The paper describes the technique using data from an idealized building model. The approach is successfully applied to actual temperature data from a commercial building in downtown Boise, Idaho.« less
NASA Astrophysics Data System (ADS)
Bitelli, G.; Dellapasqua, M.; Girelli, V. A.; Sanchini, E.; Tini, M. A.
2017-05-01
The modern Geomatics techniques, such as Terrestrial Laser Scanner (TLS) and multi-view Structure from Motion (SfM), are gaining more and more interest in the Cultural Heritage field. All the data acquired with these technologies could be stored and managed together with other information in a Historical Building Information Model (HBIM). In this paper, it will be shown the case study of the San Michele in Acerboli's church, located in Santarcangelo di Romagna, Italy. This church, dated about the 6th century A.D., represents a high relevant Romanic building of the high Medieval period. The building presents an irregular square plan with a different length of the lateral brick walls and a consequential oblique one in correspondence of the apse. Nevertheless, the different lengths of the lateral brick walls are balanced thanks to the irregular spaces between the windows. Different changes occurred during the centuries, such as the closing of the seven main doors and the building of the bell tower, in the 11th century A.D., which is nowadays the main entrance of the church. An integrated survey was realized, covering the exterior and the interior. The final 3D model represents a valid support not only for documentation, but also to maintain and manage in an integrate approach the available knowledge of this Cultural Heritage site, developing a HBIM system in which all the mentioned historical, geometrical, material matters are collected.
2015-01-01
Among the recent data mining techniques available, the boosting approach has attracted a great deal of attention because of its effective learning algorithm and strong boundaries in terms of its generalization performance. However, the boosting approach has yet to be used in regression problems within the construction domain, including cost estimations, but has been actively utilized in other domains. Therefore, a boosting regression tree (BRT) is applied to cost estimations at the early stage of a construction project to examine the applicability of the boosting approach to a regression problem within the construction domain. To evaluate the performance of the BRT model, its performance was compared with that of a neural network (NN) model, which has been proven to have a high performance in cost estimation domains. The BRT model has shown results similar to those of NN model using 234 actual cost datasets of a building construction project. In addition, the BRT model can provide additional information such as the importance plot and structure model, which can support estimators in comprehending the decision making process. Consequently, the boosting approach has potential applicability in preliminary cost estimations in a building construction project. PMID:26339227
Shin, Yoonseok
2015-01-01
Among the recent data mining techniques available, the boosting approach has attracted a great deal of attention because of its effective learning algorithm and strong boundaries in terms of its generalization performance. However, the boosting approach has yet to be used in regression problems within the construction domain, including cost estimations, but has been actively utilized in other domains. Therefore, a boosting regression tree (BRT) is applied to cost estimations at the early stage of a construction project to examine the applicability of the boosting approach to a regression problem within the construction domain. To evaluate the performance of the BRT model, its performance was compared with that of a neural network (NN) model, which has been proven to have a high performance in cost estimation domains. The BRT model has shown results similar to those of NN model using 234 actual cost datasets of a building construction project. In addition, the BRT model can provide additional information such as the importance plot and structure model, which can support estimators in comprehending the decision making process. Consequently, the boosting approach has potential applicability in preliminary cost estimations in a building construction project.
Numerical Modelling of Connections Between Stones in Foundations of Historical Buildings
NASA Astrophysics Data System (ADS)
Przewlocki, Jaroslaw; Zielinska, Monika; Grebowski, Karol
2017-12-01
The aim of this paper is to analyse the behaviour of old building foundations composed of stones (the main load-bearing elements) and mortar, based on numerical analysis. Some basic aspects of historical foundations are briefly discussed, with an emphasis on their development, techniques, and material. The behaviour of a foundation subjected to the loads transmitted from the upper parts of the structure is described using the finite element method (FEM). The main problems in analysing the foundations of historical buildings are determining the characteristics of the materials and the degree of degradation of the mortar, which is the weakest part of the foundation. Mortar is graded using the damaged-plastic model. In this model, exceeding the bearing capacity occurs due to the degradation of materials. The damaged-plastic model is the most accurate model describing the work and properties of mortar because it shows exactly what happens with this material throughout its total load history. For a uniformly loaded fragment of the foundation, both stresses and strains were analysed. The results of the analysis presented in this paper contribute to further research in the field of understanding both behaviour and modelling in historical buildings’ foundations.
NASA Astrophysics Data System (ADS)
Fais, Silvana; Casula, Giuseppe; Cuccuru, Francesco; Ligas, Paola; Bianchi, Maria Giovanna; Marraccini, Alessandro
2017-04-01
The need to integrate different non invasive geophysical datasets for an effective diagnostic process of the stone materials of cultural heritage buildings is due to the complexity of the intrinsic characteristics of the different types of stones and of their degradation process. Consequently integration between different geophysical techniques is required for the characterization of stone building materials. In order to perform the diagnostic process by different non-invasive techniques thus interpreting in a realistic way the different geophysical parameters, it is necessary to link the petrophysical characteristics of stones with the geophysical ones. In this study the complementary application of three different non invasive techniques (terrestrial laser scanner (TLS), infrared thermography and ultrasonic surface and tomography measurements) was carried out to analyse the conservation state and quality of the carbonate building materials of three inner columns of the old precious church of San Lorenzo in the historical city center of Cagliari (Sardinia). In previous works (Casula et al., 2009; Fais et al., 2015), especially the integrated application of TLS and ultrasonic techniques has been demonstrated to represent a powerful tool in evaluating the quality of the stone building materials by solving or limiting the uncertainties typical of all indirect methods. Thanks to the terrestrial laser scanner (TLS) technique it was possible to 3D model the investigated columns and their surface geometrical anomalies. The TLS measurements were complemented by several ultrasonic in situ and laboratory tests in the 24kHz - 54kHz range. The ultrasonic parameters, especially longitudinal and transversal velocities, allow to recover information on materials related with mechanical properties. A good correlation between TLS surface geometrical anomalies and the ultrasonic velocity ones is evident at the surface and in shallow parts of the investigated architectural elements. To calibrate the geophysical results and provide reliable data for the interpretation, the petrophysical properties (porosity, density, water absorption) and petrographical characteristics (especially texture) of the carbonate building materials under study were examined. By combining petrographical, petrophysical, terrestrial laser scanner and ultrasonic techniques, a consistent diagnostic process of the carbonate building materials can be achieved to detect the presence of defects, fissures, fractures, weathering process or compositional variations. The above diagnostic process is very useful also to evaluate the behavior of the carbonate building materials, facilitating the planning of urgent and long-term conservation programs and in time monitoring. References Casula G, Fais S, Ligas P (2009) Experimental application of 3-D laser scanning and acoustic techniques in assessing the quality of stones used in monumental structures. Int J Microstruct. Mater. Prop. 4:45-56. doi: 10.1504/IJMMP.2009.028432 Fais, S., Cuccuru, F., Ligas, P, Casula, G., Bianchi M.G. (2015) Integrated ultrasonic, laser scanning and petrographical characterisation of carbonate building materials on an architectural structure of a historic building. Bull Eng Geol Environ. doi: 10.1007/s10064-015-0815-9 Acknowledgements: This work was supported by Regione Autonoma della Sardegna (RAS), Regional Law 7th August 2007, n. 7. The authors would also like to thank Archidiocesi di Cagliari and Mons. Mario Ledda for their kind permission to work on the San Lorenzo Church.
NASA Astrophysics Data System (ADS)
Desconnets, Jean-Christophe; Giuliani, Gregory; Guigoz, Yaniss; Lacroix, Pierre; Mlisa, Andiswa; Noort, Mark; Ray, Nicolas; Searby, Nancy D.
2017-02-01
The discovery of and access to capacity building resources are often essential to conduct environmental projects based on Earth Observation (EO) resources, whether they are Earth Observation products, methodological tools, techniques, organizations that impart training in these techniques or even projects that have shown practical achievements. Recognizing this opportunity and need, the European Commission through two FP7 projects jointly with the Group on Earth Observations (GEO) teamed up with the Committee on Earth observation Satellites (CEOS). The Global Earth Observation CApacity Building (GEOCAB) portal aims at compiling all current capacity building efforts on the use of EO data for societal benefits into an easily updateable and user-friendly portal. GEOCAB offers a faceted search to improve user discovery experience with a fully interactive world map with all inventoried projects and activities. This paper focuses on the conceptual framework used to implement the underlying platform. An ISO19115 metadata model associated with a terminological repository are the core elements that provide a semantic search application and an interoperable discovery service. The organization and the contribution of different user communities to ensure the management and the update of the content of GEOCAB are addressed.
IEA EBC annex 53: Total energy use in buildings—Analysis and evaluation methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoshino, Hiroshi; Hong, Tianzhen; Nord, Natasa
One of the most significant barriers to achieving deep building energy efficiency is a lack of knowledge about the factors determining energy use. In fact, there is often a significant discrepancy between designed and real energy use in buildings, which is poorly understood but are believed to have more to do with the role of human behavior than building design. Building energy use is mainly influenced by six factors: climate, building envelope, building services and energy systems, building operation and maintenance, occupants’ activities and behavior, and indoor environmental quality. In the past, much research focused on the first three factors.more » However, the next three human-related factors can have an influence as significant as the first three. Annex 53 employed an interdisciplinary approach, integrating building science, architectural engineering, computer modeling and simulation, and social and behavioral science to develop and apply methods to analyze and evaluate the real energy use in buildings considering the six influencing factors. Finally, outcomes from Annex 53 improved understanding and strengthen knowledge regarding the robust prediction of total energy use in buildings, enabling reliable quantitative assessment of energy-savings measures, policies, and techniques.« less
IEA EBC annex 53: Total energy use in buildings—Analysis and evaluation methods
Yoshino, Hiroshi; Hong, Tianzhen; Nord, Natasa
2017-07-18
One of the most significant barriers to achieving deep building energy efficiency is a lack of knowledge about the factors determining energy use. In fact, there is often a significant discrepancy between designed and real energy use in buildings, which is poorly understood but are believed to have more to do with the role of human behavior than building design. Building energy use is mainly influenced by six factors: climate, building envelope, building services and energy systems, building operation and maintenance, occupants’ activities and behavior, and indoor environmental quality. In the past, much research focused on the first three factors.more » However, the next three human-related factors can have an influence as significant as the first three. Annex 53 employed an interdisciplinary approach, integrating building science, architectural engineering, computer modeling and simulation, and social and behavioral science to develop and apply methods to analyze and evaluate the real energy use in buildings considering the six influencing factors. Finally, outcomes from Annex 53 improved understanding and strengthen knowledge regarding the robust prediction of total energy use in buildings, enabling reliable quantitative assessment of energy-savings measures, policies, and techniques.« less
Quality Analysis on 3d Buidling Models Reconstructed from Uav Imagery
NASA Astrophysics Data System (ADS)
Jarzabek-Rychard, M.; Karpina, M.
2016-06-01
Recent developments in UAV technology and structure from motion techniques have effected that UAVs are becoming standard platforms for 3D data collection. Because of their flexibility and ability to reach inaccessible urban parts, drones appear as optimal solution for urban applications. Building reconstruction from the data collected with UAV has the important potential to reduce labour cost for fast update of already reconstructed 3D cities. However, especially for updating of existing scenes derived from different sensors (e.g. airborne laser scanning), a proper quality assessment is necessary. The objective of this paper is thus to evaluate the potential of UAV imagery as an information source for automatic 3D building modeling at LOD2. The investigation process is conducted threefold: (1) comparing generated SfM point cloud to ALS data; (2) computing internal consistency measures of the reconstruction process; (3) analysing the deviation of Check Points identified on building roofs and measured with a tacheometer. In order to gain deep insight in the modeling performance, various quality indicators are computed and analysed. The assessment performed according to the ground truth shows that the building models acquired with UAV-photogrammetry have the accuracy of less than 18 cm for the plannimetric position and about 15 cm for the height component.
NASA Astrophysics Data System (ADS)
Lin, D.; Jarzabek-Rychard, M.; Schneider, D.; Maas, H.-G.
2018-05-01
An automatic building façade thermal texture mapping approach, using uncooled thermal camera data, is proposed in this paper. First, a shutter-less radiometric thermal camera calibration method is implemented to remove the large offset deviations caused by changing ambient environment. Then, a 3D façade model is generated from a RGB image sequence using structure-from-motion (SfM) techniques. Subsequently, for each triangle in the 3D model, the optimal texture is selected by taking into consideration local image scale, object incident angle, image viewing angle as well as occlusions. Afterwards, the selected textures can be further corrected using thermal radiant characteristics. Finally, the Gauss filter outperforms the voted texture strategy at the seams smoothing and thus for instance helping to reduce the false alarm rate in façade thermal leakages detection. Our approach is evaluated on a building row façade located at Dresden, Germany.
Computer-aided design of biological circuits using TinkerCell
Bergmann, Frank T; Sauro, Herbert M
2010-01-01
Synthetic biology is an engineering discipline that builds on modeling practices from systems biology and wet-lab techniques from genetic engineering. As synthetic biology advances, efficient procedures will be developed that will allow a synthetic biologist to design, analyze and build biological networks. In this idealized pipeline, computer-aided design (CAD) is a necessary component. The role of a CAD application would be to allow efficient transition from a general design to a final product. TinkerCell is a design tool for serving this purpose in synthetic biology. In TinkerCell, users build biological networks using biological parts and modules. The network can be analyzed using one of several functions provided by TinkerCell or custom programs from third-party sources. Since best practices for modeling and constructing synthetic biology networks have not yet been established, TinkerCell is designed as a flexible and extensible application that can adjust itself to changes in the field. PMID:21327060
Three-Dimensional Integrated Survey for Building Investigations.
Costantino, Domenica; Angelini, Maria Giuseppa
2015-11-01
The study shows the results of a survey aimed to represent a building collapse and the feasibility of the modellation as a support of structure analysis. An integrated survey using topographic, photogrammetric, and terrestrial laser techniques was carried out to obtain a three-dimensional (3D) model of the building, plans and prospects, and the particulars of the collapsed area. Authors acquired, by a photogrammetric survey, information about regular parties of the structure; while using laser scanner data they reconstructed a set of more interesting architectural details and areas with higher surface curvature. Specifically, the process of texture provided a detailed 3D structure of the areas under investigation. The analysis of the data acquired resulted to be very useful both in identifying the causes of the disaster and also in helping the reconstruction of the collapsed corner showing the contribution that the integrated surveys can give in preserving architectural and historic heritage. © 2015 American Academy of Forensic Sciences.
A Deformable Generic 3D Model of Haptoral Anchor of Monogenean
Teo, Bee Guan; Dhillon, Sarinder Kaur; Lim, Lee Hong Susan
2013-01-01
In this paper, a digital 3D model which allows for visualisation in three dimensions and interactive manipulation is explored as a tool to help us understand the structural morphology and elucidate the functions of morphological structures of fragile microorganisms which defy live studies. We developed a deformable generic 3D model of haptoral anchor of dactylogyridean monogeneans that can subsequently be deformed into different desired anchor shapes by using direct manipulation deformation technique. We used point primitives to construct the rectangular building blocks to develop our deformable 3D model. Point primitives are manually marked on a 2D illustration of an anchor on a Cartesian graph paper and a set of Cartesian coordinates for each point primitive is manually extracted from the graph paper. A Python script is then written in Blender to construct 3D rectangular building blocks based on the Cartesian coordinates. The rectangular building blocks are stacked on top or by the side of each other following their respective Cartesian coordinates of point primitive. More point primitives are added at the sites in the 3D model where more structural variations are likely to occur, in order to generate complex anchor structures. We used Catmull-Clark subdivision surface modifier to smoothen the surface and edge of the generic 3D model to obtain a smoother and more natural 3D shape and antialiasing option to reduce the jagged edges of the 3D model. This deformable generic 3D model can be deformed into different desired 3D anchor shapes through direct manipulation deformation technique by aligning the vertices (pilot points) of the newly developed deformable generic 3D model onto the 2D illustrations of the desired shapes and moving the vertices until the desire 3D shapes are formed. In this generic 3D model all the vertices present are deployed for displacement during deformation. PMID:24204903
A deformable generic 3D model of haptoral anchor of Monogenean.
Teo, Bee Guan; Dhillon, Sarinder Kaur; Lim, Lee Hong Susan
2013-01-01
In this paper, a digital 3D model which allows for visualisation in three dimensions and interactive manipulation is explored as a tool to help us understand the structural morphology and elucidate the functions of morphological structures of fragile microorganisms which defy live studies. We developed a deformable generic 3D model of haptoral anchor of dactylogyridean monogeneans that can subsequently be deformed into different desired anchor shapes by using direct manipulation deformation technique. We used point primitives to construct the rectangular building blocks to develop our deformable 3D model. Point primitives are manually marked on a 2D illustration of an anchor on a Cartesian graph paper and a set of Cartesian coordinates for each point primitive is manually extracted from the graph paper. A Python script is then written in Blender to construct 3D rectangular building blocks based on the Cartesian coordinates. The rectangular building blocks are stacked on top or by the side of each other following their respective Cartesian coordinates of point primitive. More point primitives are added at the sites in the 3D model where more structural variations are likely to occur, in order to generate complex anchor structures. We used Catmull-Clark subdivision surface modifier to smoothen the surface and edge of the generic 3D model to obtain a smoother and more natural 3D shape and antialiasing option to reduce the jagged edges of the 3D model. This deformable generic 3D model can be deformed into different desired 3D anchor shapes through direct manipulation deformation technique by aligning the vertices (pilot points) of the newly developed deformable generic 3D model onto the 2D illustrations of the desired shapes and moving the vertices until the desire 3D shapes are formed. In this generic 3D model all the vertices present are deployed for displacement during deformation.
Energy and life-cycle cost analysis of a six-story office building
NASA Astrophysics Data System (ADS)
Turiel, I.
1981-10-01
An energy analysis computer program, DOE-2, was used to compute annual energy use for a typical office building as originally designed and with several energy conserving design modifications. The largest energy use reductions were obtained with the incorporation of daylighting techniques, the use of double pane windows, night temperature setback, and the reduction of artificial lighting levels. A life-cycle cost model was developed to assess the cost-effectiveness of the design modifications discussed. The model incorporates such features as inclusion of taxes, depreciation, and financing of conservation investments. The energy conserving strategies are ranked according to economic criteria such as net present benefit, discounted payback period, and benefit to cost ratio.
Systems analysis techniques for annual cycle thermal energy storage solar systems
NASA Astrophysics Data System (ADS)
Baylin, F.
1980-07-01
Community-scale annual cycle thermal energy storage solar systems are options for building heat and cooling. A variety of approaches are feasible in modeling ACTES solar systems. The key parameter in such efforts, average collector efficiency, is examined, followed by several approaches for simple and effective modeling. Methods are also examined for modeling building loads for structures based on both conventional and passive architectural designs. Two simulation models for sizing solar heating systems with annual storage are presented. Validation is presented by comparison with the results of a study of seasonal storage systems based on SOLANSIM, an hour-by-hour simulation. These models are presently used to examine the economic trade-off between collector field area and storage capacity. Programs directed toward developing other system components such as improved tanks and solar ponds or design tools for ACTES solar systems are examined.
Process Mining Online Assessment Data
ERIC Educational Resources Information Center
Pechenizkiy, Mykola; Trcka, Nikola; Vasilyeva, Ekaterina; van der Aalst, Wil; De Bra, Paul
2009-01-01
Traditional data mining techniques have been extensively applied to find interesting patterns, build descriptive and predictive models from large volumes of data accumulated through the use of different information systems. The results of data mining can be used for getting a better understanding of the underlying educational processes, for…
A Grade 6 Project in the Social Studies: The Wall of Old Jerusalem.
ERIC Educational Resources Information Center
Ediger, Marlow
1993-01-01
Presents a classroom lesson based on the walls of old Jerusalem. Maintains that cooperative-learning techniques used to build a model of the wall helped students understand the meaning of the original wall and the division of modern-day Jerusalem. (CFR)
Modeling software systems by domains
NASA Technical Reports Server (NTRS)
Dippolito, Richard; Lee, Kenneth
1992-01-01
The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.
Energy performance of building fabric - Comparing two types of vernacular residential houses
NASA Astrophysics Data System (ADS)
Draganova, Vanya Y.; Matsumoto, Hiroshi; Tsuzuki, Kazuyo
2017-10-01
Notwithstanding apparent differences, Japanese and Bulgarian traditional residential houses share a lot of common features - building materials, building techniques, even layout design. Despite the similarities, these two types of houses have not been compared so far. The study initiates such comparison. The focus is on houses in areas with similar climate in both countries. Current legislation requirements are compared, as well as the criteria for thermal comfort of people. Achieving high energy performance results from a dynamic system of 4 main key factors - thermal comfort range, heating/cooling source, building envelope and climatic conditions. A change in any single one of them can affect the final energy performance. However, it can be expected that a combination of changes in more than one factor usually occurs. The aim of this study is to evaluate the correlation between the thermal performance of building envelope designed under current regulations and a traditional one, having in mind the different thermal comfort range in the two countries. A sample building model is calculated in Scenario 1 - Japanese traditional building fabric, Scenario 2 - Bulgarian traditional building fabric and Scenario 3 - meeting the requirements of the more demanding current regulations. The energy modelling is conducted using EnergyPlus through OpenStudio cross-platform of software tools. The 3D geometry for the simulation is created using OpenStudio SketchUp Plug-in. Equal number of inhabitants, electricity consumption and natural ventilation is assumed. The results show that overall low energy consumption can be achieved using traditional building fabric as well, when paired with a wider thermal comfort range. Under these conditions traditional building design is still viable today. This knowledge can reestablish the use of traditional building fabric in contemporary design, stimulate preservation of local culture, building traditions and community identity.
Software Tools For Building Decision-support Models For Flood Emergency Situations
NASA Astrophysics Data System (ADS)
Garrote, L.; Molina, M.; Ruiz, J. M.; Mosquera, J. C.
The SAIDA decision-support system was developed by the Spanish Ministry of the Environment to provide assistance to decision-makers during flood situations. SAIDA has been tentatively implemented in two test basins: Jucar and Guadalhorce, and the Ministry is currently planning to have it implemented in all major Spanish basins in a few years' time. During the development cycle of SAIDA, the need for providing as- sistance to end-users in model definition and calibration was clearly identified. System developers usually emphasise abstraction and generality with the goal of providing a versatile software environment. End users, on the other hand, require concretion and specificity to adapt the general model to their local basins. As decision-support models become more complex, the gap between model developers and users gets wider: Who takes care of model definition, calibration and validation?. Initially, model developers perform these tasks, but the scope is usually limited to a few small test basins. Before the model enters operational stage, end users must get involved in model construction and calibration, in order to gain confidence in the model recommendations. However, getting the users involved in these activities is a difficult task. The goal of this re- search is to develop representation techniques for simulation and management models in order to define, develop and validate a mechanism, supported by a software envi- ronment, oriented to provide assistance to the end-user in building decision models for the prediction and management of river floods in real time. The system is based on three main building blocks: A library of simulators of the physical system, an editor to assist the user in building simulation models, and a machine learning method to calibrate decision models based on the simulation models provided by the user.
Intelligent demand side management of residential building energy systems
NASA Astrophysics Data System (ADS)
Sinha, Maruti N.
Advent of modern sensing technologies, data processing capabilities and rising cost of energy are driving the implementation of intelligent systems in buildings and houses which constitute 41% of total energy consumption. The primary motivation has been to provide a framework for demand-side management and to improve overall reliability. The entire formulation is to be implemented on NILM (Non-Intrusive Load Monitoring System), a smart meter. This is going to play a vital role in the future of demand side management. Utilities have started deploying smart meters throughout the world which will essentially help to establish communication between utility and consumers. This research is focused on investigation of a suitable thermal model of residential house, building up control system and developing diagnostic and energy usage forecast tool. The present work has considered measurement based approach to pursue. Identification of building thermal parameters is the very first step towards developing performance measurement and controls. The proposed identification technique is PEM (Prediction Error Method) based, discrete state-space model. The two different models have been devised. First model is focused toward energy usage forecast and diagnostics. Here one of the novel idea has been investigated which takes integral of thermal capacity to identify thermal model of house. The purpose of second identification is to build up a model for control strategy. The controller should be able to take into account the weather forecast information, deal with the operating point constraints and at the same time minimize the energy consumption. To design an optimal controller, MPC (Model Predictive Control) scheme has been implemented instead of present thermostatic/hysteretic control. This is a receding horizon approach. Capability of the proposed schemes has also been investigated.
NASA Astrophysics Data System (ADS)
Ghaffarian, Saman; Ghaffarian, Salar
2014-11-01
This paper proposes an improved FastICA model named as Purposive FastICA (PFICA) with initializing by a simple color space transformation and a novel masking approach to automatically detect buildings from high resolution Google Earth imagery. ICA and FastICA algorithms are defined as Blind Source Separation (BSS) techniques for unmixing source signals using the reference data sets. In order to overcome the limitations of the ICA and FastICA algorithms and make them purposeful, we developed a novel method involving three main steps: 1-Improving the FastICA algorithm using Moore-Penrose pseudo inverse matrix model, 2-Automated seeding of the PFICA algorithm based on LUV color space and proposed simple rules to split image into three regions; shadow + vegetation, baresoil + roads and buildings, respectively, 3-Masking out the final building detection results from PFICA outputs utilizing the K-means clustering algorithm with two number of clusters and conducting simple morphological operations to remove noises. Evaluation of the results illustrates that buildings detected from dense and suburban districts with divers characteristics and color combinations using our proposed method have 88.6% and 85.5% overall pixel-based and object-based precision performances, respectively.
Slicing Method for curved façade and window extraction from point clouds
NASA Astrophysics Data System (ADS)
Iman Zolanvari, S. M.; Laefer, Debra F.
2016-09-01
Laser scanning technology is a fast and reliable method to survey structures. However, the automatic conversion of such data into solid models for computation remains a major challenge, especially where non-rectilinear features are present. Since, openings and the overall dimensions of the buildings are the most critical elements in computational models for structural analysis, this article introduces the Slicing Method as a new, computationally-efficient method for extracting overall façade and window boundary points for reconstructing a façade into a geometry compatible for computational modelling. After finding a principal plane, the technique slices a façade into limited portions, with each slice representing a unique, imaginary section passing through a building. This is done along a façade's principal axes to segregate window and door openings from structural portions of the load-bearing masonry walls. The method detects each opening area's boundaries, as well as the overall boundary of the façade, in part, by using a one-dimensional projection to accelerate processing. Slices were optimised as 14.3 slices per vertical metre of building and 25 slices per horizontal metre of building, irrespective of building configuration or complexity. The proposed procedure was validated by its application to three highly decorative, historic brick buildings. Accuracy in excess of 93% was achieved with no manual intervention on highly complex buildings and nearly 100% on simple ones. Furthermore, computational times were less than 3 sec for data sets up to 2.6 million points, while similar existing approaches required more than 16 hr for such datasets.
Additive Manufacturing Techniques for the Reconstruction of 3D Fetal Faces
Citro, Daniela; Padula, Francesco; Motyl, Barbara; Marcolin, Federica; Calì, Michele
2017-01-01
This paper deals with additive manufacturing techniques for the creation of 3D fetal face models starting from routine 3D ultrasound data. In particular, two distinct themes are addressed. First, a method for processing and building 3D models based on the use of medical image processing techniques is proposed. Second, the preliminary results of a questionnaire distributed to future parents consider the use of these reconstructions both from an emotional and an affective point of view. In particular, the study focuses on the enhancement of the perception of maternity or paternity and the improvement in the relationship between parents and physicians in case of fetal malformations, in particular facial or cleft lip diseases. PMID:29410600
Choi, Ickwon; Kattan, Michael W; Wells, Brian J; Yu, Changhong
2012-01-01
In medical society, the prognostic models, which use clinicopathologic features and predict prognosis after a certain treatment, have been externally validated and used in practice. In recent years, most research has focused on high dimensional genomic data and small sample sizes. Since clinically similar but molecularly heterogeneous tumors may produce different clinical outcomes, the combination of clinical and genomic information, which may be complementary, is crucial to improve the quality of prognostic predictions. However, there is a lack of an integrating scheme for clinic-genomic models due to the P ≥ N problem, in particular, for a parsimonious model. We propose a methodology to build a reduced yet accurate integrative model using a hybrid approach based on the Cox regression model, which uses several dimension reduction techniques, L₂ penalized maximum likelihood estimation (PMLE), and resampling methods to tackle the problem. The predictive accuracy of the modeling approach is assessed by several metrics via an independent and thorough scheme to compare competing methods. In breast cancer data studies on a metastasis and death event, we show that the proposed methodology can improve prediction accuracy and build a final model with a hybrid signature that is parsimonious when integrating both types of variables.
ERIC Educational Resources Information Center
Davis, Jerry L., Ed.; Nelson, Cathy S., Ed.; Gauger, Elizabeth S., Ed.
This book presents useful information on types and levels of aggression, how conflicts escalate, early warning signs of violence, and characteristics of a safe school. Violence prevention techniques, many based on principles that form the foundation of the Boys Town Education Model, show educators and administrators how to calm an angry or…
Building Exposure Maps Of Urban Infrastructure And Crop Fields In The Mekong River Basin
NASA Astrophysics Data System (ADS)
Haas, E.; Weichselbaum, J.; Gangkofner, U.; Miltzer, J.; Wali, A.
2013-12-01
In the frame of the Integrated Water Resources Management (IWRM) initiative for the Mekong river basin World Bank is collaborating with the Mekong River Commission and governmental organizations in Cambodia, Lao PDR, Thailand and Vietnam to build national and regional capacities for managing the risks associated with natural disasters, such as floods, flash floods and droughts. Within ‘eoworld', a joint initiative set up by ESA and World Bank to foster the use of Earth Observation (EO) for sustainable development work, a comprehensive database of elements at risk in the Lower Mekong river basin has been established by GeoVille, including urban infrastructure and crops (primarily rice paddies). In the long term, this exposure information shall be fed into an open-source multi- hazard modeling tool for risk assessment along the Mekong River, which then shall be used by national stakeholders as well as insurance and financial institutions for planning, disaster preparedness and emergency management. Earth Observation techniques can provide objective, synoptic and repetitive observations of elements at risk including buildings, infrastructure and crops. Through the fusion of satellite-based with in-situ data from field surveys and local knowledge (e.g. on building materials) features at risk can be characterised and mapped with high accuracy. Earth Observation data utilised comprise bi-weekly Envisat ASAR imagery programmed for a period of 9 months in 2011 to map the development of the rice cultivation area, identify predominant cropping systems (wet-season vs. dry season cultivation), crop cycles (single /double / triple crop per year), date of emergence/harvest and the distinction between rice planted under intensive (SRI) vs. regular rice cultivation techniques. Very High Resolution (VHR) optical data from SPOT, KOMPSAT and QuickBird were used for mapping of buildings and infrastructure, such as building footprints, residential / commercial areas, industrial buildings, main infrastructure, and other public assets. A key input to this work was data collected by the project team in the field with the purpose of scoping information about buildings including material, height (number of stories), construction technique, and floor area. A high resolution satellite-based Digital Elevation Model was additionally generated to provide surface elevations of vegetation and man-made objects with a vertical accuracy of 10 m. By using this methodology thousands of buildings and infrastructure features were mapped, clearly indicating the location and characteristics of the assets. Exposure maps were complemented with the analysis of historical flood and drought events using ERS and Envisat ASAR radar data for historical flood mapping alongside with vegetation index data from SPOT-VEGETATION and NOAA-AVHRR, concerning drought events.
Designing and benchmarking the MULTICOM protein structure prediction system
2013-01-01
Background Predicting protein structure from sequence is one of the most significant and challenging problems in bioinformatics. Numerous bioinformatics techniques and tools have been developed to tackle almost every aspect of protein structure prediction ranging from structural feature prediction, template identification and query-template alignment to structure sampling, model quality assessment, and model refinement. How to synergistically select, integrate and improve the strengths of the complementary techniques at each prediction stage and build a high-performance system is becoming a critical issue for constructing a successful, competitive protein structure predictor. Results Over the past several years, we have constructed a standalone protein structure prediction system MULTICOM that combines multiple sources of information and complementary methods at all five stages of the protein structure prediction process including template identification, template combination, model generation, model assessment, and model refinement. The system was blindly tested during the ninth Critical Assessment of Techniques for Protein Structure Prediction (CASP9) in 2010 and yielded very good performance. In addition to studying the overall performance on the CASP9 benchmark, we thoroughly investigated the performance and contributions of each component at each stage of prediction. Conclusions Our comprehensive and comparative study not only provides useful and practical insights about how to select, improve, and integrate complementary methods to build a cutting-edge protein structure prediction system but also identifies a few new sources of information that may help improve the design of a protein structure prediction system. Several components used in the MULTICOM system are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/. PMID:23442819
A multi-criteria model for the comparison of building envelope energy retrofits
NASA Astrophysics Data System (ADS)
Donnarumma, Giuseppe; Fiore, Pierfrancesco
2017-02-01
In light of the current EU guidelines in the energy field, improving building envelope performance cannot be separated from the context of satisfying the environmental sustainability requirements, reducing the costs associated with the life cycle of the building as well as economic and financial feasibility. Therefore, identifying the "optimal" energy retrofit solutions requires the simultaneous assessment of several factors and thus becomes a problem of choice between several possible alternatives. To facilitate the work of the decision-makers, public or private, adequate decision support tools are of great importance. Starting from this need, a model based on the multi-criteria analysis "AHP" technique is proposed, along with the definition of three synthetic indices associated with the three requirements of "Energy Performance", "Sustainability Performance" and "Cost". From the weighted aggregation of the three indices, a global index of preference is obtained that allows to "quantify" the satisfaction level of the i-th alternative from the point of view of a particular group of decision-makers. The model is then applied, by way of example, to the case-study of the energetic redevelopment of a former factory, assuming its functional conversion. Twenty possible alternative interventions on the opaque vertical closures, resulting from the combination of three thermal insulators families (synthetic, natural and mineral) with four energy retrofitting techniques are compared and the results obtained critically discussed by considering the point of view of the three different groups of decision-makers.
Failure mechanism of shear-wall dominant multi-story buildings
Yuksel, S.B.; Kalkan, E.
2008-01-01
The recent trend in the building industry of Turkey as well as in many European countries is towards utilizing the tunnel form (shear-wall dominant) construction system for development of multi-story residential units. The tunnel form buildings diverge from other conventional reinforced concrete (RC) buildings due to the lack of beams and columns in their structural integrity. The vertical load-carrying members of these buildings are the structural-walls only, and the floor system is a flat plate. Besides the constructive advantages, tunnel form buildings provide superior seismic performance compared to conventional RC frame and dual systems as observed during the recent devastating earthquakes in Turkey (1999 Mw 7.4 Kocaeli, Mw 7.2 Duzce, and 2004 Mw 6.5 Bingol). With its proven earthquake performance, the tunnel form system is becoming the primary construction technique in many seismically active regions. In this study, a series of nonlinear analyses were conducted using finite element (FE) models to augment our understanding on their failure mechanism under lateral forces. In order to represent the nonlinear behavior adequately, The FE models were verified with the results of experimental studies performed on three dimensional (3D) scaled tunnel form building specimens. The results of this study indicate that the structural walls of tunnel form buildings may exhibit brittle flexural failure under lateral loading, if they are not properly reinforced. The global tension/compression couple triggers this failure mechanism by creating pure axial tension in the outermost shear-walls.
Introduction to a New Approach to Experiential Learning.
ERIC Educational Resources Information Center
Jackson, Lewis; MacIsaac, Doug
1994-01-01
A process model for experiential learning (EL) in adult education begins with the characteristics and needs of adult learners and conceptual foundations of EL. It includes methods and techniques for in-class and field-based experiences, building a folio (point-in-time performance assessment), and portfolio construction (assessing transitional…
Reshaping the Enterprise through an Information Architecture and Process Reengineering.
ERIC Educational Resources Information Center
Laudato, Nicholas C.; DeSantis, Dennis J.
1995-01-01
The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…
NASA Astrophysics Data System (ADS)
Collier, Richard S.; McKenna, Paul M.; Perala, Rodney A.
1991-08-01
The objective here is to describe the lightning hazards to buildings and their internal environments using advanced formulations of Maxwell's Equations. The method described is the Three Dimensional Finite Difference Time Domain Solution. It can be used to solve for the lightning interaction with such structures in three dimensions with the inclusion of a considerable amount of detail. Special techniques were developed for including wire, plumbing, and rebar into the model. Some buildings have provisions for lightning protection in the form of air terminals connected to a ground counterpoise system. It is shown that fields and currents within these structures can be significantly high during a lightning strike. Time lapse video presentations were made showing the electric and magnetic field distributions on selected cross sections of the buildings during a simulated lightning strike.
NASA Technical Reports Server (NTRS)
Collier, Richard S.; Mckenna, Paul M.; Perala, Rodney A.
1991-01-01
The objective here is to describe the lightning hazards to buildings and their internal environments using advanced formulations of Maxwell's Equations. The method described is the Three Dimensional Finite Difference Time Domain Solution. It can be used to solve for the lightning interaction with such structures in three dimensions with the inclusion of a considerable amount of detail. Special techniques were developed for including wire, plumbing, and rebar into the model. Some buildings have provisions for lightning protection in the form of air terminals connected to a ground counterpoise system. It is shown that fields and currents within these structures can be significantly high during a lightning strike. Time lapse video presentations were made showing the electric and magnetic field distributions on selected cross sections of the buildings during a simulated lightning strike.
Effect of Moisture Content on Thermal Properties of Porous Building Materials
NASA Astrophysics Data System (ADS)
Kočí, Václav; Vejmelková, Eva; Čáchová, Monika; Koňáková, Dana; Keppert, Martin; Maděra, Jiří; Černý, Robert
2017-02-01
The thermal conductivity and specific heat capacity of characteristic types of porous building materials are determined in the whole range of moisture content from dry to fully water-saturated state. A transient pulse technique is used in the experiments, in order to avoid the influence of moisture transport on measured data. The investigated specimens include cement composites, ceramics, plasters, and thermal insulation boards. The effect of moisture-induced changes in thermal conductivity and specific heat capacity on the energy performance of selected building envelopes containing the studied materials is then analyzed using computational modeling of coupled heat and moisture transport. The results show an increased moisture content as a substantial negative factor affecting both thermal properties of materials and energy balance of envelopes, which underlines the necessity to use moisture-dependent thermal parameters of building materials in energy-related calculations.
Hwang, Young Sun; Seo, Minseok; Choi, Hee Jung; Kim, Sang Kyung; Kim, Heebal; Han, Jae Yong
2018-04-01
The chicken is a valuable model organism, especially in evolutionary and embryology research because its embryonic development occurs in the egg. However, despite its scientific importance, no transcriptome data have been generated for deciphering the early developmental stages of the chicken because of practical and technical constraints in accessing pre-oviposited embryos. Here, we determine the entire transcriptome of pre-oviposited avian embryos, including oocyte, zygote, and intrauterine embryos from Eyal-giladi and Kochav stage I (EGK.I) to EGK.X collected using a noninvasive approach for the first time. We also compare RNA-sequencing data obtained using a bulked embryo sequencing and single embryo/cell sequencing technique. The raw sequencing data were preprocessed with two genome builds, Galgal4 and Galgal5, and the expression of 17,108 and 26,102 genes was quantified in the respective builds. There were some differences between the two techniques, as well as between the two genome builds, and these were affected by the emergence of long intergenic noncoding RNA annotations. The first transcriptome datasets of pre-oviposited early chicken embryos based on bulked and single embryo sequencing techniques will serve as a valuable resource for investigating early avian embryogenesis, for comparative studies among vertebrates, and for novel gene annotation in the chicken genome.
Challoner, Avril; Pilla, Francesco; Gill, Laurence
2015-12-01
NO₂ and particulate matter are the air pollutants of most concern in Ireland, with possible links to the higher respiratory and cardiovascular mortality and morbidity rates found in the country compared to the rest of Europe. Currently, air quality limits in Europe only cover outdoor environments yet the quality of indoor air is an essential determinant of a person's well-being, especially since the average person spends more than 90% of their time indoors. The modelling conducted in this research aims to provide a framework for epidemiological studies by the use of publically available data from fixed outdoor monitoring stations to predict indoor air quality more accurately. Predictions are made using two modelling techniques, the Personal-exposure Activity Location Model (PALM), to predict outdoor air quality at a particular building, and Artificial Neural Networks, to model the indoor/outdoor relationship of the building. This joint approach has been used to predict indoor air concentrations for three inner city commercial buildings in Dublin, where parallel indoor and outdoor diurnal monitoring had been carried out on site. This modelling methodology has been shown to provide reasonable predictions of average NO₂ indoor air quality compared to the monitored data, but did not perform well in the prediction of indoor PM2.5 concentrations. Hence, this approach could be used to determine NO₂ exposures more rigorously of those who work and/or live in the city centre, which can then be linked to potential health impacts.
Skrzat, Janusz; Spulber, Alexandru; Walocha, Jerzy
This paper presents the effects of building mesh models of the human skull and the cranial bones from a series of CT-scans. With the aid of computer so ware, 3D reconstructions of the whole skull and segmented cranial bones were performed and visualized by surface rendering techniques. The article briefly discusses clinical and educational applications of 3D cranial models created using stereolitographic reproduction.
Innovative application of virtual display technique in virtual museum
NASA Astrophysics Data System (ADS)
Zhang, Jiankang
2017-09-01
Virtual museum refers to display and simulate the functions of real museum on the Internet in the form of 3 Dimensions virtual reality by applying interactive programs. Based on Virtual Reality Modeling Language, virtual museum building and its effective interaction with the offline museum lie in making full use of 3 Dimensions panorama technique, virtual reality technique and augmented reality technique, and innovatively taking advantages of dynamic environment modeling technique, real-time 3 Dimensions graphics generating technique, system integration technique and other key virtual reality techniques to make sure the overall design of virtual museum.3 Dimensions panorama technique, also known as panoramic photography or virtual reality, is a technique based on static images of the reality. Virtual reality technique is a kind of computer simulation system which can create and experience the interactive 3 Dimensions dynamic visual world. Augmented reality, also known as mixed reality, is a technique which simulates and mixes the information (visual, sound, taste, touch, etc.) that is difficult for human to experience in reality. These technologies make virtual museum come true. It will not only bring better experience and convenience to the public, but also be conducive to improve the influence and cultural functions of the real museum.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1977-06-01
The mixed-strategy analysis was a tradeoff analysis between energy-conservation methods and an alternative energy source (solar) considering technical and economic benefits. The objective of the analysis was to develop guidelines for: reducing energy requirements; reducing conventional fuel use; and identifying economic alternatives for building owners. The analysis was done with a solar system in place. This makes the study unique in that it is determining the interaction of energy conservation with a solar system. The study, therefore, established guidelines as to how to minimize capital investment while reducing the conventional fuel consumption through either a larger solar system or anmore » energy-conserving technique. To focus the scope of energy-conservation techniques and alternative energy sources considered, five building types (house, apartment buildings, commercial buildings, schools, and office buildings) were selected. Finally, the lists of energy-conservation techniques and alternative energy sources were reduced to lists of manageable size by using technical attributes to select the best candidates for further study. The resultant energy-conservation techniques were described in detail and installed costs determined. The alternative energy source reduced to solar. Building construction characteristics were defined for each building for each of four geographic regions of the country. A mixed strategy consisting of an energy-conservation technique and solar heating/hot water/cooling system was analyzed, using computer simulation to determine the interaction between energy conservation and the solar system. Finally, using FEA fuel-price scenarios and installed costs for the solar system and energy conservation techniques, an economic analysis was performed to determine the cost effectiveness of the combination. (MCW)« less
NASA Technical Reports Server (NTRS)
Peuquet, Donna J.
1987-01-01
A new approach to building geographic data models that is based on the fundamental characteristics of the data is presented. An overall theoretical framework for representing geographic data is proposed. An example of utilizing this framework in a Geographic Information System (GIS) context by combining artificial intelligence techniques with recent developments in spatial data processing techniques is given. Elements of data representation discussed include hierarchical structure, separation of locational and conceptual views, and the ability to store knowledge at variable levels of completeness and precision.
Information loss and reconstruction in diffuse fluorescence tomography
Bonfert-Taylor, Petra; Leblond, Frederic; Holt, Robert W.; Tichauer, Kenneth; Pogue, Brian W.; Taylor, Edward C.
2012-01-01
This paper is a theoretical exploration of spatial resolution in diffuse fluorescence tomography. It is demonstrated that, given a fixed imaging geometry, one cannot—relative to standard techniques such as Tikhonov regularization and truncated singular value decomposition—improve the spatial resolution of the optical reconstructions via increasing the node density of the mesh considered for modeling light transport. Using techniques from linear algebra, it is shown that, as one increases the number of nodes beyond the number of measurements, information is lost by the forward model. It is demonstrated that this information cannot be recovered using various common reconstruction techniques. Evidence is provided showing that this phenomenon is related to the smoothing properties of the elliptic forward model that is used in the diffusion approximation to light transport in tissue. This argues for reconstruction techniques that are sensitive to boundaries, such as L1-reconstruction and the use of priors, as well as the natural approach of building a measurement geometry that reflects the desired image resolution. PMID:22472763
Ergun, Bahadir; Sahin, Cumhur; Baz, Ibrahim; Ustuntas, Taner
2010-06-01
Terrestrial laser scanning is a popular methodology that is used frequently in the process of documenting historical buildings and cultural heritage. The historical peninsula region sprawls over an area of approximately 1,500 ha and is one of the main aggregate areas of the historical buildings in Istanbul. In this study, terrestrial laser scanning and close range photogrammetry techniques are integrated into each other to create a 3D city model of this part of Istanbul, including some of the buildings that represent the most brilliant areas of Byzantine and Ottoman Empires. Several terrestrial laser scanners with their different specifications were used to solve various geometric scanning problems for distinct areas of the subject city. Photogrammetric method was used for the documentation of the façades of these historical buildings for architectural purposes. This study differentiates itself from the similar ones by its application process that focuses on the geometry, the building texture, and density of the study area. Nowadays, the largest-scale studies among 3D modeling studies, in terms of the methodology of measurement, are urban modeling studies. Because of this large scale, the application of 3D urban modeling studies is executed in a gradual way. In this study, a modeling method based on the façades of the streets was used. In addition, the complimentary elements for the process of modeling were combined in several ways. A street model was presented as a sample, as being the subject of the applied study. In our application of 3D modeling, the modeling based on close range photogrammetry and the data of combined calibration with the data of terrestrial laser scanner were used in a compatible way. The final work was formed with the pedestal data for 3D visualization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lopez, Tammy Ann
Technical Area-18 (TA-18), also known as Pajarito Site, is located on Los Alamos National Laboratory property and has historic buildings that will be included in the Manhattan Project National Historic Park. Characterization studies of metal contamination were needed in two of the four buildings that are on the historic registry in this area, a “battleship” bunker building (TA-18-0002) and the Pond cabin (TA-18-0029). However, these two buildings have been exposed to the elements, are decades old, and have porous and rough surfaces (wood and concrete). Due to these conditions, it was questioned whether standard wipe sampling would be adequate tomore » detect surface dust metal contamination in these buildings. Thus, micro-vacuum and surface wet wipe sampling techniques were performed side-by-side at both buildings and results were compared statistically. A two-tail paired t-test revealed that the micro-vacuum and wet wipe techniques were statistically different for both buildings. Further mathematical analysis revealed that the wet wipe technique picked up more metals from the surface than the microvacuum technique. Wet wipes revealed concentrations of beryllium and lead above internal housekeeping limits; however, using an yttrium normalization method with linear regression analysis between beryllium and yttrium revealed a correlation indicating that the beryllium levels were likely due to background and not operational contamination. PPE and administrative controls were implemented for National Park Service (NPS) and Department of Energy (DOE) tours as a result of this study. Overall, this study indicates that the micro-vacuum technique may not be an efficient technique to sample for metal dust contamination.« less
Vision-based system identification technique for building structures using a motion capture system
NASA Astrophysics Data System (ADS)
Oh, Byung Kwan; Hwang, Jin Woo; Kim, Yousok; Cho, Tongjun; Park, Hyo Seon
2015-11-01
This paper presents a new vision-based system identification (SI) technique for building structures by using a motion capture system (MCS). The MCS with outstanding capabilities for dynamic response measurements can provide gage-free measurements of vibrations through the convenient installation of multiple markers. In this technique, from the dynamic displacement responses measured by MCS, the dynamic characteristics (natural frequency, mode shape, and damping ratio) of building structures are extracted after the processes of converting the displacement from MCS to acceleration and conducting SI by frequency domain decomposition. A free vibration experiment on a three-story shear frame was conducted to validate the proposed technique. The SI results from the conventional accelerometer-based method were compared with those from the proposed technique and showed good agreement, which confirms the validity and applicability of the proposed vision-based SI technique for building structures. Furthermore, SI directly employing MCS measured displacements to FDD was performed and showed identical results to those of conventional SI method.
Development and Application of Agglomerated Multigrid Methods for Complex Geometries
NASA Technical Reports Server (NTRS)
Nishikawa, Hiroaki; Diskin, Boris; Thomas, James L.
2010-01-01
We report progress in the development of agglomerated multigrid techniques for fully un- structured grids in three dimensions, building upon two previous studies focused on efficiently solving a model diffusion equation. We demonstrate a robust fully-coarsened agglomerated multigrid technique for 3D complex geometries, incorporating the following key developments: consistent and stable coarse-grid discretizations, a hierarchical agglomeration scheme, and line-agglomeration/relaxation using prismatic-cell discretizations in the highly-stretched grid regions. A signi cant speed-up in computer time is demonstrated for a model diffusion problem, the Euler equations, and the Reynolds-averaged Navier-Stokes equations for 3D realistic complex geometries.
Techniques for plotting shadow patterns
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bainbridge, D.A.
1982-02-01
Basic approaches for plotting shadow patterns (summer or winter) are discussed, illustrated, and compared. The solar simulator technique uses floodlights or a moveable table to mimic the sun's path over a model of the building being studied. The drawback is that, for large developments, very small models would have to be built. Graphic solutions (2 types) are described in which: (1) sun angles are used to calculate shadow patterns using trigonometry and (2) drawings are made and shadows are calculated. Examples are given for a house on level ground and on sloping ground. Calculations of shade density are also illustrated.more » 8 references. (MJJ)« less
Integration of LIDAR Data Into a Municipal GIS to Study Solar Radiation
NASA Astrophysics Data System (ADS)
Africani, P.; Bitelli, G.; Lambertini, A.; Minghetti, A.; Paselli, E.
2013-04-01
Identifying the right roofs to install solar panels inside a urban area is crucial for both private citizens and the whole local population. The aim is not easy because a lot of consideration must be made: insolation, orientation of the surface, size of the surface, shading due to topography, shading due to taller buildings next the surface, shading due to taller vegetation and other possible problems typical of urban areas like the presence of chimneys. Accuracy of data related to the analyzed surfaces is indeed fundamental, and also the detail of geometric models used to represent buildings and their roofs. The complexity that these roofs can reach is elevated. This work uses LiDAR data to obtain, with a semi-automatic technique, the full geometry of each roof part complementing the pre-existing building data in the municipal cartography. With this data is possible to evaluate the placement of solar panels on roofs of a whole city analyzing the solar potential of each building in detail. Other traditional techniques, like photogrammetry, need strong manual editing effort in order to identify slopes and insert vector on surfaces at the right height. Regarding LiDAR data, in order to perform accurate modelling, it is necessary to obtain an high density point cloud. The method proposed can also be used as a fast and linear workflow process for an area where LiDAR data are available and a municipal cartography already exist: LiDAR data can be furthermore successfully used to cross-check errors in pre-existent digital cartography that can remain otherwise hidden.
Poulain, Christophe A.; Finlayson, Bruce A.; Bassingthwaighte, James B.
2010-01-01
The analysis of experimental data obtained by the multiple-indicator method requires complex mathematical models for which capillary blood-tissue exchange (BTEX) units are the building blocks. This study presents a new, nonlinear, two-region, axially distributed, single capillary, BTEX model. A facilitated transporter model is used to describe mass transfer between plasma and intracellular spaces. To provide fast and accurate solutions, numerical techniques suited to nonlinear convection-dominated problems are implemented. These techniques are the random choice method, an explicit Euler-Lagrange scheme, and the MacCormack method with and without flux correction. The accuracy of the numerical techniques is demonstrated, and their efficiencies are compared. The random choice, Euler-Lagrange and plain MacCormack method are the best numerical techniques for BTEX modeling. However, the random choice and Euler-Lagrange methods are preferred over the MacCormack method because they allow for the derivation of a heuristic criterion that makes the numerical methods stable without degrading their efficiency. Numerical solutions are also used to illustrate some nonlinear behaviors of the model and to show how the new BTEX model can be used to estimate parameters from experimental data. PMID:9146808
Chen, Yun; Nielsen, Jens
2013-12-01
Bio-based production of chemical building blocks from renewable resources is an attractive alternative to petroleum-based platform chemicals. Metabolic pathway and strain engineering is the key element in constructing robust microbial chemical factories within the constraints of cost effective production. Here we discuss how the development of computational algorithms, novel modules and methods, omics-based techniques combined with modeling refinement are enabling reduction in development time and thus advance the field of industrial biotechnology. We further discuss how recent technological developments contribute to the development of novel cell factories for the production of the building block chemicals: adipic acid, succinic acid and 3-hydroxypropionic acid. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Arqub, Omar Abu; El-Ajou, Ahmad; Momani, Shaher
2015-07-01
Building fractional mathematical models for specific phenomena and developing numerical or analytical solutions for these fractional mathematical models are crucial issues in mathematics, physics, and engineering. In this work, a new analytical technique for constructing and predicting solitary pattern solutions of time-fractional dispersive partial differential equations is proposed based on the generalized Taylor series formula and residual error function. The new approach provides solutions in the form of a rapidly convergent series with easily computable components using symbolic computation software. For method evaluation and validation, the proposed technique was applied to three different models and compared with some of the well-known methods. The resultant simulations clearly demonstrate the superiority and potentiality of the proposed technique in terms of the quality performance and accuracy of substructure preservation in the construct, as well as the prediction of solitary pattern solutions for time-fractional dispersive partial differential equations.
NASA Astrophysics Data System (ADS)
Iannitti, Gianluca; Bonora, Nicola; Gentile, Domenico; Ruggiero, Andrew; Testa, Gabriel; Gubbioni, Simone
2017-06-01
In this work, the mechanical behavior of Ti-6Al-4V obtained by additive manufacturing technique was investigated, also considering the build direction. Dog-bone shaped specimens and Taylor cylinders were machined from rods manufactured by means of the EOSSINT M2 80 machine, based on Direct Metal Laser Sintering technique. Tensile tests were performed at strain rate ranging from 5E-4 s-1 to 1000 s-1 using an Instron electromechanical machine for quasistatic tests and a Direct-Tension Split Hopkinson Bar for dynamic tests. The mechanical strength of the material was described by a Johnson-Cook model modified to account for stress saturation occurring at high strain. Taylor cylinder tests and their corresponding numerical simulations were carried out in order to validate the constitutive model under a complex deformation path, high strain rates, and high temperatures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Na; Makhmalbaf, Atefe; Srivastava, Viraj
This paper presents a new technique for and the results of normalizing building energy consumption to enable a fair comparison among various types of buildings located near different weather stations across the U.S. The method was developed for the U.S. Building Energy Asset Score, a whole-building energy efficiency rating system focusing on building envelope, mechanical systems, and lighting systems. The Asset Score is calculated based on simulated energy use under standard operating conditions. Existing weather normalization methods such as those based on heating and cooling degrees days are not robust enough to adjust all climatic factors such as humidity andmore » solar radiation. In this work, over 1000 sets of climate coefficients were developed to separately adjust building heating, cooling, and fan energy use at each weather station in the United States. This paper also presents a robust, standardized weather station mapping based on climate similarity rather than choosing the closest weather station. This proposed simulated-based climate adjustment was validated through testing on several hundreds of thousands of modeled buildings. Results indicated the developed climate coefficients can isolate and adjust for the impacts of local climate for asset rating.« less
NASA Astrophysics Data System (ADS)
Seker, D. Z.; Alkan, M.; Kutoglu, S. S.; Akcin, H.
2010-12-01
Documentation of the cultural heritage sites is extremely important for monitoring and preserves them from natural disasters and human made activities. Due to its very rich historical background from the first human settlements in Catalhoyuk and Alacahoyuk and civilizations such as Byzantine, Seljuk and Ottoman, there are lots of cultural heritage sites in Turkey. 3D modeling and recording of historical buildings using modern tools and techniques in several locations of Turkey have been conducted and still continuing. The nine cultural sites in Turkey are included in the protection list of UNESCO as cultural heritage and one of them is the township of Safranbolu, which is the one of the most outstanding example of the traditional Turkish Architecture and also unique itself in terms of conservation of the human settlement in their authentic environmental motif up till now. In this study outcomes and further studies of a research project related to study area which is supported by the Turkish National Research Center (TUBITAK) with the project number 106Y157, will be presented in details. The basic aim of the study is development a GIS based information and management system for the city of Safranbolu. All historical buildings which are registered are assigned with the database. 3D modeling some of the selected building among the buildings which are registered as historical monuments using different data comes from different sources similar to their original constructions were realized and then it will be distributed via internet by a web-based information system designed during the project. Also some of the buildings were evaluated using close range photogrammetric technique to obtain their façade reliefs, were also assigned with the database. Designed database consists of 3D models, locations, historical information, cadastral and land register data of the selected buildings together with the other data collected during the project related to buildings. Using this system, all kind of spatial and non-spatial analyses were realized and different thematic maps for the historical city were produced. When the project is finalized, all the historical buildings which are consists of houses, mosques, fountains and caravansary in Safranbolu will be recorded permanently and architectural features of them will be integrated to designed spatial information system. In addition, by the help of internet, many people may be reached the data easily which will be very helpful to increase the number of visitor to the town. Also, this project will be guidance for future related studies.
NASA Technical Reports Server (NTRS)
Russell, P. B.; Pueschel, R. F.; Livingston, J. M.; Bergstrom, R.; Lawless, James G. (Technical Monitor)
1994-01-01
This paper brings together experimental. evidence required to build realistic models of the global evolution of physical, chemical, and optical properties of the aerosol resulting from the 1991 Pinatubo volcanic eruption. Such models are needed to compute the effects of the aerosol on atmospheric chemistry, dynamics, radiation, and temperature. Whereas there is now a large and growing body of post-Pinatubo measurements by a variety of techniques, some results are in conflict, and a self-consistent, unified picture is needed, along with an assessment of remaining uncertainties. This paper examines data from photometers, radiometers, impactors, optical counters/sizers, and lidars operated on the ground, aircraft, balloons, and spacecraft.
Deeb, Omar; Shaik, Basheerulla; Agrawal, Vijay K
2014-10-01
Quantitative Structure-Activity Relationship (QSAR) models for binding affinity constants (log Ki) of 78 flavonoid ligands towards the benzodiazepine site of GABA (A) receptor complex were calculated using the machine learning methods: artificial neural network (ANN) and support vector machine (SVM) techniques. The models obtained were compared with those obtained using multiple linear regression (MLR) analysis. The descriptor selection and model building were performed with 10-fold cross-validation using the training data set. The SVM and MLR coefficient of determination values are 0.944 and 0.879, respectively, for the training set and are higher than those of ANN models. Though the SVM model shows improvement of training set fitting, the ANN model was superior to SVM and MLR in predicting the test set. Randomization test is employed to check the suitability of the models.
Animation Strategies for Smooth Transformations Between Discrete Lods of 3d Building Models
NASA Astrophysics Data System (ADS)
Kada, Martin; Wichmann, Andreas; Filippovska, Yevgeniya; Hermes, Tobias
2016-06-01
The cartographic 3D visualization of urban areas has experienced tremendous progress over the last years. An increasing number of applications operate interactively in real-time and thus require advanced techniques to improve the quality and time response of dynamic scenes. The main focus of this article concentrates on the discussion of strategies for smooth transformation between two discrete levels of detail (LOD) of 3D building models that are represented as restricted triangle meshes. Because the operation order determines the geometrical and topological properties of the transformation process as well as its visual perception by a human viewer, three different strategies are proposed and subsequently analyzed. The simplest one orders transformation operations by the length of the edges to be collapsed, while the other two strategies introduce a general transformation direction in the form of a moving plane. This plane either pushes the nodes that need to be removed, e.g. during the transformation of a detailed LOD model to a coarser one, towards the main building body, or triggers the edge collapse operations used as transformation paths for the cartographic generalization.
Detecting and disentangling nonlinear structure from solar flux time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.; Roszman, L.
1992-01-01
Interest in solar activity has grown in the past two decades for many reasons. Most importantly for flight dynamics, solar activity changes the atmospheric density, which has important implications for spacecraft trajectory and lifetime prediction. Building upon the previously developed Rayleigh-Benard nonlinear dynamic solar model, which exhibits many dynamic behaviors observed in the Sun, this work introduces new chaotic solar forecasting techniques. Our attempt to use recently developed nonlinear chaotic techniques to model and forecast solar activity has uncovered highly entangled dynamics. Numerical techniques for decoupling additive and multiplicative white noise from deterministic dynamics and examines falloff of the power spectra at high frequencies as a possible means of distinguishing deterministic chaos from noise than spectrally white or colored are presented. The power spectral techniques presented are less cumbersome than current methods for identifying deterministic chaos, which require more computationally intensive calculations, such as those involving Lyapunov exponents and attractor dimension.
SKIRT: The design of a suite of input models for Monte Carlo radiative transfer simulations
NASA Astrophysics Data System (ADS)
Baes, M.; Camps, P.
2015-09-01
The Monte Carlo method is the most popular technique to perform radiative transfer simulations in a general 3D geometry. The algorithms behind and acceleration techniques for Monte Carlo radiative transfer are discussed extensively in the literature, and many different Monte Carlo codes are publicly available. On the contrary, the design of a suite of components that can be used for the distribution of sources and sinks in radiative transfer codes has received very little attention. The availability of such models, with different degrees of complexity, has many benefits. For example, they can serve as toy models to test new physical ingredients, or as parameterised models for inverse radiative transfer fitting. For 3D Monte Carlo codes, this requires algorithms to efficiently generate random positions from 3D density distributions. We describe the design of a flexible suite of components for the Monte Carlo radiative transfer code SKIRT. The design is based on a combination of basic building blocks (which can be either analytical toy models or numerical models defined on grids or a set of particles) and the extensive use of decorators that combine and alter these building blocks to more complex structures. For a number of decorators, e.g. those that add spiral structure or clumpiness, we provide a detailed description of the algorithms that can be used to generate random positions. Advantages of this decorator-based design include code transparency, the avoidance of code duplication, and an increase in code maintainability. Moreover, since decorators can be chained without problems, very complex models can easily be constructed out of simple building blocks. Finally, based on a number of test simulations, we demonstrate that our design using customised random position generators is superior to a simpler design based on a generic black-box random position generator.
NASA Astrophysics Data System (ADS)
Ahmad, S.; Husain, A.; Ghani, F.; Alam, M. N.
2013-11-01
The conversion of large amount of solid waste (foundry slag) into alternate source of building material will contribute not only as a solution to growing waste problem, but also it will conserve the natural resources of other building material and thereby reduce the cost of construction. The present work makes an effort to safe and economic use of recycle mortar (1:6) as a supplementary material. Conventional and recycled twelve prisms were casted with varying percentage of solid waste (foundry slag) added (0, 10, 20, 30 %) replacing cement by weight and tested under compression testing machine. As the replacement is increasing, the strength is decreasing. 10 % replacement curve is very closed to 0 % whereas 20 % is farther and 30 % is farthest. 20 % replacement was chosen for dynamic testing as its strength is within permissible limit as per IS code. A 1:4 scale single storey brick model with half size brick was fabricated on shake table in the lab for dynamic testing using pure friction isolation system (coarse sand as friction material µ = 0.34). Pure friction isolation technique can be adopted economically in developing countries where low-rise building prevails due to their low cost. The superstructure was separated from the foundation at plinth level, so as to permit sliding of superstructure during severe earthquake. The observed values of acceleration and displacement responses compare fairly with the analytical values of the analytical model. It also concluded that 20 % replacement of cement by solid waste (foundry slag) could be safely adopted without endangering the safety of the masonry structures under seismic load.To have an idea that how much energy is dissipated through this isolation, the same model with fixed base was tested and results were compared with the isolated free sliding model and it has been observed that more than 60 % energy is dissipated through this pure friction isolation technique. In case of base isolation, no visible cracks were observed up to the table force of 4.25 kN (1,300 rpm), whereas for fixed base failure started at 800 rpm.To strengthen the fixed base model, bamboo reinforcement were used for economical point of view. Another model of same dimension with same mortar ratio was fabricated on the shake table with bamboo reinforcement as plinth band and lintel band. In addition another four round bamboo bars of 3 mm diameter were placed at each of the four corners of the model. The building model was tested and found very encouraging and surprising results. The model failure started at 1,600 rpm, which means that this model is surviving the double force in comparison with the non-bamboo reinforcement.
3D modeling of building indoor spaces and closed doors from imagery and point clouds.
Díaz-Vilariño, Lucía; Khoshelham, Kourosh; Martínez-Sánchez, Joaquín; Arias, Pedro
2015-02-03
3D models of indoor environments are increasingly gaining importance due to the wide range of applications to which they can be subjected: from redesign and visualization to monitoring and simulation. These models usually exist only for newly constructed buildings; therefore, the development of automatic approaches for reconstructing 3D indoors from imagery and/or point clouds can make the process easier, faster and cheaper. Among the constructive elements defining a building interior, doors are very common elements and their detection can be very useful either for knowing the environment structure, to perform an efficient navigation or to plan appropriate evacuation routes. The fact that doors are topologically connected to walls by being coplanar, together with the unavoidable presence of clutter and occlusions indoors, increases the inherent complexity of the automation of the recognition process. In this work, we present a pipeline of techniques used for the reconstruction and interpretation of building interiors based on point clouds and images. The methodology analyses the visibility problem of indoor environments and goes in depth with door candidate detection. The presented approach is tested in real data sets showing its potential with a high door detection rate and applicability for robust and efficient envelope reconstruction.
NASA Astrophysics Data System (ADS)
Casula, Giuseppe; Fais, Silvana; Giovanna Bianchi, Maria; Cuccuru, Francesco; Ligas, Paola
2015-04-01
The Terrestrial Laser Scanner (TLS) is a modern contactless non-destructive technique (NDT) useful to 3D-model complex-shaped objects with a few hours' field survey. A TLS survey produces very dense point clouds made up of coordinates of point and radiometric information given by the reflectivity parameter i.e. the ratio between the amount of energy emitted by the sensor and the energy reflected by the target object. Modern TLSs used in architecture are phase instruments where the phase difference obtained by comparing the emitted laser pulse with the reflected one is proportional to the sensor-target distance expressed as an integer multiple of the half laser wavelength. TLS data are processed by registering point clouds i.e. by referring them to the same reference frame and by aggregation after a fine registration procedure. The resulting aggregate point cloud can be compared with graphic primitives as single or multiple planes, cylinders or spheres, and the resulting residuals give a morphological map that affords information about the state of conservation of the building materials used in historical or modern buildings, in particular when compared with other NDT techniques. In spite of its great productivity, the TLS technique is limited in that it is unable to penetrate the investigated materials. For this reason both the 3D residuals map and the reflectivity map need to be correlated with the results of other NDT techniques such as the ultrasonic method, and a complex study of the composition of building materials is also necessary. The application of a methodology useful to evaluate the quality of stone building materials and locate altered or damaged zones is presented in this study based on the integrated application of three independent techniques, two non destructive such as the TLS and the ultrasonic techniques in the 24-54 kHz range, and a third to analyze the petrographical characteristics of the stone materials, mainly the texture, with optical and scanning electronic microscopy (SEM). A very interesting case study is presented on a carbonate stone door of great architectural and historical interest, well suited to a high definition survey . This architectural element is inside the "Palazzo di Città" museum in the historical center of the Town of Cagliari, Sardinia (Italy). The integrated application of TLS and in situ and laboratory ultrasonic techniques, enhanced by the knowledge of the petrographic characteristics of the rocks, improves the diagnostic process and affords reliable information on the state of conservation of the stones used to build it. The integrated use of the above non destructive techniques also provides suitable data for a possible restoration and future preservation. Acknowledgments: This work was financially supported by Sardinian Local Administration (RAS - LR 7,August 2007, n.7, Promotion of Scientific Research and Innovation in Sardinia - Italy, Responsible Scientist: S.Fais).
Preliminary work toward the development of a dimensional tolerance standard for rapid prototyping
NASA Technical Reports Server (NTRS)
Kennedy, W. J.
1996-01-01
Rapid prototyping is a new technology for building parts quickly from CAD models. It works by slicing a CAD model into layers, then by building a model of the part one layer at a time. Since most parts can be sliced, most parts can be modeled using rapid prototyping. The layers themselves are created in a number of different ways - by using a laser to cure a layer of an epoxy or a resin, by depositing a layer of plastic or wax upon a surface, by using a laser to sinter a layer of powder, or by using a laser to cut a layer of paper. Rapid prototyping (RP) is new, and a standard part for use in comparing dimensional tolerances has not yet been chosen and accepted by ASTM (the American Society for Testing Materials). Such a part is needed when RP is used to build parts for investment casting or for direct use. The objective of this project was to start the development of a standard part by using statistical techniques to choose the features of the part which show curl - the vertical deviation of a part from its intended horizontal plane.
NASA Astrophysics Data System (ADS)
Lefebvre, Karine
Reinforced concrete structures with unreinforced masonry infills (BMR) are considered vulnerable to earthquakes. Under seismic actions, infills could fail (causing injuries or death) and cause damages to columns. In Quebec and Canada, most of BMR structures have been constructed prior to the introduction of modern seismic design codes raising question on the contribution of the infill to the structure lateral resistance. The aim of this thesis is to improve modelling technique of BMR structures built in Quebec between 1915 and 1960. This type of structures is found in hospitals or schools buildings, which must comply with some post-earthquake functionality requirements. They could also be residential or office buildings. Actually, practicing engineers usually calculate seismic capacity of BMR structures without considering the infill's structural contribution to the lateral resistance. Yet, this contribution should not be omitted. The first part of the thesis investigates the construction techniques and material properties of the old BMR structures in the Province. The results are the material properties (concrete, reinforcing steel, brick, terra cotta tile, and mortar) and the characteristics of the assemblies (wall section, reinforcement details…). The second part of the thesis presents the results of series of parametric analyses to identify among modelling and geometric parameters, which ones are the most influent on the lateral load response (rigidity, fundamental period, normal modes). Linear and modal analyses were performed. The most influent parameters identified are: number of storeys, number of bays, bay's width, soft storey, openings, upper storeys modelized (instead of being replaced by punctual loads) and the modelization technique of infills panels (strut or shell). Nonlinear static analyses have been performed to identify the most influent parameters to be considered for evaluating the lateral resistance, the capacity (load / displacement) and the yielding sequence (beam versus columns versus infills). The identified parameters are the presence of the infills, the openings and the geometric characteristics of the models (number of storeys and number of bays). One important contribution of this work is the development of an equivalent strut model to represent the action of the infill. The model could be easily implemented in standard analysis software. A central axial hinge reproducing the nonlinear behaviour of the masonry is added to the strut element. This model is a hybridization of existing proposals (FEMA and others) with added innovations by the author. It has been validated with experimental and numerical analyses results from literature. An important conclusion of this thesis is that the contribution of infills to lateral load resisting capacities of BMR structures should be considered for structure of more than one storey. Infills can add up to 51 % to bare frame capacity. The National building code requires that the lateral resistance of existing buildings must be at least 60 % of the equivalent static seismic force (V2005). It is concluded that one storey BMR buildings have a sufficient resistance, while three-storeys structures exhibit plastic deformations for loads under 0,6* V2005.
Baglivo, Cristina; Congedo, Paolo Maria
2018-04-01
Several technical combinations have been evaluated in order to design high energy performance buildings for the warm climate. The analysis has been developed in several steps, avoiding the use of HVAC systems. The methodological approach of this study is based on a sequential search technique and it is shown on the paper entitled "Envelope Design Optimization by Thermal Modeling of a Building in a Warm Climate" [1]. The Operative Air Temperature trends (TOP), for each combination, have been plotted through a dynamic simulation performed using the software TRNSYS 17 (a transient system simulation program, University of Wisconsin, Solar Energy Laboratory, USA, 2010). Starting from the simplest building configuration consisting of 9 rooms (equal-sized modules of 5 × 5 m 2 ), the different building components are sequentially evaluated until the envelope design is optimized. The aim of this study is to perform a step-by-step simulation, simplifying as much as possible the model without making additional variables that can modify their performances. Walls, slab-on-ground floor, roof, shading and windows are among the simulated building components. The results are shown for each combination and evaluated for Brindisi, a city in southern Italy having 1083 degrees day, belonging to the national climatic zone C. The data show the trends of the TOP for each measure applied in the case study for a total of 17 combinations divided into eight steps.
The Woodworker's Website: A Project Management Case Study
ERIC Educational Resources Information Center
Jance, Marsha
2014-01-01
A case study that focuses on building a website for a woodworking business is discussed. Project management and linear programming techniques can be used to determine the time required to complete the website project discussed in the case. This case can be assigned to students in an undergraduate or graduate decision modeling or management science…
ERIC Educational Resources Information Center
Robbins, Rockey; Tonemah, Stuart; Robbins, Sharla
2002-01-01
A culturally relevant group therapy model for gifted American Indian students and their parents uses non-didactic facilitation to focus on cultural identity, play, self-disclosure, parental involvement, silence, cognitive processing, emotional expression, and social responsibility. Evaluation results indicate the program builds self-esteem, pride…
ERIC Educational Resources Information Center
Myers, Trina S.; Blackman, Anna; Andersen, Trevor; Hay, Rachel; Lee, Ickjai; Gray, Heather
2014-01-01
Flexible online delivery of tertiary ICT programs is experiencing rapid growth. Creating an online environment that develops team building and interpersonal skills is difficult due to factors such as student isolation and the individual-centric model of online learning that encourages discrete study rather than teamwork. Incorporating teamwork…
ERIC Educational Resources Information Center
Claybrook, Billy G.
A new heuristic factorization scheme uses learning to improve the efficiency of determining the symbolic factorization of multivariable polynomials with interger coefficients and an arbitrary number of variables and terms. The factorization scheme makes extensive use of artificial intelligence techniques (e.g., model-building, learning, and…
Complex Building Detection Through Integrating LIDAR and Aerial Photos
NASA Astrophysics Data System (ADS)
Zhai, R.
2015-02-01
This paper proposes a new approach on digital building detection through the integration of LiDAR data and aerial imagery. It is known that most building rooftops are represented by different regions from different seed pixels. Considering the principals of image segmentation, this paper employs a new region based technique to segment images, combining both the advantages of LiDAR and aerial images together. First, multiple seed points are selected by taking several constraints into consideration in an automated way. Then, the region growing procedures proceed by combining the elevation attribute from LiDAR data, visibility attribute from DEM (Digital Elevation Model), and radiometric attribute from warped images in the segmentation. Through this combination, the pixels with similar height, visibility, and spectral attributes are merged into one region, which are believed to represent the whole building area. The proposed methodology was implemented on real data and competitive results were achieved.
A data-driven dynamics simulation framework for railway vehicles
NASA Astrophysics Data System (ADS)
Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun
2018-03-01
The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.
Correcting for deformation in skin-based marker systems.
Alexander, E J; Andriacchi, T P
2001-03-01
A new technique is described that reduces error due to skin movement artifact in the opto-electronic measurement of in vivo skeletal motion. This work builds on a previously described point cluster technique marker set and estimation algorithm by extending the transformation equations to the general deformation case using a set of activity-dependent deformation models. Skin deformation during activities of daily living are modeled as consisting of a functional form defined over the observation interval (the deformation model) plus additive noise (modeling error). The method is described as an interval deformation technique. The method was tested using simulation trials with systematic and random components of deformation error introduced into marker position vectors. The technique was found to substantially outperform methods that require rigid-body assumptions. The method was tested in vivo on a patient fitted with an external fixation device (Ilizarov). Simultaneous measurements from markers placed on the Ilizarov device (fixed to bone) were compared to measurements derived from skin-based markers. The interval deformation technique reduced the errors in limb segment pose estimate by 33 and 25% compared to the classic rigid-body technique for position and orientation, respectively. This newly developed method has demonstrated that by accounting for the changing shape of the limb segment, a substantial improvement in the estimates of in vivo skeletal movement can be achieved.
NASA Astrophysics Data System (ADS)
Ishizawa, O. A.; Clouteau, D.
2007-12-01
Long-duration, amplifications and spatial response's variability of the seismic records registered in Mexico City during the September 1985 earthquake cannot only be explained by the soil velocity model. We will try to explain these phenomena by studying the extent of the effect of buildings' diffracted wave fields during an earthquake. The main question is whether the presence of a large number of buildings can significantly modify the seismic wave field. We are interested in the interaction between the incident wave field propagating in a stratified half- space and a large number of structures at the free surface, i.e., the coupled city-site effect. We study and characterize the seismic wave propagation regimes in a city using the theory of wave propagation in random media. In the coupled city-site system, the buildings are modeled as resonant scatterers uniformly distributed at the surface of a deterministic, horizontally layered elastic half-space representing the soil. Based on the mean-field and the field correlation equations, we build a theoretical model which takes into account the multiple scattering of seismic waves and allows us to describe the coupled city-site system behavior in a simple and rapid way. The results obtained for the configurationally averaged field quantities are validated by means of 3D results for the seismic response of a deterministic model. The numerical simulations of this model are computed with MISS3D code based on classical Soil-Structure Interaction techniques and on a variational coupling between Boundary Integral Equations for a layered soil and a modal Finite Element approach for the buildings. This work proposes a detailed numerical and a theoretical analysis of the city-site interaction (CSI) in Mexico City area. The principal parameters in the study of the CSI are the buildings resonant frequency distribution, the soil characteristics of the site, the urban density and position of the buildings in the city, as well as the type of incident wave. The main results of the theoretical and numerical models allow us to characterize the seismic movement in urban areas.
Mocho, Pierre; Desauziers, Valérie
2011-05-01
Solid-phase microextraction (SPME) is a powerful technique, easy to implement for on-site static sampling of indoor VOCs emitted by building materials. However, a major constraint lies in the establishment of calibration curves which requires complex generation of standard atmospheres. Thus, the purpose of this paper is to propose a model to predict adsorption kinetics (i.e., calibration curves) of four model VOCs. The model is based on Fick's laws for the gas phase and on the equilibrium or the solid diffusion model for the adsorptive phase. Two samplers (the FLEC® and a home-made cylindrical emission cell), coupled to SPME for static sampling of material emissions, were studied. A good agreement between modeling and experimental data is observed and results show the influence of sampling rate on mass transfer mode in function of sample volume. The equilibrium model is adapted to quite large volume sampler (cylindrical cell) while the solid diffusion model is dedicated to small volume sampler (FLEC®). The limiting steps of mass transfer are the diffusion in gas phase for the cylindrical cell and the pore surface diffusion for the FLEC®. In the future, this modeling approach could be a useful tool for time-saving development of SPME to study building material emission in static mode sampling.
NASA Astrophysics Data System (ADS)
Richardson, P. W.; Karlstrom, L.
2016-12-01
The competition between constructional volcanic processes such as lava flows, cinder cones, and tumuli compete with physical and chemical erosional processes to control the morphology of mafic volcanic landscapes. If volcanic effusion rates are high, these landscapes are primarily constructional, but over the timescales associated with hot spot volcanism (1-10 Myr) and arcs (10-50 Myr), chemical and physical erosional processes are important. For fluvial incision to occur, initially high infiltration rates must be overcome by chemical weathering or input of fine-grained sediment. We investigate lava flow resurfacing, using a new lava flow algorithm that can be calibrated for specific flows and eruption magnitude/frequency relationships, into a landscape evolution model to complete two modeling experiments to investigate the interplay between volcanic resurfacing and fluvial incision. We use a stochastic spatial vent distribution calibrated from the Hawaiian eruption record to resurface a synthetically produced ocean island. In one experiment, we investigate the consequences of including time-dependent channel incision efficiency. This effectively mimics the behavior of transient hydrological development of lava flows. In the second experiment, we explore the competition between channel incision and lava flow resurfacing. The relative magnitudes of channel incision versus lava flow resurfacing are captured in landscape topography. For example, during the shield building period for ocean islands, effusion rates are high and the signature of lava flow resurfacing dominates. In contrast, after the shield building phase, channel incision begins and eventually dominates the topographic signature. We develop a dimensionless ratio of resurfacing rate to erosion rate to characterize the transition between these processes. We use spectral techniques to characterize volcanic features and to pinpoint the transition between constructional and erosional morphology on modeled landscapes and on the Big Island of Hawaii.
NASA Astrophysics Data System (ADS)
Ai, Z. T.; Mak, C. M.
2014-05-01
This study examines the interunit dispersion characteristics in and around multistory buildings under wind-induced single-sided ventilation conditions using computational fluid dynamics (CFD) method, under the hypothesis that infectious respiratory aerosols exhausted from a unit can reenter into another unit in a same building through opened windows. The effect of balconies on the interunit dispersion pattern is considered. The RNG k - ɛ model and the two-layer near-wall model are employed to establish the coupled indoor and outdoor airflow field, and the tracer gas technique is adopted to simulate pollutant dispersion. Reentry ratios from each unit to other units under prevailing wind directions are quantified and the possible interunit dispersion routes are then revealed. It is found that many reentry ratios appear to reach around 10.0%, suggesting that the interunit dispersion is an important pollutant transmission route. The interunit dispersion pattern is highly dependent on the incident wind direction and the fact whether the building has protrusive envelope features. On average, the strongest dispersion occurs on the windward wall of the buildings under oblique wind direction, owing to high ACH (air change per hour) values and unidirectional spread routes. Except under a normal incident wind, the presence of balconies intensifies the interunit dispersion by forming dispersion channels to increase the reentry ratios.
Data mining to support simulation modeling of patient flow in hospitals.
Isken, Mark W; Rajagopalan, Balaji
2002-04-01
Spiraling health care costs in the United States are driving institutions to continually address the challenge of optimizing the use of scarce resources. One of the first steps towards optimizing resources is to utilize capacity effectively. For hospital capacity planning problems such as allocation of inpatient beds, computer simulation is often the method of choice. One of the more difficult aspects of using simulation models for such studies is the creation of a manageable set of patient types to include in the model. The objective of this paper is to demonstrate the potential of using data mining techniques, specifically clustering techniques such as K-means, to help guide the development of patient type definitions for purposes of building computer simulation or analytical models of patient flow in hospitals. Using data from a hospital in the Midwest this study brings forth several important issues that researchers need to address when applying clustering techniques in general and specifically to hospital data.
NASA Astrophysics Data System (ADS)
Gradziński, Piotr
2017-10-01
Whereas World’s climate is changing (inter alia, under the influence of architecture activity), the author attempts to reorientations design practice primarily in a direction the use and adapt to the climatic conditions. Architectural Design using in early stages of the architectural Design Process of the building, among other Life Cycle Analysis (LCA) and digital analytical tools BIM (Building Information Modelling) defines the overriding requirements which the designer/architect should meet. The first part, the text characterized the architecture activity influences (by consumption, pollution, waste, etc.) and the use of building materials (embodied energy, embodied carbon, Global Warming Potential, etc.) within the meaning of the direct negative environmental impact. The second part, the paper presents the revision of the methods and analytical techniques prevent negative influences. Firstly, showing the study of the building by using the Life Cycle Analysis of the structure (e.g. materials) and functioning (e.g. energy consumptions) of the architectural object (stages: before use, use, after use). Secondly, the use of digital analytical tools for determining the benefits of running multi-faceted simulations in terms of environmental factors (exposure to light, shade, wind) directly affecting shaping the form of the building. The conclusion, author’s research results highlight the fact that indicates the possibility of building design using the above-mentioned elements (LCA, BIM) causes correction, early designs decisions in the design process of architectural form, minimizing the impact on nature, environment. The work refers directly to the architectural-environmental dimensions, orienting the design process of buildings in respect of widely comprehended climatic changes.
Prediction of siRNA potency using sparse logistic regression.
Hu, Wei; Hu, John
2014-06-01
RNA interference (RNAi) can modulate gene expression at post-transcriptional as well as transcriptional levels. Short interfering RNA (siRNA) serves as a trigger for the RNAi gene inhibition mechanism, and therefore is a crucial intermediate step in RNAi. There have been extensive studies to identify the sequence characteristics of potent siRNAs. One such study built a linear model using LASSO (Least Absolute Shrinkage and Selection Operator) to measure the contribution of each siRNA sequence feature. This model is simple and interpretable, but it requires a large number of nonzero weights. We have introduced a novel technique, sparse logistic regression, to build a linear model using single-position specific nucleotide compositions which has the same prediction accuracy of the linear model based on LASSO. The weights in our new model share the same general trend as those in the previous model, but have only 25 nonzero weights out of a total 84 weights, a 54% reduction compared to the previous model. Contrary to the linear model based on LASSO, our model suggests that only a few positions are influential on the efficacy of the siRNA, which are the 5' and 3' ends and the seed region of siRNA sequences. We also employed sparse logistic regression to build a linear model using dual-position specific nucleotide compositions, a task LASSO is not able to accomplish well due to its high dimensional nature. Our results demonstrate the superiority of sparse logistic regression as a technique for both feature selection and regression over LASSO in the context of siRNA design.
Jacquemin, Bénédicte; Lepeule, Johanna; Boudier, Anne; Arnould, Caroline; Benmerad, Meriem; Chappaz, Claire; Ferran, Joane; Kauffmann, Francine; Morelli, Xavier; Pin, Isabelle; Pison, Christophe; Rios, Isabelle; Temam, Sofia; Künzli, Nino; Slama, Rémy
2013-01-01
Background: Errors in address geocodes may affect estimates of the effects of air pollution on health. Objective: We investigated the impact of four geocoding techniques on the association between urban air pollution estimated with a fine-scale (10 m × 10 m) dispersion model and lung function in adults. Methods: We measured forced expiratory volume in 1 sec (FEV1) and forced vital capacity (FVC) in 354 adult residents of Grenoble, France, who were participants in two well-characterized studies, the Epidemiological Study on the Genetics and Environment on Asthma (EGEA) and the European Community Respiratory Health Survey (ECRHS). Home addresses were geocoded using individual building matching as the reference approach and three spatial interpolation approaches. We used a dispersion model to estimate mean PM10 and nitrogen dioxide concentrations at each participant’s address during the 12 months preceding their lung function measurements. Associations between exposures and lung function parameters were adjusted for individual confounders and same-day exposure to air pollutants. The geocoding techniques were compared with regard to geographical distances between coordinates, exposure estimates, and associations between the estimated exposures and health effects. Results: Median distances between coordinates estimated using the building matching and the three interpolation techniques were 26.4, 27.9, and 35.6 m. Compared with exposure estimates based on building matching, PM10 concentrations based on the three interpolation techniques tended to be overestimated. When building matching was used to estimate exposures, a one-interquartile range increase in PM10 (3.0 μg/m3) was associated with a 3.72-point decrease in FVC% predicted (95% CI: –0.56, –6.88) and a 3.86-point decrease in FEV1% predicted (95% CI: –0.14, –3.24). The magnitude of associations decreased when other geocoding approaches were used [e.g., for FVC% predicted –2.81 (95% CI: –0.26, –5.35) using NavTEQ, or 2.08 (95% CI –4.63, 0.47, p = 0.11) using Google Maps]. Conclusions: Our findings suggest that the choice of geocoding technique may influence estimated health effects when air pollution exposures are estimated using a fine-scale exposure model. Citation: Jacquemin B, Lepeule J, Boudier A, Arnould C, Benmerad M, Chappaz C, Ferran J, Kauffmann F, Morelli X, Pin I, Pison C, Rios I, Temam S, Künzli N, Slama R, Siroux V. 2013. Impact of geocoding methods on associations between long-term exposure to urban air pollution and lung function. Environ Health Perspect 121:1054–1060; http://dx.doi.org/10.1289/ehp.1206016 PMID:23823697
Guided-waves technique for inspecting the health of wall-covered building risers
NASA Astrophysics Data System (ADS)
Tse, Peter W.; Chen, J. M.; Wan, X.
2015-03-01
The inspection technique uses guided ultrasonic waves (GW) has been proven effective in detecting pipes' defects. However, as of today, the technique has not attracted much market attention because of insufficient field tests and lack of traceable records with proven results in commercial applications. In this paper, it presents the results obtained by using GW to inspect the defects occurred in real gas risers that are commonly installed in tall buildings. The purpose of having risers is to deliver gas from any building external piping system to each household unit of the building. The risers extend from the external wall of the building, penetrate thorough the concrete wall, into the kitchen or bathroom of each household unit. Similar to in-service pipes, risers are prone to corrosion due to water leaks into the concrete wall. However, the corrosion occurs in the section of riser, which is covered by the concrete wall, is difficult to be inspected by conventional techniques. Hence, GW technique was employed. The effectiveness of GW technique was tested by laboratory and on-site experiments using real risers gathered from tall buildings. The experimental results show that GW can partially penetrate thorough the riser's section that is covered by wall. The integrity of the wall-covered section of a riser can be determined by the reflected wave signals generated by the corroded area that may exit inside the wall-covered section. Based on the reflected wave signal, one can determine the health of the wall-covered riser.
NASA Astrophysics Data System (ADS)
Awrangjeb, M.; Siddiqui, F. U.
2017-11-01
In complex urban and residential areas, there are buildings which are not only connected with and/or close to one another but also partially occluded by their surrounding vegetation. Moreover, there may be buildings whose roofs are made of transparent materials. In transparent buildings, there are point returns from both the ground (or materials inside the buildings) and the rooftop. These issues confuse the previously proposed building masks which are generated from either ground points or non-ground points. The normalised digital surface model (nDSM) is generated from the non-ground points and usually it is hard to find individual buildings and trees using the nDSM. In contrast, the primary building mask is produced using the ground points, thereby it misses the transparent rooftops. This paper proposes a new building mask based on the non-ground points. The dominant directions of non-ground lines extracted from the multispectral imagery are estimated. A dummy grid with the target mask resolution is rotated at each dominant direction to obtain the corresponding height values from the non-ground points. Three sub-masks are then generated from the height grid by estimating the gradient function. Two of these sub-masks capture planar surfaces whose height remain constant in along and across the dominant direction, respectively. The third sub-mask contains only the flat surfaces where the height (ideally) remains constant in all directions. All the sub-masks generated in all estimated dominant directions are combined to produce the candidate building mask. Although the application of the gradient function helps in removal of most of the vegetation, the final building mask is obtained through removal of planar vegetation, if any, and tiny isolated false candidates. Experimental results on three Australian data sets show that the proposed method can successfully remove vegetation, thereby separate buildings from occluding vegetation and detect buildings with transparent roof materials. While compared to existing building detection techniques, the proposed technique offers higher objectbased completeness, correctness and quality, specially in complex scenes with aforementioned issues. It is not only capable of detecting transparent buildings, but also small garden sheds which are sometimes as small as 5 m2 in area.
Novel transformation-based response prediction of shear building using interval neural network
NASA Astrophysics Data System (ADS)
Chakraverty, S.; Sahoo, Deepti Moyi
2017-04-01
Present paper uses powerful technique of interval neural network (INN) to simulate and estimate structural response of multi-storey shear buildings subject to earthquake motion. The INN is first trained for a real earthquake data, viz., the ground acceleration as input and the numerically generated responses of different floors of multi-storey buildings as output. Till date, no model exists to handle positive and negative data in the INN. As such here, the bipolar data in [ -1, 1] are converted first to unipolar form, i.e., to [0, 1] by means of a novel transformation for the first time to handle the above training patterns in normalized form. Once the training is done, again the unipolar data are converted back to its bipolar form by using the inverse transformation. The trained INN architecture is then used to simulate and test the structural response of different floors for various intensity earthquake data and it is found that the predicted responses given by INN model are good for practical purposes.
Computer-aided design of biological circuits using TinkerCell.
Chandran, Deepak; Bergmann, Frank T; Sauro, Herbert M
2010-01-01
Synthetic biology is an engineering discipline that builds on modeling practices from systems biology and wet-lab techniques from genetic engineering. As synthetic biology advances, efficient procedures will be developed that will allow a synthetic biologist to design, analyze, and build biological networks. In this idealized pipeline, computer-aided design (CAD) is a necessary component. The role of a CAD application would be to allow efficient transition from a general design to a final product. TinkerCell is a design tool for serving this purpose in synthetic biology. In TinkerCell, users build biological networks using biological parts and modules. The network can be analyzed using one of several functions provided by TinkerCell or custom programs from third-party sources. Since best practices for modeling and constructing synthetic biology networks have not yet been established, TinkerCell is designed as a flexible and extensible application that can adjust itself to changes in the field. © 2010 Landes Bioscience
Transforming RNA-Seq data to improve the performance of prognostic gene signatures.
Zwiener, Isabella; Frisch, Barbara; Binder, Harald
2014-01-01
Gene expression measurements have successfully been used for building prognostic signatures, i.e for identifying a short list of important genes that can predict patient outcome. Mostly microarray measurements have been considered, and there is little advice available for building multivariable risk prediction models from RNA-Seq data. We specifically consider penalized regression techniques, such as the lasso and componentwise boosting, which can simultaneously consider all measurements and provide both, multivariable regression models for prediction and automated variable selection. However, they might be affected by the typical skewness, mean-variance-dependency or extreme values of RNA-Seq covariates and therefore could benefit from transformations of the latter. In an analytical part, we highlight preferential selection of covariates with large variances, which is problematic due to the mean-variance dependency of RNA-Seq data. In a simulation study, we compare different transformations of RNA-Seq data for potentially improving detection of important genes. Specifically, we consider standardization, the log transformation, a variance-stabilizing transformation, the Box-Cox transformation, and rank-based transformations. In addition, the prediction performance for real data from patients with kidney cancer and acute myeloid leukemia is considered. We show that signature size, identification performance, and prediction performance critically depend on the choice of a suitable transformation. Rank-based transformations perform well in all scenarios and can even outperform complex variance-stabilizing approaches. Generally, the results illustrate that the distribution and potential transformations of RNA-Seq data need to be considered as a critical step when building risk prediction models by penalized regression techniques.
Transforming RNA-Seq Data to Improve the Performance of Prognostic Gene Signatures
Zwiener, Isabella; Frisch, Barbara; Binder, Harald
2014-01-01
Gene expression measurements have successfully been used for building prognostic signatures, i.e for identifying a short list of important genes that can predict patient outcome. Mostly microarray measurements have been considered, and there is little advice available for building multivariable risk prediction models from RNA-Seq data. We specifically consider penalized regression techniques, such as the lasso and componentwise boosting, which can simultaneously consider all measurements and provide both, multivariable regression models for prediction and automated variable selection. However, they might be affected by the typical skewness, mean-variance-dependency or extreme values of RNA-Seq covariates and therefore could benefit from transformations of the latter. In an analytical part, we highlight preferential selection of covariates with large variances, which is problematic due to the mean-variance dependency of RNA-Seq data. In a simulation study, we compare different transformations of RNA-Seq data for potentially improving detection of important genes. Specifically, we consider standardization, the log transformation, a variance-stabilizing transformation, the Box-Cox transformation, and rank-based transformations. In addition, the prediction performance for real data from patients with kidney cancer and acute myeloid leukemia is considered. We show that signature size, identification performance, and prediction performance critically depend on the choice of a suitable transformation. Rank-based transformations perform well in all scenarios and can even outperform complex variance-stabilizing approaches. Generally, the results illustrate that the distribution and potential transformations of RNA-Seq data need to be considered as a critical step when building risk prediction models by penalized regression techniques. PMID:24416353
Collaborative filtering on a family of biological targets.
Erhan, Dumitru; L'heureux, Pierre-Jean; Yue, Shi Yi; Bengio, Yoshua
2006-01-01
Building a QSAR model of a new biological target for which few screening data are available is a statistical challenge. However, the new target may be part of a bigger family, for which we have more screening data. Collaborative filtering or, more generally, multi-task learning, is a machine learning approach that improves the generalization performance of an algorithm by using information from related tasks as an inductive bias. We use collaborative filtering techniques for building predictive models that link multiple targets to multiple examples. The more commonalities between the targets, the better the multi-target model that can be built. We show an example of a multi-target neural network that can use family information to produce a predictive model of an undersampled target. We evaluate JRank, a kernel-based method designed for collaborative filtering. We show their performance on compound prioritization for an HTS campaign and the underlying shared representation between targets. JRank outperformed the neural network both in the single- and multi-target models.
Review of optimization techniques of polygeneration systems for building applications
NASA Astrophysics Data System (ADS)
Y, Rong A.; Y, Su; R, Lahdelma
2016-08-01
Polygeneration means simultaneous production of two or more energy products in a single integrated process. Polygeneration is an energy-efficient technology and plays an important role in transition into future low-carbon energy systems. It can find wide applications in utilities, different types of industrial sectors and building sectors. This paper mainly focus on polygeneration applications in building sectors. The scales of polygeneration systems in building sectors range from the micro-level for a single home building to the large- level for residential districts. Also the development of polygeneration microgrid is related to building applications. The paper aims at giving a comprehensive review for optimization techniques for designing, synthesizing and operating different types of polygeneration systems for building applications.
Model of slums rejuvenation in Telaga Tujuh village: the case of Langsa city, Aceh, Indonesia
NASA Astrophysics Data System (ADS)
Irwansyah, Mirza; Caisarina, Irin; Solehati, Dini
2018-05-01
Telaga Tujuh village is the only island inhabited compared to other islands in Langsa City, Aceh. Most of the houses are on stilts with very limited infrastructure such as lack of road facilities, local drainage, drinking water, wastewater, and garbage disposals. In determining the model of the slum settlements arrangement of Telaga Tujuh Village, there are some things to know that the characteristics of slums themselves and the causes of slum settlement. The aim of this study is to determine model of slum settlement arrangement that is suitable to be applied in the location. The method used is qualitative with sampling technique and qualitative analysis. To obtain the primary data used observation method, questionnaires, and interview. Secondary data obtained from agencies related to slum settlement arrangement. Based on characteristic analysis found that 365 residential buildings are irregular with the percentage of 100%, 365 residential buildings do not have safe drinking water supply, 365 residential buildings do not have waste water management. From the analysis shows that the appropriate model to be applied to Telaga Tujuh village is the rejuvenation model with the land consolidation system through the re-arrangement divided by two, 60% for the existing residential development and 40% for commercial development.
Challoner, Avril; Pilla, Francesco; Gill, Laurence
2015-01-01
NO2 and particulate matter are the air pollutants of most concern in Ireland, with possible links to the higher respiratory and cardiovascular mortality and morbidity rates found in the country compared to the rest of Europe. Currently, air quality limits in Europe only cover outdoor environments yet the quality of indoor air is an essential determinant of a person’s well-being, especially since the average person spends more than 90% of their time indoors. The modelling conducted in this research aims to provide a framework for epidemiological studies by the use of publically available data from fixed outdoor monitoring stations to predict indoor air quality more accurately. Predictions are made using two modelling techniques, the Personal-exposure Activity Location Model (PALM), to predict outdoor air quality at a particular building, and Artificial Neural Networks, to model the indoor/outdoor relationship of the building. This joint approach has been used to predict indoor air concentrations for three inner city commercial buildings in Dublin, where parallel indoor and outdoor diurnal monitoring had been carried out on site. This modelling methodology has been shown to provide reasonable predictions of average NO2 indoor air quality compared to the monitored data, but did not perform well in the prediction of indoor PM2.5 concentrations. Hence, this approach could be used to determine NO2 exposures more rigorously of those who work and/or live in the city centre, which can then be linked to potential health impacts. PMID:26633448
Tsunami Simulators in Physical Modelling Laboratories - From Concept to Proven Technique
NASA Astrophysics Data System (ADS)
Allsop, W.; Chandler, I.; Rossetto, T.; McGovern, D.; Petrone, C.; Robinson, D.
2016-12-01
Before 2004, there was little public awareness around Indian Ocean coasts of the potential size and effects of tsunami. Even in 2011, the scale and extent of devastation by the Japan East Coast Tsunami was unexpected. There were very few engineering tools to assess onshore impacts of tsunami, so no agreement on robust methods to predict forces on coastal defences, buildings or related infrastructure. Modelling generally used substantial simplifications of either solitary waves (far too short durations) or dam break (unrealistic and/or uncontrolled wave forms).This presentation will describe research from EPI-centre, HYDRALAB IV, URBANWAVES and CRUST projects over the last 10 years that have developed and refined pneumatic Tsunami Simulators for the hydraulic laboratory. These unique devices have been used to model generic elevated and N-wave tsunamis up to and over simple shorelines, and at example defences. They have reproduced full-duration tsunamis including the Mercator trace from 2004 at 1:50 scale. Engineering scale models subjected to those tsunamis have measured wave run-up on simple slopes, forces on idealised sea defences and pressures / forces on buildings. This presentation will describe how these pneumatic Tsunami Simulators work, demonstrate how they have generated tsunami waves longer than the facility within which they operate, and will highlight research results from the three generations of Tsunami Simulator. Of direct relevance to engineers and modellers will be measurements of wave run-up levels and comparison with theoretical predictions. Recent measurements of forces on individual buildings have been generalized by separate experiments on buildings (up to 4 rows) which show that the greatest forces can act on the landward (not seaward) buildings. Continuing research in the 70m long 4m wide Fast Flow Facility on tsunami defence structures have also measured forces on buildings in the lee of a failed defence wall.
Sutton, Steven C; Hu, Mingxiu
2006-05-05
Many mathematical models have been proposed for establishing an in vitro/in vivo correlation (IVIVC). The traditional IVIVC model building process consists of 5 steps: deconvolution, model fitting, convolution, prediction error evaluation, and cross-validation. This is a time-consuming process and typically a few models at most are tested for any given data set. The objectives of this work were to (1) propose a statistical tool to screen models for further development of an IVIVC, (2) evaluate the performance of each model under different circumstances, and (3) investigate the effectiveness of common statistical model selection criteria for choosing IVIVC models. A computer program was developed to explore which model(s) would be most likely to work well with a random variation from the original formulation. The process used Monte Carlo simulation techniques to build IVIVC models. Data-based model selection criteria (Akaike Information Criteria [AIC], R2) and the probability of passing the Food and Drug Administration "prediction error" requirement was calculated. To illustrate this approach, several real data sets representing a broad range of release profiles are used to illustrate the process and to demonstrate the advantages of this automated process over the traditional approach. The Hixson-Crowell and Weibull models were often preferred over the linear. When evaluating whether a Level A IVIVC model was possible, the model selection criteria AIC generally selected the best model. We believe that the approach we proposed may be a rapid tool to determine which IVIVC model (if any) is the most applicable.
NASA Astrophysics Data System (ADS)
Pérez Ramos, A.; Robleda Prieto, G.
2016-06-01
Indoor Gothic apse provides a complex environment for virtualization using imaging techniques due to its light conditions and architecture. Light entering throw large windows in combination with the apse shape makes difficult to find proper conditions to photo capture for reconstruction purposes. Thus, documentation techniques based on images are usually replaced by scanning techniques inside churches. Nevertheless, the need to use Terrestrial Laser Scanning (TLS) for indoor virtualization means a significant increase in the final surveying cost. So, in most cases, scanning techniques are used to generate dense point clouds. However, many Terrestrial Laser Scanner (TLS) internal cameras are not able to provide colour images or cannot reach the image quality that can be obtained using an external camera. Therefore, external quality images are often used to build high resolution textures of these models. This paper aims to solve the problem posted by virtualizing indoor Gothic churches, making that task more affordable using exclusively techniques base on images. It reviews a previous proposed methodology using a DSRL camera with 18-135 lens commonly used for close range photogrammetry and add another one using a HDR 360° camera with four lenses that makes the task easier and faster in comparison with the previous one. Fieldwork and office-work are simplified. The proposed methodology provides photographs in such a good conditions for building point clouds and textured meshes. Furthermore, the same imaging resources can be used to generate more deliverables without extra time consuming in the field, for instance, immersive virtual tours. In order to verify the usefulness of the method, it has been decided to apply it to the apse since it is considered one of the most complex elements of Gothic churches and it could be extended to the whole building.
A model-based 3D template matching technique for pose acquisition of an uncooperative space object.
Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele
2015-03-16
This paper presents a customized three-dimensional template matching technique for autonomous pose determination of uncooperative targets. This topic is relevant to advanced space applications, like active debris removal and on-orbit servicing. The proposed technique is model-based and produces estimates of the target pose without any prior pose information, by processing three-dimensional point clouds provided by a LIDAR. These estimates are then used to initialize a pose tracking algorithm. Peculiar features of the proposed approach are the use of a reduced number of templates and the idea of building the database of templates on-line, thus significantly reducing the amount of on-board stored data with respect to traditional techniques. An algorithm variant is also introduced aimed at further accelerating the pose acquisition time and reducing the computational cost. Technique performance is investigated within a realistic numerical simulation environment comprising a target model, LIDAR operation and various target-chaser relative dynamics scenarios, relevant to close-proximity flight operations. Specifically, the capability of the proposed techniques to provide a pose solution suitable to initialize the tracking algorithm is demonstrated, as well as their robustness against highly variable pose conditions determined by the relative dynamics. Finally, a criterion for autonomous failure detection of the presented techniques is presented.
NASA Astrophysics Data System (ADS)
Shih, Chihhsiong
2005-01-01
Two efficient workflow are developed for the reconstruction of a 3D full color building model. One uses a point wise sensing device to sample an unknown object densely and attach color textures from a digital camera separately. The other uses an image based approach to reconstruct the model with color texture automatically attached. The point wise sensing device reconstructs the CAD model using a modified best view algorithm that collects the maximum number of construction faces in one view. The partial views of the point clouds data are then glued together using a common face between two consecutive views. Typical overlapping mesh removal and coarsening procedures are adapted to generate a unified 3D mesh shell structure. A post processing step is then taken to combine the digital image content from a separate camera with the 3D mesh shell surfaces. An indirect uv mapping procedure first divide the model faces into groups within which every face share the same normal direction. The corresponding images of these faces in a group is then adjusted using the uv map as a guidance. The final assembled image is then glued back to the 3D mesh to present a full colored building model. The result is a virtual building that can reflect the true dimension and surface material conditions of a real world campus building. The image based modeling procedure uses a commercial photogrammetry package to reconstruct the 3D model. A novel view planning algorithm is developed to guide the photos taking procedure. This algorithm successfully generate a minimum set of view angles. The set of pictures taken at these view angles can guarantee that each model face shows up at least in two of the pictures set and no more than three. The 3D model can then be reconstructed with minimum amount of labor spent in correlating picture pairs. The finished model is compared with the original object in both the topological and dimensional aspects. All the test cases show exact same topology and reasonably low dimension error ratio. Again proving the applicability of the algorithm.
A fuzzy hill-climbing algorithm for the development of a compact associative classifier
NASA Astrophysics Data System (ADS)
Mitra, Soumyaroop; Lam, Sarah S.
2012-02-01
Classification, a data mining technique, has widespread applications including medical diagnosis, targeted marketing, and others. Knowledge discovery from databases in the form of association rules is one of the important data mining tasks. An integrated approach, classification based on association rules, has drawn the attention of the data mining community over the last decade. While attention has been mainly focused on increasing classifier accuracies, not much efforts have been devoted towards building interpretable and less complex models. This paper discusses the development of a compact associative classification model using a hill-climbing approach and fuzzy sets. The proposed methodology builds the rule-base by selecting rules which contribute towards increasing training accuracy, thus balancing classification accuracy with the number of classification association rules. The results indicated that the proposed associative classification model can achieve competitive accuracies on benchmark datasets with continuous attributes and lend better interpretability, when compared with other rule-based systems.
Uav and Computer Vision, Detection of Infrastructure Losses and 3d Modeling
NASA Astrophysics Data System (ADS)
Barrile, V.; Bilotta, G.; Nunnari, A.
2017-11-01
The degradation of buildings, or rather the decline of their initial performances following external agents both natural (cold-thaw, earthquake, salt, etc.) and artificial (industrial field, urban setting, etc.), in the years lead to the necessity of developing Non-Destructive Testing (NDT) intended to give useful information for an explanation of a potential deterioration without damaging the state of buildings. An accurate examination of damages, of the repeat of cracks in condition of similar stress, indicate the existence of principles that control the creation of these events. There is no doubt that a precise visual analysis is at the bottom of a correct evaluation of the building. This paper deals with the creation of 3D models based on the capture of digital images, through autopilot flight UAV, for civil buildings situated on the area of Reggio Calabria. The following elaboration is done thanks to the use of commercial software, based on specific algorithms of the Structure from Motion (SfM) technique. SfM represents an important progress in the aerial and terrestrial survey field obtaining results, in terms of time and quality, comparable to those achievable through more traditional data capture methodologies.
Surveying for architectural students: as simple as possible - as much as necessary
NASA Astrophysics Data System (ADS)
Mayer, I.; Mitterecker, T.
2017-08-01
More and more, existing buildings - and particularly historic buildings - are becoming part of the daily business of every architect. Planning and designing in the field of architectural heritage requires not only knowledge of contemporary building techniques, design processes and national and international guidelines, but also a deep understanding of architectural heritage, its evolution and genesis, the building techniques that have been applied, materials used, traditions, etc. In many cases, it is indispensable to perform a detailed building survey and building research to achieve an adequate design concept. The Department of History of Architecture and Building Archaeology of TU Wien has an extensive tradition of building research and over the course of the past 10 years, has developed a teaching workflow to introduce architectural students to building archaeology und surveying methods for building research. A sophisticated, temporally interwoven combination of courses and lectures on different topics related to building archaeology and surveying rapidly gives the architectural students the right tools for this important but often neglected task.
Daemen Alternative Energy/Geothermal Technologies Demonstration Program Erie County
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beiswanger, Jr, Robert C
2010-05-20
The purpose of the Daemen Alternative Energy/Geothermal Technologies Demonstration Project is to demonstrate the use of geothermal technology as model for energy and environmental efficiency in heating and cooling older, highly inefficient buildings. The former Marian Library building at Daemen College is a 19,000 square foot building located in the center of campus. Through this project, the building was equipped with geothermal technology and results were disseminated. Gold LEED certification for the building was awarded. 1) How the research adds to the understanding of the area investigated. This project is primarily a demonstration project. Information about the installation is availablemore » to other companies, organizations, and higher education institutions that may be interested in using geothermal energy for heating and cooling older buildings. 2) The technical effectiveness and economic feasibility of the methods or techniques investigated or demonstrated. According to the modeling and estimates through Stantec, the energy-efficiency cost savings is estimated at 20%, or $24,000 per year. Over 20 years this represents $480,000 in unrestricted revenue available for College operations. See attached technical assistance report. 3) How the project is otherwise of benefit to the public. The Daemen College Geothermal Technologies Ground Source Heat Pumps project sets a standard for retrofitting older, highly inefficient, energy wasting and environmentally irresponsible buildings quite typical of many of the buildings on the campuses of regional colleges and universities. As a model, the project serves as an energy-efficient system with significant environmental advantages. Information about the energy-efficiency measures is available to other colleges and universities, organizations and companies, students, and other interested parties. The installation and renovation provided employment for 120 individuals during the award period. Through the new Center, Daemen will continue to host a range of events on campus for the general public. The College does not charge fees for speakers or most other events. This has been a long-standing tradition of the College.« less
Daemen Alternative Energy/Geothermal Technologies Demonstration Program, Erie County
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beiswanger, Robert C.
The purpose of the Daemen Alternative Energy/Geothermal Technologies Demonstration Project is to demonstrate the use of geothermal technology as model for energy and environmental efficiency in heating and cooling older, highly inefficient buildings. The former Marian Library building at Daemen College is a 19,000 square foot building located in the center of campus. Through this project, the building was equipped with geothermal technology and results were disseminated. Gold LEED certification for the building was awarded. 1) How the research adds to the understanding of the area investigated. This project is primarily a demonstration project. Information about the installation is availablemore » to other companies, organizations, and higher education institutions that may be interested in using geothermal energy for heating and cooling older buildings. 2) The technical effectiveness and economic feasibility of the methods or techniques investigated or demonstrated. According to the modeling and estimates through Stantec, the energy-efficiency cost savings is estimated at 20%, or $24,000 per year. Over 20 years this represents $480,000 in unrestricted revenue available for College operations. See attached technical assistance report. 3) How the project is otherwise of benefit to the public. The Daemen College Geothermal Technologies Ground Source Heat Pumps project sets a standard for retrofitting older, highly inefficient, energy wasting and environmentally irresponsible buildings that are quite typical of many of the buildings on the campuses of regional colleges and universities. As a model, the project serves as an energy-efficient system with significant environmental advantages. Information about the energy-efficiency measures is available to other colleges and universities, organizations and companies, students, and other interested parties. The installation and renovation provided employment for 120 individuals during the award period. Through the new Center, Daemen will continue to host a range of events on campus for the general public. The College does not charge fees for speakers or most other events. This has been a long-standing tradition of the College.« less
A symbolic/subsymbolic interface protocol for cognitive modeling
Simen, Patrick; Polk, Thad
2009-01-01
Researchers studying complex cognition have grown increasingly interested in mapping symbolic cognitive architectures onto subsymbolic brain models. Such a mapping seems essential for understanding cognition under all but the most extreme viewpoints (namely, that cognition consists exclusively of digitally implemented rules; or instead, involves no rules whatsoever). Making this mapping reduces to specifying an interface between symbolic and subsymbolic descriptions of brain activity. To that end, we propose parameterization techniques for building cognitive models as programmable, structured, recurrent neural networks. Feedback strength in these models determines whether their components implement classically subsymbolic neural network functions (e.g., pattern recognition), or instead, logical rules and digital memory. These techniques support the implementation of limited production systems. Though inherently sequential and symbolic, these neural production systems can exploit principles of parallel, analog processing from decision-making models in psychology and neuroscience to explain the effects of brain damage on problem solving behavior. PMID:20711520
Probabilistic registration of an unbiased statistical shape model to ultrasound images of the spine
NASA Astrophysics Data System (ADS)
Rasoulian, Abtin; Rohling, Robert N.; Abolmaesumi, Purang
2012-02-01
The placement of an epidural needle is among the most difficult regional anesthetic techniques. Ultrasound has been proposed to improve success of placement. However, it has not become the standard-of-care because of limitations in the depictions and interpretation of the key anatomical features. We propose to augment the ultrasound images with a registered statistical shape model of the spine to aid interpretation. The model is created with a novel deformable group-wise registration method which utilizes a probabilistic approach to register groups of point sets. The method is compared to a volume-based model building technique and it demonstrates better generalization and compactness. We instantiate and register the shape model to a spine surface probability map extracted from the ultrasound images. Validation is performed on human subjects. The achieved registration accuracy (2-4 mm) is sufficient to guide the choice of puncture site and trajectory of an epidural needle.
An Evaluation of the Synergistic Simulation of the Federal Open Market Committee.
ERIC Educational Resources Information Center
Bartlett, Robin Lynn; Amsler, Christine E.
The Federal Open Market Committee (FOMC) simulation employed three techniques: case study, role playing, and model building, in order to acquaint college students studying money and banking with the creation of monetary policy. The specific goals of the FOMC simulation were: (1) to familiarize students with the data used in monetary policy…
The Serious Use of Play and Metaphor: Legos and Labyrinths
ERIC Educational Resources Information Center
James, Alison; Brookfield, Stephen
2013-01-01
In this paper the authors wish to examine kinesthetic forms of learning involving the body and the physical realm. The authors look at two particular techniques; using Legos to build metaphorical models and living the physical experience of metaphors in the shape of labyrinth-walking and its attendant activities. The authors begin by discussing…
Information Fusion - Methods and Aggregation Operators
NASA Astrophysics Data System (ADS)
Torra, Vicenç
Information fusion techniques are commonly applied in Data Mining and Knowledge Discovery. In this chapter, we will give an overview of such applications considering their three main uses. This is, we consider fusion methods for data preprocessing, model building and information extraction. Some aggregation operators (i.e. particular fusion methods) and their properties are briefly described as well.
ERIC Educational Resources Information Center
Marbach-Ad, Gili; Schaefer, Kathryn L.; Kumi, Bryna C.; Friedman, Lee A.; Thompson, Katerina V.; Doyle, Michael P.
2012-01-01
This study describes the development and evaluation of a prep course for chemistry graduate teaching assistants (GTAs). The course was developed around three major goals: (i) building a community for new GTAs and socializing them into the department; (ii) modeling teaching with well-documented, innovative teaching and learning techniques; and…
Building detection in SAR imagery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steinbach, Ryan Matthew
Current techniques for building detection in Synthetic Aperture Radar (SAR) imagery can be computationally expensive and/or enforce stringent requirements for data acquisition. I present two techniques that are effective and efficient at determining an approximate building location. This approximate location can be used to extract a portion of the SAR image to then perform a more robust detection. The proposed techniques assume that for the desired image, bright lines and shadows, SAR artifact effects, are approximately labeled. These labels are enhanced and utilized to locate buildings, only if the related bright lines and shadows can be grouped. In order tomore » find which of the bright lines and shadows are related, all of the bright lines are connected to all of the shadows. This allows the problem to be solved from a connected graph viewpoint, where the nodes are the bright lines and shadows and the arcs are the connections between bright lines and shadows. For the first technique, constraints based on angle of depression and the relationship between connected bright lines and shadows are applied to remove unrelated arcs. The second technique calculates weights for the connections and then performs a series of increasingly relaxed hard and soft thresholds. This results in groups of various levels on their validity. Once the related bright lines and shadows are grouped, their locations are combined to provide an approximate building location. Experimental results demonstrate the outcome of the two techniques. The two techniques are compared and discussed.« less
Binder, Harald; Porzelius, Christine; Schumacher, Martin
2011-03-01
Analysis of molecular data promises identification of biomarkers for improving prognostic models, thus potentially enabling better patient management. For identifying such biomarkers, risk prediction models can be employed that link high-dimensional molecular covariate data to a clinical endpoint. In low-dimensional settings, a multitude of statistical techniques already exists for building such models, e.g. allowing for variable selection or for quantifying the added value of a new biomarker. We provide an overview of techniques for regularized estimation that transfer this toward high-dimensional settings, with a focus on models for time-to-event endpoints. Techniques for incorporating specific covariate structure are discussed, as well as techniques for dealing with more complex endpoints. Employing gene expression data from patients with diffuse large B-cell lymphoma, some typical modeling issues from low-dimensional settings are illustrated in a high-dimensional application. First, the performance of classical stepwise regression is compared to stage-wise regression, as implemented by a component-wise likelihood-based boosting approach. A second issues arises, when artificially transforming the response into a binary variable. The effects of the resulting loss of efficiency and potential bias in a high-dimensional setting are illustrated, and a link to competing risks models is provided. Finally, we discuss conditions for adequately quantifying the added value of high-dimensional gene expression measurements, both at the stage of model fitting and when performing evaluation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Seismic hazard, risk, and design for South America
Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison
2018-01-01
We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best available science.
NASA Astrophysics Data System (ADS)
Vdovin, R. A.; Smelov, V. G.
2017-02-01
This work describes the experience in manufacturing the turbine rotor for the micro-engine. It demonstrates the design principles for the complex investment casting process combining the use of the ProCast software and the rapid prototyping techniques. At the virtual modelling stage, in addition to optimized process parameters, the casting structure was improved to obtain the defect-free section. The real production stage allowed demonstrating the performance and fitness of rapid prototyping techniques for the manufacture of geometrically-complex engine-building parts.
High-energy synchrotron x-ray techniques for studying irradiated materials
Park, Jun-Sang; Zhang, Xuan; Sharma, Hemant; ...
2015-03-20
High performance materials that can withstand radiation, heat, multiaxial stresses, and corrosive environment are necessary for the deployment of advanced nuclear energy systems. Nondestructive in situ experimental techniques utilizing high energy x-rays from synchrotron sources can be an attractive set of tools for engineers and scientists to investigate the structure–processing–property relationship systematically at smaller length scales and help build better material models. In this paper, two unique and interconnected experimental techniques, namely, simultaneous small-angle/wide-angle x-ray scattering (SAXS/WAXS) and far-field high-energy diffraction microscopy (FF-HEDM) are presented. Finally, the changes in material state as Fe-based alloys are heated to high temperatures ormore » subject to irradiation are examined using these techniques.« less
Towards a 3d Spatial Urban Energy Modelling Approach
NASA Astrophysics Data System (ADS)
Bahu, J.-M.; Koch, A.; Kremers, E.; Murshed, S. M.
2013-09-01
Today's needs to reduce the environmental impact of energy use impose dramatic changes for energy infrastructure and existing demand patterns (e.g. buildings) corresponding to their specific context. In addition, future energy systems are expected to integrate a considerable share of fluctuating power sources and equally a high share of distributed generation of electricity. Energy system models capable of describing such future systems and allowing the simulation of the impact of these developments thus require a spatial representation in order to reflect the local context and the boundary conditions. This paper describes two recent research approaches developed at EIFER in the fields of (a) geo-localised simulation of heat energy demand in cities based on 3D morphological data and (b) spatially explicit Agent-Based Models (ABM) for the simulation of smart grids. 3D city models were used to assess solar potential and heat energy demand of residential buildings which enable cities to target the building refurbishment potentials. Distributed energy systems require innovative modelling techniques where individual components are represented and can interact. With this approach, several smart grid demonstrators were simulated, where heterogeneous models are spatially represented. Coupling 3D geodata with energy system ABMs holds different advantages for both approaches. On one hand, energy system models can be enhanced with high resolution data from 3D city models and their semantic relations. Furthermore, they allow for spatial analysis and visualisation of the results, with emphasis on spatially and structurally correlations among the different layers (e.g. infrastructure, buildings, administrative zones) to provide an integrated approach. On the other hand, 3D models can benefit from more detailed system description of energy infrastructure, representing dynamic phenomena and high resolution models for energy use at component level. The proposed modelling strategies conceptually and practically integrate urban spatial and energy planning approaches. The combined modelling approach that will be developed based on the described sectorial models holds the potential to represent hybrid energy systems coupling distributed generation of electricity with thermal conversion systems.
NASA Astrophysics Data System (ADS)
Hartley, Christopher Ahlvin
Current building energy auditing techniques are outdated and lack targeted, actionable information. These analyses only use one year's worth of monthly electricity and gas bills to define energy conservation and efficiency measures. These limited data sets cannot provide robust, directed energy reduction recommendations. The need is apparent for an overhaul of existing energy audit protocols to utilize all data that is available from the building's utility provider, installed energy management system (EMS), and sub-metering devices. This thesis analyzed the current state-of-the-art in energy audits, generated a next generation energy audit protocol, and conducted both audits types on four case study buildings to find out what additional information can be obtained from additional data sources and increased data gathering resolutions. Energy data from each case study building were collected using a variety of means including utility meters, whole building energy meters, EMS systems, and sub-metering devices. In addition to conducting an energy analysis for each case study building using the current and next generation energy audit protocols, two building energy models were created using the programs eQuest and EnergyPlus. The current and next generation energy audit protocol results were compared to one another upon completion. The results show that using the current audit protocols, only variations in season are apparent. Results from the developed next generation energy audit protocols show that in addition to seasonal variations, building heating, ventilation and air conditioning (HVAC) schedules, occupancy schedules, baseline and peak energy demand levels, and malfunctioning equipment can be found. This new protocol may also be used to quickly generate accurate building models because of the increased resolution that yields scheduling information. The developed next generation energy auditing protocol is scalable and can work for many building types across the United States, and perhaps the world.
Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Bonvini, Marco; Piette, Mary Ann
We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and buildingmore » behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.« less
NASA Astrophysics Data System (ADS)
Lang, A. F.; Salvaggio, C.
2016-12-01
Climate change, skyrocketing global population, and increasing urbanization have set the stage for more so-called "mega-disasters." We possess the knowledge to mitigate and predict the scope of these events, and recent advancements in remote sensing can inform these efforts. Satellite and aerial imagery can be obtained anywhere of interest; unmanned aerial systems can be deployed quickly; and improved sensor resolutions and image processing techniques allow close examination of the built environment. Combined, these technologies offer an unprecedented ability for the disaster community to visualize, assess, and communicate risk. Disaster mitigation and response efforts rely on an accurate representation of the built environment, including knowledge of building types, structural characteristics, and juxtapositions to known hazards. The use of remote sensing to extract these inventory data has come far in the last five years. Researchers in the Digital Imaging and Remote Sensing (DIRS) group at the Rochester Institute of Technology are meeting the needs of the disaster community through the development of novel image processing methods capable of extracting detailed information of individual buildings. DIRS researchers have pioneered the ability to generate three-dimensional building models from point cloud imagery (e.g., LiDAR). This method can process an urban area and recreate it in a navigable virtual reality environment such as Google Earth within hours. Detailed geometry is obtained for individual structures (e.g., footprint, elevation). In a recent step forward, these geometric data can now be combined with imagery from other sources, such as high resolution or multispectral imagery. The latter ascribes a spectral signature to individual pixels, suggesting construction material. Ultimately, these individual building data are amassed over an entire region, facilitating aggregation and risk modeling analyses. The downtown region of Rochester, New York is presented as a case study. High resolution optical, LiDAR, and multi-spectral imagery was captured of this region. Using the techniques described, these imagery sources are combined and processed to produce a holistic representation of the built environment, inclusive of individual building characteristics.
Yassin, Mohamed F; Ohba, Masaake
2012-09-01
To assist validation of numerical simulations of urban pollution, air quality in a street canyon was investigated using a wind tunnel as a research tool under neutral atmospheric conditions. We used tracer gas techniques from a line source without buoyancy. Ethylene (C(2)H(4)) was used as the tracer gas. The street canyon model was formed of six parallel building rows of the same length. The flow and dispersion field was analyzed and measured using a hot-wire anemometer with split fiber probe and fast flame ionization detector. The diffusion flow field in the boundary layer within the street canyon was examined at different locations, with varying building orientations (θ=90°, 112.5°, 135° and 157.5°) and street canyon aspect ratios (W/H=1/2, 3/4 and 1) downwind of the leeward side of the street canyon model. Results show that velocity increases with aspect ratio, and with θ>90°. Pollutant concentration increases as aspect ratio decreases. This concentration decreases exponentially in the vertical direction, and decreases as θ increases from 90°. Measured pollutant concentration distributions indicate that variability of building orientation and aspect ratio in the street canyon are important for estimating air quality in the canyon. The data presented here can be used as a comprehensive database for validation of numerical models.
Experimental analysis and constitutive modelling of steel of A-IIIN strength class
NASA Astrophysics Data System (ADS)
Kruszka, Leopold; Janiszewski, Jacek
2015-09-01
Fundamentally important is the better understanding of behaviour of new building steels under impact loadings, including plastic deformations. Results of the experimental analysis in wide range of strain rates in compression at room temperature, as well as constitutive modelling for and B500SP structural steels of new A-IIIN Polish strength class, examined dynamically by split Hopkinson pressure bar technique at high strain rates, are presented in table and graphic forms. Dynamic mechanical characteristics of compressive strength for tested building structural steel are determined as well as dynamic mechanical properties of this material are compared with 18G2-b steel of A-II strength class, including effects of the shape of tested specimens, i.e. their slenderness. The paper focuses the attention on those experimental tests, their interpretation, and constitutive semi-empirical modelling of the behaviour of tested steels based on Johnson-Cook's model. Obtained results of analyses presented here are used for designing and numerical simulations of reinforced concrete protective structures.
Evolving RBF neural networks for adaptive soft-sensor design.
Alexandridis, Alex
2013-12-01
This work presents an adaptive framework for building soft-sensors based on radial basis function (RBF) neural network models. The adaptive fuzzy means algorithm is utilized in order to evolve an RBF network, which approximates the unknown system based on input-output data from it. The methodology gradually builds the RBF network model, based on two separate levels of adaptation: On the first level, the structure of the hidden layer is modified by adding or deleting RBF centers, while on the second level, the synaptic weights are adjusted with the recursive least squares with exponential forgetting algorithm. The proposed approach is tested on two different systems, namely a simulated nonlinear DC Motor and a real industrial reactor. The results show that the produced soft-sensors can be successfully applied to model the two nonlinear systems. A comparison with two different adaptive modeling techniques, namely a dynamic evolving neural-fuzzy inference system (DENFIS) and neural networks trained with online backpropagation, highlights the advantages of the proposed methodology.
NASA Astrophysics Data System (ADS)
Tian, J.; Krauß, T.; d'Angelo, P.
2017-05-01
Automatic rooftop extraction is one of the most challenging problems in remote sensing image analysis. Classical 2D image processing techniques are expensive due to the high amount of features required to locate buildings. This problem can be avoided when 3D information is available. In this paper, we show how to fuse the spectral and height information of stereo imagery to achieve an efficient and robust rooftop extraction. In the first step, the digital terrain model (DTM) and in turn the normalized digital surface model (nDSM) is generated by using a newly step-edge approach. In the second step, the initial building locations and rooftop boundaries are derived by removing the low-level pixels and high-level pixels with higher probability to be trees and shadows. This boundary is then served as the initial level set function, which is further refined to fit the best possible boundaries through distance regularized level-set curve evolution. During the fitting procedure, the edge-based active contour model is adopted and implemented by using the edges indicators extracted from panchromatic image. The performance of the proposed approach is tested by using the WorldView-2 satellite data captured over Munich.
NASA Astrophysics Data System (ADS)
Putra, MIH; Lewis, SA; Kurniasih, EM; Prabuning, D.; Faiqoh, E.
2016-11-01
Geographic information system and remote sensing techniques can be used to assist with distribution modelling; a useful tool that helps with strategic design and management plans for MPAs. This study built a pilot model of plankton biomass and distribution in the waters off Solor and Lembata, and is the first study to identify marine megafauna foraging areas in the region. Forty-three samples of zooplankton were collected every 4 km according to the range time and station of aqua MODIS. Generalized additive model (GAM) we used to modelling zooplankton biomass response from environmental properties.Thirty one samples were used to build a model of inverse distance weighting (IDW) (cell size 0.01°) and 12 samples were used as a control to verify the models accuracy. Furthermore, Getis-Ord Gi was used to identify the significance of the hotspot and cold-spot for foraging area. The GAM models was explain 88.1% response of zooplankton biomass and percent to full moon, phytopankton biomassbeing strong predictors. The sampling design was essential in order to build highly accurate models. Our models 96% accurate for phytoplankton and 88% accurate for zooplankton. The foraging behaviour was significantly related to plankton biomass hotspots, which were two times higher compared to plankton cold-spots. In addition, extremely steep slopes of the Lamakera strait support strong upwelling with highly productive waters that affect the presence of marine megafauna. This study detects that the Lamakera strait provides the planktonic requirements for marine megafauna foraging, helping to explain why this region supports such high diversity and abundance of marine megafauna.
Building predictive models for MERS-CoV infections using data mining techniques.
Al-Turaiki, Isra; Alshahrani, Mona; Almutairi, Tahani
Recently, the outbreak of MERS-CoV infections caused worldwide attention to Saudi Arabia. The novel virus belongs to the coronaviruses family, which is responsible for causing mild to moderate colds. The control and command center of Saudi Ministry of Health issues a daily report on MERS-CoV infection cases. The infection with MERS-CoV can lead to fatal complications, however little information is known about this novel virus. In this paper, we apply two data mining techniques in order to better understand the stability and the possibility of recovery from MERS-CoV infections. The Naive Bayes classifier and J48 decision tree algorithm were used to build our models. The dataset used consists of 1082 records of cases reported between 2013 and 2015. In order to build our prediction models, we split the dataset into two groups. The first group combined recovery and death records. A new attribute was created to indicate the record type, such that the dataset can be used to predict the recovery from MERS-CoV. The second group contained the new case records to be used to predict the stability of the infection based on the current status attribute. The resulting recovery models indicate that healthcare workers are more likely to survive. This could be due to the vaccinations that healthcare workers are required to get on regular basis. As for the stability models using J48, two attributes were found to be important for predicting stability: symptomatic and age. Old patients are at high risk of developing MERS-CoV complications. Finally, the performance of all the models was evaluated using three measures: accuracy, precision, and recall. In general, the accuracy of the models is between 53.6% and 71.58%. We believe that the performance of the prediction models can be enhanced with the use of more patient data. As future work, we plan to directly contact hospitals in Riyadh in order to collect more information related to patients with MERS-CoV infections. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.
Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...
2008-01-01
Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less
Modeling Smart Structure of Wind Turbine Blade
NASA Astrophysics Data System (ADS)
Qiao, Yin-hu; Han, Jiang; Zhang, Chun-yan; Chen, Jie-ping
2012-06-01
With the increasing size of wind turbine blades, the need for more sophisticated load control techniques has induced the interest for aerodynamic control systems with build-in intelligence on the blades. The paper aims to provide a way for modeling the adaptive wind turbine blades and analyze its ability for vibration suppress. It consists of the modeling of the adaptive wind turbine blades with the wire of piezoelectric material embedded in blade matrix, and smart sandwich structure of wind turbine blade. By using this model, an active vibration method which effectively suppresses the vibrations of the smart blade is designed.
Resonance and streaming of armored microbubbles
NASA Astrophysics Data System (ADS)
Spelman, Tamsin; Bertin, Nicolas; Stephen, Olivier; Marmottant, Philippe; Lauga, Eric
2015-11-01
A new experimental technique involves building a hollow capsule which partially encompasses a microbubble, creating an ``armored microbubble'' with long lifespan. Under acoustic actuation, such bubble produces net streaming flows. In order to theoretically model the induced flow, we first extend classical models of free bubbles to describe the streaming flow around a spherical body for any known axisymmetric shape oscillation. A potential flow model is then employed to determine the resonance modes of the armored microbubble. We finally use a more detailed viscous model to calculate the surface shape oscillations at the experimental driving frequency, and from this we predict the generated streaming flows.
Redistribution population data across a regular spatial grid according to buildings characteristics
NASA Astrophysics Data System (ADS)
Calka, Beata; Bielecka, Elzbieta; Zdunkiewicz, Katarzyna
2016-12-01
Population data are generally provided by state census organisations at the predefined census enumeration units. However, these datasets very are often required at userdefined spatial units that differ from the census output levels. A number of population estimation techniques have been developed to address these problems. This article is one of those attempts aimed at improving county level population estimates by using spatial disaggregation models with support of buildings characteristic, derived from national topographic database, and average area of a flat. The experimental gridded population surface was created for Opatów county, sparsely populated rural region located in Central Poland. The method relies on geolocation of population counts in buildings, taking into account the building volume and structural building type and then aggregation the people total in 1 km quadrilateral grid. The overall quality of population distribution surface expressed by the mean of RMSE equals 9 persons, and the MAE equals 0.01. We also discovered that nearly 20% of total county area is unpopulated and 80% of people lived on 33% of the county territory.
NASA Astrophysics Data System (ADS)
Rossi, D.
2011-09-01
The main focus of this article is to explain a teaching activity. This experience follows a research aimed to testing innovative systems for formal and digital analysis of architectural building. In particular, the field of investigation is the analytical drawing. An analytical draw allows to develope an interpretative and similar models of reality; these models are built using photomodeling techniques and are designed to re-write modern and contemporary architecture. The typology of the buildings surveyed belong to a cultural period, called Modern Movement, historically placed between the two world wars. The Modern Movement aimed to renew existing architectural principle and to a functional redefinition of the same one. In Italy these principles arrived during the Fascist period. Heritage made up of public social buildings (case del Balilla, G.I.L., recreation center...) built during the fascist period in middle Italy is remarkable for quantity and in many cases for architectural quality. This kind of buildings are composed using pure shapes: large cube (gyms) alternate with long rectangular block containing offices creates compositions made of big volumes and high towers. These features are perfectly suited to the needs of a surveying process by photomodeling where the role of photography is central and where there is the need to identify certain and easily distinguishable points on all picture, leaning on the edges of the volume or lininig on the texture discontinuity. The goal is the documentation to preserve and to develop buildings and urban complexes of modern architecture, directed to encourage an artistic preservation.
Software-engineering challenges of building and deploying reusable problem solvers.
O'Connor, Martin J; Nyulas, Csongor; Tu, Samson; Buckeridge, David L; Okhmatovskaia, Anna; Musen, Mark A
2009-11-01
Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task-method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach.
Software-engineering challenges of building and deploying reusable problem solvers
O’CONNOR, MARTIN J.; NYULAS, CSONGOR; TU, SAMSON; BUCKERIDGE, DAVID L.; OKHMATOVSKAIA, ANNA; MUSEN, MARK A.
2012-01-01
Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task–method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach. PMID:23565031
Prospecting for new physics in the Higgs and flavor sectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bishara, Fady
We explore two directions in beyond the standard model physics: dark matter model building and probing new sources of CP violation. In dark matter model building, we consider two scenarios where the stability of dark matter derives from the flavor symmetries of the standard model. The first model contains a flavor singlet dark matter candidate whose couplings to the visible sector are proportional to the flavor breaking parameters. This leads to a metastable dark matter with TeV scale mediators. In the second model, we consider a fully gauged SU(3) 3 flavor model with a flavor triplet dark matter. Consequently, the dark matter multiplet is charged while the standard model fields are neutral under a remnant Z 3 which ensures dark matter stability. We show that a Dirac fermion dark matter with radiative splitting in the multiplet must have a mass in the range [0:5; 5] TeV in order to satisfy all experimental constraints. We then turn our attention to Higgs portal dark matter and investigate the possibility of obtaining bounds on the up, down, and strange quark Yukawa couplings. If Higgs portal dark matter is discovered, we find that direct detection rates are insensitive to vanishing light quark Yukawa couplings. We then review flavor models and give the expected enhancement or suppression of the Yukawa couplings in those models. Finally, in the last two chapters, we develop techniques for probing CP violation in the Higgs coupling to photons and in rare radiative decays of B mesons. While theoretically clean, we find that these methods are not practical with current and planned detectors. However, these techniques can be useful with a dedicated detector (e.g., a gaseous TPC). In the case of radiative B meson decay B 0 → (K* → Kππ) γ, the techniques we develop also allow the extraction of the photon polarization fraction which is sensitive to new physics contributions since, in the standard model, the right(left) handed polarization fraction is of O( Λ QCD=m b) formore » $$\\bar{B}^{0}$$(B 0) meson decays.« less
From Architectural Photogrammetry Toward Digital Architectural Heritage Education
NASA Astrophysics Data System (ADS)
Baik, A.; Alitany, A.
2018-05-01
This paper considers the potential of using the documentation approach proposed for the heritage buildings in Historic Jeddah, Saudi Arabia (as a case study) by using the close-range photogrammetry / the Architectural Photogrammetry techniques as a new academic experiment in digital architectural heritage education. Moreover, different than most of engineering educational techniques related to architecture education, this paper will be focusing on the 3-D data acquisition technology as a tool to document and to learn the principals of the digital architectural heritage documentation. The objective of this research is to integrate the 3-D modelling and visualisation knowledge for the purposes of identifying, designing and evaluating an effective engineering educational experiment. Furthermore, the students will learn and understand the characteristics of the historical building while learning more advanced 3-D modelling and visualisation techniques. It can be argued that many of these technologies alone are difficult to improve the education; therefore, it is important to integrate them in an educational framework. This should be in line with the educational ethos of the academic discipline. Recently, a number of these technologies and methods have been effectively used in education sectors and other purposes; such as in the virtual museum. However, these methods are not directly coincided with the traditional education and teaching architecture. This research will be introduced the proposed approach as a new academic experiment in the architecture education sector. The new teaching approach will be based on the Architectural Photogrammetry to provide semantically rich models. The academic experiment will require students to have suitable knowledge in both Photogrammetry applications to engage with the process.
NASA Astrophysics Data System (ADS)
Du, Kongchang; Zhao, Ying; Lei, Jiaqiang
2017-09-01
In hydrological time series prediction, singular spectrum analysis (SSA) and discrete wavelet transform (DWT) are widely used as preprocessing techniques for artificial neural network (ANN) and support vector machine (SVM) predictors. These hybrid or ensemble models seem to largely reduce the prediction error. In current literature researchers apply these techniques to the whole observed time series and then obtain a set of reconstructed or decomposed time series as inputs to ANN or SVM. However, through two comparative experiments and mathematical deduction we found the usage of SSA and DWT in building hybrid models is incorrect. Since SSA and DWT adopt 'future' values to perform the calculation, the series generated by SSA reconstruction or DWT decomposition contain information of 'future' values. These hybrid models caused incorrect 'high' prediction performance and may cause large errors in practice.
NASA Astrophysics Data System (ADS)
Themistocleous, K.; Agapiou, A.; Hadjimitsis, D.
2016-10-01
The documentation of architectural cultural heritage sites has traditionally been expensive and labor-intensive. New innovative technologies, such as Unmanned Aerial Vehicles (UAVs), provide an affordable, reliable and straightforward method of capturing cultural heritage sites, thereby providing a more efficient and sustainable approach to documentation of cultural heritage structures. In this study, hundreds of images of the Panagia Chryseleousa church in Foinikaria, Cyprus were taken using a UAV with an attached high resolution camera. The images were processed to generate an accurate digital 3D model by using Structure in Motion techniques. Building Information Model (BIM) was then used to generate drawings of the church. The methodology described in the paper provides an accurate, simple and cost-effective method of documenting cultural heritage sites and generating digital 3D models using novel techniques and innovative methods.
Using Rollback Avoidance to Mitigate Failures in Next-Generation Extreme-Scale Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levy, Scott N.
2016-05-01
High-performance computing (HPC) systems enable scientists to numerically model complex phenomena in many important physical systems. The next major milestone in the development of HPC systems is the construction of the rst supercomputer capable executing more than an exa op, 10 18 oating point operations per second. On systems of this scale, failures will occur much more frequently than on current systems. As a result, resilience is a key obstacle to building next-generation extremescale systems. Coordinated checkpointing is currently the most widely-used mechanism for handling failures on HPC systems. Although coordinated checkpointing remains e ective on current systems, increasing themore » scale of today's systems to build next-generation systems will increase the cost of fault tolerance as more and more time is taken away from the application to protect against or recover from failure. Rollback avoidance techniques seek to mitigate the cost of checkpoint/restart by allowing an application to continue its execution rather than rolling back to an earlier checkpoint when failures occur. These techniqes include failure prediction and preventive migration, replicated computation, fault-tolerant algorithms, and softwarebased memory fault correction. In this thesis, we examine how rollback avoidance techniques can be used to address failures on extreme-scale systems. Using a combination of analytic modeling and simulation, we evaluate the potential impact of rollback avoidance on these systems. We then present a novel rollback avoidance technique that exploits similarities in application memory. Finally, we examine the feasibility of using this technique to protect against memory faults in kernel memory.« less
NASA Astrophysics Data System (ADS)
Friedl, L.; Macauley, M.; Bernknopf, R.
2013-12-01
Internationally, multiple organizations are placing greater emphasis on the societal benefits that governments, businesses, and NGOs can derive from applications of Earth-observing satellite observations, research, and models. A growing set of qualitative, anecdotal examples on the uses of Earth observations across a range of sectors can be complemented by the quantitative substantiation of the socioeconomic benefits. In turn, the expanding breadth of environmental data available and the awareness of their beneficial applications to inform decisions can support new products and services by companies, agencies, and civil society. There are, however, significant efforts needed to bridge the Earth sciences and social and economic sciences fields to build capacity, develop case studies, and refine analytic techniques in quantifying socioeconomic benefits from the use of Earth observations. Some government programs, such as the NASA Earth Science Division's Applied Sciences Program have initiated activities in recent years to quantify the socioeconomic benefits from applications of Earth observations research, and to develop multidisciplinary models for organizations' decision-making activities. A community of practice has conducted workshops, developed impact analysis reports, published a book, developed a primer, and pursued other activities to advance analytic methodologies and build capacity. This paper will present an overview of measuring socioeconomic impacts of Earth observations and how the measures can be translated into a value of Earth observation information. It will address key terms, techniques, principles and applications of socioeconomic impact analyses. It will also discuss activities to pursue a research agenda on analytic techniques, develop a body of knowledge, and promote broader skills and capabilities.
Multiattribute Decision Modeling Techniques: A Comparative Analysis
1988-08-01
Analytic Hierarchy Process ( AHP ). It is structurally similar to SMART, but elicitation methods are different and there are several algorithms for...reconciliation of inconsistent judgments and for consistency checks that are not available in any of the utility procedures. The AHP has been applied...of commercially available software packages that implement the AHP algorithms. Elicitation Methods. The AHP builds heavily on value trees, which
ERIC Educational Resources Information Center
Addai, Isaac
2015-01-01
This paper in the field of capacity building and students' affairs used the external survey assessment techniques of the probit model to examine the predicaments of non-resident students of the College of Technology Education, University of Education, Winneba. Considering the very limited residential facilities and the growing demand for tertiary…
Build your own soil: exploring microfluidics to create microbial habitat structures
Aleklett, Kristin; Kiers, E Toby; Ohlsson, Pelle; Shimizu, Thomas S; Caldas, Victor EA; Hammer, Edith C
2018-01-01
Soil is likely the most complex ecosystem on earth. Despite the global importance and extraordinary diversity of soils, they have been notoriously challenging to study. We show how pioneering microfluidic techniques provide new ways of studying soil microbial ecology by allowing simulation and manipulation of chemical conditions and physical structures at the microscale in soil model habitats. PMID:29135971
Using Bayesian Learning to Classify College Algebra Students by Understanding in Real-Time
ERIC Educational Resources Information Center
Cousino, Andrew
2013-01-01
The goal of this work is to provide instructors with detailed information about their classes at each assignment during the term. The information is both on an individual level and at the aggregate level. We used the large number of grades, which are available online these days, along with data-mining techniques to build our models. This enabled…
An Agent-Based Interface to Terrestrial Ecological Forecasting
NASA Technical Reports Server (NTRS)
Golden, Keith; Nemani, Ramakrishna; Pang, Wan-Lin; Votava, Petr; Etzioni, Oren
2004-01-01
This paper describes a flexible agent-based ecological forecasting system that combines multiple distributed data sources and models to provide near-real-time answers to questions about the state of the Earth system We build on novel techniques in automated constraint-based planning and natural language interfaces to automatically generate data products based on descriptions of the desired data products.
Intensive psychotherapy of schizophrenia.
Keats, C. J.; McGlashan, T. H.
1985-01-01
The literature on strategies of investigative psychotherapy of schizophrenia is selectively reviewed, and a case history is presented. The format is modelled on the authors' research technique of contrasting theory with practice. While long-term observation of single cases does not address cause and effect, descriptions of cases with a variety of known outcomes can help to build a typology of treatment processes. PMID:4049907
NASA Astrophysics Data System (ADS)
Jeong, C.; Om, J.; Hwang, J.; Joo, K.; Heo, J.
2013-12-01
In recent, the frequency of extreme flood has been increasing due to climate change and global warming. Highly flood damages are mainly caused by the collapse of flood control structures such as dam and dike. In order to reduce these disasters, the disaster management system (DMS) through flood forecasting, inundation mapping, EAP (Emergency Action Plan) has been studied. The estimation of inundation damage and practical EAP are especially crucial to the DMS. However, it is difficult to predict inundation and take a proper action through DMS in real emergency situation because several techniques for inundation damage estimation are not integrated and EAP is supplied in the form of a document in Korea. In this study, the integrated simulation system including rainfall frequency analysis, rainfall-runoff modeling, inundation prediction, surface runoff analysis, and inland flood analysis was developed. Using this system coupled with standard GIS data, inundation damage can be estimated comprehensively and automatically. The standard EAP based on BIM (Building Information Modeling) was also established in this system. It is, therefore, expected that the inundation damages through this study over the entire area including buildings can be predicted and managed.
Feed-Forward Neural Network Prediction of the Mechanical Properties of Sandcrete Materials
Asteris, Panagiotis G.; Roussis, Panayiotis C.; Douvika, Maria G.
2017-01-01
This work presents a soft-sensor approach for estimating critical mechanical properties of sandcrete materials. Feed-forward (FF) artificial neural network (ANN) models are employed for building soft-sensors able to predict the 28-day compressive strength and the modulus of elasticity of sandcrete materials. To this end, a new normalization technique for the pre-processing of data is proposed. The comparison of the derived results with the available experimental data demonstrates the capability of FF ANNs to predict with pinpoint accuracy the mechanical properties of sandcrete materials. Furthermore, the proposed normalization technique has been proven effective and robust compared to other normalization techniques available in the literature. PMID:28598400
Techniques in teaching statistics : linking research production and research use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martinez-Moyano, I .; Smith, A.; Univ. of Massachusetts at Boston)
In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between researchmore » and practice.« less
RECIPE COMPLETION USING MACHINE LEARNING TECHNIQUES.
De Clercq, M; Stock, M; De Baets, B; Waegeman, W
2015-01-01
Completing a recipe is a non-trivial task, as the success of ingredient combinations depends on a multitude of factors such as taste, smell, texture, etc. The aim of our work is to build a model that adds one or more ingredients to a given number of ingredients. The idea is based on leftover ingredients in a fridge. A person could list the available ingredients in his or her fridge and the model would suggest some additional ingredients to create a full recipe.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dobson, Ian; Hiskens, Ian; Linderoth, Jeffrey
Building on models of electrical power systems, and on powerful mathematical techniques including optimization, model predictive control, and simluation, this project investigated important issues related to the stable operation of power grids. A topic of particular focus was cascading failures of the power grid: simulation, quantification, mitigation, and control. We also analyzed the vulnerability of networks to component failures, and the design of networks that are responsive to and robust to such failures. Numerous other related topics were investigated, including energy hubs and cascading stall of induction machines
Software Development Cost Estimation Executive Summary
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Menzies, Tim
2006-01-01
Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.
2014-10-01
the angles and dihedrals that are truly unique will be indicated by the user by editing NewAngleTypesDump and NewDihedralTypesDump. The program ...Atomistic Molecular Simulations 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Robert M Elder, Timothy W Sirk, and...Antechamber program in Assisted Model Building with Energy Refinement (AMBER) Tools to assign partial charges (using the Austin Model 1 [AM1]-bond charge
Scrutinizing UML Activity Diagrams
NASA Astrophysics Data System (ADS)
Al-Fedaghi, Sabah
Building an information system involves two processes: conceptual modeling of the “real world domain” and designing the software system. Object-oriented methods and languages (e.g., UML) are typically used for describing the software system. For the system analysis process that produces the conceptual description, object-oriented techniques or semantics extensions are utilized. Specifically, UML activity diagrams are the “flow charts” of object-oriented conceptualization tools. This chapter proposes an alternative to UML activity diagrams through the development of a conceptual modeling methodology based on the notion of flow.
Modeling injury rates as a function of industrialized versus on-site construction techniques.
Rubio-Romero, J C; Suárez-Cebador, M; Abad, Jesús
2014-05-01
It is often predicted that the industrialization of building activities will lead to a reduction of accident rates in the construction sector, particularly as a result of switching activities from building sites to factories. However, to date no scientific research has provided objective quantitative results to back up this claim. The aim of this paper is to evaluate how industrialization affects the accident rate in different industrialized building systems in Spain. Our results revealed that the industrialized steel modular system presents the lowest accident rate, while the highest accident rate was recorded in the construction method with cast-in-place concrete. The lightweight construction system also presents a high accident rate. Accordingly, industrialized building systems cannot claim to be safer than traditional ones. The different types of "on-site work" seem to be the main variable which would explain the accident rates recorded in industrialized construction systems. Copyright © 2014 Elsevier Ltd. All rights reserved.
Characterization techniques for incorporating backgrounds into DIRSIG
NASA Astrophysics Data System (ADS)
Brown, Scott D.; Schott, John R.
2000-07-01
The appearance of operation hyperspectral imaging spectrometers in both solar and thermal regions has lead to the development of a variety of spectral detection algorithms. The development and testing of these algorithms requires well characterized field collection campaigns that can be time and cost prohibitive. Radiometrically robust synthetic image generation (SIG) environments that can generate appropriate images under a variety of atmospheric conditions and with a variety of sensors offers an excellent supplement to reduce the scope of the expensive field collections. In addition, SIG image products provide the algorithm developer with per-pixel truth, allowing for improved characterization of the algorithm performance. To meet the needs of the algorithm development community, the image modeling community needs to supply synthetic image products that contain all the spatial and spectral variability present in real world scenes, and that provide the large area coverage typically acquired with actual sensors. This places a heavy burden on synthetic scene builders to construct well characterized scenes that span large areas. Several SIG models have demonstrated the ability to accurately model targets (vehicles, buildings, etc.) Using well constructed target geometry (from CAD packages) and robust thermal and radiometry models. However, background objects (vegetation, infrastructure, etc.) dominate the percentage of real world scene pixels and utilizing target building techniques is time and resource prohibitive. This paper discusses new methods that have been integrated into the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model to characterize backgrounds. The new suite of scene construct types allows the user to incorporate both terrain and surface properties to obtain wide area coverage. The terrain can be incorporated using a triangular irregular network (TIN) derived from elevation data or digital elevation model (DEM) data from actual sensors, temperature maps, spectral reflectance cubes (possible derived from actual sensors), and/or material and mixture maps. Descriptions and examples of each new technique are presented as well as hybrid methods to demonstrate target embedding in real world imagery.
Assume-Guarantee Verification of Source Code with Design-Level Assumptions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.
2004-01-01
Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.
Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars
2016-04-12
A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.
NASA Astrophysics Data System (ADS)
Jin, Hao; Xu, Rui; Xu, Wenming; Cui, Pingyuan; Zhu, Shengying
2017-10-01
As to support the mission of Mars exploration in China, automated mission planning is required to enhance security and robustness of deep space probe. Deep space mission planning requires modeling of complex operations constraints and focus on the temporal state transitions of involved subsystems. Also, state transitions are ubiquitous in physical systems, but have been elusive for knowledge description. We introduce a modeling approach to cope with these difficulties that takes state transitions into consideration. The key technique we build on is the notion of extended states and state transition graphs. Furthermore, a heuristics that based on state transition graphs is proposed to avoid redundant work. Finally, we run comprehensive experiments on selected domains and our techniques present an excellent performance.
Building Change Detection from Harvey using Unmanned Aerial System (UAS)
NASA Astrophysics Data System (ADS)
Chang, A.; Yeom, J.; Jung, J.; Choi, I.
2017-12-01
Unmanned Aerial System (UAS) is getting to be the most important technique in recent days since the fine spatial and high temporal resolution data previously unobtainable from traditional remote sensing platforms. Advanced UAS data can provide a great opportunity for disaster monitoring. Especially, building change detection is the one of the most important topics for damage assessment and recovery from disasters. This study is proposing a method to monitor building change with UAS data for Holiday Beach in Texas, where was directly hit by Harvey on 25 August 2017. This study adopted 3D change detection to monitor building damage and recovery levels with building height as well as natural color information. We used a rotorcraft UAS to collect RGB data twice on 9 September and 18 October 2017 after the hurricane. The UAS data was processed using Agisoft Photoscan Pro Software to generate super high resolution dataset including orthomosaic, DSM (Digital Surface Model), and 3D point cloud. We compared the processed dataset with an airborne image considerable as before-hurricane data, which was acquired on January 2016. Building damage and recovery levels were determined by height and color change. The result will show that UAS data is useful to assess building damage and recovery for affected area by the natural disaster such as Harvey.
Machine Learning Techniques for Global Sensitivity Analysis in Climate Models
NASA Astrophysics Data System (ADS)
Safta, C.; Sargsyan, K.; Ricciuto, D. M.
2017-12-01
Climate models studies are not only challenged by the compute intensive nature of these models but also by the high-dimensionality of the input parameter space. In our previous work with the land model components (Sargsyan et al., 2014) we identified subsets of 10 to 20 parameters relevant for each QoI via Bayesian compressive sensing and variance-based decomposition. Nevertheless the algorithms were challenged by the nonlinear input-output dependencies for some of the relevant QoIs. In this work we will explore a combination of techniques to extract relevant parameters for each QoI and subsequently construct surrogate models with quantified uncertainty necessary to future developments, e.g. model calibration and prediction studies. In the first step, we will compare the skill of machine-learning models (e.g. neural networks, support vector machine) to identify the optimal number of classes in selected QoIs and construct robust multi-class classifiers that will partition the parameter space in regions with smooth input-output dependencies. These classifiers will be coupled with techniques aimed at building sparse and/or low-rank surrogate models tailored to each class. Specifically we will explore and compare sparse learning techniques with low-rank tensor decompositions. These models will be used to identify parameters that are important for each QoI. Surrogate accuracy requirements are higher for subsequent model calibration studies and we will ascertain the performance of this workflow for multi-site ALM simulation ensembles.
NASA Astrophysics Data System (ADS)
Barraca, Nuno; Almeida, Miguel; Varum, Humberto; Almeida, Fernando; Matias, Manuel Senos
2016-04-01
Ancient buildings in historical town centers can be protected by Cultural Heritage legislation, thus implying that any rehabilitation must respect their main architectural features. These concerns also apply to Modern and Contemporary buildings, in particular if they are important examples of architectural styles from those periods. These extra problems, or motivations, add to the inherent structural delicacy of ancient building restoration that requires detailed knowledge of the building foundations, characteristics and materials, modification history, infrastructure mapping, current pathologies, etc., all relevant information for an informed rehabilitation project. Such knowledge is seldom available before the actual rehabilitation works begin, and the usual invasive preliminary surveys are frequently expensive, time-consuming and likely significantly alter/damage the building's main features or structural integrity. Hence, the current demand for indirect, non-invasive, reliable and high resolution imagery techniques able to produce relevant information at the early stages of a rehabilitation project. The present work demonstrates that Ground Penetrating Radar (GPR or Georadar) surveys can provide a priori knowledge on the structure, construction techniques, materials, history and pathologies in a classified Modern Age building. It is also shown that the use of GPR on these projects requires carefully designed surveys, taking into account the known information, spatial constraints, environmental noise, nature and dimensions of the expected targets and suitable data processing sequences. Thus, if properly applied, GPR produces high-resolution results crucial for sound engineering/architectural interventions aiming to restore and renovate Modern and Contemporary buildings, with (1) focus on the overall quality of the end-result, (2) no damage inflicted to the existing structure, (3) respect of the building's historical coherence and architectural elements and characteristics, that is, its Cultural Heritage value. Most of the findings and applications discussed in this work can be seen as an approximation to model studies, so that, relevant information can be drawn from the different investigated situations. Therefore, owing to the nature and the range of the problems encountered in this case study, it is also expected that the presented GPR data and interpretation will provide important clues and guidance in the planning and investigation of similar projects and problems.
Identification of cost effective energy conservation measures
NASA Technical Reports Server (NTRS)
Bierenbaum, H. S.; Boggs, W. H.
1978-01-01
In addition to a successful program of readily implemented conservation actions for reducing building energy consumption at Kennedy Space Center, recent detailed analyses have identified further substantial savings for buildings representative of technical facilities designed when energy costs were low. The techniques employed for determination of these energy savings consisted of facility configuration analysis, power and lighting measurements, detailed computer simulations and simulation verifications. Use of these methods resulted in identification of projected energy savings as large as $330,000 a year (approximately two year break-even period) in a single building. Application of these techniques to other commercial buildings is discussed
A synthetic seismicity model for the Middle America Trench
NASA Technical Reports Server (NTRS)
Ward, Steven N.
1991-01-01
A novel iterative technique, based on the concept of fault segmentation and computed using 2D static dislocation theory, for building models of seismicity and fault interaction which are physically acceptable and geometrically and kinematically correct, is presented. The technique is applied in two steps to seismicity observed at the Middle America Trench. The first constructs generic models which randomly draw segment strengths and lengths from a 2D probability distribution. The second constructs predictive models in which segment lengths and strengths are adjusted to mimic the actual geography and timing of large historical earthquakes. Both types of models reproduce the statistics of seismicity over five units of magnitude and duplicate other aspects including foreshock and aftershock sequences, migration of foci, and the capacity to produce both characteristic and noncharacteristic earthquakes. Over a period of about 150 yr the complex interaction of fault segments and the nonlinear failure conditions conspire to transform an apparently deterministic model into a chaotic one.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1986-01-01
Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.
Improving stability of prediction models based on correlated omics data by using network approaches.
Tissier, Renaud; Houwing-Duistermaat, Jeanine; Rodríguez-Girondo, Mar
2018-01-01
Building prediction models based on complex omics datasets such as transcriptomics, proteomics, metabolomics remains a challenge in bioinformatics and biostatistics. Regularized regression techniques are typically used to deal with the high dimensionality of these datasets. However, due to the presence of correlation in the datasets, it is difficult to select the best model and application of these methods yields unstable results. We propose a novel strategy for model selection where the obtained models also perform well in terms of overall predictability. Several three step approaches are considered, where the steps are 1) network construction, 2) clustering to empirically derive modules or pathways, and 3) building a prediction model incorporating the information on the modules. For the first step, we use weighted correlation networks and Gaussian graphical modelling. Identification of groups of features is performed by hierarchical clustering. The grouping information is included in the prediction model by using group-based variable selection or group-specific penalization. We compare the performance of our new approaches with standard regularized regression via simulations. Based on these results we provide recommendations for selecting a strategy for building a prediction model given the specific goal of the analysis and the sizes of the datasets. Finally we illustrate the advantages of our approach by application of the methodology to two problems, namely prediction of body mass index in the DIetary, Lifestyle, and Genetic determinants of Obesity and Metabolic syndrome study (DILGOM) and prediction of response of each breast cancer cell line to treatment with specific drugs using a breast cancer cell lines pharmacogenomics dataset.
Development of Composite PCMs by Incorporation of Paraffin into Various Building Materials
Memon, Shazim Ali; Liao, Wenyu; Yang, Shuqing; Cui, Hongzhi; Shah, Syed Farasat Ali
2015-01-01
In this research, we focused on the development of composite phase-change materials (CPCMs) by incorporation of a paraffin through vacuum impregnation in widely used building materials (Kaolin and ground granulated blast-furnace slag (GGBS)). The composite PCMs were characterized using environmental scanning electron microscopy (ESEM), Fourier transform infrared spectroscopy (FT-IR), differential scanning calorimetry (DSC) and thermogravimetric analysis (TGA) techniques. Moreover, thermal performance of cement paste composite PCM panels was evaluated using a self-designed heating system. Test results showed that the maximum percentage of paraffin retained by Kaolin and GGBS was found to be 18% and 9%, respectively. FT-IR results show that CPCMs are chemically compatible. The phase-change temperatures of CPCMs were in the human comfort zone, and they possessed considerable latent-heat storage capacity. TGA results showed that CPCMs are thermally stable, and they did not show any sign of degradation below 150 °C. From thermal cycling tests, it was revealed that the CPCMs are thermally reliable. Thermal performance tests showed that in comparison to the control room model, the room models prepared with CPCMs reduced both the temperature fluctuations and maximum indoor center temperature. Therefore, the prepared CPCMs have some potential in reducing peak loads in buildings when applied to building facade. PMID:28787953
Numerical simulation of residual stress in laser based additive manufacturing process
NASA Astrophysics Data System (ADS)
Kalyan Panda, Bibhu; Sahoo, Seshadev
2018-03-01
Minimizing the residual stress build-up in metal-based additive manufacturing plays a pivotal role in selecting a particular material and technique for making an industrial part. In beam-based additive manufacturing, although a great deal of effort has been made to minimize the residual stresses, it is still elusive how to do so by simply optimizing the processing parameters, such as beam size, beam power, and scan speed. Amid different types of additive manufacturing processes, Direct Metal Laser Sintering (DMLS) process uses a high-power laser to melt and sinter layers of metal powder. The rapid solidification and heat transfer on powder bed endows a high cooling rate which leads to the build-up of residual stresses, that will affect the mechanical properties of the build parts. In the present work, the authors develop a numerical thermo-mechanical model for the measurement of residual stress in the AlSi10Mg build samples by using finite element method. Transient temperature distribution in the powder bed was assessed using the coupled thermal to structural model. Subsequently, the residual stresses were estimated with varying laser power. From the simulation result, it found that the melt pool dimensions increase with increasing the laser power and the magnitude of residual stresses in the built part increases.
Yu, Hesheng; Thé, Jesse
2017-05-01
The dispersion of gaseous pollutant around buildings is complex due to complex turbulence features such as flow detachment and zones of high shear. Computational fluid dynamics (CFD) models are one of the most promising tools to describe the pollutant distribution in the near field of buildings. Reynolds-averaged Navier-Stokes (RANS) models are the most commonly used CFD techniques to address turbulence transport of the pollutant. This research work studies the use of [Formula: see text] closure model for the gas dispersion around a building by fully resolving the viscous sublayer for the first time. The performance of standard [Formula: see text] model is also included for comparison, along with results of an extensively validated Gaussian dispersion model, the U.S. Environmental Protection Agency (EPA) AERMOD (American Meteorological Society/U.S. Environmental Protection Agency Regulatory Model). This study's CFD models apply the standard [Formula: see text] and the [Formula: see text] turbulence models to obtain wind flow field. A passive concentration transport equation is then calculated based on the resolved flow field to simulate the distribution of pollutant concentrations. The resultant simulation of both wind flow and concentration fields are validated rigorously by extensive data using multiple validation metrics. The wind flow field can be acceptably modeled by the [Formula: see text] model. However, the [Formula: see text] model fails to simulate the gas dispersion. The [Formula: see text] model outperforms [Formula: see text] in both flow and dispersion simulations, with higher hit rates for dimensionless velocity components and higher "factor of 2" of observations (FAC2) for normalized concentration. All these validation metrics of [Formula: see text] model pass the quality assurance criteria recommended by The Association of German Engineers (Verein Deutscher Ingenieure, VDI) guideline. Furthermore, these metrics are better than or the same as those in the literature. Comparison between the performances of [Formula: see text] and AERMOD shows that the CFD simulation is superior to Gaussian-type model for pollutant dispersion in the near wake of obstacles. AERMOD can perform as a screening tool for near-field gas dispersion due to its expeditious calculation and the ability to handle complicated cases. The utilization of [Formula: see text] to simulate gaseous pollutant dispersion around an isolated building is appropriate and is expected to be suitable for complex urban environment. Multiple validation metrics of [Formula: see text] turbulence model in CFD quantitatively indicated that this turbulence model was appropriate for the simulation of gas dispersion around buildings. CFD is, therefore, an attractive alternative to wind tunnel for modeling gas dispersion in urban environment due to its excellent performance, and lower cost.
[Advancements of computer chemistry in separation of Chinese medicine].
Li, Lingjuan; Hong, Hong; Xu, Xuesong; Guo, Liwei
2011-12-01
Separating technique of Chinese medicine is not only a key technique in the field of Chinese medicine' s research and development, but also a significant step in the modernization of Chinese medicinal preparation. Computer chemistry can build model and look for the regulations from Chinese medicine system which is full of complicated data. This paper analyzed the applicability, key technology, basic mode and common algorithm of computer chemistry applied in the separation of Chinese medicine, introduced the mathematic mode and the setting methods of Extraction kinetics, investigated several problems which based on traditional Chinese medicine membrane procession, and forecasted the application prospect.
NASA Astrophysics Data System (ADS)
Singh, Vipul
2011-12-01
The green building movement has been an effective catalyst in reducing energy demands of buildings and a large number of 'green' certified buildings have been in operation for several years. Whether these buildings are actually performing as intended, and if not, identifying specific causes for this discrepancy falls into the general realm of post-occupancy evaluation (POE). POE involves evaluating building performance in terms of energy-use, indoor environmental quality, acoustics and water-use; the first aspect i.e. energy-use is addressed in this thesis. Normally, a full year or more of energy-use and weather data is required to determine the actual post-occupancy energy-use of buildings. In many cases, either measured building performance data is not available or the time and cost implications may not make it feasible to invest in monitoring the building for a whole year. Knowledge about the minimum amount of measured data needed to accurately capture the behavior of the building over the entire year can be immensely beneficial. This research identifies simple modeling techniques to determine best time of the year to begin in-situ monitoring of building energy-use, and the least amount of data required for generating acceptable long-term predictions. Four analysis procedures are studied. The short-term monitoring for long-term prediction (SMLP) approach and dry-bulb temperature analysis (DBTA) approach allow determining the best time and duration of the year for in-situ monitoring to be performed based only on the ambient temperature data of the location. Multivariate change-point (MCP) modeling uses simulated/monitored data to determine best monitoring period of the year. This is also used to validate the SMLP and DBTA approaches. The hybrid inverse modeling method-1 predicts energy-use by combining a short dataset of monitored internal loads with a year of utility-bills, and hybrid inverse method-2 predicts long term building performance using utility-bills only. The results obtained show that often less than three to four months of monitored data is adequate for estimating the annual building energy use, provided that the monitoring is initiated at the right time, and the seasonal as well as daily variations are adequately captured by the short dataset. The predictive accuracy of the short data-sets is found to be strongly influenced by the closeness of the dataset's mean temperature to the annual average temperature. The analysis methods studied would be very useful for energy professionals involved in POE.
Building Personal and Social Competence through Cancer-Related Issues
ERIC Educational Resources Information Center
Donovan, Owen M.
2009-01-01
This article presents a teaching technique that aims to demonstrate pedagogy consistent with the characteristics of effective health education curricula that is student-centered, builds personal and social competence, and embeds assessment throughout the learning process. This teaching technique is appropriate for middle and high school students…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winlaw, Manda; De Sterck, Hans; Sanders, Geoffrey
In very simple terms a network can be de ned as a collection of points joined together by lines. Thus, networks can be used to represent connections between entities in a wide variety of elds including engi- neering, science, medicine, and sociology. Many large real-world networks share a surprising number of properties, leading to a strong interest in model development research and techniques for building synthetic networks have been developed, that capture these similarities and replicate real-world graphs. Modeling these real-world networks serves two purposes. First, building models that mimic the patterns and prop- erties of real networks helps tomore » understand the implications of these patterns and helps determine which patterns are important. If we develop a generative process to synthesize real networks we can also examine which growth processes are plausible and which are not. Secondly, high-quality, large-scale network data is often not available, because of economic, legal, technological, or other obstacles [7]. Thus, there are many instances where the systems of interest cannot be represented by a single exemplar network. As one example, consider the eld of cybersecurity, where systems require testing across diverse threat scenarios and validation across diverse network structures. In these cases, where there is no single exemplar network, the systems must instead be modeled as a collection of networks in which the variation among them may be just as important as their common features. By developing processes to build synthetic models, so-called graph generators, we can build synthetic networks that capture both the essential features of a system and realistic variability. Then we can use such synthetic graphs to perform tasks such as simulations, analysis, and decision making. We can also use synthetic graphs to performance test graph analysis algorithms, including clustering algorithms and anomaly detection algorithms.« less
Smith, Lee; Ucci, Marcella; Marmot, Alexi; Spinney, Richard; Laskowski, Marek; Sawyer, Alexia; Konstantatou, Marina; Hamer, Mark; Ambler, Gareth; Wardle, Jane; Fisher, Abigail
2013-11-12
Health benefits of regular participation in physical activity are well documented but population levels are low. Office layout, and in particular the number and location of office building destinations (eg, print and meeting rooms), may influence both walking time and characteristics of sitting time. No research to date has focused on the role that the layout of the indoor office environment plays in facilitating or inhibiting step counts and characteristics of sitting time. The primary aim of this study was to investigate associations between office layout and physical activity, as well as sitting time using objective measures. Active buildings is a unique collaboration between public health, built environment and computer science researchers. The study involves objective monitoring complemented by a larger questionnaire arm. UK office buildings will be selected based on a variety of features, including office floor area and number of occupants. Questionnaires will include items on standard demographics, well-being, physical activity behaviour and putative socioecological correlates of workplace physical activity. Based on survey responses, approximately 30 participants will be recruited from each building into the objective monitoring arm. Participants will wear accelerometers (to monitor physical activity and sitting inside and outside the office) and a novel tracking device will be placed in the office (to record participant location) for five consecutive days. Data will be analysed using regression analyses, as well as novel agent-based modelling techniques. The results of this study will be disseminated through peer-reviewed publications and scientific presentations. Ethical approval was obtained through the University College London Research Ethics Committee (Reference number 4400/001).
Disposal of Kitchen Waste from High Rise Apartment
NASA Astrophysics Data System (ADS)
Ori, Kirki; Bharti, Ajay; Kumar, Sunil
2017-09-01
The high rise building has numbers of floor and rooms having variety of users or tenants for residential purposes. The huge quantities of heterogenous mixtures of domestic food waste are generated from every floor of the high rise residential buildings. Disposal of wet and biodegradable domestic kitchen waste from high rise buildings are more expensive in regards of collection and vertical transportation. This work is intended to address the technique to dispose of the wet organic food waste from the high rise buildings or multistory building at generation point with the advantage of gravity and vermicomposting technique. This innovative effort for collection and disposal of wet organic solid waste from high rise apartment is more economical and hygienic in comparison with present system of disposal.
Monitoring Building Deformation with InSAR: Experiments and Validation.
Yang, Kui; Yan, Li; Huang, Guoman; Chen, Chu; Wu, Zhengpeng
2016-12-20
Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS) regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE) indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perkins, M P; Ong, M M; Crull, E W
2009-07-21
During lightning strikes buildings and other structures can act as imperfect Faraday Cages, enabling electromagnetic fields to be developed inside the facilities. Some equipment stored inside these facilities may unfortunately act as antenna systems. It is important to have techniques developed to analyze how much voltage, current, or energy dissipation may be developed over valuable components. In this discussion we will demonstrate the modeling techniques used to accurately analyze a generic missile type weapons system as it goes through different stages of assembly. As work is performed on weapons systems detonator cables can become exposed. These cables will form differentmore » monopole and loop type antenna systems that must be analyzed to determine the voltages developed over the detonator regions. Due to the low frequencies of lightning pulses, a lumped element circuit model can be developed to help analyze the different antenna configurations. We will show an example of how numerical modeling can be used to develop the lumped element circuit models used to calculate voltage, current, or energy dissipated over the detonator region of a generic missile type weapons system.« less
Scalable Deployment of Advanced Building Energy Management Systems
2013-06-01
Building Automation and Control Network BDAS Building Data Acquisition System BEM building energy model BIM building information modeling BMS...A prototype toolkit to seamlessly and automatically transfer a Building Information Model ( BIM ) to a Building Energy Model (BEM) has been...circumvent the need to manually construct and maintain a detailed building energy simulation model . This detailed
A Comparison of Techniques for Optimal Infrastructure Restoration
2014-12-01
to solve incremental network design problems. Álvarez et al. (2014) use a continuous MILP to solve the supply chain network infras- tructure problem...S. Long, T. Shoberg, S. Corns. 2014. A mathe- matical model for supply chain network infrastructure restoration. Y. Guan, H. Liao, eds., Proceedings...Links . . . . . . . . . . . . . . . . . 36 A.5 Use Supply from a Particular Node . . . . . . . . . . . . . . . . . 37 A.6 High Demand with High Building
A Novel Solution-Technique Applied to a Novel WAAS Architecture
NASA Technical Reports Server (NTRS)
Bavuso, J.
1998-01-01
The Federal Aviation Administration has embarked on an historic task of modernizing and significantly improving the national air transportation system. One system that uses the Global Positioning System (GPS) to determine aircraft navigational information is called the Wide Area Augmentation System (WAAS). This paper describes a reliability assessment of one candidate system architecture for the WAAS. A unique aspect of this study regards the modeling and solution of a candidate system that allows a novel cold sparing scheme. The cold spare is a WAAS communications satellite that is fabricated and launched after a predetermined number of orbiting satellite failures have occurred and after some stochastic fabrication time transpires. Because these satellites are complex systems with redundant components, they exhibit an increasing failure rate with a Weibull time to failure distribution. Moreover, the cold spare satellite build-time is Weibull and upon launch is considered to be a good-as-new system with an increasing failure rate and a Weibull time to failure distribution as well. The reliability model for this system is non-Markovian because three distinct system clocks are required: the time to failure of the orbiting satellites, the build time for the cold spare, and the time to failure for the launched spare satellite. A powerful dynamic fault tree modeling notation and Monte Carlo simulation technique with importance sampling are shown to arrive at a reliability prediction for a 10 year mission.
Tenailleau, Quentin M; Bernard, Nadine; Pujol, Sophie; Houot, Hélène; Joly, Daniel; Mauny, Frédéric
2015-01-01
Environmental epidemiological studies rely on the quantification of the exposure level in a surface defined as the subject's exposure area. For residential exposure, this area is often the subject's neighborhood. However, the variability of the size and nature of the neighborhoods makes comparison of the findings across studies difficult. This article examines the impact of the neighborhood's definition on environmental noise exposure levels obtained from four commonly used sampling techniques: address point, façade, buffers, and official zoning. A high-definition noise model, built on a middle-sized French city, has been used to estimate LAeq,24 h exposure in the vicinity of 10,825 residential buildings. Twelve noise exposure indicators have been used to assess inhabitants' exposure. Influence of urban environmental factors was analyzed using multilevel modeling. When the sampled area increases, the average exposure increases (+3.9 dB), whereas the SD decreases (-1.6 dB) (P<0.01). Most of the indicators differ statistically. When comparing indicators from the 50-m and 400-m radius buffers, the assigned LAeq,24 h level varies across buildings from -9.4 to +22.3 dB. This variation is influenced by urban environmental characteristics (P<0.01). On the basis of this study's findings, sampling technique, neighborhood size, and environmental composition should be carefully considered in further exposure studies.
Global Dynamic Exposure and the OpenBuildingMap - Communicating Risk and Involving Communities
NASA Astrophysics Data System (ADS)
Schorlemmer, Danijel; Beutin, Thomas; Hirata, Naoshi; Hao, Ken; Wyss, Max; Cotton, Fabrice; Prehn, Karsten
2017-04-01
Detailed understanding of local risk factors regarding natural catastrophes requires in-depth characterization of the local exposure. Current exposure capture techniques have to find the balance between resolution and coverage. We aim at bridging this gap by employing a crowd-sourced approach to exposure capturing, focusing on risk related to earthquake hazard. OpenStreetMap (OSM), the rich and constantly growing geographical database, is an ideal foundation for this task. More than 3.5 billion geographical nodes, more than 200 million building footprints (growing by 100'000 per day), and a plethora of information about school, hospital, and other critical facilities allows us to exploit this dataset for risk-related computations. We are combining the strengths of crowd-sourced data collection with the knowledge of experts in extracting the most information from these data. Besides relying on the very active OpenStreetMap community and the Humanitarian OpenStreetMap Team, which are collecting building information at high pace, we are providing a tailored building capture tool for mobile devices. This tool is facilitating simple and fast building property capturing for OpenStreetMap by any person or interested community. With our OpenBuildingMap system, we are harvesting this dataset by processing every building in near-realtime. We are collecting exposure and vulnerability indicators from explicitly provided data (e.g. hospital locations), implicitly provided data (e.g. building shapes and positions), and semantically derived data, i.e. interpretation applying expert knowledge. The expert knowledge is needed to translate the simple building properties as captured by OpenStreetMap users into vulnerability and exposure indicators and subsequently into building classifications as defined in the Building Taxonomy 2.0 developed by the Global Earthquake Model (GEM) and the European Macroseismic Scale (EMS98). With this approach, we increase the resolution of existing exposure models from aggregated exposure information to building-by-building vulnerability. We report on our method, on the software development for the mobile application and the server-side analysis system, and on the OpenBuildingMap (www.openbuildingmap.org), our global Tile Map Service focusing on building properties. The free/open framework we provide can be used on commodity hardware for local to regional exposure capturing, for stakeholders in disaster management and mitigation for communicating risk, and for communities to understand their risk.
Technical Aspects for the Creation of a Multi-Dimensional Land Information System
NASA Astrophysics Data System (ADS)
Ioannidis, Charalabos; Potsiou, Chryssy; Soile, Sofia; Verykokou, Styliani; Mourafetis, George; Doulamis, Nikolaos
2016-06-01
The complexity of modern urban environments and civil demands for fast, reliable and affordable decision-making requires not only a 3D Land Information System, which tends to replace traditional 2D LIS architectures, but also the need to address the time and scale parameters, that is, the 3D geometry of buildings in various time instances (4th dimension) at various levels of detail (LoDs - 5th dimension). This paper describes and proposes solutions for technical aspects that need to be addressed for the 5D modelling pipeline. Such solutions include the creation of a 3D model, the application of a selective modelling procedure between various time instances and at various LoDs, enriched with cadastral and other spatial data, and a procedural modelling approach for the representation of the inner parts of the buildings. The methodology is based on automatic change detection algorithms for spatial-temporal analysis of the changes that took place in subsequent time periods, using dense image matching and structure from motion algorithms. The selective modelling approach allows a detailed modelling only for the areas where spatial changes are detected. The procedural modelling techniques use programming languages for the textual semantic description of a building; they require the modeller to describe its part-to-whole relationships. Finally, a 5D viewer is developed, in order to tackle existing limitations that accompany the use of global systems, such as the Google Earth or the Google Maps, as visualization software. An application based on the proposed methodology in an urban area is presented and it provides satisfactory results.
Copeland, Holly E.; Doherty, Kevin E.; Naugle, David E.; Pocewicz, Amy; Kiesecker, Joseph M.
2009-01-01
Background Many studies have quantified the indirect effect of hydrocarbon-based economies on climate change and biodiversity, concluding that a significant proportion of species will be threatened with extinction. However, few studies have measured the direct effect of new energy production infrastructure on species persistence. Methodology/Principal Findings We propose a systematic way to forecast patterns of future energy development and calculate impacts to species using spatially-explicit predictive modeling techniques to estimate oil and gas potential and create development build-out scenarios by seeding the landscape with oil and gas wells based on underlying potential. We illustrate our approach for the greater sage-grouse (Centrocercus urophasianus) in the western US and translate the build-out scenarios into estimated impacts on sage-grouse. We project that future oil and gas development will cause a 7–19 percent decline from 2007 sage-grouse lek population counts and impact 3.7 million ha of sagebrush shrublands and 1.1 million ha of grasslands in the study area. Conclusions/Significance Maps of where oil and gas development is anticipated in the US Intermountain West can be used by decision-makers intent on minimizing impacts to sage-grouse. This analysis also provides a general framework for using predictive models and build-out scenarios to anticipate impacts to species. These predictive models and build-out scenarios allow tradeoffs to be considered between species conservation and energy development prior to implementation. PMID:19826472
Dolev, Danny; Függer, Matthias; Posch, Markus; Schmid, Ulrich; Steininger, Andreas; Lenzen, Christoph
2014-06-01
We present the first implementation of a distributed clock generation scheme for Systems-on-Chip that recovers from an unbounded number of arbitrary transient faults despite a large number of arbitrary permanent faults. We devise self-stabilizing hardware building blocks and a hybrid synchronous/asynchronous state machine enabling metastability-free transitions of the algorithm's states. We provide a comprehensive modeling approach that permits to prove, given correctness of the constructed low-level building blocks, the high-level properties of the synchronization algorithm (which have been established in a more abstract model). We believe this approach to be of interest in its own right, since this is the first technique permitting to mathematically verify, at manageable complexity, high-level properties of a fault-prone system in terms of its very basic components. We evaluate a prototype implementation, which has been designed in VHDL, using the Petrify tool in conjunction with some extensions, and synthesized for an Altera Cyclone FPGA.
Dolev, Danny; Függer, Matthias; Posch, Markus; Schmid, Ulrich; Steininger, Andreas; Lenzen, Christoph
2014-01-01
We present the first implementation of a distributed clock generation scheme for Systems-on-Chip that recovers from an unbounded number of arbitrary transient faults despite a large number of arbitrary permanent faults. We devise self-stabilizing hardware building blocks and a hybrid synchronous/asynchronous state machine enabling metastability-free transitions of the algorithm's states. We provide a comprehensive modeling approach that permits to prove, given correctness of the constructed low-level building blocks, the high-level properties of the synchronization algorithm (which have been established in a more abstract model). We believe this approach to be of interest in its own right, since this is the first technique permitting to mathematically verify, at manageable complexity, high-level properties of a fault-prone system in terms of its very basic components. We evaluate a prototype implementation, which has been designed in VHDL, using the Petrify tool in conjunction with some extensions, and synthesized for an Altera Cyclone FPGA. PMID:26516290
Object-oriented design and programming in medical decision support.
Heathfield, H; Armstrong, J; Kirkham, N
1991-12-01
The concept of object-oriented design and programming has recently received a great deal of attention from the software engineering community. This paper highlights the realisable benefits of using the object-oriented approach in the design and development of clinical decision support systems. These systems seek to build a computational model of some problem domain and therefore tend to be exploratory in nature. Conventional procedural design techniques do not support either the process of model building or rapid prototyping. The central concepts of the object-oriented paradigm are introduced, namely encapsulation, inheritance and polymorphism, and their use illustrated in a case study, taken from the domain of breast histopathology. In particular, the dual roles of inheritance in object-oriented programming are examined, i.e., inheritance as a conceptual modelling tool and inheritance as a code reuse mechanism. It is argued that the use of the former is not entirely intuitive and may be difficult to incorporate into the design process. However, inheritance as a means of optimising code reuse offers substantial technical benefits.
Design of evaporative-cooling roof for decreasing air temperatures in buildings in the humid tropics
NASA Astrophysics Data System (ADS)
Kindangen, Jefrey I.; Umboh, Markus K.
2017-03-01
This subject points to assess the benefits of the evaporative-cooling roof, particularly for buildings with corrugated zinc roofs. In Manado, many buildings have roofed with corrugated zinc sheets; because this material is truly practical, easy and economical application. In general, to achieve thermal comfort in buildings in a humid tropical climate, people applying cross ventilation to cool the air in the room and avoid overheating. Cross ventilation is a very popular path to achieve thermal comfort; yet, at that place are other techniques that allow reducing the problem of excessive high temperature in the room in the constructions. This study emphasizes applications of the evaporative-cooling roof. Spraying water on the surface of the ceiling has been executed on the test cell and the reuse of water after being sprayed and cooled once more by applying a heat exchanger. Initial results indicate a reliable design and successfully meet the target as an effective evaporative-cooling roof technique. Application of water spraying automatic and cooling water installations can work optimally and can be an optimal model for the cooling roof as one of the green technologies. The role of heat exchangers can lower the temperature of the water from spraying the surface of the ceiling, which has become a hot, down an average of 0.77° C. The mass flow rate of the cooling water is approximately 1.106 kg/h and the rate of heat flow is around 515 Watt, depend on the site.
NASA Astrophysics Data System (ADS)
Pueyo-Anchuela, Ó.; Casas-Sainz, A. M.; Soriano, M. A.; Pocoví-Juan, A.
Complex geological shallow subsurface environments represent an important handicap in urban and building projects. The geological features of the Central Ebro Basin, with sharp lateral changes in Quaternary deposits, alluvial karst phenomena and anthropic activity can preclude the characterization of future urban areas only from isolated geomechanical tests or from non-correctly dimensioned geophysical techniques. This complexity is here analyzed in two different test fields, (i) one of them linked to flat-bottomed valleys with irregular distribution of Quaternary deposits related to sharp lateral facies changes and irregular preconsolidated substratum position and (ii) a second one with similar complexities in the alluvial deposits and karst activity linked to solution of the underlying evaporite substratum. The results show that different geophysical techniques allow for similar geological models to be obtained in the first case (flat-bottomed valleys), whereas only the application of several geophysical techniques can permit to correctly evaluate the geological model complexities in the second case (alluvial karst). In this second case, the geological and superficial information permit to refine the sensitivity of the applied geophysical techniques to different indicators of karst activity. In both cases 3D models are needed to correctly distinguish alluvial lateral sedimentary changes from superimposed karstic activity.
Validation of Computational Models in Biomechanics
Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.
2010-01-01
The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648
Davis, Ben; Grosvenor, Chriss; Johnk, Robert; Novotny, David; Baker-Jarvis, James; Janezic, Michael
2007-01-01
Building materials are often incorporated into complex, multilayer macrostructures that are simply not amenable to measurements using coax or waveguide sample holders. In response to this, we developed an ultra-wideband (UWB) free-field measurement system. This measurement system uses a ground-plane-based system and two TEM half-horn antennas to transmit and receive the RF signal. The material samples are placed between the antennas, and reflection and transmission measurements made. Digital signal processing techniques are then applied to minimize environmental and systematic effects. The processed data are compared to a plane-wave model to extract the material properties with optimization software based on genetic algorithms.
2000-12-15
NASA is looking to biological techniques that are millions of years old to help it develop new materials and technologies for the 21st century. Sponsored by NASA, Jeffrey Brinker of the University of New Mexico is studying how multiple elements can assemble themselves into a composite material that is clear, tough, and impermeable. His research is based on the model of how an abalone builds the nacre, also called mother-of-pearl, inside its shell. The mollusk layers bricks of calcium carbonate (the main ingredient in classroom chalk) and mortar of biopolymer to form a new material (top and bottom left) that is twice as hard and 1,000 times as tough as either of the original building materials.
NASA Astrophysics Data System (ADS)
Costanzo, Antonio; Montuori, Antonio; Silva, Juan Pablo; Silvestri, Malvina; Musacchio, Massimo; Buongiorno, Maria Fabrizia; Stramondo, Salvatore
2016-08-01
In this work, a web-GIS procedure to map the risk of road blockage in urban environments through the combined use of space-borne and airborne remote sensing sensors is presented. The methodology concerns (1) the provision of a geo-database through the integration of space-borne multispectral images and airborne LiDAR data products; (2) the modeling of building vulnerability, based on the corresponding 3D geometry and construction time information; (3) the GIS-based mapping of road closure due to seismic- related building collapses based on the building characteristic height and the width of the road. Experimental results, gathered for the Cosenza urban area, allow demonstrating the benefits of both the proposed approach and the GIS-based integration of multi-platforms remote sensing sensors and techniques for seismic road assessment purposes.
Transactive Control of Commercial Building HVAC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corbin, Charles D.; Makhmalbaf, Atefe; Huang, Sen
This document details the development and testing of market-based transactive controls for building heating, ventilating and air conditioning (HVAC) systems. These controls are intended to serve the purposes of reducing electricity use through conservation, reducing peak building electric demand, and providing demand flexibility to assist with power system operations. This report is the summary of the first year of work conducted under Phase 1 of the Clean Energy and Transactive Campus Project. The methods and techniques described here were first investigated in simulation, and then subsequently deployed to a physical testbed on the Pacific Northwest National Laboratory (PNNL) campus formore » validation. In this report, we describe the models and control algorithms we have developed, testing of the control algorithms in simulation, and deployment to a physical testbed. Results from physical experiments support previous simulation findings, and provide insights for further improvement.« less
Fusion of monocular cues to detect man-made structures in aerial imagery
NASA Technical Reports Server (NTRS)
Shufelt, Jefferey; Mckeown, David M.
1991-01-01
The extraction of buildings from aerial imagery is a complex problem for automated computer vision. It requires locating regions in a scene that possess properties distinguishing them as man-made objects as opposed to naturally occurring terrain features. It is reasonable to assume that no single detection method can correctly delineate or verify buildings in every scene. A cooperative-methods paradigm is useful in approaching the building extraction problem. Using this paradigm, each extraction technique provides information which can be added or assimilated into an overall interpretation of the scene. Thus, the main objective is to explore the development of computer vision system that integrates the results of various scene analysis techniques into an accurate and robust interpretation of the underlying three dimensional scene. The problem of building hypothesis fusion in aerial imagery is discussed. Building extraction techniques are briefly surveyed, including four building extraction, verification, and clustering systems. A method for fusing the symbolic data generated by these systems is described, and applied to monocular image and stereo image data sets. Evaluation methods for the fusion results are described, and the fusion results are analyzed using these methods.
Layout Slam with Model Based Loop Closure for 3d Indoor Corridor Reconstruction
NASA Astrophysics Data System (ADS)
Baligh Jahromi, A.; Sohn, G.; Jung, J.; Shahbazi, M.; Kang, J.
2018-05-01
In this paper, we extend a recently proposed visual Simultaneous Localization and Mapping (SLAM) techniques, known as Layout SLAM, to make it robust against error accumulations, abrupt changes of camera orientation and miss-association of newly visited parts of the scene to the previously visited landmarks. To do so, we present a novel technique of loop closing based on layout model matching; i.e., both model information (topology and geometry of reconstructed models) and image information (photometric features) are used to address a loop-closure detection. The advantages of using the layout-related information in the proposed loop-closing technique are twofold. First, it imposes a metric constraint on the global map consistency and, thus, adjusts the mapping scale drifts. Second, it can reduce matching ambiguity in the context of indoor corridors, where the scene is homogenously textured and extracting sufficient amount of distinguishable point features is a challenging task. To test the impact of the proposed technique on the performance of Layout SLAM, we have performed the experiments on wide-angle videos captured by a handheld camera. This dataset was collected from the indoor corridors of a building at York University. The obtained results demonstrate that the proposed method successfully detects the instances of loops while producing very limited trajectory errors.
Size reduction techniques for vital compliant VHDL simulation models
Rich, Marvin J.; Misra, Ashutosh
2006-08-01
A method and system select delay values from a VHDL standard delay file that correspond to an instance of a logic gate in a logic model. Then the system collects all the delay values of the selected instance and builds super generics for the rise-time and the fall-time of the selected instance. Then, the system repeats this process for every delay value in the standard delay file (310) that correspond to every instance of every logic gate in the logic model. The system then outputs a reduced size standard delay file (314) containing the super generics for every instance of every logic gate in the logic model.
Wab-InSAR: a new wavelet based InSAR time series technique applied to volcanic and tectonic areas
NASA Astrophysics Data System (ADS)
Walter, T. R.; Shirzaei, M.; Nankali, H.; Roustaei, M.
2009-12-01
Modern geodetic techniques such as InSAR and GPS provide valuable observations of the deformation field. Because of the variety of environmental interferences (e.g., atmosphere, topography distortion) and incompleteness of the models (assumption of the linear model for deformation), those observations are usually tainted by various systematic and random errors. Therefore we develop and test new methods to identify and filter unwanted periodic or episodic artifacts to obtain accurate and precise deformation measurements. Here we present and implement a new wavelet based InSAR (Wab-InSAR) time series approach. Because wavelets are excellent tools for identifying hidden patterns and capturing transient signals, we utilize wavelet functions for reducing the effect of atmospheric delay and digital elevation model inaccuracies. Wab-InSAR is a model free technique, reducing digital elevation model errors in individual interferograms using a 2D spatial Legendre polynomial wavelet filter. Atmospheric delays are reduced using a 3D spatio-temporal wavelet transform algorithm and a novel technique for pixel selection. We apply Wab-InSAR to several targets, including volcano deformation processes at Hawaii Island, and mountain building processes in Iran. Both targets are chosen to investigate large and small amplitude signals, variable and complex topography and atmospheric effects. In this presentation we explain different steps of the technique, validate the results by comparison to other high resolution processing methods (GPS, PS-InSAR, SBAS) and discuss the geophysical results.
Merrill, Jacqueline A; Deegan, Michael; Wilson, Rosalind V; Kaushal, Rainu; Fredericks, Kimberly
2013-01-01
Objective To evaluate the complex dynamics involved in implementing electronic health information exchange (HIE) for public health reporting at a state health department, and to identify policy implications to inform similar implementations. Materials and methods Qualitative data were collected over 8 months from seven experts at New York State Department of Health who implemented web services and protocols for querying, receipt, and validation of electronic data supplied by regional health information organizations. Extensive project documentation was also collected. During group meetings experts described the implementation process and created reference modes and causal diagrams that the evaluation team used to build a preliminary model. System dynamics modeling techniques were applied iteratively to build causal loop diagrams representing the implementation. The diagrams were validated iteratively by individual experts followed by group review online, and through confirmatory review of documents and artifacts. Results Three casual loop diagrams captured well-recognized system dynamics: Sliding Goals, Project Rework, and Maturity of Resources. The findings were associated with specific policies that address funding, leadership, ensuring expertise, planning for rework, communication, and timeline management. Discussion This evaluation illustrates the value of a qualitative approach to system dynamics modeling. As a tool for strategic thinking on complicated and intense processes, qualitative models can be produced with fewer resources than a full simulation, yet still provide insights that are timely and relevant. Conclusions System dynamics techniques clarified endogenous and exogenous factors at play in a highly complex technology implementation, which may inform other states engaged in implementing HIE supported by federal Health Information Technology for Economic and Clinical Health (HITECH) legislation. PMID:23292910
Casale, M; Oliveri, P; Casolino, C; Sinelli, N; Zunin, P; Armanino, C; Forina, M; Lanteri, S
2012-01-27
An authentication study of the Italian PDO (protected designation of origin) extra virgin olive oil Chianti Classico was performed; UV-visible (UV-vis), Near-Infrared (NIR) and Mid-Infrared (MIR) spectroscopies were applied to a set of samples representative of the whole Chianti Classico production area. The non-selective signals (fingerprints) provided by the three spectroscopic techniques were utilised both individually and jointly, after fusion of the respective profile vectors, in order to build a model for the Chianti Classico PDO olive oil. Moreover, these results were compared with those obtained by the gas chromatographic determination of the fatty acids composition. In order to characterise the olive oils produced in the Chianti Classico PDO area, UNEQ (unequal class models) and SIMCA (soft independent modelling of class analogy) were employed both on the MIR, NIR and UV-vis spectra, individually and jointly, and on the fatty acid composition. Finally, PLS (partial least square) regression was applied on the UV-vis, NIR and MIR spectra, in order to predict the content of oleic and linoleic acids in the extra virgin olive oils. UNEQ, SIMCA and PLS were performed after selection of the relevant predictors, in order to increase the efficiency of both classification and regression models. The non-selective information obtained from UV-vis, NIR and MIR spectroscopy allowed to build reliable models for checking the authenticity of the Italian PDO extra virgin olive oil Chianti Classico. Copyright © 2011 Elsevier B.V. All rights reserved.
Merrill, Jacqueline A; Deegan, Michael; Wilson, Rosalind V; Kaushal, Rainu; Fredericks, Kimberly
2013-06-01
To evaluate the complex dynamics involved in implementing electronic health information exchange (HIE) for public health reporting at a state health department, and to identify policy implications to inform similar implementations. Qualitative data were collected over 8 months from seven experts at New York State Department of Health who implemented web services and protocols for querying, receipt, and validation of electronic data supplied by regional health information organizations. Extensive project documentation was also collected. During group meetings experts described the implementation process and created reference modes and causal diagrams that the evaluation team used to build a preliminary model. System dynamics modeling techniques were applied iteratively to build causal loop diagrams representing the implementation. The diagrams were validated iteratively by individual experts followed by group review online, and through confirmatory review of documents and artifacts. Three casual loop diagrams captured well-recognized system dynamics: Sliding Goals, Project Rework, and Maturity of Resources. The findings were associated with specific policies that address funding, leadership, ensuring expertise, planning for rework, communication, and timeline management. This evaluation illustrates the value of a qualitative approach to system dynamics modeling. As a tool for strategic thinking on complicated and intense processes, qualitative models can be produced with fewer resources than a full simulation, yet still provide insights that are timely and relevant. System dynamics techniques clarified endogenous and exogenous factors at play in a highly complex technology implementation, which may inform other states engaged in implementing HIE supported by federal Health Information Technology for Economic and Clinical Health (HITECH) legislation.
NASA Astrophysics Data System (ADS)
Sharkawi, K.-H.; Abdul-Rahman, A.
2013-09-01
Cities and urban areas entities such as building structures are becoming more complex as the modern human civilizations continue to evolve. The ability to plan and manage every territory especially the urban areas is very important to every government in the world. Planning and managing cities and urban areas based on printed maps and 2D data are getting insufficient and inefficient to cope with the complexity of the new developments in big cities. The emergence of 3D city models have boosted the efficiency in analysing and managing urban areas as the 3D data are proven to represent the real world object more accurately. It has since been adopted as the new trend in buildings and urban management and planning applications. Nowadays, many countries around the world have been generating virtual 3D representation of their major cities. The growing interest in improving the usability of 3D city models has resulted in the development of various tools for analysis based on the 3D city models. Today, 3D city models are generated for various purposes such as for tourism, location-based services, disaster management and urban planning. Meanwhile, modelling 3D objects are getting easier with the emergence of the user-friendly tools for 3D modelling available in the market. Generating 3D buildings with high accuracy also has become easier with the availability of airborne Lidar and terrestrial laser scanning equipments. The availability and accessibility to this technology makes it more sensible to analyse buildings in urban areas using 3D data as it accurately represent the real world objects. The Open Geospatial Consortium (OGC) has accepted CityGML specifications as one of the international standards for representing and exchanging spatial data, making it easier to visualize, store and manage 3D city models data efficiently. CityGML able to represents the semantics, geometry, topology and appearance of 3D city models in five well-defined Level-of-Details (LoD), namely LoD0 to LoD4. The accuracy and structural complexity of the 3D objects increases with the LoD level where LoD0 is the simplest LoD (2.5D; Digital Terrain Model (DTM) + building or roof print) while LoD4 is the most complex LoD (architectural details with interior structures). Semantic information is one of the main components in CityGML and 3D City Models, and provides important information for any analyses. However, more often than not, the semantic information is not available for the 3D city model due to the unstandardized modelling process. One of the examples is where a building is normally generated as one object (without specific feature layers such as Roof, Ground floor, Level 1, Level 2, Block A, Block B, etc). This research attempts to develop a method to improve the semantic data updating process by segmenting the 3D building into simpler parts which will make it easier for the users to select and update the semantic information. The methodology is implemented for 3D buildings in LoD2 where the buildings are generated without architectural details but with distinct roof structures. This paper also introduces hybrid semantic-geometric 3D segmentation method that deals with hierarchical segmentation of a 3D building based on its semantic value and surface characteristics, fitted by one of the predefined primitives. For future work, the segmentation method will be implemented as part of the change detection module that can detect any changes on the 3D buildings, store and retrieve semantic information of the changed structure, automatically updates the 3D models and visualize the results in a userfriendly graphical user interface (GUI).
Challenges in Laser Sintering of Melt-Processable Thermoset Imide Resin
NASA Technical Reports Server (NTRS)
Chuang, Kathy C.; Gornet, Timothy; Koerner, Hilmar
2016-01-01
Polymer Laser Sintering (LS) is an additive manufacturing technique that builds 3D models layer by layer using a laser to selectively melt cross sections in powdered polymeric materials, following sequential slices of the CAD model. LS generally uses thermoplastic polymeric powders, such as polyamides (i.e. Nylon), and the resultant 3D objects are often weaker in their strength compared to traditionally processed materials, due to the lack of polymer inter-chain connection in the z-direction. The objective of this project is to investigate the possibility of printing a melt-processable RTM370 imide resin powder terminated with reactive phenylethynyl groups by LS, followed by a postcure in order to promote additional crosslinking to achieve higher temperature (250-300 C) capability. A preliminary study to build tensile specimens by LS and the corresponding DSC and rheology study of RTM370 during LS process is presented.
Challenges in Laser Sintering of Thermoset Imide Resin
NASA Technical Reports Server (NTRS)
Chuang, Kathy C.; Gornet, Timothy; Koerner, Hilmar
2016-01-01
Polymer Laser Sintering (LS) is an additive manufacturing technique that builds 3D models layer by layer using a laser to selectively melt cross sections in powdered polymeric materials, following sequential slices of the CAD model. LS generally uses thermoplastic polymeric powders, such as polyamides (i.e. Nylon), and the resultant 3D objects are often weaker in their strength compared to traditionally processed materials, due to the lack of polymer inter-chain connection in the z-direction. The objective of this project is to investigate the possibility of printing a melt-processable RTM370 imide resin powder terminated with reactive phenylethynyl groups by LS, followed by a postcure in order to promote additional crosslinking to achieve higher temperature (250-300 C) capability. A preliminary study to build tensile specimens by LS and the corresponding DSC and rheology study of RTM370 during LS process is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colin, P.; Nicoletis, S.; Froidevaux, R.
1996-12-31
A case study is presented of building a map showing the probability that the concentration in polycyclic aromatic hydrocarbon (PAH) exceeds a critical threshold. This assessment is based on existing PAH sample data (direct information) and on an electrical resistivity survey (indirect information). Simulated annealing is used to build a model of the range of possible values for PAH concentrations and of the bivariate relationship between PAH concentrations and electrical resistivity. The geostatistical technique of simple indicator kriging is then used, together with the probabilistic model, to infer, at each node of a grid, the range of possible values whichmore » the PAH concentration can take. The risk map is then extracted for this characterization of the local uncertainty. The difference between this risk map and a traditional iso-concentration map is then discussed in terms of decision-making.« less
Artificial Immune System Approaches for Aerospace Applications
NASA Technical Reports Server (NTRS)
KrishnaKumar, Kalmanje; Koga, Dennis (Technical Monitor)
2002-01-01
Artificial Immune Systems (AIS) combine a priori knowledge with the adapting capabilities of biological immune system to provide a powerful alternative to currently available techniques for pattern recognition, modeling, design, and control. Immunology is the science of built-in defense mechanisms that are present in all living beings to protect against external attacks. A biological immune system can be thought of as a robust, adaptive system that is capable of dealing with an enormous variety of disturbances and uncertainties. Biological immune systems use a finite number of discrete "building blocks" to achieve this adaptiveness. These building blocks can be thought of as pieces of a puzzle which must be put together in a specific way-to neutralize, remove, or destroy each unique disturbance the system encounters. In this paper, we outline AIS models that are immediately applicable to aerospace problems and identify application areas that need further investigation.
Empirical Evaluation of Hunk Metrics as Bug Predictors
NASA Astrophysics Data System (ADS)
Ferzund, Javed; Ahsan, Syed Nadeem; Wotawa, Franz
Reducing the number of bugs is a crucial issue during software development and maintenance. Software process and product metrics are good indicators of software complexity. These metrics have been used to build bug predictor models to help developers maintain the quality of software. In this paper we empirically evaluate the use of hunk metrics as predictor of bugs. We present a technique for bug prediction that works at smallest units of code change called hunks. We build bug prediction models using random forests, which is an efficient machine learning classifier. Hunk metrics are used to train the classifier and each hunk metric is evaluated for its bug prediction capabilities. Our classifier can classify individual hunks as buggy or bug-free with 86 % accuracy, 83 % buggy hunk precision and 77% buggy hunk recall. We find that history based and change level hunk metrics are better predictors of bugs than code level hunk metrics.
Shaking Table Tests Validating Two Strengthening Interventions on Masonry Buildings
NASA Astrophysics Data System (ADS)
De Canio, Gerardo; Muscolino, Giuseppe; Palmeri, Alessandro; Poggi, Massimo; Clemente, Paolo
2008-07-01
Masonry buildings constitute quite often a precious cultural heritage for our cities. In order to future generations can enjoy this heritage, thence, effective projects of protection should be developed against all the anthropical and natural actions which may irreparably damage old masonry buildings. However, the strengthening interventions on these constructions have to respect their authenticity, without altering the original conception, not only functionally and aesthetically of course, but also statically. These issues are of central interests in the Messina area, where the seismic protection of new and existing constructions is a primary demand. It is well known, in fact, that the city of Messina lies in a highly seismic zone, and has been subjected to two destructive earthquakes in slightly more than one century, the 1783 Calabria earthquake and the more famous 1908 Messina-Reggio Calabria earthquake. It follows that the retrofitting projects on buildings which survived these two events should be designed with the aim to save the life of occupants operating with "light" techniques, i.e. respecting the original structural scheme. On the other hand, recent earthquakes, and in particular the 1997 Umbria-Marche sequence, unequivocally demonstrated that some of the most popular retrofitting interventions adopted in the second half the last century are absolutely ineffective, or even unsafe. Over these years, in fact, a number of "heavy" techniques proliferated, and therefore old masonry buildings suffered, among others, the substitution of existing timber slabs with more ponderous concrete slabs and/or the insertion of RC and steel members coupled with the original masonry elements (walls, arches, vaults). As a result, these buildings have been transformed by unwise engineers into hybrid structures, having a mixed behaviour (which frequently proved to be also unpredictable) between those of historic masonry and new members. Starting from these considerations, a numerical and experimental research has been carried out, aimed at validating two different strengthening interventions on masonry buildings: (i) the substitution of the existing roof with timber-concrete composite slabs, which are able to improve the dynamic behaviour of the structure without excessively increase the mass, and (ii) the reinforcement of masonry walls with FRP materials, which allow increasing both stiffness and strength of the construction. The experimental tests have been performed on a 1:2 scale model of a masonry building resembling a special type, the so-called "tipo misto messinese", which is proper to the reconstruction of the city of Messina after the 1783 Calabria earthquake. The model, incorporating a novel timber-concrete composite slab, has been tested on the main shaking table available at the ENEA Research Centre "Casaccia," both before and after the reinforcement with FRP materials. Some aspects related to the definition of the model and to the selection of an appropriate seismic input will be discussed, and numerical results confirming the effectiveness of the interventions mentioned above will be presented.
Shaking Table Tests Validating Two Strengthening Interventions on Masonry Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Canio, Gerardo; Poggi, Massimo; Clemente, Paolo
2008-07-08
Masonry buildings constitute quite often a precious cultural heritage for our cities. In order to future generations can enjoy this heritage, thence, effective projects of protection should be developed against all the anthropical and natural actions which may irreparably damage old masonry buildings. However, the strengthening interventions on these constructions have to respect their authenticity, without altering the original conception, not only functionally and aesthetically of course, but also statically. These issues are of central interests in the Messina area, where the seismic protection of new and existing constructions is a primary demand. It is well known, in fact, thatmore » the city of Messina lies in a highly seismic zone, and has been subjected to two destructive earthquakes in slightly more than one century, the 1783 Calabria earthquake and the more famous 1908 Messina-Reggio Calabria earthquake. It follows that the retrofitting projects on buildings which survived these two events should be designed with the aim to save the life of occupants operating with 'light' techniques, i.e. respecting the original structural scheme. On the other hand, recent earthquakes, and in particular the 1997 Umbria-Marche sequence, unequivocally demonstrated that some of the most popular retrofitting interventions adopted in the second half the last century are absolutely ineffective, or even unsafe. Over these years, in fact, a number of 'heavy' techniques proliferated, and therefore old masonry buildings suffered, among others, the substitution of existing timber slabs with more ponderous concrete slabs and/or the insertion of RC and steel members coupled with the original masonry elements (walls, arches, vaults). As a result, these buildings have been transformed by unwise engineers into hybrid structures, having a mixed behaviour (which frequently proved to be also unpredictable) between those of historic masonry and new members. Starting from these considerations, a numerical and experimental research has been carried out, aimed at validating two different strengthening interventions on masonry buildings: (i) the substitution of the existing roof with timber-concrete composite slabs, which are able to improve the dynamic behaviour of the structure without excessively increase the mass, and (ii) the reinforcement of masonry walls with FRP materials, which allow increasing both stiffness and strength of the construction. The experimental tests have been performed on a 1:2 scale model of a masonry building resembling a special type, the so-called 'tipo misto messinese', which is proper to the reconstruction of the city of Messina after the 1783 Calabria earthquake. The model, incorporating a novel timber-concrete composite slab, has been tested on the main shaking table available at the ENEA Research Centre 'Casaccia', both before and after the reinforcement with FRP materials. Some aspects related to the definition of the model and to the selection of an appropriate seismic input will be discussed, and numerical results confirming the effectiveness of the interventions mentioned above will be presented.« less
CAD/CAM silicone simulator for teaching cheiloplasty: description of the technique.
Zheng, Y; Lu, B; Zhang, J; Wu, G
2015-02-01
Techniques of virtual simulation have been used to teach junior surgeons how to do a cheiloplasty, but still do not meet the trainees' demands. We describe a CAD/CAM silicone simulator, which we made using several maxillofacial prosthetic techniques. An optical scanning system was used to collect the data about the cleft lip. Reverse engineering software was then used to build the virtual model, and this was processed in wax by machine. The definitive simulator was made with prosthetic silicone and extrinsic colourants. The surgical trainees practised the basic skills of cheiloplasty on the simulator, and proved its worth. Copyright © 2014 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Temperature Mapping of 3D Printed Polymer Plates: Experimental and Numerical Study
Kousiatza, Charoula; Chatzidai, Nikoleta; Karalekas, Dimitris
2017-01-01
In Fused Deposition Modeling (FDM), which is a common thermoplastic Additive Manufacturing (AM) method, the polymer model material that is in the form of a flexible filament is heated above its glass transition temperature (Tg) to a semi-molten state in the head’s liquefier. The heated material is extruded in a rastering configuration onto the building platform where it rapidly cools and solidifies with the adjoining material. The heating and rapid cooling cycles of the work materials exhibited during the FDM process provoke non-uniform thermal gradients and cause stress build-up that consequently result in part distortions, dimensional inaccuracy and even possible part fabrication failure. Within the purpose of optimizing the FDM technique by eliminating the presence of such undesirable effects, real-time monitoring is essential for the evaluation and control of the final parts’ quality. The present work investigates the temperature distributions developed during the FDM building process of multilayered thin plates and on this basis a numerical study is also presented. The recordings of temperature changes were achieved by embedding temperature measuring sensors at various locations into the middle-plane of the printed structures. The experimental results, mapping the temperature variations within the samples, were compared to the corresponding ones obtained by finite element modeling, exhibiting good correlation. PMID:28245557
Applying Evidence-Based Medicine in Telehealth: An Interactive Pattern Recognition Approximation
Fernández-Llatas, Carlos; Meneu, Teresa; Traver, Vicente; Benedi, José-Miguel
2013-01-01
Born in the early nineteen nineties, evidence-based medicine (EBM) is a paradigm intended to promote the integration of biomedical evidence into the physicians daily practice. This paradigm requires the continuous study of diseases to provide the best scientific knowledge for supporting physicians in their diagnosis and treatments in a close way. Within this paradigm, usually, health experts create and publish clinical guidelines, which provide holistic guidance for the care for a certain disease. The creation of these clinical guidelines requires hard iterative processes in which each iteration supposes scientific progress in the knowledge of the disease. To perform this guidance through telehealth, the use of formal clinical guidelines will allow the building of care processes that can be interpreted and executed directly by computers. In addition, the formalization of clinical guidelines allows for the possibility to build automatic methods, using pattern recognition techniques, to estimate the proper models, as well as the mathematical models for optimizing the iterative cycle for the continuous improvement of the guidelines. However, to ensure the efficiency of the system, it is necessary to build a probabilistic model of the problem. In this paper, an interactive pattern recognition approach to support professionals in evidence-based medicine is formalized. PMID:24185841
SWARM : a scientific workflow for supporting Bayesian approaches to improve metabolic models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, X.; Stevens, R.; Mathematics and Computer Science
2008-01-01
With the exponential growth of complete genome sequences, the analysis of these sequences is becoming a powerful approach to build genome-scale metabolic models. These models can be used to study individual molecular components and their relationships, and eventually study cells as systems. However, constructing genome-scale metabolic models manually is time-consuming and labor-intensive. This property of manual model-building process causes the fact that much fewer genome-scale metabolic models are available comparing to hundreds of genome sequences available. To tackle this problem, we design SWARM, a scientific workflow that can be utilized to improve genome-scale metabolic models in high-throughput fashion. SWARM dealsmore » with a range of issues including the integration of data across distributed resources, data format conversions, data update, and data provenance. Putting altogether, SWARM streamlines the whole modeling process that includes extracting data from various resources, deriving training datasets to train a set of predictors and applying Bayesian techniques to assemble the predictors, inferring on the ensemble of predictors to insert missing data, and eventually improving draft metabolic networks automatically. By the enhancement of metabolic model construction, SWARM enables scientists to generate many genome-scale metabolic models within a short period of time and with less effort.« less
Inductive System Health Monitoring
NASA Technical Reports Server (NTRS)
Iverson, David L.
2004-01-01
The Inductive Monitoring System (IMS) software was developed to provide a technique to automatically produce health monitoring knowledge bases for systems that are either difficult to model (simulate) with a computer or which require computer models that are too complex to use for real time monitoring. IMS uses nominal data sets collected either directly from the system or from simulations to build a knowledge base that can be used to detect anomalous behavior in the system. Machine learning and data mining techniques are used to characterize typical system behavior by extracting general classes of nominal data from archived data sets. IMS is able to monitor the system by comparing real time operational data with these classes. We present a description of learning and monitoring method used by IMS and summarize some recent IMS results.
3D printing with polymers: Challenges among expanding options and opportunities.
Stansbury, Jeffrey W; Idacavage, Mike J
2016-01-01
Additive manufacturing, which is more colloquially referred to as 3D printing, is quickly approaching mainstream adoption as a highly flexible processing technique that can be applied to plastic, metal, ceramic, concrete and other building materials. However, taking advantage of the tremendous versatility associated with in situ photopolymerization as well as the ability to select from a variety of preformed processible polymers, 3D printing predominantly targets the production of polymeric parts and models. The goal of this review is to connect the various additive manufacturing techniques with the monomeric and polymeric materials they use while highlighting emerging material-based developments. Modern additive manufacturing technology was introduced approximately three decades ago but this review compiles recent peer-reviewed literature reports to demonstrate the evolution underway with respect to the various building techniques that differ significantly in approach as well as the new variations in polymer-based materials being employed. Recent growth of 3D printing has been dramatic and the ability of the various platform technologies to expand from rapid production prototypic models to the greater volume of readily customizable production of working parts is critical for continued high growth rates. This transition to working part production is highly dependent on adapting materials that deliver not only the requisite design accuracy but also the physical and mechanical properties necessary for the application. With the weighty distinction of being called the next industrial revolution, 3D printing technologies is already altering many industrial and academic operations including changing models for future healthcare delivery in medicine and dentistry. Copyright © 2015 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gomes, Marcos Sebastião de Paula; Isnard, André Augusto; Pinto, José Maurício do Carmo
The article discusses an experimental investigation of turbulent dispersion processes in a typical three-dimensional urban geometry, in reduced scale, in neutrally stable conditions. Wind tunnel experiments were carried out for characterizing the flow and the dispersion of a pollutant around a scaled model (1:400) of a group of eight 10-floor buildings surrounding a square. The situation corresponded to the dispersion of fine inertialess particles released from a line source positioned upstream of the urban geometry. After the sudden interruption of the source generation, the particles persisted in the recirculation cavity between the buildings, with the concentration decaying exponentially with time. This is in accordance with previous works on the dispersion process around bluff bodies of different shapes [e.g., Humphries and Vincent, 1976. An experimental investigation of the detention of airborne smoke in the wake bubble behind a disk. Journal of Fluid Mechanics 73, 453-464; Vincent, 1977. Model experiments on the nature of air pollution transport near buildings. Atmospheric Environment 11, 765-774; Fackrell, 1984. Parameters characterizing dispersion in the near wake of buildings. Journal of Wind Engineering and Industrial Aerodynamics 16, 97-118]. The main parameter in the investigation was the characteristic time constant for the concentration decay. The measurements of the variation in the concentration of the fine particles were performed by means of a photo-detection technique based on the attenuation of light. The velocity fields were evaluated with the particle image velocimetry (PIV) technique. The dimensionless residence time H for the particles ( H= τU/ L, where τ is the time constant for the concentration decay, U the free-stream velocity, and L is a characteristic dimension for the urban geometry, as defined by Humphries and Vincent [1976. An experimental investigation of the detention of airborne smoke in the wake bubble behind a disk. Journal of Fluid Mechanics 73, 453-464] was determined for various locations in the scaled model, in the range of Reynolds numbers ( Re) between 8000 and 64,000. H was found to be 6.5±1.0.
BUILDING ENVELOPE OPTIMIZATION USING EMERGY ANALYSIS
Energy analysis is an integral component of sustainable building practices. Energy analysis coupled with optimization techniques may offer solutions for greater energy efficiency over the lifetime of the building. However, all such computationsemploy the energy used for operation...
BEopt - Building Energy Optimization BEopt NREL - National Renewable Energy Laboratory Primary Energy Optimization) software provides capabilities to evaluate residential building designs and identify sequential search optimization technique used by BEopt: Finds minimum-cost building designs at different
Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, J.; Polly, B.; Collis, J.
2013-09-01
This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less
Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
and Ben Polly, Joseph Robertson; Polly, Ben; Collis, Jon
2013-09-01
This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less
NASA Astrophysics Data System (ADS)
De Vos, P. J.
2017-08-01
Since the new millennium, living in historic cities has become extremely popular in the Netherlands. As a consequence, historic environments are being adapted to meet modern living standards. Houses are constantly subjected to development, restoration and renovation. Although most projects are carried out with great care and strive to preserve and respect as much historic material as possible, nevertheless a significant amount of historical fabric disappears. This puts enormous pressure on building archaeologists that struggle to rapidly and accurately capture in situ authentic material and historical evidence in the midst of construction works. In Leiden, a medieval city that flourished during the seventeenth century and that today counts over 3,000 listed monuments, a solution to the problem has been found with the implementation of advanced recording techniques. Since 2014, building archaeologists of the city council have experienced first-hand that new recording techniques, such as laser scanning and photogrammetry, have dramatically decreased time spent on site with documentation. Time they now use to uncover, analyse and interpret the recovered historical data. Nevertheless, within building archaeology education, a strong case is made for hand drawing as a method for understanding a building, emphasising the importance of close observation and physical contact with the subject. In this paper, the use of advanced recording techniques in building archaeology is being advocated, confronting traditional educational theory with practise, and research tradition with the rapid rise of new recording technologies.
NASA Astrophysics Data System (ADS)
Jain, A.
2017-08-01
Computer based method can help in discovery of leads and can potentially eliminate chemical synthesis and screening of many irrelevant compounds, and in this way, it save time as well as cost. Molecular modeling systems are powerful tools for building, visualizing, analyzing and storing models of complex molecular structure that can help to interpretate structure activity relationship. The use of various techniques of molecular mechanics and dynamics and software in Computer aided drug design along with statistics analysis is powerful tool for the medicinal chemistry to synthesis therapeutic and effective drugs with minimum side effect.
Updates on measurements and modeling techniques for expendable countermeasures
NASA Astrophysics Data System (ADS)
Gignilliat, Robert; Tepfer, Kathleen; Wilson, Rebekah F.; Taczak, Thomas M.
2016-10-01
The potential threat of recently-advertised anti-ship missiles has instigated research at the United States (US) Naval Research Laboratory (NRL) into the improvement of measurement techniques for visual band countermeasures. The goal of measurements is the collection of radiometric imagery for use in the building and validation of digital models of expendable countermeasures. This paper will present an overview of measurement requirements unique to the visual band and differences between visual band and infrared (IR) band measurements. A review of the metrics used to characterize signatures in the visible band will be presented and contrasted to those commonly used in IR band measurements. For example, the visual band measurements require higher fidelity characterization of the background, including improved high-transmittance measurements and better characterization of solar conditions to correlate results more closely with changes in the environment. The range of relevant engagement angles has also been expanded to include higher altitude measurements of targets and countermeasures. In addition to the discussion of measurement techniques, a top-level qualitative summary of modeling approaches will be presented. No quantitative results or data will be presented.
Monitoring Building Deformation with InSAR: Experiments and Validation
Yang, Kui; Yan, Li; Huang, Guoman; Chen, Chu; Wu, Zhengpeng
2016-01-01
Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS) regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE) indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated. PMID:27999403
Active Player Modeling in the Iterated Prisoner's Dilemma
Park, Hyunsoo; Kim, Kyung-Joong
2016-01-01
The iterated prisoner's dilemma (IPD) is well known within the domain of game theory. Although it is relatively simple, it can also elucidate important problems related to cooperation and trust. Generally, players can predict their opponents' actions when they are able to build a precise model of their behavior based on their game playing experience. However, it is difficult to make such predictions based on a limited number of games. The creation of a precise model requires the use of not only an appropriate learning algorithm and framework but also a good dataset. Active learning approaches have recently been introduced to machine learning communities. The approach can usually produce informative datasets with relatively little effort. Therefore, we have proposed an active modeling technique to predict the behavior of IPD players. The proposed method can model the opponent player's behavior while taking advantage of interactive game environments. This experiment used twelve representative types of players as opponents, and an observer used an active modeling algorithm to model these opponents. This observer actively collected data and modeled the opponent's behavior online. Most of our data showed that the observer was able to build, through direct actions, a more accurate model of an opponent's behavior than when the data were collected through random actions. PMID:26989405
Active Player Modeling in the Iterated Prisoner's Dilemma.
Park, Hyunsoo; Kim, Kyung-Joong
2016-01-01
The iterated prisoner's dilemma (IPD) is well known within the domain of game theory. Although it is relatively simple, it can also elucidate important problems related to cooperation and trust. Generally, players can predict their opponents' actions when they are able to build a precise model of their behavior based on their game playing experience. However, it is difficult to make such predictions based on a limited number of games. The creation of a precise model requires the use of not only an appropriate learning algorithm and framework but also a good dataset. Active learning approaches have recently been introduced to machine learning communities. The approach can usually produce informative datasets with relatively little effort. Therefore, we have proposed an active modeling technique to predict the behavior of IPD players. The proposed method can model the opponent player's behavior while taking advantage of interactive game environments. This experiment used twelve representative types of players as opponents, and an observer used an active modeling algorithm to model these opponents. This observer actively collected data and modeled the opponent's behavior online. Most of our data showed that the observer was able to build, through direct actions, a more accurate model of an opponent's behavior than when the data were collected through random actions.
The monocular visual imaging technology model applied in the airport surface surveillance
NASA Astrophysics Data System (ADS)
Qin, Zhe; Wang, Jian; Huang, Chao
2013-08-01
At present, the civil aviation airports use the surface surveillance radar monitoring and positioning systems to monitor the aircrafts, vehicles and the other moving objects. Surface surveillance radars can cover most of the airport scenes, but because of the terminals, covered bridges and other buildings geometry, surface surveillance radar systems inevitably have some small segment blind spots. This paper presents a monocular vision imaging technology model for airport surface surveillance, achieving the perception of scenes of moving objects such as aircrafts, vehicles and personnel location. This new model provides an important complement for airport surface surveillance, which is different from the traditional surface surveillance radar techniques. Such technique not only provides clear objects activities screen for the ATC, but also provides image recognition and positioning of moving targets in this area. Thereby it can improve the work efficiency of the airport operations and avoid the conflict between the aircrafts and vehicles. This paper first introduces the monocular visual imaging technology model applied in the airport surface surveillance and then the monocular vision measurement accuracy analysis of the model. The monocular visual imaging technology model is simple, low cost, and highly efficient. It is an advanced monitoring technique which can make up blind spot area of the surface surveillance radar monitoring and positioning systems.
Simulation of a Petri net-based model of the terpenoid biosynthesis pathway.
Hawari, Aliah Hazmah; Mohamed-Hussein, Zeti-Azura
2010-02-09
The development and simulation of dynamic models of terpenoid biosynthesis has yielded a systems perspective that provides new insights into how the structure of this biochemical pathway affects compound synthesis. These insights may eventually help identify reactions that could be experimentally manipulated to amplify terpenoid production. In this study, a dynamic model of the terpenoid biosynthesis pathway was constructed based on the Hybrid Functional Petri Net (HFPN) technique. This technique is a fusion of three other extended Petri net techniques, namely Hybrid Petri Net (HPN), Dynamic Petri Net (HDN) and Functional Petri Net (FPN). The biological data needed to construct the terpenoid metabolic model were gathered from the literature and from biological databases. These data were used as building blocks to create an HFPNe model and to generate parameters that govern the global behaviour of the model. The dynamic model was simulated and validated against known experimental data obtained from extensive literature searches. The model successfully simulated metabolite concentration changes over time (pt) and the observations correlated with known data. Interactions between the intermediates that affect the production of terpenes could be observed through the introduction of inhibitors that established feedback loops within and crosstalk between the pathways. Although this metabolic model is only preliminary, it will provide a platform for analysing various high-throughput data, and it should lead to a more holistic understanding of terpenoid biosynthesis.
Zhang, Hui-Rong; Yin, Le-Feng; Liu, Yan-Li; Yan, Li-Yi; Wang, Ning; Liu, Gang; An, Xiao-Li; Liu, Bin
2018-04-01
The aim of this study is to build a digital dental model with cone beam computed tomography (CBCT), to fabricate a virtual model via 3D printing, and to determine the accuracy of 3D printing dental model by comparing the result with a traditional dental cast. CBCT of orthodontic patients was obtained to build a digital dental model by using Mimics 10.01 and Geomagic studio software. The 3D virtual models were fabricated via fused deposition modeling technique (FDM). The 3D virtual models were compared with the traditional cast models by using a Vernier caliper. The measurements used for comparison included the width of each tooth, the length and width of the maxillary and mandibular arches, and the length of the posterior dental crest. 3D printing models had higher accuracy compared with the traditional cast models. The results of the paired t-test of all data showed that no statistically significant difference was observed between the two groups (P>0.05). Dental digital models built with CBCT realize the digital storage of patients' dental condition. The virtual dental model fabricated via 3D printing avoids traditional impression and simplifies the clinical examination process. The 3D printing dental models produced via FDM show a high degree of accuracy. Thus, these models are appropriate for clinical practice.
Optimal Control Inventory Stochastic With Production Deteriorating
NASA Astrophysics Data System (ADS)
Affandi, Pardi
2018-01-01
In this paper, we are using optimal control approach to determine the optimal rate in production. Most of the inventory production models deal with a single item. First build the mathematical models inventory stochastic, in this model we also assume that the items are in the same store. The mathematical model of the problem inventory can be deterministic and stochastic models. In this research will be discussed how to model the stochastic as well as how to solve the inventory model using optimal control techniques. The main tool in the study problems for the necessary optimality conditions in the form of the Pontryagin maximum principle involves the Hamilton function. So we can have the optimal production rate in a production inventory system where items are subject deterioration.
Managing Variation in Services in a Software Product Line Context
2010-05-01
Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR-021, ADA235785). Software Engineering Institute, Carnegie Mellon University, 1990...the systems in the product line, and a plan for building the systems. Product line scope and product line analysis define the boundaries and...systems, as well as expected ways in which they may vary. Product line analysis applies established modeling techniques to engineer the common and
Building a Virtual Model of a Baleen Whale: Phase 2
2012-09-30
the most promising technique for discovering acoustic pathways and assessing potential effects from any particular sound, involves finite element...Service, California State Fish and Game, to film and television companies, and the Monterey Bay Aquarium. He has an excellent reputation and decades of...of him for the BBC. Realistically, we can only put our operations into effect during good weather because we need to accomplish an
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mazzolani, Federico M.
2008-07-08
The seismic protection of historical and monumental buildings, namely dating back from the ancient age up to the 20th Century, is being looked at with greater and greater interest, above all in the Euro-Mediterranean area, its cultural heritage being strongly susceptible to undergo severe damage or even collapse due to earthquake. The cultural importance of historical and monumental constructions limits, in many cases, the possibility to upgrade them from the seismic point of view, due to the fear of using intervention techniques which could have detrimental effects on their cultural value. Consequently, a great interest is growing in the developmentmore » of sustainable methodologies for the use of Reversible Mixed Technologies (RMTs) in the seismic protection of the existing constructions. RMTs, in fact, are conceived for exploiting the peculiarities of innovative materials and special devices, and they allow ease of removal when necessary. This paper deals with the experimental and numerical studies, framed within the EC PROHITECH research project, on the application of RMTs to the historical and monumental constructions mainly belonging to the cultural heritage of the Euro-Mediterranean area. The experimental tests and the numerical analyses are carried out at five different levels, namely full scale models, large scale models, sub-systems, devices, materials and elements.« less
Precast concrete unit assessment through GPR survey and FDTD modelling
NASA Astrophysics Data System (ADS)
Campo, Davide
2017-04-01
Precast concrete elements are widely used within United Kingdom house building offering ease in assembly and added values as structural integrity, sound and thermal insulation; most common concrete components include walls, beams, floors, panels, lintels, stairs, etc. The lack of respect of the manufacturer instruction during assembling, however, may induce cracking and short/long term loss of bearing capacity. GPR is a well-established not destructive technique employed in the assessment of structural elements because of real-time imaging, quickness of data collecting and ability to discriminate finest structural details. In this work, GPR has been used to investigate two different precast elements: precast reinforced concrete planks constituting the roof slab of a school and precast wood-cement blocks with insulation material pre-fitted used to build a perimeter wall of a private building. Visible cracks affected both constructions. For the assessment surveys, a GSSI 2.0 GHz GPR antenna has been used because of the high resolution required and the small size of the antenna case (155 by 90 by 105mm) enabling scanning up to 45mm from any obstruction. Finite Difference Time Domain (FDTD) numerical modelling was also performed to build a scenario of the expected GPR signal response for a preliminary real-time interpretation and to help solve uncertainties due to complex reflection patterns: simulated radargrams were built using Reflex Software v. 8.2, reproducing the same GPR pulse used for the surveys in terms of wavelet, nominal frequency, sample frequency and time window. Model geometries were derived from the design projects available both for the planks and the blocks; the electromagnetic properties of the materials (concrete, reinforcing bars, air-filled void, insulation and wooden concrete) were inferred from both values reported in literature and a preliminary interpretation of radargrams where internal layer interfaces were clearly recognizable and univocally interpretable. Simulated and real radargrams comparison demonstrated that, in both cases, manufacturer instructions were not fully respected and confirmed GPR as a fast and effective structural assessment technique with the support of FDTD modelling as data interpretation validating method when complex reflection patterns are observed. GPR findings will be then used to address the intrusive coring necessary to evaluate the compressive strength of the concrete and, in synergy with the intrusive survey results, to plan properly corrective actions to ensure the stability of the structures and guarantee the usability.
NASA Astrophysics Data System (ADS)
Shohei, N.; Nakamura, H.; Fujiwara, H.; Naoichi, M.; Hiromitsu, T.
2017-12-01
It is important to get schematic information of the damage situation immediately after the earthquake utilizing photographs shot from an airplane in terms of the investigation and the decision-making for authorities. In case of the 2016 Kumamoto earthquake, we have acquired more than 1,800 orthographic projection photographs adjacent to damaged areas. These photos have taken between April 16th and 19th by airplanes, then we have distinguished damages of all buildings with 4 levels, and organized as approximately 296,000 GIS data corresponding to the fundamental Geospatial data published by Geospatial Information Authority of Japan. These data have organized by effort of hundreds of engineers. However, it is not considered practical for more extensive disasters like the Nankai Trough earthquake by only human powers. So, we have been developing the automatic damage identification method utilizing image recognition and machine learning techniques. First, we have extracted training data of more than 10,000 buildings which have equally damage levels divided in 4 grades. With these training data, we have been raster scanning in each scanning ranges of entire images, then clipping patch images which represents damage levels each. By utilizing these patch images, we have been developing discriminant models by two ways. One is a model using the Support Vector Machine (SVM). First, extract a feature quantity of each patch images. Then, with these vector values, calculate the histogram density as a method of Bag of Visual Words (BoVW), then classify borders with each damage grades by SVM. The other one is a model using the multi-layered Neural Network. First, design a multi-layered Neural Network. Second, input patch images and damage levels based on a visual judgement, and then, optimize learning parameters with error backpropagation method. By use of both discriminant models, we are going to discriminate damage levels in each patches, then create the image that shows building damage situations. It would be helpful for more prompt and widespread damage detection than visual judgement. Acknowledgment: This work was supported by CSTI through the Cross-ministerial Strategic Innovation Promotion Program (SIP), titled "Enhancement of societal resiliency against natural disasters"(Funding agency: JST).
Results of Outdoor to Indoor Propagation Measurements from 5-32GHz
NASA Technical Reports Server (NTRS)
Houts, Jacquelynne R.; McDonough, Ryan S.
2016-01-01
The demand for wireless services has increased exponentially in the last few years and shows no signs of slowing in the near future. In order for the next generation wireless to provide seamless access, whether the user is indoors or out, a thorough understanding and validation of models describing the impact of building entry loss (BEL) is required. This information is currently lacking and presents a challenge for most system designers. For this reason empirical data is needed to assess the impact of BEL at frequencies that are being explored for future mobile broadband applications This paper present the results of measurements of outdoor-to-indoor propagation from 5-32 GHz in three different buildings. The first is a newer building that is similar in construction to modern residential home. The second is an older commercial office building. The last building is a very new commercial office building built using modern green building techniques. These three buildings allow for the measurement of propagation losses through both modern and older materials; such as glass windows and exterior block and siding. Initial results found that at particular spatial locations the BEL could be less than 1dB or more than 70dB with free space losses discounted (this is likely influenced by multipath). Additionally, it was observed that the PDF distributions of a majority of the measurements trended toward log-normal with means and standard deviations ranging from 8-38dB and 6-14dB, respectively.
Teachers' Opinions about Building a Democratic Classroom
ERIC Educational Resources Information Center
Kesici, Sahin
2008-01-01
The purpose of this study is to determine how to build a democratic classroom in terms of teachers' views. In this study, the qualitative research technique is applied. In addition, the semi-structured interview technique is used as a method of data collection. The data obtained are coded into Nvivo2 and then the following themes are established:…
Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer
NASA Astrophysics Data System (ADS)
Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad
2017-04-01
Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.
NASA Astrophysics Data System (ADS)
Rahmat, R. F.; Anthonius; Muchtar, M. A.; Hizriadi, A.; Syahputra, M. F.
2018-03-01
Universitas Sumatera Utara is one of the public universities that have over 100 buildings with total area of more than 133.141 square meters. Information delivery on the location of the institutional buildings becomes challenging since the university land reaches 93.4 Ha. The information usually delivers orally, in video presentation and in the form of two-dimensional such as maps, posters, and brochures. These three techniques of information delivery have their advantages and disadvantages. Thus, we know that virtual reality has come to existence, touching every domain of knowledge. In this paper we study and implement virtual reality as a new approach to distribute the information to cover all of the deficiencies. The utilization of virtual reality technology combined with 3D modeling is aims to introduce and inform the location of USU institutional buildings in interactive and innovative ways. With the application existence, the campus introduction is expected to be more convenient so that all the USU students will be able to find the exact location of the building they are headed for.
Tools for Evaluating Fault Detection and Diagnostic Methods for HVAC Secondary Systems
NASA Astrophysics Data System (ADS)
Pourarian, Shokouh
Although modern buildings are using increasingly sophisticated energy management and control systems that have tremendous control and monitoring capabilities, building systems routinely fail to perform as designed. More advanced building control, operation, and automated fault detection and diagnosis (AFDD) technologies are needed to achieve the goal of net-zero energy commercial buildings. Much effort has been devoted to develop such technologies for primary heating ventilating and air conditioning (HVAC) systems, and some secondary systems. However, secondary systems, such as fan coil units and dual duct systems, although widely used in commercial, industrial, and multifamily residential buildings, have received very little attention. This research study aims at developing tools that could provide simulation capabilities to develop and evaluate advanced control, operation, and AFDD technologies for these less studied secondary systems. In this study, HVACSIM+ is selected as the simulation environment. Besides developing dynamic models for the above-mentioned secondary systems, two other issues related to the HVACSIM+ environment are also investigated. One issue is the nonlinear equation solver used in HVACSIM+ (Powell's Hybrid method in subroutine SNSQ). It has been found from several previous research projects (ASRHAE RP 825 and 1312) that SNSQ is especially unstable at the beginning of a simulation and sometimes unable to converge to a solution. Another issue is related to the zone model in the HVACSIM+ library of components. Dynamic simulation of secondary HVAC systems unavoidably requires an interacting zone model which is systematically and dynamically interacting with building surrounding. Therefore, the accuracy and reliability of the building zone model affects operational data generated by the developed dynamic tool to predict HVAC secondary systems function. The available model does not simulate the impact of direct solar radiation that enters a zone through glazing and the study of zone model is conducted in this direction to modify the existing zone model. In this research project, the following tasks are completed and summarized in this report: 1. Develop dynamic simulation models in the HVACSIM+ environment for common fan coil unit and dual duct system configurations. The developed simulation models are able to produce both fault-free and faulty operational data under a wide variety of faults and severity levels for advanced control, operation, and AFDD technology development and evaluation purposes; 2. Develop a model structure, which includes the grouping of blocks and superblocks, treatment of state variables, initial and boundary conditions, and selection of equation solver, that can simulate a dual duct system efficiently with satisfactory stability; 3. Design and conduct a comprehensive and systematic validation procedure using collected experimental data to validate the developed simulation models under both fault-free and faulty operational conditions; 4. Conduct a numerical study to compare two solution techniques: Powell's Hybrid (PH) and Levenberg-Marquardt (LM) in terms of their robustness and accuracy. 5. Modification of the thermal state of the existing building zone model in HVACSIM+ library of component. This component is revised to consider the transmitted heat through glazing as a heat source for transient building zone load prediction In this report, literature, including existing HVAC dynamic modeling environment and models, HVAC model validation methodologies, and fault modeling and validation methodologies, are reviewed. The overall methodologies used for fault free and fault model development and validation are introduced. Detailed model development and validation results for the two secondary systems, i.e., fan coil unit and dual duct system are summarized. Experimental data mostly from the Iowa Energy Center Energy Resource Station are used to validate the models developed in this project. Satisfactory model performance in both fault free and fault simulation studies is observed for all studied systems.
NASA Astrophysics Data System (ADS)
Aycock, Kenneth I.; Hariharan, Prasanna; Craven, Brent A.
2017-11-01
For decades, the study of biomedical fluid dynamics using optical flow visualization and measurement techniques has been limited by the inability to fabricate transparent physical models that realistically replicate the complex morphology of biological lumens. In this study, we present an approach for producing optically transparent anatomical models that are suitable for particle image velocimetry (PIV) using a common 3D inkjet printing process (PolyJet) and stock resin (VeroClear). By matching the index of refraction of the VeroClear material using a room-temperature mixture of water, sodium iodide, and glycerol, and by printing the part in an orientation such that the flat, optical surfaces are at an approximately 45° angle to the build plane, we overcome the challenges associated with using this 3D printing technique for PIV. Here, we summarize our methodology and demonstrate the process and the resultant PIV measurements of flow in an optically transparent anatomical model of the human inferior vena cava.
Steam-load-forecasting technique for central-heating plants. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, M.C.; Carnahan, J.V.
Because boilers generally are most efficient at full loads, the Army could achieve significant savings by running fewer boilers at high loads rather than more boilers at low loads. A reliable load prediction technique could help ensure that only those boilers required to meet demand are on line. This report presents the results of an investigation into the feasibility of forecasting heat plant steam loads from historical patterns and weather information. Using steam flow data collected at Fort Benjamin Harrison, IN, a Box-Jenkins transfer function model with an acceptably small prediction error was initially identified. Initial investigation of forecast modelmore » development appeared successful. Dynamic regression methods using actual ambient temperatures yielded the best results. Box-Jenkins univariate models' results appeared slightly less accurate. Since temperature information was not needed for model building and forecasting, however, it is recommended that Box-Jenkins models be considered prime candidates for load forecasting due to their simpler mathematics.« less
Large Animal Models of an In Vivo Bioreactor for Engineering Vascularized Bone.
Akar, Banu; Tatara, Alexander M; Sutradhar, Alok; Hsiao, Hui-Yi; Miller, Michael; Cheng, Ming-Huei; Mikos, Antonios G; Brey, Eric M
2018-04-12
Reconstruction of large skeletal defects is challenging due to the requirement for large volumes of donor tissue and the often complex surgical procedures. Tissue engineering has the potential to serve as a new source of tissue for bone reconstruction, but current techniques are often limited in regards to the size and complexity of tissue that can be formed. Building tissue using an in vivo bioreactor approach may enable the production of appropriate amounts of specialized tissue, while reducing issues of donor site morbidity and infection. Large animals are required to screen and optimize new strategies for growing clinically appropriate volumes of tissues in vivo. In this article, we review both ovine and porcine models that serve as models of the technique proposed for clinical engineering of bone tissue in vivo. Recent findings are discussed with these systems, as well as description of next steps required for using these models, to develop clinically applicable tissue engineering applications.
Wijesiri, Buddhi; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha
2016-09-15
Accurate prediction of stormwater quality is essential for developing effective pollution mitigation strategies. The use of models incorporating simplified mathematical replications of pollutant processes is the common practice for determining stormwater quality. However, an inherent process uncertainty arises due to the intrinsic variability associated with pollutant processes, which has neither been comprehensively understood, nor well accounted for in uncertainty assessment of stormwater quality modelling. This review provides the context for defining and quantifying the uncertainty associated with pollutant build-up and wash-off on urban impervious surfaces based on the hypothesis that particle size is predominant in influencing process variability. Critical analysis of published research literature brings scientific evidence together in order to establish the fact that particle size changes with time, and different sized particles exhibit distinct behaviour during build-up and wash-off, resulting in process variability. Analysis of the different adsorption behaviour of particles confirmed that the variations in pollutant load and composition are influenced by particle size. Particle behaviour and variations in pollutant load and composition are related due to the strong affinity of pollutants such as heavy metals and hydrocarbons for specific particle size ranges. As such, the temporal variation in particle size is identified as the key to establishing a basis for assessing build-up and wash-off process uncertainty. Therefore, accounting for pollutant build-up and wash-off process variability, which is influenced by particle size, would facilitate the assessment of the uncertainty associated with modelling outcomes. Furthermore, the review identified fundamental knowledge gaps where further research is needed in relation to: (1) the aggregation of particles suspended in the atmosphere during build-up; (2) particle re-suspension during wash-off; (3) pollutant re-adsorption by different particle size fractions; and (4) development of evidence-based techniques for assessing uncertainty; and (5) methods for translating the knowledge acquired from the investigation of process mechanisms at small scale into catchment scale for stormwater quality modelling. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mao, Zhiyi; Shan, Ruifeng; Wang, Jiajun; Cai, Wensheng; Shao, Xueguang
2014-07-01
Polyphenols in plant samples have been extensively studied because phenolic compounds are ubiquitous in plants and can be used as antioxidants in promoting human health. A method for rapid determination of three phenolic compounds (chlorogenic acid, scopoletin and rutin) in plant samples using near-infrared diffuse reflectance spectroscopy (NIRDRS) is studied in this work. Partial least squares (PLS) regression was used for building the calibration models, and the effects of spectral preprocessing and variable selection on the models are investigated for optimization of the models. The results show that individual spectral preprocessing and variable selection has no or slight influence on the models, but the combination of the techniques can significantly improve the models. The combination of continuous wavelet transform (CWT) for removing the variant background, multiplicative scatter correction (MSC) for correcting the scattering effect and randomization test (RT) for selecting the informative variables was found to be the best way for building the optimal models. For validation of the models, the polyphenol contents in an independent sample set were predicted. The correlation coefficients between the predicted values and the contents determined by high performance liquid chromatography (HPLC) analysis are as high as 0.964, 0.948 and 0.934 for chlorogenic acid, scopoletin and rutin, respectively.
Assume-Guarantee Abstraction Refinement Meets Hybrid Systems
NASA Technical Reports Server (NTRS)
Bogomolov, Sergiy; Frehse, Goran; Greitschus, Marius; Grosu, Radu; Pasareanu, Corina S.; Podelski, Andreas; Strump, Thomas
2014-01-01
Compositional verification techniques in the assume- guarantee style have been successfully applied to transition systems to efficiently reduce the search space by leveraging the compositional nature of the systems under consideration. We adapt these techniques to the domain of hybrid systems with affine dynamics. To build assumptions we introduce an abstraction based on location merging. We integrate the assume-guarantee style analysis with automatic abstraction refinement. We have implemented our approach in the symbolic hybrid model checker SpaceEx. The evaluation shows its practical potential. To the best of our knowledge, this is the first work combining assume-guarantee reasoning with automatic abstraction-refinement in the context of hybrid automata.
Microgravity Manufacturing Via Fused Deposition
NASA Technical Reports Server (NTRS)
Cooper, K. G.; Griffin, M. R.
2003-01-01
Manufacturing polymer hardware during space flight is currently outside the state of the art. A process called fused deposition modeling (FDM) can make this approach a reality by producing net-shaped components of polymer materials directly from a CAE model. FDM is a rapid prototyping process developed by Stratasys, Inc.. which deposits a fine line of semi-molten polymer onto a substrate while moving via computer control to form the cross-sectional shape of the part it is building. The build platen is then lowered and the process is repeated, building a component directly layer by layer. This method enables direct net-shaped production of polymer components directly from a computer file. The layered manufacturing process allows for the manufacture of complex shapes and internal cavities otherwise impossible to machine. This task demonstrated the benefits of the FDM technique to quickly and inexpensively produce replacement components or repair broken hardware in a Space Shuttle or Space Station environment. The intent of the task was to develop and fabricate an FDM system that was lightweight, compact, and required minimum power consumption to fabricate ABS plastic hardware in microgravity. The final product of the shortened task turned out to be a ground-based breadboard device, demonstrating miniaturization capability of the system.
Gozalbes, Rafael; Carbajo, Rodrigo J; Pineda-Lucena, Antonio
2010-01-01
In the last decade, fragment-based drug discovery (FBDD) has evolved from a novel approach in the search of new hits to a valuable alternative to the high-throughput screening (HTS) campaigns of many pharmaceutical companies. The increasing relevance of FBDD in the drug discovery universe has been concomitant with an implementation of the biophysical techniques used for the detection of weak inhibitors, e.g. NMR, X-ray crystallography or surface plasmon resonance (SPR). At the same time, computational approaches have also been progressively incorporated into the FBDD process and nowadays several computational tools are available. These stretch from the filtering of huge chemical databases in order to build fragment-focused libraries comprising compounds with adequate physicochemical properties, to more evolved models based on different in silico methods such as docking, pharmacophore modelling, QSAR and virtual screening. In this paper we will review the parallel evolution and complementarities of biophysical techniques and computational methods, providing some representative examples of drug discovery success stories by using FBDD.
A study of two statistical methods as applied to shuttle solid rocket booster expenditures
NASA Technical Reports Server (NTRS)
Perlmutter, M.; Huang, Y.; Graves, M.
1974-01-01
The state probability technique and the Monte Carlo technique are applied to finding shuttle solid rocket booster expenditure statistics. For a given attrition rate per launch, the probable number of boosters needed for a given mission of 440 launches is calculated. Several cases are considered, including the elimination of the booster after a maximum of 20 consecutive launches. Also considered is the case where the booster is composed of replaceable components with independent attrition rates. A simple cost analysis is carried out to indicate the number of boosters to build initially, depending on booster costs. Two statistical methods were applied in the analysis: (1) state probability method which consists of defining an appropriate state space for the outcome of the random trials, and (2) model simulation method or the Monte Carlo technique. It was found that the model simulation method was easier to formulate while the state probability method required less computing time and was more accurate.
Multivariate Analysis of Seismic Field Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alam, M. Kathleen
1999-06-01
This report includes the details of the model building procedure and prediction of seismic field data. Principal Components Regression, a multivariate analysis technique, was used to model seismic data collected as two pieces of equipment were cycled on and off. Models built that included only the two pieces of equipment of interest had trouble predicting data containing signals not included in the model. Evidence for poor predictions came from the prediction curves as well as spectral F-ratio plots. Once the extraneous signals were included in the model, predictions improved dramatically. While Principal Components Regression performed well for the present datamore » sets, the present data analysis suggests further work will be needed to develop more robust modeling methods as the data become more complex.« less
Huang, An-Min; Fei, Ben-Hua; Jiang, Ze-Hui; Hse, Chung-Yun
2007-09-01
Near infrared spectroscopy is widely used as a quantitative method, and the main multivariate techniques consist of regression methods used to build prediction models, however, the accuracy of analysis results will be affected by many factors. In the present paper, the influence of different sample roughness on the mathematical model of NIR quantitative analysis of wood density was studied. The result of experiments showed that if the roughness of predicted samples was consistent with that of calibrated samples, the result was good, otherwise the error would be much higher. The roughness-mixed model was more flexible and adaptable to different sample roughness. The prediction ability of the roughness-mixed model was much better than that of the single-roughness model.
REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang
2013-04-30
Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less
Computational Process Modeling for Additive Manufacturing
NASA Technical Reports Server (NTRS)
Bagg, Stacey; Zhang, Wei
2014-01-01
Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.
Semantic Image Based Geolocation Given a Map (Author’s Initial Manuscript)
2016-09-01
novel technique for detection and identification of building facades from geo-tagged reference view using the map and geometry of the building facades. We...2D map of the environment, and geometry of building facades. We evaluate our approach for building identification and geo-localization on a new...location recognition and building identification is done by matching the query view to a reference set, followed by estimation of 3D building facades
Multi Sensor Data Integration for AN Accurate 3d Model Generation
NASA Astrophysics Data System (ADS)
Chhatkuli, S.; Satoh, T.; Tachibana, K.
2015-05-01
The aim of this paper is to introduce a novel technique of data integration between two different data sets, i.e. laser scanned RGB point cloud and oblique imageries derived 3D model, to create a 3D model with more details and better accuracy. In general, aerial imageries are used to create a 3D city model. Aerial imageries produce an overall decent 3D city models and generally suit to generate 3D model of building roof and some non-complex terrain. However, the automatically generated 3D model, from aerial imageries, generally suffers from the lack of accuracy in deriving the 3D model of road under the bridges, details under tree canopy, isolated trees, etc. Moreover, the automatically generated 3D model from aerial imageries also suffers from undulated road surfaces, non-conforming building shapes, loss of minute details like street furniture, etc. in many cases. On the other hand, laser scanned data and images taken from mobile vehicle platform can produce more detailed 3D road model, street furniture model, 3D model of details under bridge, etc. However, laser scanned data and images from mobile vehicle are not suitable to acquire detailed 3D model of tall buildings, roof tops, and so forth. Our proposed approach to integrate multi sensor data compensated each other's weakness and helped to create a very detailed 3D model with better accuracy. Moreover, the additional details like isolated trees, street furniture, etc. which were missing in the original 3D model derived from aerial imageries could also be integrated in the final model automatically. During the process, the noise in the laser scanned data for example people, vehicles etc. on the road were also automatically removed. Hence, even though the two dataset were acquired in different time period the integrated data set or the final 3D model was generally noise free and without unnecessary details.
Possibilities of CT Scanning as Analysis Method in Laser Additive Manufacturing
NASA Astrophysics Data System (ADS)
Karme, Aleksis; Kallonen, Aki; Matilainen, Ville-Pekka; Piili, Heidi; Salminen, Antti
Laser additive manufacturing is an established and constantly developing technique. Structural assessment should be a key component to ensure directed evolution towards higher level of manufacturing. The macroscopic properties of metallic structures are determined by their internal microscopic features, which are difficult to assess using conventional surface measuring methodologies. X-ray microtomography (CT) is a promising technique for three-dimensional non-destructive probing of internal composition and build of various materials. Aim of this study is to define the possibilities of using CT scanning as quality control method in LAM fabricated parts. Since the parts fabricated with LAM are very often used in high quality and accuracy demanding applications in various industries such as medical and aerospace, it is important to be able to define the accuracy of the build parts. The tubular stainless steel test specimens were 3D modelled, manufactured with a modified research AM equipment and imaged after manufacturing with a high-power, high-resolution CT scanner. 3D properties, such as surface texture and the amount and distribution of internal pores, were also evaluated in this study. Surface roughness was higher on the interior wall of the tube, and deviation from the model was systematically directed towards the central axis. Pore distribution showed clear organization and divided into two populations; one following the polygon model seams along both rims, and the other being associated with the concentric and equidistant movement path of the laser. Assessment of samples can enhance the fabrication by guiding the improvement of both modelling and manufacturing process.
World Ocean Circulation Experiment (WOCE) Young Investigator Workshops
NASA Technical Reports Server (NTRS)
Austin, Meg
2004-01-01
The World Ocean Circulation Experiment (WOCE) Young Investigator Workshops goals and objectives are: a) to familiarize Young Investigators with WOCE models, datasets and estimation procedures; b) to offer intensive hands-on exposure to these models ard methods; c) to build collaborations among junior scientists and more senior WOCE investigators; and finally, d) to generate ideas and projects leading to fundable WOCE synthesis projects. To achieve these goals and objectives, the Workshop will offer a mixture of tutorial lectures on numerical models and estimation procedures, advanced seminars on current WOCE synthesis activities and related projects, and the opportunity to conduct small projects which put into practice the techniques advanced in the lectures.
NASA Astrophysics Data System (ADS)
Georgiou, E.; Karachaliou, E.; Stylianidis, E.
2017-08-01
Characteristic example of the Balkan architecture of the 19th century, consists the "Tower house" which is found in the region of Epirus and Western Macedonia, Greece. Nowadays, the only information about these heritage buildings could be abstracted by the architectural designs on hand and the model - Tower that is being displayed in the Folklore Museum of the Municipality of Kozani, Greece, as a maquette. The current work generates a scaled 3D digital model of the "Tower house", by using photogrammetry techniques applied on the model-maquette that is being displayed in the Museum exhibits.
From intuition to statistics in building subsurface structural models
Brandenburg, J.P.; Alpak, F.O.; Naruk, S.; Solum, J.
2011-01-01
Experts associated with the oil and gas exploration industry suggest that combining forward trishear models with stochastic global optimization algorithms allows a quantitative assessment of the uncertainty associated with a given structural model. The methodology is applied to incompletely imaged structures related to deepwater hydrocarbon reservoirs and results are compared to prior manual palinspastic restorations and borehole data. This methodology is also useful for extending structural interpretations into other areas of limited resolution, such as subsalt in addition to extrapolating existing data into seismic data gaps. This technique can be used for rapid reservoir appraisal and potentially have other applications for seismic processing, well planning, and borehole stability analysis.
The Mine Locomotive Wireless Network Strategy Based on Successive Interference Cancellation
Wu, Liaoyuan; Han, Jianghong; Wei, Xing; Shi, Lei; Ding, Xu
2015-01-01
We consider a wireless network strategy based on successive interference cancellation (SIC) for mine locomotives. We firstly build the original mathematical model for the strategy which is a non-convex model. Then, we examine this model intensively, and figure out that there are certain regulations embedded in it. Based on these findings, we are able to reformulate the model into a new form and design a simple algorithm which can assign each locomotive with a proper transmitting scheme during the whole schedule procedure. Simulation results show that the outcomes obtained through this algorithm are improved by around 50% compared with those that do not apply the SIC technique. PMID:26569240
Sanz, Luis; Alonso, Juan Antonio
2017-12-01
In this work we develop approximate aggregation techniques in the context of slow-fast linear population models governed by stochastic differential equations and apply the results to the treatment of populations with spatial heterogeneity. Approximate aggregation techniques allow one to transform a complex system involving many coupled variables and in which there are processes with different time scales, by a simpler reduced model with a fewer number of 'global' variables, in such a way that the dynamics of the former can be approximated by that of the latter. In our model we contemplate a linear fast deterministic process together with a linear slow process in which the parameters are affected by additive noise, and give conditions for the solutions corresponding to positive initial conditions to remain positive for all times. By letting the fast process reach equilibrium we build a reduced system with a lesser number of variables, and provide results relating the asymptotic behaviour of the first- and second-order moments of the population vector for the original and the reduced system. The general technique is illustrated by analysing a multiregional stochastic system in which dispersal is deterministic and the rate growth of the populations in each patch is affected by additive noise.
Demonstration of reduced-order urban scale building energy models
Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew; ...
2017-09-08
The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less
Demonstration of reduced-order urban scale building energy models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew
The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less
NASA Technical Reports Server (NTRS)
Kapania, Rakesh K.; Liu, Youhua
1998-01-01
The use of continuum models for the analysis of discrete built-up complex aerospace structures is an attractive idea especially at the conceptual and preliminary design stages. But the diversity of available continuum models and hard-to-use qualities of these models have prevented them from finding wide applications. In this regard, Artificial Neural Networks (ANN or NN) may have a great potential as these networks are universal approximators that can realize any continuous mapping, and can provide general mechanisms for building models from data whose input-output relationship can be highly nonlinear. The ultimate aim of the present work is to be able to build high fidelity continuum models for complex aerospace structures using the ANN. As a first step, the concepts and features of ANN are familiarized through the MATLAB NN Toolbox by simulating some representative mapping examples, including some problems in structural engineering. Then some further aspects and lessons learned about the NN training are discussed, including the performances of Feed-Forward and Radial Basis Function NN when dealing with noise-polluted data and the technique of cross-validation. Finally, as an example of using NN in continuum models, a lattice structure with repeating cells is represented by a continuum beam whose properties are provided by neural networks.
NASA Astrophysics Data System (ADS)
Michele, Mangiameli; Giuseppe, Mussumeci; Salvatore, Zito
2017-07-01
The Structure From Motion (SFM) is a technique applied to a series of photographs of an object that returns a 3D reconstruction made up by points in the space (point clouds). This research aims at comparing the results of the SFM approach with the results of a 3D laser scanning in terms of density and accuracy of the model. The experience was conducted by detecting several architectural elements (walls and portals of historical buildings) both with a 3D laser scanner of the latest generation and an amateur photographic camera. The point clouds acquired by laser scanner and those acquired by the photo camera have been systematically compared. In particular we present the experience carried out on the "Don Diego Pappalardo Palace" site in Pedara (Catania, Sicily).
Mossotti, Victor G.; Eldeeb, A. Raouf; Fries, Terry L.; Coombs, Mary Jane; Naude, Virginia N.; Soderberg, Lisa; Wheeler, George S.
2002-01-01
This report describes a scientific investigation of the effects of eight different cleaning techniques on the Berkshire Lee marble component of the facade of the East Center Pavilion at Philadelphia City Hall; the study was commissioned by the city of Philadelphia. The eight cleaning techniques evaluated in this study were power wash (proprietary gel detergent followed by water rinse under pressure), misting (treatment with potable, nebulized water for 24-36 hours), gommage (proprietary Thomann-Hanry low-pressure, air-driven, small-particle, dry abrasion), combination (gommage followed by misting), Armax (sodium bicarbonate delivered under pressure in a water wash), JOS (dolomite powder delivered in a low-pressure, rotary-vortex water wash), laser (thermal ablation), and dry ice (powdered-dry-ice abrasion delivered under pressure). In our study approximately 160 cores were removed from the building for laboratory analysis. We developed a computer program to analyze scanning-electron-micrograph images for the microscale surface roughness and other morphologic parameters of the stone surface, including the near-surface fracture density of the stone. An analysis of more than 1,100 samples cut from the cores provided a statistical basis for crafting the essential elements of a reduced-form, mixed-kinetics conceptual model that represents the deterioration of calcareous stone in terms of self-organized soiling and erosion patterns. This model, in turn, provided a basis for identifying the variables that are affected by the cleaning techniques and for evaluating the extent to which such variables influence the stability of the stone. The model recognizes three classes of variables that may influence the soiling load on the stone, including such exogenous environmental variables as airborne moisture, pollutant concentrations, and local aerodynamics, and such endogenous stone variables as surface chemistry and microstructure (fracturing, roughness, and so on). This study showed that morphologic variables on the mesoscale to macroscale are not generally affected by the choice of a cleaning technique. The long-term soiling pattern on the building is independent of the cleaning technique applied. This study also showed that soluble salts do not play a significant role in the deterioration of Berkshire Lee marble. Although salts were evident in cracks and fissures of the heavily soiled stone, such salts did not penetrate the surface to a depth of more than a few hundred micrometers. The criteria used to differentiate the cleaning techniques were ultimately based on the ability of each technique to remove soiling without altering the texture of the stone surface. This study identified both the gommage and JOS techniques as appropriate for cleaning ashlar surfaces and the combination technique as appropriate for cleaning highly carved surfaces at the entablatures, cornices, and column capitals.
Knowledge-based simulation using object-oriented programming
NASA Technical Reports Server (NTRS)
Sidoran, Karen M.
1993-01-01
Simulations have become a powerful mechanism for understanding and modeling complex phenomena. Their results have had substantial impact on a broad range of decisions in the military, government, and industry. Because of this, new techniques are continually being explored and developed to make them even more useful, understandable, extendable, and efficient. One such area of research is the application of the knowledge-based methods of artificial intelligence (AI) to the computer simulation field. The goal of knowledge-based simulation is to facilitate building simulations of greatly increased power and comprehensibility by making use of deeper knowledge about the behavior of the simulated world. One technique for representing and manipulating knowledge that has been enhanced by the AI community is object-oriented programming. Using this technique, the entities of a discrete-event simulation can be viewed as objects in an object-oriented formulation. Knowledge can be factual (i.e., attributes of an entity) or behavioral (i.e., how the entity is to behave in certain circumstances). Rome Laboratory's Advanced Simulation Environment (RASE) was developed as a research vehicle to provide an enhanced simulation development environment for building more intelligent, interactive, flexible, and realistic simulations. This capability will support current and future battle management research and provide a test of the object-oriented paradigm for use in large scale military applications.
Application of Machine Learning to Rotorcraft Health Monitoring
NASA Technical Reports Server (NTRS)
Cody, Tyler; Dempsey, Paula J.
2017-01-01
Machine learning is a powerful tool for data exploration and model building with large data sets. This project aimed to use machine learning techniques to explore the inherent structure of data from rotorcraft gear tests, relationships between features and damage states, and to build a system for predicting gear health for future rotorcraft transmission applications. Classical machine learning techniques are difficult, if not irresponsible to apply to time series data because many make the assumption of independence between samples. To overcome this, Hidden Markov Models were used to create a binary classifier for identifying scuffing transitions and Recurrent Neural Networks were used to leverage long distance relationships in predicting discrete damage states. When combined in a workflow, where the binary classifier acted as a filter for the fatigue monitor, the system was able to demonstrate accuracy in damage state prediction and scuffing identification. The time dependent nature of the data restricted data exploration to collecting and analyzing data from the model selection process. The limited amount of available data was unable to give useful information, and the division of training and testing sets tended to heavily influence the scores of the models across combinations of features and hyper-parameters. This work built a framework for tracking scuffing and fatigue on streaming data and demonstrates that machine learning has much to offer rotorcraft health monitoring by using Bayesian learning and deep learning methods to capture the time dependent nature of the data. Suggested future work is to implement the framework developed in this project using a larger variety of data sets to test the generalization capabilities of the models and allow for data exploration.
Air Force Operational Test and Evaluation Center, Volume 2, Number 2
1988-01-01
the special class of attributes arc recorded, cost or In place of the normalization ( I). we propose beliefit. the lollowins normalization NUMERICAL ...comprchcnsi\\c set of modular basic data flow to meet requirements at test tools ,. designed to provide flexible data reduction start, then building to...possible. a totlinaion ot the two position error measurement techniques arc used SLR is a methd of fitting a linear model o accumlulate a position error
Alternative Natural Energy Sources in Building Design.
ERIC Educational Resources Information Center
Davis, Albert J.; Schubert, Robert P.
This publication provides a discussion of various energy conserving building systems and design alternatives. The information presented here covers alternative space and water heating systems, and energy conserving building designs incorporating these systems and other energy conserving techniques. Besides water, wind, solar, and bio conversion…
Model building techniques for analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald
2009-09-01
The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the productmore » definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.« less
NASA Astrophysics Data System (ADS)
Jacobs, P. Js; Cnudde, V.
2003-04-01
X-ray computed micro-tomography (μCT) is a promising non-destructive imaging technique to study building materials. μCT analysis provides information on the internal structure and petrophysical properties of small samples (size up to 2 cm diameter and 6 cm height), with to date a maximum resolution of 10 μm for commercial systems (Skyscan 1072). μCT allows visualising and measuring complete three-dimensional object structures without sample preparation. Possible applications of the μCT-technique for the monitoring of natural building stones are multiple: (i) to determine non-destructively porosity based on 3D images, (ii) to visualise weathering phenomena at the μ-scale, (iii) to understand the rationale of weathering processes, (iv) to visualise the presence of waterrepellents and consolidation products, (v) to monitor the protective effects of these products during weathering in order to understand the underlying weathering mechanisms and (vi) to provide advise on the suitability of products for the treatment of a particular rock-type. μCT-technique in combination with micro-Raman spectroscopy could prove to be a powerful tool for the future, as the combination of 3D visualisation and 2D chemical determination of inorganic as well as organic components could provide new insights to optimise conservation and restoration techniques of building materials. Determining the penetration depth of restoration products, used to consolidate or to protect natural building stones from weathering, is crucial if the application of conservation products is planned. Every type of natural building stone has its own petrophysical characteristics and each rock type reacts differently on the various restoration products available on the market. To assess the penetration depth and the effectiveness of a certain restoration product, μCT technology in combination with micro-Raman spectroscopy could be applied. Due to its non-destructive character and its resolution down to porosity scale, the technology of μCT offers a large potential of application. μCT-technique in combination with micro-Raman spectroscopy could prove to be a powerful tool for the future, as the combination of 3D visualisation and 2D chemical determination could provide new insights to optimise conservation and restoration techniques of building materials. These principles will be demonstrated for Maastricht limestone and Bray sandstone that have been selected for this study because of their high porosity and their very pure composition.
Multiscale Materials Modeling in an Industrial Environment.
Weiß, Horst; Deglmann, Peter; In 't Veld, Pieter J; Cetinkaya, Murat; Schreiner, Eduard
2016-06-07
In this review, we sketch the materials modeling process in industry. We show that predictive and fast modeling is a prerequisite for successful participation in research and development processes in the chemical industry. Stable and highly automated workflows suitable for handling complex systems are a must. In particular, we review approaches to build and parameterize soft matter systems. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved, as exemplified here for formulation polymer development. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US government. Valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work hand in hand.
Structural Equation Model Trees
Brandmaier, Andreas M.; von Oertzen, Timo; McArdle, John J.; Lindenberger, Ulman
2015-01-01
In the behavioral and social sciences, structural equation models (SEMs) have become widely accepted as a modeling tool for the relation between latent and observed variables. SEMs can be seen as a unification of several multivariate analysis techniques. SEM Trees combine the strengths of SEMs and the decision tree paradigm by building tree structures that separate a data set recursively into subsets with significantly different parameter estimates in a SEM. SEM Trees provide means for finding covariates and covariate interactions that predict differences in structural parameters in observed as well as in latent space and facilitate theory-guided exploration of empirical data. We describe the methodology, discuss theoretical and practical implications, and demonstrate applications to a factor model and a linear growth curve model. PMID:22984789
Terwilliger, Thomas C; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Moriarty, Nigel W; Zwart, Peter H; Hung, Li Wei; Read, Randy J; Adams, Paul D
2008-01-01
The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 A, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution.
Barriers to Building Energy Efficiency (BEE) promotion: A transaction costs perspective
NASA Astrophysics Data System (ADS)
Qian Kun, Queena
Worldwide, buildings account for a surprisingly high 40% of global energy consumption, and the resulting carbon footprint significantly exceeds that of all forms of transportation combined. Large and attractive opportunities exist to reduce buildings' energy use at lower costs and higher returns than in other sectors. This thesis analyzes the concerns of the market stakeholders, mainly real estate developers and end-users, in terms of transaction costs as they make decisions about investing in Building Energy Efficiency (BEE). It provides a detailed analysis of the current situation and future prospects for BEE adoption by the market's stakeholders. It delineates the market and lays out the economic and institutional barriers to the large-scale deployment of energy-efficient building techniques. The aim of this research is to investigate the barriers raised by transaction costs that hinder market stakeholders from investing in BEES. It explains interactions among stakeholders in general and in the specific case of Hong Kong as they consider transaction costs. It focuses on the influence of transaction costs on the decision-making of the stakeholders during the entire process of real estate development. The objectives are: 1) To establish an analytical framework for understanding the barriers to BEE investment with consideration of transaction costs; 2) To build a theoretical game model of decision making among the BEE market stakeholders; 3) To study the empirical data from questionnaire surveys of building designers and from focused interviews with real estate developers in Hong Kong; 4) To triangulate the study's empirical findings with those of the theoretical model and analytical framework. The study shows that a coherent institutional framework needs to be established to ensure that the design and implementation of BEE policies acknowledge the concerns of market stakeholders by taking transaction costs into consideration. Regulatory and incentive options should be integrated into BEE policies to minimize efficiency gaps and to realize a sizeable increase in the number of energy-efficient buildings in the next decades. Specifically, the analysis shows that a thorough understanding of the transaction costs borne by particular stakeholders could improve the energy efficiency of buildings, even without improvements in currently available technology.
NASA Astrophysics Data System (ADS)
Hess, M. R.; Petrovic, V.; Kuester, F.
2017-08-01
Digital documentation of cultural heritage structures is increasingly more common through the application of different imaging techniques. Many works have focused on the application of laser scanning and photogrammetry techniques for the acquisition of threedimensional (3D) geometry detailing cultural heritage sites and structures. With an abundance of these 3D data assets, there must be a digital environment where these data can be visualized and analyzed. Presented here is a feedback driven visualization framework that seamlessly enables interactive exploration and manipulation of massive point cloud data. The focus of this work is on the classification of different building materials with the goal of building more accurate as-built information models of historical structures. User defined functions have been tested within the interactive point cloud visualization framework to evaluate automated and semi-automated classification of 3D point data. These functions include decisions based on observed color, laser intensity, normal vector or local surface geometry. Multiple case studies are presented here to demonstrate the flexibility and utility of the presented point cloud visualization framework to achieve classification objectives.
Probabilistic topic modeling for the analysis and classification of genomic sequences
2015-01-01
Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734
Measure Guideline. Air Sealing Attics in Multifamily Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Otis, Casey; Maxwell, Sean
2012-06-01
This Building America Measure Guideline is intended for owners, builders, contractors, homeowners, and other stakeholders in the multifamily building industry, and focuses on challenges found in existing buildings for a variety of housing types. It explains why air sealing is desirable, explores related health and safety issues, and identifies common air leakage points in multifamily building attics. In addition, it also gives an overview of materials and techniques typically used to perform air sealing work.
Measure Guideline: Air Sealing Attics in Multifamily Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Otis, C.; Maxwell, S.
2012-06-01
This Building America Measure Guideline is intended for owners, builders, contractors, homeowners, and other stakeholders in the multifamily building industry, and focuses on challenges found in existing buildings for a variety of housing types. It explains why air sealing is desirable, explores related health and safety issues, and identifies common air leakage points in multifamily building attics. In addition, it also gives an overview of materials and techniques typically used to perform air sealing work.
Inverse and Predictive Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Syracuse, Ellen Marie
The LANL Seismo-Acoustic team has a strong capability in developing data-driven models that accurately predict a variety of observations. These models range from the simple – one-dimensional models that are constrained by a single dataset and can be used for quick and efficient predictions – to the complex – multidimensional models that are constrained by several types of data and result in more accurate predictions. Team members typically build models of geophysical characteristics of Earth and source distributions at scales of 1 to 1000s of km, the techniques used are applicable for other types of physical characteristics at an evenmore » greater range of scales. The following cases provide a snapshot of some of the modeling work done by the Seismo- Acoustic team at LANL.« less
Mesh-To from Segmented Mesh Elements to Bim Model with Limited Parameters
NASA Astrophysics Data System (ADS)
Yang, X.; Koehl, M.; Grussenmeyer, P.
2018-05-01
Building Information Modelling (BIM) technique has been widely utilized in heritage documentation and comes to a general term Historical/Heritage BIM (HBIM). The current HBIM project mostly employs the scan-to-BIM process to manually create the geometric model from the point cloud. This paper explains how it is possible to shape from the mesh geometry with reduced human involvement during the modelling process. Aiming at unbuilt heritage, two case studies are handled in this study, including a ruined Roman stone architectural and a severely damaged abbey. The pipeline consists of solid element modelling based on documentation data using Autodesk Revit, a common BIM platform, and the successive modelling from these geometric primitives using Autodesk Dynamo, a visual programming built-in plugin tool in Revit. The BIM-based reconstruction enriches the classic visual model from computer graphics approaches with measurement, semantic and additional information. Dynamo is used to develop a semi-automated function to reduce the manual process, which builds the final BIM model from segmented parametric elements directly. The level of detail (LoD) of the final models is dramatically relevant with the manual involvement in the element creation. The proposed outline also presents two potential issues in the ongoing work: combining the ontology semantics with the parametric BIM model, and introducing the proposed pipeline into the as-built HBIM process.
Energy management study: A proposed case of government building
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tahir, Mohamad Zamhari; Nawi, Mohd Nasrun Mohd; Baharum, Mohd Faizal
Align with the current needs of the sustainable and green technology in Malaysian construction industry, this research is conducted to seek and identify opportunities to better manage energy use including the process of understand when, where, and how energy is used in a building. The purpose of this research is to provide a best practice guideline as a practical tool to assist construction industry in Malaysia to improve the energy efficiency of the office building during the post-production by reviewing the current practice of the building operation and maintenance in order to optimum the usage and reduce the amount ofmore » energy input into the building. Therefore, this paper will review the concept of maintenance management, current issue in energy management, and on how the research process will be conducted. There are several process involves and focuses on technical and management techniques such as energy metering, tracing, harvesting, and auditing based on the case study that will be accomplish soon. Accordingly, a case study is appropriate to be selected as a strategic research approach in which involves an empirical investigation of a particular contemporary phenomenon within its real life context using multiple sources of evidence for the data collection process. A Government office building will be selected as an appropriate case study for this research. In the end of this research, it will recommend a strategic approach or model in a specific guideline for enabling energy-efficient operation and maintenance in the office building.« less
Energy management study: A proposed case of government building
NASA Astrophysics Data System (ADS)
Tahir, Mohamad Zamhari; Nawi, Mohd Nasrun Mohd; Baharum, Mohd Faizal
2015-05-01
Align with the current needs of the sustainable and green technology in Malaysian construction industry, this research is conducted to seek and identify opportunities to better manage energy use including the process of understand when, where, and how energy is used in a building. The purpose of this research is to provide a best practice guideline as a practical tool to assist construction industry in Malaysia to improve the energy efficiency of the office building during the post-production by reviewing the current practice of the building operation and maintenance in order to optimum the usage and reduce the amount of energy input into the building. Therefore, this paper will review the concept of maintenance management, current issue in energy management, and on how the research process will be conducted. There are several process involves and focuses on technical and management techniques such as energy metering, tracing, harvesting, and auditing based on the case study that will be accomplish soon. Accordingly, a case study is appropriate to be selected as a strategic research approach in which involves an empirical investigation of a particular contemporary phenomenon within its real life context using multiple sources of evidence for the data collection process. A Government office building will be selected as an appropriate case study for this research. In the end of this research, it will recommend a strategic approach or model in a specific guideline for enabling energy-efficient operation and maintenance in the office building.
Space Partitioning for Privacy Enabled 3D City Models
NASA Astrophysics Data System (ADS)
Filippovska, Y.; Wichmann, A.; Kada, M.
2016-10-01
Due to recent technological progress, data capturing and processing of highly detailed (3D) data has become extensive. And despite all prospects of potential uses, data that includes personal living spaces and public buildings can also be considered as a serious intrusion into people's privacy and a threat to security. It becomes especially critical if data is visible by the general public. Thus, a compromise is needed between open access to data and privacy requirements which can be very different for each application. As privacy is a complex and versatile topic, the focus of this work particularly lies on the visualization of 3D urban data sets. For the purpose of privacy enabled visualizations of 3D city models, we propose to partition the (living) spaces into privacy regions, each featuring its own level of anonymity. Within each region, the depicted 2D and 3D geometry and imagery is anonymized with cartographic generalization techniques. The underlying spatial partitioning is realized as a 2D map generated as a straight skeleton of the open space between buildings. The resulting privacy cells are then merged according to the privacy requirements associated with each building to form larger regions, their borderlines smoothed, and transition zones established between privacy regions to have a harmonious visual appearance. It is exemplarily demonstrated how the proposed method generates privacy enabled 3D city models.
The Creation of Space Vector Models of Buildings From RPAS Photogrammetry Data
NASA Astrophysics Data System (ADS)
Trhan, Ondrej
2017-06-01
The results of Remote Piloted Aircraft System (RPAS) photogrammetry are digital surface models and orthophotos. The main problem of the digital surface models obtained is that buildings are not perpendicular and the shape of roofs is deformed. The task of this paper is to obtain a more accurate digital surface model using building reconstructions. The paper discusses the problem of obtaining and approximating building footprints, reconstructing the final spatial vector digital building model, and modifying the buildings on the digital surface model.
Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process
NASA Astrophysics Data System (ADS)
Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph
2012-08-01
Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.
Toward Model Building for Visual Aesthetic Perception
Lughofer, Edwin; Zeng, Xianyi
2017-01-01
Several models of visual aesthetic perception have been proposed in recent years. Such models have drawn on investigations into the neural underpinnings of visual aesthetics, utilizing neurophysiological techniques and brain imaging techniques including functional magnetic resonance imaging, magnetoencephalography, and electroencephalography. The neural mechanisms underlying the aesthetic perception of the visual arts have been explained from the perspectives of neuropsychology, brain and cognitive science, informatics, and statistics. Although corresponding models have been constructed, the majority of these models contain elements that are difficult to be simulated or quantified using simple mathematical functions. In this review, we discuss the hypotheses, conceptions, and structures of six typical models for human aesthetic appreciation in the visual domain: the neuropsychological, information processing, mirror, quartet, and two hierarchical feed-forward layered models. Additionally, the neural foundation of aesthetic perception, appreciation, or judgement for each model is summarized. The development of a unified framework for the neurobiological mechanisms underlying the aesthetic perception of visual art and the validation of this framework via mathematical simulation is an interesting challenge in neuroaesthetics research. This review aims to provide information regarding the most promising proposals for bridging the gap between visual information processing and brain activity involved in aesthetic appreciation. PMID:29270194
NASA Astrophysics Data System (ADS)
Houng, S.; Hong, T.
2013-12-01
The nature and excitation mechanism of incidents or non-natural events have been widely investigated using seismological techniques. With introduction of dense seismic networks, small-sized non-natural events such as building collapse and chemical explosions are well recorded. Two representative non-natural seismic sources are investigated. A 5-story building in South Korea, Sampoong department store, was collapsed in June 25, 1995, causing casualty of 1445. This accident is known to be the second deadliest non-terror-related building collapse in the world. The event was well recorded by a local station in ~ 9 km away. P and S waves were recorded weak, while monotonic Rayleigh waves were observed well. The origin time is determined using surface-wave arrival time. The magnitude of event is determined to be 1.2, which coincides with a theoretical estimate based on the mass and volume of building. Synthetic waveforms are modeled for various combinations of velocity structures and source time functions, which allow us to constrain the process of building collapse. It appears that the building was collapsed once within a couple of seconds. We also investigate a M2.1 chemical explosion at a fertilizer plant in Texas on April 18, 2013. It was reported that more than one hundred people were dead or injured by the explosion. Seismic waveforms for nearby stations are collected from Incorporated Research Institution of Seismology (IRIS). The event was well recorded at stations in ~500 km away from the source. Strong acoustic signals were observed at stations in a certain great-circle direction. This observation suggests preferential propagation of acoustic waves depending on atmospheric environment. Waveform cross-correlation, spectral analysis and waveform modeling are applied to understand the source physics. We discuss the nature of source and source excitation mechanism.
78 FR 21602 - Amendment to an Approved Agency Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-11
..., Director Better Buildings Challenge, Buildings Technology Program. [FR Doc. 2013-08484 Filed 4-10-13; 8:45..., including through the use of automated collection techniques or other forms of information technology. DATES....: 1910-5141; (2) Information Collection Request Title: Department of Energy Better Buildings Challenge...
[A cold/heat property classification strategy based on bio-effects of herbal medicines].
Jiang, Miao; Lv, Ai-Ping
2014-06-01
The property theory of Chinese herbal medicine (CHM) is regarded as the core and basic of Chinese medical theory, however, the underlying mechanism of the properties in CHMs remains unclear, which impedes a barrier for the modernization of Chinese herbal medicine. The properties of CHM are often categorized into cold and heat according to the theory of Chinese medicine, which are essential to guide the clinical application of CHMs. There is an urgent demand to build a cold/heat property classification model to facilitate the property theory of Chinese herbal medicine, as well as to clarify the controversial properties of some herbs. Based on previous studies on the cold/heat properties of CHM, in this paper, we described a novel strategy on building a cold/heat property classification model based on herbal bio-effect. The interdisciplinary cooperation of systems biology, pharmacological network, and pattern recognition technique might lighten the study on cold/heat property theory, provide a scientific model for determination the cold/heat property of herbal medicines, and a new strategy for expanding the Chinese herbal medicine resources as well.
Commercial Buildings Research Group. Steve's areas of expertise are electric power distribution systems, DC techniques for maximizing the energy efficiency of electrical distribution systems in commercial buildings
ERIC Educational Resources Information Center
Allen, Walter C.
1976-01-01
Examines a century of library architecture in relation to the changing perceptions of library functions, the development of building techniques and materials, fluctuating esthetic fashions and sometimes wildly erratic economic climates. (Author)
A fusion of top-down and bottom-up modeling techniques to constrain regional scale carbon budgets
NASA Astrophysics Data System (ADS)
Goeckede, M.; Turner, D. P.; Michalak, A. M.; Vickers, D.; Law, B. E.
2009-12-01
The effort to constrain regional scale carbon budgets benefits from assimilating as many high quality data sources as possible in order to reduce uncertainties. Two of the most common approaches used in this field, bottom-up and top-down techniques, both have their strengths and weaknesses, and partly build on very different sources of information to train, drive, and validate the models. Within the context of the ORCA2 project, we follow both bottom-up and top-down modeling strategies with the ultimate objective of reconciling their surface flux estimates. The ORCA2 top-down component builds on a coupled WRF-STILT transport module that resolves the footprint function of a CO2 concentration measurement in high temporal and spatial resolution. Datasets involved in the current setup comprise GDAS meteorology, remote sensing products, VULCAN fossil fuel inventories, boundary conditions from CarbonTracker, and high-accuracy time series of atmospheric CO2 concentrations. Surface fluxes of CO2 are normally provided through a simple diagnostic model which is optimized against atmospheric observations. For the present study, we replaced the simple model with fluxes generated by an advanced bottom-up process model, Biome-BGC, which uses state-of-the-art algorithms to resolve plant-physiological processes, and 'grow' a biosphere based on biogeochemical conditions and climate history. This approach provides a more realistic description of biomass and nutrient pools than is the case for the simple model. The process model ingests various remote sensing data sources as well as high-resolution reanalysis meteorology, and can be trained against biometric inventories and eddy-covariance data. Linking the bottom-up flux fields to the atmospheric CO2 concentrations through the transport module allows evaluating the spatial representativeness of the BGC flux fields, and in that way assimilates more of the available information than either of the individual modeling techniques alone. Bayesian inversion is then applied to assign scaling factors that align the surface fluxes with the CO2 time series. Our project demonstrates how bottom-up and top-down techniques can be reconciled to arrive at a more robust and balanced spatial carbon budget. We will show how to evaluate existing flux products through regionally representative atmospheric observations, i.e. how well the underlying model assumptions represent processes on the regional scale. Adapting process model parameterizations sets for e.g. sub-regions, disturbance regimes, or land cover classes, in order to optimize the agreement between surface fluxes and atmospheric observations can lead to improved understanding of the underlying flux mechanisms, and reduces uncertainties in the regional carbon budgets.
Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Model View Definition
2013-06-01
building information models ( BIM ) at the coordinated design stage of building construction. 1.3 Approach To...standard for exchanging Building Information Modeling ( BIM ) data, which defines hundreds of classes for common use in software, currently supported by...specifications, Construction Operations Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF:
Accuracy of impression scanning compared with stone casts of implant impressions.
Matta, Ragai Edward; Adler, Werner; Wichmann, Manfred; Heckmann, Siegfried Martin
2017-04-01
Accurate virtual implant models are a necessity for the fabrication of precisely fitting superstructures. The purpose of this in vitro study was to evaluate different methods with which to build an accurate virtual model of a 3-dimensional implant in the oral cavity; this model would then be used for iterative computer-aided design and computer-aided manufacturing (CAD-CAM) procedures. A titanium master model with 3 rigidly connected implants was manufactured and digitized with a noncontact industrial scanner to obtain a virtual master model. Impressions of the master model with the implant position locators (IPL) were made using vinyl siloxanether material. The impressions were scanned (Impression scanning technique group). For the transfer technique and pick-up technique groups (each group n=20), implant analogs were inserted into the impression copings, impressions were made using polyether, and casts were poured in Type 4 gypsum. The IPLs were screwed into the analogs and scanned. To compare the virtual master model with each virtual test model, a CAD interactive software, ATOS professional, was applied. The Kruskal-Wallis test was subsequently used to determine the overall difference between groups, with the Mann-Whitney U test used for pairwise comparisons. Through Bonferroni correction, the α-level was set to .017. The outcome revealed a significant difference among the 3 groups (P<.01) in terms of accuracy. With regard to total deviation, for all axes, the transfer technique generated the greatest divergence, 0.078 mm (±0.022), compared with the master model. Deviation with the pick-up technique was 0.041 mm (±0.009), with impression scanning generating the most accurate models with a deviation of 0.022 mm (±0.007). The impression scanning method improved the precision of CAD-CAM-fabricated superstructures. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Colour computer-generated holography for point clouds utilizing the Phong illumination model.
Symeonidou, Athanasia; Blinder, David; Schelkens, Peter
2018-04-16
A technique integrating the bidirectional reflectance distribution function (BRDF) is proposed to generate realistic high-quality colour computer-generated holograms (CGHs). We build on prior work, namely a fast computer-generated holography method for point clouds that handles occlusions. We extend the method by integrating the Phong illumination model so that the properties of the objects' surfaces are taken into account to achieve natural light phenomena such as reflections and shadows. Our experiments show that rendering holograms with the proposed algorithm provides realistic looking objects without any noteworthy increase to the computational cost.
NASA Technical Reports Server (NTRS)
Foster, T. F.; Lockman, W. K.
1975-01-01
Heat-transfer data for the 0.0175-scale Space Shuttle Vehicle 3 are presented. Interference heating effects were investigated by a model build-up technique of Orbiter alone, tank alone, second, and first stage configurations. The test program was conducted in the NASA-Ames 3.5-Foot Hypersonic Wind Tunnel at Mach 5.3 for nominal free-stream Reynolds number per foot values of 1.5 x 1,000,000 and 5.0 x 1,000,000.
Critical analysis of procurement techniques in construction management sectors
NASA Astrophysics Data System (ADS)
Tiwari, Suman Tiwari Suresh; Chan, Shiau Wei; Faraz Mubarak, Muhammad
2018-04-01
Over the last three decades, numerous procurement techniques have been one of the highlights of the Construction Management (CM) for ventures, administration contracting, venture management as well as design and construct. Due to the development and utilization of those techniques, various researchers have explored the criteria for their choice and their execution in terms of time, cost and quality. Nevertheless, there is a lack of giving an account on the relationship between the procurement techniques and the progressed related issues, for example, supply chain, sustainability, innovation and technology development, lean construction, constructability, value management, Building Information Modelling (BIM) as well as e-procurement. Through chosen papers from the reputable CM-related academic journals, the specified scopes of these issues are methodically assessed with the objective to explore the status and trend in procurement related research. The result of this paper contributes theoretically as well as practically to the researchers and industrialist in order to be aware and appreciate the development of procurement techniques.
NASA Astrophysics Data System (ADS)
Cantarino, I.; Torrijo, F. J.; Palencia, S.; Gielen, E.
2014-11-01
This paper proposes a method of valuing the stock of residential buildings in Spain as the first step in assessing possible damage caused to them by natural hazards. For the purposes of the study we had access to the SIOSE (the Spanish Land Use and Cover Information System), a high-resolution land-use model, as well as to a report on the financial valuations of this type of building throughout Spain. Using dasymetric disaggregation processes and GIS techniques we developed a geolocalized method of obtaining this information, which was the exposure variable in the general risk assessment formula. Then, with the application over a hazard map, the risk value can be easily obtained. An example of its application is given in a case study that assesses the risk of a landslide in the entire 23 200 km2 of the Valencia Autonomous Community (NUT2), the results of which are analysed by municipal areas (LAU2) for the years 2005 and 2009.
Lightning and surge protection of large ground facilities
NASA Astrophysics Data System (ADS)
Stringfellow, Michael F.
1988-04-01
The vulnerability of large ground facilities to direct lightning strikes and to lightning-induced overvoltages on the power distribution, telephone and data communication lines are discussed. Advanced electrogeometric modeling is used for the calculation of direct strikes to overhead power lines, buildings, vehicles and objects within the facility. Possible modes of damage, injury and loss are discussed. Some appropriate protection methods for overhead power lines, structures, vehicles and aircraft are suggested. Methods to mitigate the effects of transients on overhead and underground power systems as well as within buildings and other structures are recommended. The specification and location of low-voltage surge suppressors for the protection of vulnerable hardware such as computers, telecommunication equipment and radar installations are considered. The advantages and disadvantages of commonly used grounding techniques, such as single point, multiple and isolated grounds are compared. An example is given of the expected distribution of lightning flashes to a large airport, its buildings, structures and facilities, as well as to vehicles on the ground.