Sample records for build computer models

  1. Introducing Molecular Life Science Students to Model Building Using Computer Simulations

    ERIC Educational Resources Information Center

    Aegerter-Wilmsen, Tinri; Kettenis, Dik; Sessink, Olivier; Hartog, Rob; Bisseling, Ton; Janssen, Fred

    2006-01-01

    Computer simulations can facilitate the building of models of natural phenomena in research, such as in the molecular life sciences. In order to introduce molecular life science students to the use of computer simulations for model building, a digital case was developed in which students build a model of a pattern formation process in…

  2. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  3. Automatic computation for optimum height planning of apartment buildings to improve solar access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seong, Yoon-Bok; Kim, Yong-Yee; Seok, Ho-Tae

    2011-01-15

    The objective of this study is to suggest a mathematical model and an optimal algorithm for determining the height of apartment buildings to satisfy the solar rights of survey buildings or survey housing units. The objective is also to develop an automatic computation model for the optimum height of apartment buildings and then to clarify the performance and expected effects. To accomplish the objective of this study, the following procedures were followed: (1) The necessity of the height planning of obstruction buildings to satisfy the solar rights of survey buildings or survey housing units is demonstrated by analyzing through amore » literature review the recent trend of disputes related to solar rights and to examining the social requirements in terms of solar rights. In addition, the necessity of the automatic computation system for height planning of apartment buildings is demonstrated and a suitable analysis method for this system is chosen by investigating the characteristics of analysis methods for solar rights assessment. (2) A case study on the process of height planning of apartment buildings will be briefly described and the problems occurring in this process will then be examined carefully. (3) To develop an automatic computation model for height planning of apartment buildings, geometrical elements forming apartment buildings are defined by analyzing the geometrical characteristics of apartment buildings. In addition, design factors and regulations required in height planning of apartment buildings are investigated. Based on this knowledge, the methodology and mathematical algorithm to adjust the height of apartment buildings by automatic computation are suggested and probable problems and the ways to resolve these problems are discussed. Finally, the methodology and algorithm for the optimization are suggested. (4) Based on the suggested methodology and mathematical algorithm, the automatic computation model for optimum height of apartment buildings is developed and the developed system is verified through the application of some cases. The effects of the suggested model are then demonstrated quantitatively and qualitatively. (author)« less

  4. Building machines that adapt and compute like brains.

    PubMed

    Kriegeskorte, Nikolaus; Mok, Robert M

    2017-01-01

    Building machines that learn and think like humans is essential not only for cognitive science, but also for computational neuroscience, whose ultimate goal is to understand how cognition is implemented in biological brains. A new cognitive computational neuroscience should build cognitive-level and neural-level models, understand their relationships, and test both types of models with both brain and behavioral data.

  5. A Computational Workflow for the Automated Generation of Models of Genetic Designs.

    PubMed

    Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil

    2018-06-05

    Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.

  6. Scaling predictive modeling in drug development with cloud computing.

    PubMed

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  7. Toward a Computational Model of Tutoring.

    ERIC Educational Resources Information Center

    Woolf, Beverly Park

    1992-01-01

    Discusses the integration of instructional science and computer science. Topics addressed include motivation for building knowledge-based systems; instructional design issues, including cognitive models, representing student intentions, and student models and error diagnosis; representing tutoring knowledge; building a tutoring system, including…

  8. Computational modeling of peripheral pain: a commentary.

    PubMed

    Argüello, Erick J; Silva, Ricardo J; Huerta, Mónica K; Avila, René S

    2015-06-11

    This commentary is intended to find possible explanations for the low impact of computational modeling on pain research. We discuss the main strategies that have been used in building computational models for the study of pain. The analysis suggests that traditional models lack biological plausibility at some levels, they do not provide clinically relevant results, and they cannot capture the stochastic character of neural dynamics. On this basis, we provide some suggestions that may be useful in building computational models of pain with a wider range of applications.

  9. Evaluation of a micro-scale wind model's performance over realistic building clusters using wind tunnel experiments

    NASA Astrophysics Data System (ADS)

    Zhang, Ning; Du, Yunsong; Miao, Shiguang; Fang, Xiaoyi

    2016-08-01

    The simulation performance over complex building clusters of a wind simulation model (Wind Information Field Fast Analysis model, WIFFA) in a micro-scale air pollutant dispersion model system (Urban Microscale Air Pollution dispersion Simulation model, UMAPS) is evaluated using various wind tunnel experimental data including the CEDVAL (Compilation of Experimental Data for Validation of Micro-Scale Dispersion Models) wind tunnel experiment data and the NJU-FZ experiment data (Nanjing University-Fang Zhuang neighborhood wind tunnel experiment data). The results show that the wind model can reproduce the vortexes triggered by urban buildings well, and the flow patterns in urban street canyons and building clusters can also be represented. Due to the complex shapes of buildings and their distributions, the simulation deviations/discrepancies from the measurements are usually caused by the simplification of the building shapes and the determination of the key zone sizes. The computational efficiencies of different cases are also discussed in this paper. The model has a high computational efficiency compared to traditional numerical models that solve the Navier-Stokes equations, and can produce very high-resolution (1-5 m) wind fields of a complex neighborhood scale urban building canopy (~ 1 km ×1 km) in less than 3 min when run on a personal computer.

  10. ADDRESSING HUMAN EXPOSURES TO AIR POLLUTANTS AROUND BUILDINGS IN URBAN AREAS WITH COMPUTATIONAL FLUID DYNAMICS MODELS

    EPA Science Inventory

    This paper discusses the status and application of Computational Fluid Dynamics (CFD) models to address challenges for modeling human exposures to air pollutants around urban building microenvironments. There are challenges for more detailed understanding of air pollutant sour...

  11. A Study of the Use of Ontologies for Building Computer-Aided Control Engineering Self-Learning Educational Software

    ERIC Educational Resources Information Center

    García, Isaías; Benavides, Carmen; Alaiz, Héctor; Alonso, Angel

    2013-01-01

    This paper describes research on the use of knowledge models (ontologies) for building computer-aided educational software in the field of control engineering. Ontologies are able to represent in the computer a very rich conceptual model of a given domain. This model can be used later for a number of purposes in different software applications. In…

  12. Building Cognition: The Construction of Computational Representations for Scientific Discovery.

    PubMed

    Chandrasekharan, Sanjay; Nersessian, Nancy J

    2015-11-01

    Novel computational representations, such as simulation models of complex systems and video games for scientific discovery (Foldit, EteRNA etc.), are dramatically changing the way discoveries emerge in science and engineering. The cognitive roles played by such computational representations in discovery are not well understood. We present a theoretical analysis of the cognitive roles such representations play, based on an ethnographic study of the building of computational models in a systems biology laboratory. Specifically, we focus on a case of model-building by an engineer that led to a remarkable discovery in basic bioscience. Accounting for such discoveries requires a distributed cognition (DC) analysis, as DC focuses on the roles played by external representations in cognitive processes. However, DC analyses by and large have not examined scientific discovery, and they mostly focus on memory offloading, particularly how the use of existing external representations changes the nature of cognitive tasks. In contrast, we study discovery processes and argue that discoveries emerge from the processes of building the computational representation. The building process integrates manipulations in imagination and in the representation, creating a coupled cognitive system of model and modeler, where the model is incorporated into the modeler's imagination. This account extends DC significantly, and we present some of the theoretical and application implications of this extended account. Copyright © 2014 Cognitive Science Society, Inc.

  13. Computational and mathematical methods in brain atlasing.

    PubMed

    Nowinski, Wieslaw L

    2017-12-01

    Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.

  14. Software for Building Models of 3D Objects via the Internet

    NASA Technical Reports Server (NTRS)

    Schramer, Tim; Jensen, Jeff

    2003-01-01

    The Virtual EDF Builder (where EDF signifies Electronic Development Fixture) is a computer program that facilitates the use of the Internet for building and displaying digital models of three-dimensional (3D) objects that ordinarily comprise assemblies of solid models created previously by use of computer-aided-design (CAD) programs. The Virtual EDF Builder resides on a Unix-based server computer. It is used in conjunction with a commercially available Web-based plug-in viewer program that runs on a client computer. The Virtual EDF Builder acts as a translator between the viewer program and a database stored on the server. The translation function includes the provision of uniform resource locator (URL) links to other Web-based computer systems and databases. The Virtual EDF builder can be used in two ways: (1) If the client computer is Unix-based, then it can assemble a model locally; the computational load is transferred from the server to the client computer. (2) Alternatively, the server can be made to build the model, in which case the server bears the computational load and the results are downloaded to the client computer or workstation upon completion.

  15. Rapid Energy Modeling Workflow Demonstration Project

    DTIC Science & Technology

    2014-01-01

    Conditioning Engineers BIM Building Information Model BLCC building life cycle costs BPA Building Performance Analysis CAD computer assisted...invited to enroll in the Autodesk Building Performance Analysis ( BPA ) Certificate Program under a group 30 specifically for DoD installation

  16. A Systems Approach to High Performance Buildings: A Computational Systems Engineering R&D Program to Increase DoD Energy Efficiency

    DTIC Science & Technology

    2012-02-01

    for Low Energy Building Ventilation and Space Conditioning Systems...Building Energy Models ................... 162 APPENDIX D: Reduced-Order Modeling and Control Design for Low Energy Building Systems .... 172 D.1...Design for Low Energy Building Ventilation and Space Conditioning Systems This section focuses on the modeling and control of airflow in buildings

  17. The updated algorithm of the Energy Consumption Program (ECP): A computer model simulating heating and cooling energy loads in buildings

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.; Strain, D. M.; Chai, V. W.; Higgins, S.

    1979-01-01

    The energy Comsumption Computer Program was developed to simulate building heating and cooling loads and compute thermal and electric energy consumption and cost. This article reports on the new additional algorithms and modifications made in an effort to widen the areas of application. The program structure was rewritten accordingly to refine and advance the building model and to further reduce the processing time and cost. The program is noted for its very low cost and ease of use compared to other available codes. The accuracy of computations is not sacrificed however, since the results are expected to lie within + or - 10% of actual energy meter readings.

  18. A Seminar in Mathematical Model-Building.

    ERIC Educational Resources Information Center

    Smith, David A.

    1979-01-01

    A course in mathematical model-building is described. Suggested modeling projects include: urban problems, biology and ecology, economics, psychology, games and gaming, cosmology, medicine, history, computer science, energy, and music. (MK)

  19. Near-Source Modeling Updates: Building Downwash & Near-Road

    EPA Science Inventory

    The presentation describes recent research efforts in near-source model development focusing on building downwash and near-road barriers. The building downwash section summarizes a recent wind tunnel study, ongoing computational fluid dynamics simulations and efforts to improve ...

  20. Evaluation of the Revised Lagrangian Particle Model GRAL Against Wind-Tunnel and Field Observations in the Presence of Obstacles

    NASA Astrophysics Data System (ADS)

    Oettl, Dietmar

    2015-05-01

    A revised microscale flow field model has been implemented in the Lagrangian particle model Graz Lagrangian Model (GRAL) for computing flows around obstacles. It is based on the Reynolds-averaged Navier-Stokes equations in three dimensions and the widely used standard turbulence model. Here we focus on evaluating the model regarding computed concentrations by use of a comprehensive wind-tunnel experiment with numerous combinations of building geometries, stack positions, and locations. In addition, two field experiments carried out in Denmark and in the U.S were used to evaluate the model. Further, two different formulations of the standard deviation of wind component fluctuations have also been investigated, but no clear picture could be drawn in this respect. Overall the model is able to capture several of the main features of pollutant dispersion around obstacles, but at least one future model improvement was identified for stack releases within the recirculation zone of buildings. Regulatory applications are the bread-and-butter of most GRAL users nowadays, requiring fast and robust modelling algorithms. Thus, a few simplifications have been introduced to decrease the computational time required. Although predicted concentrations for the two field experiments were found to be in good agreement with observations, shortcomings were identified regarding the extent of computed recirculation zones for the idealized wind-tunnel building geometries, with approaching flows perpendicular to building faces.

  1. The Five Key Questions of Human Performance Modeling.

    PubMed

    Wu, Changxu

    2018-01-01

    Via building computational (typically mathematical and computer simulation) models, human performance modeling (HPM) quantifies, predicts, and maximizes human performance, human-machine system productivity and safety. This paper describes and summarizes the five key questions of human performance modeling: 1) Why we build models of human performance; 2) What the expectations of a good human performance model are; 3) What the procedures and requirements in building and verifying a human performance model are; 4) How we integrate a human performance model with system design; and 5) What the possible future directions of human performance modeling research are. Recent and classic HPM findings are addressed in the five questions to provide new thinking in HPM's motivations, expectations, procedures, system integration and future directions.

  2. Slicing Method for curved façade and window extraction from point clouds

    NASA Astrophysics Data System (ADS)

    Iman Zolanvari, S. M.; Laefer, Debra F.

    2016-09-01

    Laser scanning technology is a fast and reliable method to survey structures. However, the automatic conversion of such data into solid models for computation remains a major challenge, especially where non-rectilinear features are present. Since, openings and the overall dimensions of the buildings are the most critical elements in computational models for structural analysis, this article introduces the Slicing Method as a new, computationally-efficient method for extracting overall façade and window boundary points for reconstructing a façade into a geometry compatible for computational modelling. After finding a principal plane, the technique slices a façade into limited portions, with each slice representing a unique, imaginary section passing through a building. This is done along a façade's principal axes to segregate window and door openings from structural portions of the load-bearing masonry walls. The method detects each opening area's boundaries, as well as the overall boundary of the façade, in part, by using a one-dimensional projection to accelerate processing. Slices were optimised as 14.3 slices per vertical metre of building and 25 slices per horizontal metre of building, irrespective of building configuration or complexity. The proposed procedure was validated by its application to three highly decorative, historic brick buildings. Accuracy in excess of 93% was achieved with no manual intervention on highly complex buildings and nearly 100% on simple ones. Furthermore, computational times were less than 3 sec for data sets up to 2.6 million points, while similar existing approaches required more than 16 hr for such datasets.

  3. Testing the World with Simulations.

    ERIC Educational Resources Information Center

    Roberts, Nancy

    1983-01-01

    Discusses steps involved in model building and simulation: understanding a problem, building a model, and simulation. Includes a mathematical model (focusing on a problem dealing with influenza) written in the DYNAMO computer language, developed specifically for writing simulation models. (Author/JN)

  4. Cyberpsychology: a human-interaction perspective based on cognitive modeling.

    PubMed

    Emond, Bruno; West, Robert L

    2003-10-01

    This paper argues for the relevance of cognitive modeling and cognitive architectures to cyberpsychology. From a human-computer interaction point of view, cognitive modeling can have benefits both for theory and model building, and for the design and evaluation of sociotechnical systems usability. Cognitive modeling research applied to human-computer interaction has two complimentary objectives: (1) to develop theories and computational models of human interactive behavior with information and collaborative technologies, and (2) to use the computational models as building blocks for the design, implementation, and evaluation of interactive technologies. From the perspective of building theories and models, cognitive modeling offers the possibility to anchor cyberpsychology theories and models into cognitive architectures. From the perspective of the design and evaluation of socio-technical systems, cognitive models can provide the basis for simulated users, which can play an important role in usability testing. As an example of application of cognitive modeling to technology design, the paper presents a simulation of interactive behavior with five different adaptive menu algorithms: random, fixed, stacked, frequency based, and activation based. Results of the simulation indicate that fixed menu positions seem to offer the best support for classification like tasks such as filing e-mails. This research is part of the Human-Computer Interaction, and the Broadband Visual Communication research programs at the National Research Council of Canada, in collaboration with the Carleton Cognitive Modeling Lab at Carleton University.

  5. Representing, Running, and Revising Mental Models: A Computational Model

    ERIC Educational Resources Information Center

    Friedman, Scott; Forbus, Kenneth; Sherin, Bruce

    2018-01-01

    People use commonsense science knowledge to flexibly explain, predict, and manipulate the world around them, yet we lack computational models of how this commonsense science knowledge is represented, acquired, utilized, and revised. This is an important challenge for cognitive science: Building higher order computational models in this area will…

  6. Hybrid architecture for encoded measurement-based quantum computation

    PubMed Central

    Zwerger, M.; Briegel, H. J.; Dür, W.

    2014-01-01

    We present a hybrid scheme for quantum computation that combines the modular structure of elementary building blocks used in the circuit model with the advantages of a measurement-based approach to quantum computation. We show how to construct optimal resource states of minimal size to implement elementary building blocks for encoded quantum computation in a measurement-based way, including states for error correction and encoded gates. The performance of the scheme is determined by the quality of the resource states, where within the considered error model a threshold of the order of 10% local noise per particle for fault-tolerant quantum computation and quantum communication. PMID:24946906

  7. Mobile Modelling for Crowdsourcing Building Interior Data

    NASA Astrophysics Data System (ADS)

    Rosser, J.; Morley, J.; Jackson, M.

    2012-06-01

    Indoor spatial data forms an important foundation to many ubiquitous computing applications. It gives context to users operating location-based applications, provides an important source of documentation of buildings and can be of value to computer systems where an understanding of environment is required. Unlike external geographic spaces, no centralised body or agency is charged with collecting or maintaining such information. Widespread deployment of mobile devices provides a potential tool that would allow rapid model capture and update by a building's users. Here we introduce some of the issues involved in volunteering building interior data and outline a simple mobile tool for capture of indoor models. The nature of indoor data is inherently private; however in-depth analysis of this issue and legal considerations are not discussed in detail here.

  8. Equation-based languages – A new paradigm for building energy modeling, simulation and optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael; Bonvini, Marco; Nouidui, Thierry S.

    Most of the state-of-the-art building simulation programs implement models in imperative programming languages. This complicates modeling and excludes the use of certain efficient methods for simulation and optimization. In contrast, equation-based modeling languages declare relations among variables, thereby allowing the use of computer algebra to enable much simpler schematic modeling and to generate efficient code for simulation and optimization. We contrast the two approaches in this paper. We explain how such manipulations support new use cases. In the first of two examples, we couple models of the electrical grid, multiple buildings, HVAC systems and controllers to test a controller thatmore » adjusts building room temperatures and PV inverter reactive power to maintain power quality. In the second example, we contrast the computing time for solving an optimal control problem for a room-level model predictive controller with and without symbolic manipulations. As a result, exploiting the equation-based language led to 2, 200 times faster solution« less

  9. Equation-based languages – A new paradigm for building energy modeling, simulation and optimization

    DOE PAGES

    Wetter, Michael; Bonvini, Marco; Nouidui, Thierry S.

    2016-04-01

    Most of the state-of-the-art building simulation programs implement models in imperative programming languages. This complicates modeling and excludes the use of certain efficient methods for simulation and optimization. In contrast, equation-based modeling languages declare relations among variables, thereby allowing the use of computer algebra to enable much simpler schematic modeling and to generate efficient code for simulation and optimization. We contrast the two approaches in this paper. We explain how such manipulations support new use cases. In the first of two examples, we couple models of the electrical grid, multiple buildings, HVAC systems and controllers to test a controller thatmore » adjusts building room temperatures and PV inverter reactive power to maintain power quality. In the second example, we contrast the computing time for solving an optimal control problem for a room-level model predictive controller with and without symbolic manipulations. As a result, exploiting the equation-based language led to 2, 200 times faster solution« less

  10. From Years of Work in Psychology and Computer Science, Scientists Build Theories of Thinking and Learning.

    ERIC Educational Resources Information Center

    Wheeler, David L.

    1988-01-01

    Scientists feel that progress in artificial intelligence and the availability of thousands of experimental results make this the right time to build and test theories on how people think and learn, using the computer to model minds. (MSE)

  11. Energy consumption program: A computer model simulating energy loads in buildings

    NASA Technical Reports Server (NTRS)

    Stoller, F. W.; Lansing, F. L.; Chai, V. W.; Higgins, S.

    1978-01-01

    The JPL energy consumption computer program developed as a useful tool in the on-going building modification studies in the DSN energy conservation project is described. The program simulates building heating and cooling loads and computes thermal and electric energy consumption and cost. The accuracy of computations are not sacrificed, however, since the results lie within + or - 10 percent margin compared to those read from energy meters. The program is carefully structured to reduce both user's time and running cost by asking minimum information from the user and reducing many internal time-consuming computational loops. Many unique features were added to handle two-level electronics control rooms not found in any other program.

  12. Theoretical basis of the DOE-2 building energy use analysis program

    NASA Astrophysics Data System (ADS)

    Curtis, R. B.

    1981-04-01

    A user-oriented, public domain, computer program was developed that will enable architects and engineers to perform design and retrofit studies of the energy-use of buildings under realistic weather conditions. The DOE-2.1A has been named by the US DOE as the standard evaluation technique for the Congressionally mandated building energy performance standards (BEPS). A number of program design decisions were made that determine the breadth of applicability of DOE-2.1. Such design decisions are intrinsic to all building energy use analysis computer programs and determine the types of buildings or the kind of HVAC systems that can be modeled. In particular, the weighting factor method used in DOE-2 has both advantages and disadvantages relative to other computer programs.

  13. Assessment of Life Cycle Information Exchanges (LCie): Understanding the Value-Added Benefit of a COBie Process

    DTIC Science & Technology

    2013-10-01

    exchange (COBie), Building Information Modeling ( BIM ), value-added analysis, business processes, project management 16. SECURITY CLASSIFICATION OF: 17...equipment. The innovative aspect of Building In- formation Modeling ( BIM ) is that it creates a computable building descrip- tion. The ability to use a...interoperability. In order for the building information to be interoperable, it must also con- form to a common data model , or schema, that defines the class

  14. A FRAMEWORK FOR FINE-SCALE COMPUTATIONAL FLUID DYNAMICS AIR QUALITY MODELING AND ANALYSIS

    EPA Science Inventory

    Fine-scale Computational Fluid Dynamics (CFD) simulation of pollutant concentrations within roadway and building microenvironments is feasible using high performance computing. Unlike currently used regulatory air quality models, fine-scale CFD simulations are able to account rig...

  15. On a computational model of building thermal dynamic response

    NASA Astrophysics Data System (ADS)

    Jarošová, Petra; Vala, Jiří

    2016-07-01

    Development and exploitation of advanced materials, structures and technologies in civil engineering, both for buildings with carefully controlled interior temperature and for common residential houses, together with new European and national directives and technical standards, stimulate the development of rather complex and robust, but sufficiently simple and inexpensive computational tools, supporting their design and optimization of energy consumption. This paper demonstrates the possibility of consideration of such seemingly contradictory requirements, using the simplified non-stationary thermal model of a building, motivated by the analogy with the analysis of electric circuits; certain semi-analytical forms of solutions come from the method of lines.

  16. Community United Methodist Church passive solar classroom addition: comparison of predicted and actual energy use

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, W.H.; Peckham, N.

    1984-01-01

    The Community United Methodist Church of Columbia, Missouri, has recently built a passive solar addition. This building was partially funded by the Department of Energy Passive Solar Commercial Building Demonstration Program (1) and by a grant from the Board of Global Ministries of the United Methodist Church. As part of the design phase, the PASOLE computer code was used to model the thermal characteristics of the building. The building was subsequently completed in September 1981, and one and one-half years of end use energy data has been collected as of March 1983. This paper presents (1) a description of themore » new building and the computer model used to analyze it, (2) a comparison of predicted and actual energy use, (3) a comparison between the new, solar building and conventional portions of the church complex and (4) summarizes other operational experiences.« less

  17. Building a Foreign Military Sales Construction Delivery Strategy Decision Support System

    DTIC Science & Technology

    1991-09-01

    DSS, formulates it into a computer model and produces solutions using information and expert heuristics. Using the Expert Systeic Process to Build a DSS...computer model . There are five stages in the development of an expert system. They are: 1) Identify and characterize the important aspects of the problem...and Steven A. Hidreth. U.S. Security Assistance: The Political Process. Massachusetts: Heath and Company, 1985. 19. Guirguis , Amir A., Program

  18. Determining crystal structures through crowdsourcing and coursework

    NASA Astrophysics Data System (ADS)

    Horowitz, Scott; Koepnick, Brian; Martin, Raoul; Tymieniecki, Agnes; Winburn, Amanda A.; Cooper, Seth; Flatten, Jeff; Rogawski, David S.; Koropatkin, Nicole M.; Hailu, Tsinatkeab T.; Jain, Neha; Koldewey, Philipp; Ahlstrom, Logan S.; Chapman, Matthew R.; Sikkema, Andrew P.; Skiba, Meredith A.; Maloney, Finn P.; Beinlich, Felix R. M.; Caglar, Ahmet; Coral, Alan; Jensen, Alice Elizabeth; Lubow, Allen; Boitano, Amanda; Lisle, Amy Elizabeth; Maxwell, Andrew T.; Failer, Barb; Kaszubowski, Bartosz; Hrytsiv, Bohdan; Vincenzo, Brancaccio; de Melo Cruz, Breno Renan; McManus, Brian Joseph; Kestemont, Bruno; Vardeman, Carl; Comisky, Casey; Neilson, Catherine; Landers, Catherine R.; Ince, Christopher; Buske, Daniel Jon; Totonjian, Daniel; Copeland, David Marshall; Murray, David; Jagieła, Dawid; Janz, Dietmar; Wheeler, Douglas C.; Cali, Elie; Croze, Emmanuel; Rezae, Farah; Martin, Floyd Orville; Beecher, Gil; de Jong, Guido Alexander; Ykman, Guy; Feldmann, Harald; Chan, Hugo Paul Perez; Kovanecz, Istvan; Vasilchenko, Ivan; Connellan, James C.; Borman, Jami Lynne; Norrgard, Jane; Kanfer, Jebbie; Canfield, Jeffrey M.; Slone, Jesse David; Oh, Jimmy; Mitchell, Joanne; Bishop, John; Kroeger, John Douglas; Schinkler, Jonas; McLaughlin, Joseph; Brownlee, June M.; Bell, Justin; Fellbaum, Karl Willem; Harper, Kathleen; Abbey, Kirk J.; Isaksson, Lennart E.; Wei, Linda; Cummins, Lisa N.; Miller, Lori Anne; Bain, Lyn; Carpenter, Lynn; Desnouck, Maarten; Sharma, Manasa G.; Belcastro, Marcus; Szew, Martin; Szew, Martin; Britton, Matthew; Gaebel, Matthias; Power, Max; Cassidy, Michael; Pfützenreuter, Michael; Minett, Michele; Wesselingh, Michiel; Yi, Minjune; Cameron, Neil Haydn Tormey; Bolibruch, Nicholas I.; Benevides, Noah; Kathleen Kerr, Norah; Barlow, Nova; Crevits, Nykole Krystyne; Dunn, Paul; Silveira Belo Nascimento Roque, Paulo Sergio; Riber, Peter; Pikkanen, Petri; Shehzad, Raafay; Viosca, Randy; James Fraser, Robert; Leduc, Robert; Madala, Roman; Shnider, Scott; de Boisblanc, Sharon; Butkovich, Slava; Bliven, Spencer; Hettler, Stephen; Telehany, Stephen; Schwegmann, Steven A.; Parkes, Steven; Kleinfelter, Susan C.; Michael Holst, Sven; van der Laan, T. J. A.; Bausewein, Thomas; Simon, Vera; Pulley, Warwick; Hull, William; Kim, Annes Yukyung; Lawton, Alexis; Ruesch, Amanda; Sundar, Anjali; Lawrence, Anna-Lisa; Afrin, Antara; Maheshwer, Bhargavi; Turfe, Bilal; Huebner, Christian; Killeen, Courtney Elizabeth; Antebi-Lerrman, Dalia; Luan, Danny; Wolfe, Derek; Pham, Duc; Michewicz, Elaina; Hull, Elizabeth; Pardington, Emily; Galal, Galal Osama; Sun, Grace; Chen, Grace; Anderson, Halie E.; Chang, Jane; Hewlett, Jeffrey Thomas; Sterbenz, Jennifer; Lim, Jiho; Morof, Joshua; Lee, Junho; Inn, Juyoung Samuel; Hahm, Kaitlin; Roth, Kaitlin; Nair, Karun; Markin, Katherine; Schramm, Katie; Toni Eid, Kevin; Gam, Kristina; Murphy, Lisha; Yuan, Lucy; Kana, Lulia; Daboul, Lynn; Shammas, Mario Karam; Chason, Max; Sinan, Moaz; Andrew Tooley, Nicholas; Korakavi, Nisha; Comer, Patrick; Magur, Pragya; Savliwala, Quresh; Davison, Reid Michael; Sankaran, Roshun Rajiv; Lewe, Sam; Tamkus, Saule; Chen, Shirley; Harvey, Sho; Hwang, Sin Ye; Vatsia, Sohrab; Withrow, Stefan; Luther, Tahra K.; Manett, Taylor; Johnson, Thomas James; Ryan Brash, Timothy; Kuhlman, Wyatt; Park, Yeonjung; Popović, Zoran; Baker, David; Khatib, Firas; Bardwell, James C. A.

    2016-09-01

    We show here that computer game players can build high-quality crystal structures. Introduction of a new feature into the computer game Foldit allows players to build and real-space refine structures into electron density maps. To assess the usefulness of this feature, we held a crystallographic model-building competition between trained crystallographers, undergraduate students, Foldit players and automatic model-building algorithms. After removal of disordered residues, a team of Foldit players achieved the most accurate structure. Analysing the target protein of the competition, YPL067C, uncovered a new family of histidine triad proteins apparently involved in the prevention of amyloid toxicity. From this study, we conclude that crystallographers can utilize crowdsourcing to interpret electron density information and to produce structure solutions of the highest quality.

  19. Determining crystal structures through crowdsourcing and coursework.

    PubMed

    Horowitz, Scott; Koepnick, Brian; Martin, Raoul; Tymieniecki, Agnes; Winburn, Amanda A; Cooper, Seth; Flatten, Jeff; Rogawski, David S; Koropatkin, Nicole M; Hailu, Tsinatkeab T; Jain, Neha; Koldewey, Philipp; Ahlstrom, Logan S; Chapman, Matthew R; Sikkema, Andrew P; Skiba, Meredith A; Maloney, Finn P; Beinlich, Felix R M; Popović, Zoran; Baker, David; Khatib, Firas; Bardwell, James C A

    2016-09-16

    We show here that computer game players can build high-quality crystal structures. Introduction of a new feature into the computer game Foldit allows players to build and real-space refine structures into electron density maps. To assess the usefulness of this feature, we held a crystallographic model-building competition between trained crystallographers, undergraduate students, Foldit players and automatic model-building algorithms. After removal of disordered residues, a team of Foldit players achieved the most accurate structure. Analysing the target protein of the competition, YPL067C, uncovered a new family of histidine triad proteins apparently involved in the prevention of amyloid toxicity. From this study, we conclude that crystallographers can utilize crowdsourcing to interpret electron density information and to produce structure solutions of the highest quality.

  20. SIGMA--A Graphical Approach to Teaching Simulation.

    ERIC Educational Resources Information Center

    Schruben, Lee W.

    1992-01-01

    SIGMA (Simulation Graphical Modeling and Analysis) is a computer graphics environment for building, testing, and experimenting with discrete event simulation models on personal computers. It uses symbolic representations (computer animation) to depict the logic of large, complex discrete event systems for easier understanding and has proven itself…

  1. Investigation of wind behaviour around high-rise buildings

    NASA Astrophysics Data System (ADS)

    Mat Isa, Norasikin; Fitriah Nasir, Nurul; Sadikin, Azmahani; Ariff Hairul Bahara, Jamil

    2017-09-01

    A study on the investigation of wind behaviour around the high-rise buildings is done through an experiment using a wind tunnel and computational fluid dynamics. High-rise buildings refer to buildings or structures that have more than 12 floors. Wind is invisible to the naked eye; thus, it is hard to see and analyse its flow around and over buildings without the use of proper methods, such as the use of wind tunnel and computational fluid dynamics software.The study was conducted on buildings located in Presint 4, Putrajaya, Malaysia which is the Ministry of Rural and Regional Development, Ministry of Information Communications and Culture, Ministry of Urban Wellbeing, Housing and Local Government and the Ministry of Women, Family, and Community by making scaled models of the buildings. The parameters in which this study is conducted on are, four different wind velocities used based on the seasonal monsoons, and wind direction. ANSYS Fluent workbench software is used to compute the simulations in order to achieve the objectives of this study. The data from the computational fluid dynamics are validated with the experiment done through the wind tunnel. From the results obtained through the use of the computation fluid dynamics, this study can identify the characteristics of wind around buildings, including boundary layer of the buildings, separation flow, wake region and etc. Then analyses is conducted on the occurance resulting from the wind that passes the buildings based on the velocity difference between before and after the wind passes the buildings.

  2. Computing Systems | High-Performance Computing | NREL

    Science.gov Websites

    investigate, build, and test models of complex phenomena or entire integrated systems-that cannot be directly observed or manipulated in the lab, or would be too expensive or time consuming. Models and visualizations

  3. Coupling fast fluid dynamics and multizone airflow models in Modelica Buildings library to simulate the dynamics of HVAC systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Wei; Sevilla, Thomas Alonso; Zuo, Wangda

    Historically, multizone models are widely used in building airflow and energy performance simulations due to their fast computing speed. However, multizone models assume that the air in a room is well mixed, consequently limiting their application. In specific rooms where this assumption fails, the use of computational fluid dynamics (CFD) models may be an alternative option. Previous research has mainly focused on coupling CFD models and multizone models to study airflow in large spaces. While significant, most of these analyses did not consider the coupled simulation of the building airflow with the building's Heating, Ventilation, and Air-Conditioning (HVAC) systems. Thismore » paper tries to fill the gap by integrating the models for HVAC systems with coupled multizone and CFD simulations for airflows, using the Modelica simul ation platform. To improve the computational efficiency, we incorporated a simplified CFD model named fast fluid dynamics (FFD). We first introduce the data synchronization strategy and implementation in Modelica. Then, we verify the implementation using two case studies involving an isothermal and a non-isothermal flow by comparing model simulations to experiment data. Afterward, we study another three cases that are deemed more realistic. This is done by attaching a variable air volume (VAV) terminal box and a VAV system to previous flows to assess the capability of the models in studying the dynamic control of HVAC systems. Finally, we discuss further research needs on the coupled simulation using the models.« less

  4. Mathematical Capture of Human Crowd Behavioral Data for Computational Model Building, Verification, and Validation

    DTIC Science & Technology

    2011-03-21

    throughout the experimental runs. Reliable and validated measures of anxiety ( Spielberger , 1983), as well as custom-constructed questionnaires about...Crowd modeling and simulation technologies. Transactions on modeling and computer simulation, 20(4). Spielberger , C. D. (1983

  5. Culto: AN Ontology-Based Annotation Tool for Data Curation in Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Garozzo, R.; Murabito, F.; Santagati, C.; Pino, C.; Spampinato, C.

    2017-08-01

    This paper proposes CulTO, a software tool relying on a computational ontology for Cultural Heritage domain modelling, with a specific focus on religious historical buildings, for supporting cultural heritage experts in their investigations. It is specifically thought to support annotation, automatic indexing, classification and curation of photographic data and text documents of historical buildings. CULTO also serves as a useful tool for Historical Building Information Modeling (H-BIM) by enabling semantic 3D data modeling and further enrichment with non-geometrical information of historical buildings through the inclusion of new concepts about historical documents, images, decay or deformation evidence as well as decorative elements into BIM platforms. CulTO is the result of a joint research effort between the Laboratory of Surveying and Architectural Photogrammetry "Luigi Andreozzi" and the PeRCeiVe Lab (Pattern Recognition and Computer Vision Lab) of the University of Catania,

  6. Real longitudinal data analysis for real people: building a good enough mixed model.

    PubMed

    Cheng, Jing; Edwards, Lloyd J; Maldonado-Molina, Mildred M; Komro, Kelli A; Muller, Keith E

    2010-02-20

    Mixed effects models have become very popular, especially for the analysis of longitudinal data. One challenge is how to build a good enough mixed effects model. In this paper, we suggest a systematic strategy for addressing this challenge and introduce easily implemented practical advice to build mixed effects models. A general discussion of the scientific strategies motivates the recommended five-step procedure for model fitting. The need to model both the mean structure (the fixed effects) and the covariance structure (the random effects and residual error) creates the fundamental flexibility and complexity. Some very practical recommendations help to conquer the complexity. Centering, scaling, and full-rank coding of all the predictor variables radically improve the chances of convergence, computing speed, and numerical accuracy. Applying computational and assumption diagnostics from univariate linear models to mixed model data greatly helps to detect and solve the related computational problems. Applying computational and assumption diagnostics from the univariate linear models to the mixed model data can radically improve the chances of convergence, computing speed, and numerical accuracy. The approach helps to fit more general covariance models, a crucial step in selecting a credible covariance model needed for defensible inference. A detailed demonstration of the recommended strategy is based on data from a published study of a randomized trial of a multicomponent intervention to prevent young adolescents' alcohol use. The discussion highlights a need for additional covariance and inference tools for mixed models. The discussion also highlights the need for improving how scientists and statisticians teach and review the process of finding a good enough mixed model. (c) 2009 John Wiley & Sons, Ltd.

  7. Knowledge-based model building of proteins: concepts and examples.

    PubMed Central

    Bajorath, J.; Stenkamp, R.; Aruffo, A.

    1993-01-01

    We describe how to build protein models from structural templates. Methods to identify structural similarities between proteins in cases of significant, moderate to low, or virtually absent sequence similarity are discussed. The detection and evaluation of structural relationships is emphasized as a central aspect of protein modeling, distinct from the more technical aspects of model building. Computational techniques to generate and complement comparative protein models are also reviewed. Two examples, P-selectin and gp39, are presented to illustrate the derivation of protein model structures and their use in experimental studies. PMID:7505680

  8. CAL--ERDA program manual. [Building Design Language; LOADS, SYSTEMS, PLANT, ECONOMICS, REPORT, EXECUTIVE, CAL-ERDA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunn, B. D.; Diamond, S. C.; Bennett, G. A.

    1977-10-01

    A set of computer programs, called Cal-ERDA, is described that is capable of rapid and detailed analysis of energy consumption in buildings. A new user-oriented input language, named the Building Design Language (BDL), has been written to allow simplified manipulation of the many variables used to describe a building and its operation. This manual provides the user with information necessary to understand in detail the Cal-ERDA set of computer programs. The new computer programs described include: an EXECUTIVE Processor to create computer system control commands; a BDL Processor to analyze input instructions, execute computer system control commands, perform assignments andmore » data retrieval, and control the operation of the LOADS, SYSTEMS, PLANT, ECONOMICS, and REPORT programs; a LOADS analysis program that calculates peak (design) zone and hourly loads and the effect of the ambient weather conditions, the internal occupancy, lighting, and equipment within the building, as well as variations in the size, location, orientation, construction, walls, roofs, floors, fenestrations, attachments (awnings, balconies), and shape of a building; a Heating, Ventilating, and Air-Conditioning (HVAC) SYSTEMS analysis program capable of modeling the operation of HVAC components including fans, coils, economizers, humidifiers, etc.; 16 standard configurations and operated according to various temperature and humidity control schedules. A plant equipment program models the operation of boilers, chillers, electrical generation equipment (diesel or turbines), heat storage apparatus (chilled or heated water), and solar heating and/or cooling systems. An ECONOMIC analysis program calculates life-cycle costs. A REPORT program produces tables of user-selected variables and arranges them according to user-specified formats. A set of WEATHER ANALYSIS programs manipulates, summarizes and plots weather data. Libraries of weather data, schedule data, and building data were prepared.« less

  9. A 3D puzzle approach to building protein-DNA structures.

    PubMed

    Hinton, Deborah M

    2017-03-15

    Despite recent advances in structural analysis, it is still challenging to obtain a high-resolution structure for a complex of RNA polymerase, transcriptional factors, and DNA. However, using biochemical constraints, 3D printed models of available structures, and computer modeling, one can build biologically relevant models of such supramolecular complexes.

  10. Determining crystal structures through crowdsourcing and coursework

    PubMed Central

    Horowitz, Scott; Koepnick, Brian; Martin, Raoul; Tymieniecki, Agnes; Winburn, Amanda A.; Cooper, Seth; Flatten, Jeff; Rogawski, David S.; Koropatkin, Nicole M.; Hailu, Tsinatkeab T.; Jain, Neha; Koldewey, Philipp; Ahlstrom, Logan S.; Chapman, Matthew R.; Sikkema, Andrew P.; Skiba, Meredith A.; Maloney, Finn P.; Beinlich, Felix R. M.; Caglar, Ahmet; Coral, Alan; Jensen, Alice Elizabeth; Lubow, Allen; Boitano, Amanda; Lisle, Amy Elizabeth; Maxwell, Andrew T.; Failer, Barb; Kaszubowski, Bartosz; Hrytsiv, Bohdan; Vincenzo, Brancaccio; de Melo Cruz, Breno Renan; McManus, Brian Joseph; Kestemont, Bruno; Vardeman, Carl; Comisky, Casey; Neilson, Catherine; Landers, Catherine R.; Ince, Christopher; Buske, Daniel Jon; Totonjian, Daniel; Copeland, David Marshall; Murray, David; Jagieła, Dawid; Janz, Dietmar; Wheeler, Douglas C.; Cali, Elie; Croze, Emmanuel; Rezae, Farah; Martin, Floyd Orville; Beecher, Gil; de Jong, Guido Alexander; Ykman, Guy; Feldmann, Harald; Chan, Hugo Paul Perez; Kovanecz, Istvan; Vasilchenko, Ivan; Connellan, James C.; Borman, Jami Lynne; Norrgard, Jane; Kanfer, Jebbie; Canfield, Jeffrey M.; Slone, Jesse David; Oh, Jimmy; Mitchell, Joanne; Bishop, John; Kroeger, John Douglas; Schinkler, Jonas; McLaughlin, Joseph; Brownlee, June M.; Bell, Justin; Fellbaum, Karl Willem; Harper, Kathleen; Abbey, Kirk J.; Isaksson, Lennart E.; Wei, Linda; Cummins, Lisa N.; Miller, Lori Anne; Bain, Lyn; Carpenter, Lynn; Desnouck, Maarten; Sharma, Manasa G.; Belcastro, Marcus; Szew, Martin; Szew, Martin; Britton, Matthew; Gaebel, Matthias; Power, Max; Cassidy, Michael; Pfützenreuter, Michael; Minett, Michele; Wesselingh, Michiel; Yi, Minjune; Cameron, Neil Haydn Tormey; Bolibruch, Nicholas I.; Benevides, Noah; Kathleen Kerr, Norah; Barlow, Nova; Crevits, Nykole Krystyne; Dunn, Paul; Roque, Paulo Sergio Silveira Belo Nascimento; Riber, Peter; Pikkanen, Petri; Shehzad, Raafay; Viosca, Randy; James Fraser, Robert; Leduc, Robert; Madala, Roman; Shnider, Scott; de Boisblanc, Sharon; Butkovich, Slava; Bliven, Spencer; Hettler, Stephen; Telehany, Stephen; Schwegmann, Steven A.; Parkes, Steven; Kleinfelter, Susan C.; Michael Holst, Sven; van der Laan, T. J. A.; Bausewein, Thomas; Simon, Vera; Pulley, Warwick; Hull, William; Kim, Annes Yukyung; Lawton, Alexis; Ruesch, Amanda; Sundar, Anjali; Lawrence, Anna-Lisa; Afrin, Antara; Maheshwer, Bhargavi; Turfe, Bilal; Huebner, Christian; Killeen, Courtney Elizabeth; Antebi-Lerrman, Dalia; Luan, Danny; Wolfe, Derek; Pham, Duc; Michewicz, Elaina; Hull, Elizabeth; Pardington, Emily; Galal, Galal Osama; Sun, Grace; Chen, Grace; Anderson, Halie E.; Chang, Jane; Hewlett, Jeffrey Thomas; Sterbenz, Jennifer; Lim, Jiho; Morof, Joshua; Lee, Junho; Inn, Juyoung Samuel; Hahm, Kaitlin; Roth, Kaitlin; Nair, Karun; Markin, Katherine; Schramm, Katie; Toni Eid, Kevin; Gam, Kristina; Murphy, Lisha; Yuan, Lucy; Kana, Lulia; Daboul, Lynn; Shammas, Mario Karam; Chason, Max; Sinan, Moaz; Andrew Tooley, Nicholas; Korakavi, Nisha; Comer, Patrick; Magur, Pragya; Savliwala, Quresh; Davison, Reid Michael; Sankaran, Roshun Rajiv; Lewe, Sam; Tamkus, Saule; Chen, Shirley; Harvey, Sho; Hwang, Sin Ye; Vatsia, Sohrab; Withrow, Stefan; Luther, Tahra K; Manett, Taylor; Johnson, Thomas James; Ryan Brash, Timothy; Kuhlman, Wyatt; Park, Yeonjung; Popović, Zoran; Baker, David; Khatib, Firas; Bardwell, James C. A.

    2016-01-01

    We show here that computer game players can build high-quality crystal structures. Introduction of a new feature into the computer game Foldit allows players to build and real-space refine structures into electron density maps. To assess the usefulness of this feature, we held a crystallographic model-building competition between trained crystallographers, undergraduate students, Foldit players and automatic model-building algorithms. After removal of disordered residues, a team of Foldit players achieved the most accurate structure. Analysing the target protein of the competition, YPL067C, uncovered a new family of histidine triad proteins apparently involved in the prevention of amyloid toxicity. From this study, we conclude that crystallographers can utilize crowdsourcing to interpret electron density information and to produce structure solutions of the highest quality. PMID:27633552

  11. A comprehensive pipeline for multi-resolution modeling of the mitral valve: Validation, computational efficiency, and predictive capability.

    PubMed

    Drach, Andrew; Khalighi, Amir H; Sacks, Michael S

    2018-02-01

    Multiple studies have demonstrated that the pathological geometries unique to each patient can affect the durability of mitral valve (MV) repairs. While computational modeling of the MV is a promising approach to improve the surgical outcomes, the complex MV geometry precludes use of simplified models. Moreover, the lack of complete in vivo geometric information presents significant challenges in the development of patient-specific computational models. There is thus a need to determine the level of detail necessary for predictive MV models. To address this issue, we have developed a novel pipeline for building attribute-rich computational models of MV with varying fidelity directly from the in vitro imaging data. The approach combines high-resolution geometric information from loaded and unloaded states to achieve a high level of anatomic detail, followed by mapping and parametric embedding of tissue attributes to build a high-resolution, attribute-rich computational models. Subsequent lower resolution models were then developed and evaluated by comparing the displacements and surface strains to those extracted from the imaging data. We then identified the critical levels of fidelity for building predictive MV models in the dilated and repaired states. We demonstrated that a model with a feature size of about 5 mm and mesh size of about 1 mm was sufficient to predict the overall MV shape, stress, and strain distributions with high accuracy. However, we also noted that more detailed models were found to be needed to simulate microstructural events. We conclude that the developed pipeline enables sufficiently complex models for biomechanical simulations of MV in normal, dilated, repaired states. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Proceedings: Conference on Computers in Chemical Education and Research, Dekalb, Illinois, 19-23 July 1971.

    ERIC Educational Resources Information Center

    1971

    Computers have effected a comprehensive transformation of chemistry. Computers have greatly enhanced the chemist's ability to do model building, simulations, data refinement and reduction, analysis of data in terms of models, on-line data logging, automated control of experiments, quantum chemistry and statistical and mechanical calculations, and…

  13. Modeling Poroelastic Wave Propagation in a Real 2-D Complex Geological Structure Obtained via Self-Organizing Maps

    NASA Astrophysics Data System (ADS)

    Itzá Balam, Reymundo; Iturrarán-Viveros, Ursula; Parra, Jorge O.

    2018-03-01

    Two main stages of seismic modeling are geological model building and numerical computation of seismic response for the model. The quality of the computed seismic response is partly related to the type of model that is built. Therefore, the model building approaches become as important as seismic forward numerical methods. For this purpose, three petrophysical facies (sands, shales and limestones) are extracted from reflection seismic data and some seismic attributes via the clustering method called Self-Organizing Maps (SOM), which, in this context, serves as a geological model building tool. This model with all its properties is the input to the Optimal Implicit Staggered Finite Difference (OISFD) algorithm to create synthetic seismograms for poroelastic, poroacoustic and elastic media. The results show a good agreement between observed and 2-D synthetic seismograms. This demonstrates that the SOM classification method enables us to extract facies from seismic data and allows us to integrate the lithology at the borehole scale with the 2-D seismic data.

  14. CFD: A Castle in the Sand?

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Wood, William A.

    2004-01-01

    The computational simulation community is not routinely publishing independently verifiable tests to accompany new models or algorithms. A survey reveals that only 22% of new models published are accompanied by tests suitable for independently verifying the new model. As the community develops larger codes with increased functionality, and hence increased complexity in terms of the number of building block components and their interactions, it becomes prohibitively expensive for each development group to derive the appropriate tests for each component. Therefore, the computational simulation community is building its collective castle on a very shaky foundation of components with unpublished and unrepeatable verification tests. The computational simulation community needs to begin publishing component level verification tests before the tide of complexity undermines its foundation.

  15. Artificial intelligence support for scientific model-building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1992-01-01

    Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.

  16. Stress Behaviour in Compression of Contact-Monolithic Joint of Self-Supporting Wall of Large Panel Multi-Storey Building

    NASA Astrophysics Data System (ADS)

    Derbentsev, I.; Karyakin, A. A.; Volodin, A.

    2017-11-01

    The article deals with the behaviour of a contact-monolithic joint of large-panel buildings under compression. It gives a detailed analysis and the descriptions of the stages of such joints failure based on the results of the tests and computational modelling. The article is of interest to specialists who deal with computational modelling or the research of large-panel multi-storey buildings. The text gives a valuable information on the values of their bearing capacity and flexibility, the eccentricity of load transfer from upper panel to lower, the value of thrust passed to a ceiling panel. Recommendations are given to estimate all the above-listed parameters.

  17. Public Databases Supporting Computational Toxicology

    EPA Science Inventory

    A major goal of the emerging field of computational toxicology is the development of screening-level models that predict potential toxicity of chemicals from a combination of mechanistic in vitro assay data and chemical structure descriptors. In order to build these models, resea...

  18. Cloud immersion building shielding factors for US residential structures.

    PubMed

    Dickson, E D; Hamby, D M

    2014-12-01

    This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario within a semi-infinite cloud of radioactive material. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement, as well for single-wide manufactured housing-units.

  19. Modeling User Behavior in Computer Learning Tasks.

    ERIC Educational Resources Information Center

    Mantei, Marilyn M.

    Model building techniques from Artifical Intelligence and Information-Processing Psychology are applied to human-computer interface tasks to evaluate existing interfaces and suggest new and better ones. The model is in the form of an augmented transition network (ATN) grammar which is built by applying grammar induction heuristics on a sequential…

  20. First Steps in Computational Systems Biology: A Practical Session in Metabolic Modeling and Simulation

    ERIC Educational Resources Information Center

    Reyes-Palomares, Armando; Sanchez-Jimenez, Francisca; Medina, Miguel Angel

    2009-01-01

    A comprehensive understanding of biological functions requires new systemic perspectives, such as those provided by systems biology. Systems biology approaches are hypothesis-driven and involve iterative rounds of model building, prediction, experimentation, model refinement, and development. Developments in computer science are allowing for ever…

  1. Modeling Education on the Real World.

    ERIC Educational Resources Information Center

    Hunter, Beverly

    1983-01-01

    Discusses educational applications of computer simulation and model building for grades K to 8, with emphasis on the usefulness of the computer simulation language, micro-DYNAMO, for programing and understanding the models which help to explain social and natural phenomena. A new textbook for junior-senior high school students is noted. (EAO)

  2. Toward a virtual building laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klems, J.H.; Finlayson, E.U.; Olsen, T.H.

    1999-03-01

    In order to achieve in a timely manner the large energy and dollar savings technically possible through improvements in building energy efficiency, it will be necessary to solve the problem of design failure risk. The most economical method of doing this would be to learn to calculate building performance with sufficient detail, accuracy and reliability to avoid design failure. Existing building simulation models (BSM) are a large step in this direction, but are still not capable of this level of modeling. Developments in computational fluid dynamics (CFD) techniques now allow one to construct a road map from present BSM's tomore » a complete building physical model. The most useful first step is a building interior model (BIM) that would allow prediction of local conditions affecting occupant health and comfort. To provide reliable prediction a BIM must incorporate the correct physical boundary conditions on a building interior. Doing so raises a number of specific technical problems and research questions. The solution of these within a context useful for building research and design is not likely to result from other research on CFD, which is directed toward the solution of different types of problems. A six-step plan for incorporating the correct boundary conditions within the context of the model problem of a large atrium has been outlined. A promising strategy for constructing a BIM is the overset grid technique for representing a building space in a CFD calculation. This technique promises to adapt well to building design and allows a step-by-step approach. A state-of-the-art CFD computer code using this technique has been adapted to the problem and can form the departure point for this research.« less

  3. Simulation Speed Analysis and Improvements of Modelica Models for Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jorissen, Filip; Wetter, Michael; Helsen, Lieve

    This paper presents an approach for speeding up Modelica models. Insight is provided into how Modelica models are solved and what determines the tool’s computational speed. Aspects such as algebraic loops, code efficiency and integrator choice are discussed. This is illustrated using simple building simulation examples and Dymola. The generality of the work is in some cases verified using OpenModelica. Using this approach, a medium sized office building including building envelope, heating ventilation and air conditioning (HVAC) systems and control strategy can be simulated at a speed five hundred times faster than real time.

  4. Using Wikis for Learning and Knowledge Building: Results of an Experimental Study

    ERIC Educational Resources Information Center

    Kimmerle, Joachim; Moskaliuk, Johannes; Cress, Ulrike

    2011-01-01

    Computer-supported learning and knowledge building play an increasing role in online collaboration. This paper outlines some theories concerning the interplay between individual processes of learning and collaborative processes of knowledge building. In particular, it describes the co-evolution model that attempts to examine processes of learning…

  5. Quantum simulations with noisy quantum computers

    NASA Astrophysics Data System (ADS)

    Gambetta, Jay

    Quantum computing is a new computational paradigm that is expected to lie beyond the standard model of computation. This implies a quantum computer can solve problems that can't be solved by a conventional computer with tractable overhead. To fully harness this power we need a universal fault-tolerant quantum computer. However the overhead in building such a machine is high and a full solution appears to be many years away. Nevertheless, we believe that we can build machines in the near term that cannot be emulated by a conventional computer. It is then interesting to ask what these can be used for. In this talk we will present our advances in simulating complex quantum systems with noisy quantum computers. We will show experimental implementations of this on some small quantum computers.

  6. A Computing Infrastructure for Supporting Climate Studies

    NASA Astrophysics Data System (ADS)

    Yang, C.; Bambacus, M.; Freeman, S. M.; Huang, Q.; Li, J.; Sun, M.; Xu, C.; Wojcik, G. S.; Cahalan, R. F.; NASA Climate @ Home Project Team

    2011-12-01

    Climate change is one of the major challenges facing us on the Earth planet in the 21st century. Scientists build many models to simulate the past and predict the climate change for the next decades or century. Most of the models are at a low resolution with some targeting high resolution in linkage to practical climate change preparedness. To calibrate and validate the models, millions of model runs are needed to find the best simulation and configuration. This paper introduces the NASA effort on Climate@Home project to build a supercomputer based-on advanced computing technologies, such as cloud computing, grid computing, and others. Climate@Home computing infrastructure includes several aspects: 1) a cloud computing platform is utilized to manage the potential spike access to the centralized components, such as grid computing server for dispatching and collecting models runs results; 2) a grid computing engine is developed based on MapReduce to dispatch models, model configuration, and collect simulation results and contributing statistics; 3) a portal serves as the entry point for the project to provide the management, sharing, and data exploration for end users; 4) scientists can access customized tools to configure model runs and visualize model results; 5) the public can access twitter and facebook to get the latest about the project. This paper will introduce the latest progress of the project and demonstrate the operational system during the AGU fall meeting. It will also discuss how this technology can become a trailblazer for other climate studies and relevant sciences. It will share how the challenges in computation and software integration were solved.

  7. Calibrating Building Energy Models Using Supercomputer Trained Machine Learning Agents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanyal, Jibonananda; New, Joshua Ryan; Edwards, Richard

    2014-01-01

    Building Energy Modeling (BEM) is an approach to model the energy usage in buildings for design and retrofit purposes. EnergyPlus is the flagship Department of Energy software that performs BEM for different types of buildings. The input to EnergyPlus can often extend in the order of a few thousand parameters which have to be calibrated manually by an expert for realistic energy modeling. This makes it challenging and expensive thereby making building energy modeling unfeasible for smaller projects. In this paper, we describe the Autotune research which employs machine learning algorithms to generate agents for the different kinds of standardmore » reference buildings in the U.S. building stock. The parametric space and the variety of building locations and types make this a challenging computational problem necessitating the use of supercomputers. Millions of EnergyPlus simulations are run on supercomputers which are subsequently used to train machine learning algorithms to generate agents. These agents, once created, can then run in a fraction of the time thereby allowing cost-effective calibration of building models.« less

  8. Simulation of probabilistic wind loads and building analysis

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Chamis, Christos C.

    1991-01-01

    Probabilistic wind loads likely to occur on a structure during its design life are predicted. Described here is a suitable multifactor interactive equation (MFIE) model and its use in the Composite Load Spectra (CLS) computer program to simulate the wind pressure cumulative distribution functions on four sides of a building. The simulated probabilistic wind pressure load was applied to a building frame, and cumulative distribution functions of sway displacements and reliability against overturning were obtained using NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), a stochastic finite element computer code. The geometry of the building and the properties of building members were also considered as random in the NESSUS analysis. The uncertainties of wind pressure, building geometry, and member section property were qualified in terms of their respective sensitivities on the structural response.

  9. Some Observations on the Current Status of Performing Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.

    2015-01-01

    Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.

  10. ADDRESSING HUMAN EXPOSURE TO AIR POLLUTANTS AROUND BUILDINGS IN URBAN AREAS WITH COMPUTATIONAL FLUID DYNAMICS (CFD) MODELS

    EPA Science Inventory

    Computational Fluid Dynamics (CFD) simulations provide a number of unique opportunities for expanding and improving capabilities for modeling exposures to environmental pollutants. The US Environmental Protection Agency's National Exposure Research Laboratory (NERL) has been c...

  11. Multicriteria decision model for retrofitting existing buildings

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, B.

    2003-04-01

    In this paper a model to decide which buildings from an urban area should be retrofitted is presented. The model has been cast into existing ones by choosing the decision rule, criterion weighting and decision support system types most suitable for the spatial problem of reducing earthquake risk in urban areas, considering existing spatial multiatributive and multiobjective decision methods and especially collaborative issues. Due to the participative character of the group decision problem "retrofitting existing buildings" the decision making model is based on interactivity. Buildings have been modeled following the criteria of spatial decision support systems. This includes identifying the corresponding spatial elements of buildings according to the information needs of actors from different sphaeres like architects, construction engineers and economists. The decision model aims to facilitate collaboration between this actors. The way of setting priorities interactivelly will be shown, by detailing the two phases: judgemental and computational, in this case site analysis, collection and evaluation of the unmodified data and converting survey data to information with computational methods using additional expert support. Buildings have been divided into spatial elements which are characteristic for the survey, present typical damages in case of an earthquake and are decisive for a better seismic behaviour in case of retrofitting. The paper describes the architectural and engineering characteristics as well as the structural damage for constuctions of different building ages on the example of building types in Bucharest, Romania in compressible and interdependent charts, based on field observation, reports from the 1977 earthquake and detailed studies made by the author together with a local engineer for the EERI Web Housing Encyclopedia. On this base criteria for setting priorities flow into the expert information contained in the system.

  12. Overcoming Microsoft Excel's Weaknesses for Crop Model Building and Simulations

    ERIC Educational Resources Information Center

    Sung, Christopher Teh Boon

    2011-01-01

    Using spreadsheets such as Microsoft Excel for building crop models and running simulations can be beneficial. Excel is easy to use, powerful, and versatile, and it requires the least proficiency in computer programming compared to other programming platforms. Excel, however, has several weaknesses: it does not directly support loops for iterative…

  13. Building a three-dimensional model of CYP2C9 inhibition using the Autocorrelator: an autonomous model generator.

    PubMed

    Lardy, Matthew A; Lebrun, Laurie; Bullard, Drew; Kissinger, Charles; Gobbi, Alberto

    2012-05-25

    In modern day drug discovery campaigns, computational chemists have to be concerned not only about improving the potency of molecules but also reducing any off-target ADMET activity. There are a plethora of antitargets that computational chemists may have to consider. Fortunately many antitargets have crystal structures deposited in the PDB. These structures are immediately useful to our Autocorrelator: an automated model generator that optimizes variables for building computational models. This paper describes the use of the Autocorrelator to construct high quality docking models for cytochrome P450 2C9 (CYP2C9) from two publicly available crystal structures. Both models result in strong correlation coefficients (R² > 0.66) between the predicted and experimental determined log(IC₅₀) values. Results from the two models overlap well with each other, converging on the same scoring function, deprotonated charge state, and predicted the binding orientation for our collection of molecules.

  14. Computational strategies for the automated design of RNA nanoscale structures from building blocks using NanoTiler.

    PubMed

    Bindewald, Eckart; Grunewald, Calvin; Boyle, Brett; O'Connor, Mary; Shapiro, Bruce A

    2008-10-01

    One approach to designing RNA nanoscale structures is to use known RNA structural motifs such as junctions, kissing loops or bulges and to construct a molecular model by connecting these building blocks with helical struts. We previously developed an algorithm for detecting internal loops, junctions and kissing loops in RNA structures. Here we present algorithms for automating or assisting many of the steps that are involved in creating RNA structures from building blocks: (1) assembling building blocks into nanostructures using either a combinatorial search or constraint satisfaction; (2) optimizing RNA 3D ring structures to improve ring closure; (3) sequence optimisation; (4) creating a unique non-degenerate RNA topology descriptor. This effectively creates a computational pipeline for generating molecular models of RNA nanostructures and more specifically RNA ring structures with optimized sequences from RNA building blocks. We show several examples of how the algorithms can be utilized to generate RNA tecto-shapes.

  15. Computational strategies for the automated design of RNA nanoscale structures from building blocks using NanoTiler☆

    PubMed Central

    Bindewald, Eckart; Grunewald, Calvin; Boyle, Brett; O’Connor, Mary; Shapiro, Bruce A.

    2013-01-01

    One approach to designing RNA nanoscale structures is to use known RNA structural motifs such as junctions, kissing loops or bulges and to construct a molecular model by connecting these building blocks with helical struts. We previously developed an algorithm for detecting internal loops, junctions and kissing loops in RNA structures. Here we present algorithms for automating or assisting many of the steps that are involved in creating RNA structures from building blocks: (1) assembling building blocks into nanostructures using either a combinatorial search or constraint satisfaction; (2) optimizing RNA 3D ring structures to improve ring closure; (3) sequence optimisation; (4) creating a unique non-degenerate RNA topology descriptor. This effectively creates a computational pipeline for generating molecular models of RNA nanostructures and more specifically RNA ring structures with optimized sequences from RNA building blocks. We show several examples of how the algorithms can be utilized to generate RNA tecto-shapes. PMID:18838281

  16. Building occupancy simulation and data assimilation using a graph-based agent-oriented model

    NASA Astrophysics Data System (ADS)

    Rai, Sanish; Hu, Xiaolin

    2018-07-01

    Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.

  17. Models@Home: distributed computing in bioinformatics using a screensaver based approach.

    PubMed

    Krieger, Elmar; Vriend, Gert

    2002-02-01

    Due to the steadily growing computational demands in bioinformatics and related scientific disciplines, one is forced to make optimal use of the available resources. A straightforward solution is to build a network of idle computers and let each of them work on a small piece of a scientific challenge, as done by Seti@Home (http://setiathome.berkeley.edu), the world's largest distributed computing project. We developed a generally applicable distributed computing solution that uses a screensaver system similar to Seti@Home. The software exploits the coarse-grained nature of typical bioinformatics projects. Three major considerations for the design were: (1) often, many different programs are needed, while the time is lacking to parallelize them. Models@Home can run any program in parallel without modifications to the source code; (2) in contrast to the Seti project, bioinformatics applications are normally more sensitive to lost jobs. Models@Home therefore includes stringent control over job scheduling; (3) to allow use in heterogeneous environments, Linux and Windows based workstations can be combined with dedicated PCs to build a homogeneous cluster. We present three practical applications of Models@Home, running the modeling programs WHAT IF and YASARA on 30 PCs: force field parameterization, molecular dynamics docking, and database maintenance.

  18. Multi-objective optimization of GENIE Earth system models.

    PubMed

    Price, Andrew R; Myerscough, Richard J; Voutchkov, Ivan I; Marsh, Robert; Cox, Simon J

    2009-07-13

    The tuning of parameters in climate models is essential to provide reliable long-term forecasts of Earth system behaviour. We apply a multi-objective optimization algorithm to the problem of parameter estimation in climate models. This optimization process involves the iterative evaluation of response surface models (RSMs), followed by the execution of multiple Earth system simulations. These computations require an infrastructure that provides high-performance computing for building and searching the RSMs and high-throughput computing for the concurrent evaluation of a large number of models. Grid computing technology is therefore essential to make this algorithm practical for members of the GENIE project.

  19. Contaminant deposition building shielding factors for US residential structures.

    PubMed

    Dickson, Elijah; Hamby, David; Eckerman, Keith

    2017-10-10

    This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario from contaminant deposition on the roof and surrounding surfaces. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations from contaminant deposition on the roof and surrounding ground as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement as well for single-wide manufactured housing-unit. © 2017 IOP Publishing Ltd.

  20. Contaminant deposition building shielding factors for US residential structures.

    PubMed

    Dickson, E D; Hamby, D M; Eckerman, K F

    2015-06-01

    This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario from contaminant deposition on the roof and surrounding surfaces. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations from contaminant deposition on the roof and surrounding ground as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement as well for single-wide manufactured housing-unit.

  1. Computational fluid dynamic modeling of the summit of Mt. Hopkins for the MMT Observatory

    NASA Astrophysics Data System (ADS)

    Callahan, S.

    2010-07-01

    Over the past three decades, the staff of the MMT observatory used a variety of techniques to predict the summit wind characteristics including wind tunnel modeling and the release of smoke bombs. With the planned addition of a new instrument repair facility to be constructed on the summit of Mt. Hopkins, new computational fluid dynamic (CFD) models were made to determine the building's influence on the thermal environment around the telescope. The models compared the wind profiles and density contours above the telescope enclosure with and without the new building. The results show the steeply-sided Mount Hopkins dominates the summit wind profiles. In typical winds, the height of the telescope remains above the ground layer and is sufficiently separated from the new facility to insure the heat from the new building does not interfere with the telescope. The results also confirmed the observatories waste heat exhaust duct location needs to be relocated to prevent heat from being trapped in the wind shadow of the new building and lofting above the telescope. These useful models provide many insights into understanding the thermal environment of the summit.

  2. A new security model for collaborative environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Deborah; Lorch, Markus; Thompson, Mary

    Prevalent authentication and authorization models for distributed systems provide for the protection of computer systems and resources from unauthorized use. The rules and policies that drive the access decisions in such systems are typically configured up front and require trust establishment before the systems can be used. This approach does not work well for computer software that moderates human-to-human interaction. This work proposes a new model for trust establishment and management in computer systems supporting collaborative work. The model supports the dynamic addition of new users to a collaboration with very little initial trust placed into their identity and supportsmore » the incremental building of trust relationships through endorsements from established collaborators. It also recognizes the strength of a users authentication when making trust decisions. By mimicking the way humans build trust naturally the model can support a wide variety of usage scenarios. Its particular strength lies in the support for ad-hoc and dynamic collaborations and the ubiquitous access to a Computer Supported Collaboration Workspace (CSCW) system from locations with varying levels of trust and security.« less

  3. The Use of High Performance Computing (HPC) to Strengthen the Development of Army Systems

    DTIC Science & Technology

    2011-11-01

    accurately predicting the supersonic magus effect about spinning cones, ogive- cylinders , and boat-tailed afterbodies. This work led to the successful...successful computer model of the proposed product or system, one can then build prototypes on the computer and study the effects on the performance of...needed. The NRC report discusses the requirements for effective use of such computing power. One needs “models, algorithms, software, hardware

  4. Methods of mathematical modeling using polynomials of algebra of sets

    NASA Astrophysics Data System (ADS)

    Kazanskiy, Alexandr; Kochetkov, Ivan

    2018-03-01

    The article deals with the construction of discrete mathematical models for solving applied problems arising from the operation of building structures. Security issues in modern high-rise buildings are extremely serious and relevant, and there is no doubt that interest in them will only increase. The territory of the building is divided into zones for which it is necessary to observe. Zones can overlap and have different priorities. Such situations can be described using formulas algebra of sets. Formulas can be programmed, which makes it possible to work with them using computer models.

  5. Indoor Multi-Sensor Acquisition System for Projects on Energy Renovation of Buildings.

    PubMed

    Armesto, Julia; Sánchez-Villanueva, Claudio; Patiño-Cambeiro, Faustino; Patiño-Barbeito, Faustino

    2016-05-28

    Energy rehabilitation actions in buildings have become a great economic opportunity for the construction sector. They also constitute a strategic goal in the European Union (EU), given the energy dependence and the compromises with climate change of its member states. About 75% of existing buildings in the EU were built when energy efficiency codes had not been developed. Approximately 75% to 90% of those standing buildings are expected to remain in use in 2050. Significant advances have been achieved in energy analysis, simulation tools, and computer fluid dynamics for building energy evaluation. However, the gap between predictions and real savings might still be improved. Geomatics and computer science disciplines can really help in modelling, inspection, and diagnosis procedures. This paper presents a multi-sensor acquisition system capable of automatically and simultaneously capturing the three-dimensional geometric information, thermographic, optical, and panoramic images, ambient temperature map, relative humidity map, and light level map. The system integrates a navigation system based on a Simultaneous Localization and Mapping (SLAM) approach that allows georeferencing every data to its position in the building. The described equipment optimizes the energy inspection and diagnosis steps and facilitates the energy modelling of the building.

  6. Indoor Multi-Sensor Acquisition System for Projects on Energy Renovation of Buildings

    PubMed Central

    Armesto, Julia; Sánchez-Villanueva, Claudio; Patiño-Cambeiro, Faustino; Patiño-Barbeito, Faustino

    2016-01-01

    Energy rehabilitation actions in buildings have become a great economic opportunity for the construction sector. They also constitute a strategic goal in the European Union (EU), given the energy dependence and the compromises with climate change of its member states. About 75% of existing buildings in the EU were built when energy efficiency codes had not been developed. Approximately 75% to 90% of those standing buildings are expected to remain in use in 2050. Significant advances have been achieved in energy analysis, simulation tools, and computer fluid dynamics for building energy evaluation. However, the gap between predictions and real savings might still be improved. Geomatics and computer science disciplines can really help in modelling, inspection, and diagnosis procedures. This paper presents a multi-sensor acquisition system capable of automatically and simultaneously capturing the three-dimensional geometric information, thermographic, optical, and panoramic images, ambient temperature map, relative humidity map, and light level map. The system integrates a navigation system based on a Simultaneous Localization and Mapping (SLAM) approach that allows georeferencing every data to its position in the building. The described equipment optimizes the energy inspection and diagnosis steps and facilitates the energy modelling of the building. PMID:27240379

  7. Combining wet and dry research: experience with model development for cardiac mechano-electric structure-function studies

    PubMed Central

    Quinn, T. Alexander; Kohl, Peter

    2013-01-01

    Since the development of the first mathematical cardiac cell model 50 years ago, computational modelling has become an increasingly powerful tool for the analysis of data and for the integration of information related to complex cardiac behaviour. Current models build on decades of iteration between experiment and theory, representing a collective understanding of cardiac function. All models, whether computational, experimental, or conceptual, are simplified representations of reality and, like tools in a toolbox, suitable for specific applications. Their range of applicability can be explored (and expanded) by iterative combination of ‘wet’ and ‘dry’ investigation, where experimental or clinical data are used to first build and then validate computational models (allowing integration of previous findings, quantitative assessment of conceptual models, and projection across relevant spatial and temporal scales), while computational simulations are utilized for plausibility assessment, hypotheses-generation, and prediction (thereby defining further experimental research targets). When implemented effectively, this combined wet/dry research approach can support the development of a more complete and cohesive understanding of integrated biological function. This review illustrates the utility of such an approach, based on recent examples of multi-scale studies of cardiac structure and mechano-electric function. PMID:23334215

  8. Response Surface Model Building Using Orthogonal Arrays for Computer Experiments

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Braun, Robert D.; Moore, Arlene A.; Lepsch, Roger A.

    1997-01-01

    This study investigates response surface methods for computer experiments and discusses some of the approaches available. Orthogonal arrays constructed for computer experiments are studied and an example application to a technology selection and optimization study for a reusable launch vehicle is presented.

  9. Effective Energy Simulation and Optimal Design of Side-lit Buildings with Venetian Blinds

    NASA Astrophysics Data System (ADS)

    Cheng, Tian

    Venetian blinds are popularly used in buildings to control the amount of incoming daylight for improving visual comfort and reducing heat gains in air-conditioning systems. Studies have shown that the proper design and operation of window systems could result in significant energy savings in both lighting and cooling. However, there is no convenient computer tool that allows effective and efficient optimization of the envelope of side-lit buildings with blinds now. Three computer tools, Adeline, DOE2 and EnergyPlus widely used for the above-mentioned purpose have been experimentally examined in this study. Results indicate that the two former tools give unacceptable accuracy due to unrealistic assumptions adopted while the last one may generate large errors in certain conditions. Moreover, current computer tools have to conduct hourly energy simulations, which are not necessary for life-cycle energy analysis and optimal design, to provide annual cooling loads. This is not computationally efficient, particularly not suitable for optimal designing a building at initial stage because the impacts of many design variations and optional features have to be evaluated. A methodology is therefore developed for efficient and effective thermal and daylighting simulations and optimal design of buildings with blinds. Based on geometric optics and radiosity method, a mathematical model is developed to reasonably simulate the daylighting behaviors of venetian blinds. Indoor illuminance at any reference point can be directly and efficiently computed. They have been validated with both experiments and simulations with Radiance. Validation results show that indoor illuminances computed by the new models agree well with the measured data, and the accuracy provided by them is equivalent to that of Radiance. The computational efficiency of the new models is much higher than that of Radiance as well as EnergyPlus. Two new methods are developed for the thermal simulation of buildings. A fast Fourier transform (FFT) method is presented to avoid the root-searching process in the inverse Laplace transform of multilayered walls. Generalized explicit FFT formulae for calculating the discrete Fourier transform (DFT) are developed for the first time. They can largely facilitate the implementation of FFT. The new method also provides a basis for generating the symbolic response factors. Validation simulations show that it can generate the response factors as accurate as the analytical solutions. The second method is for direct estimation of annual or seasonal cooling loads without the need for tedious hourly energy simulations. It is validated by hourly simulation results with DOE2. Then symbolic long-term cooling load can be created by combining the two methods with thermal network analysis. The symbolic long-term cooling load can keep the design parameters of interest as symbols, which is particularly useful for the optimal design and sensitivity analysis. The methodology is applied to an office building in Hong Kong for the optimal design of building envelope. Design variables such as window-to-wall ratio, building orientation, and glazing optical and thermal properties are included in the study. Results show that the selected design values could significantly impact the energy performance of windows, and the optimal design of side-lit buildings could greatly enhance energy savings. The application example also demonstrates that the developed methodology significantly facilitates the optimal building design and sensitivity analysis, and leads to high computational efficiency.

  10. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    NASA Astrophysics Data System (ADS)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; Zhang, Guannan; Ye, Ming; Wu, Jianfeng; Wu, Jichun

    2017-12-01

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we develop a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.

  11. Modeling of Heat Transfer in Rooms in the Modelica "Buildings" Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael; Zuo, Wangda; Nouidui, Thierry Stephane

    This paper describes the implementation of the room heat transfer model in the free open-source Modelica \\Buildings" library. The model can be used as a single room or to compose a multizone building model. We discuss how the model is decomposed into submodels for the individual heat transfer phenomena. We also discuss the main physical assumptions. The room model can be parameterized to use different modeling assumptions, leading to linear or non-linear differential algebraic systems of equations. We present numerical experiments that show how these assumptions affect computing time and accuracy for selected cases of the ANSI/ASHRAE Standard 140- 2007more » envelop validation tests.« less

  12. Experimental shielding evaluation of the radiation protection provided by the structurally significant components of residential structures.

    PubMed

    Dickson, E D; Hamby, D M

    2014-03-01

    The human health and environmental effects following a postulated accidental release of radioactive material to the environment have been a public and regulatory concern since the early development of nuclear technology. These postulated releases have been researched extensively to better understand the potential risks for accident mitigation and emergency planning purposes. The objective of this investigation is to provide an updated technical basis for contemporary building shielding factors for the US housing stock. Building shielding factors quantify the protection from ionising radiation provided by a certain building type. Much of the current data used to determine the quality of shielding around nuclear facilities and urban environments is based on simplistic point-kernel calculations for 1950s era suburbia and is no longer applicable to the densely populated urban environments realised today. To analyse a building's radiation shielding properties, the ideal approach would be to subject a variety of building types to various radioactive sources and measure the radiation levels in and around the building. While this is not entirely practicable, this research analyses the shielding effectiveness of ten structurally significant US housing-stock models (walls and roofs) important for shielding against ionising radiation. The experimental data are used to benchmark computational models to calculate the shielding effectiveness of various building configurations under investigation from two types of realistic environmental source terms. Various combinations of these ten shielding models can be used to develop full-scale computational housing-unit models for building shielding factor calculations representing 69.6 million housing units (61.3%) in the United States. Results produced in this investigation provide a comparison between theory and experiment behind building shielding factor methodology.

  13. Climate Ocean Modeling on a Beowulf Class System

    NASA Technical Reports Server (NTRS)

    Cheng, B. N.; Chao, Y.; Wang, P.; Bondarenko, M.

    2000-01-01

    With the growing power and shrinking cost of personal computers. the availability of fast ethernet interconnections, and public domain software packages, it is now possible to combine them to build desktop parallel computers (named Beowulf or PC clusters) at a fraction of what it would cost to buy systems of comparable power front supercomputer companies. This led as to build and assemble our own sys tem. specifically for climate ocean modeling. In this article, we present our experience with such a system, discuss its network performance, and provide some performance comparison data with both HP SPP2000 and Cray T3E for an ocean Model used in present-day oceanographic research.

  14. Computational Psychometrics for Modeling System Dynamics during Stressful Disasters.

    PubMed

    Cipresso, Pietro; Bessi, Alessandro; Colombo, Desirée; Pedroli, Elisa; Riva, Giuseppe

    2017-01-01

    Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future.

  15. The Building Game: From Enumerative Combinatorics to Conformational Diffusion

    NASA Astrophysics Data System (ADS)

    Johnson-Chyzhykov, Daniel; Menon, Govind

    2016-08-01

    We study a discrete attachment model for the self-assembly of polyhedra called the building game. We investigate two distinct aspects of the model: (i) enumerative combinatorics of the intermediate states and (ii) a notion of Brownian motion for the polyhedral linkage defined by each intermediate that we term conformational diffusion. The combinatorial configuration space of the model is computed for the Platonic, Archimedean, and Catalan solids of up to 30 faces, and several novel enumerative results are generated. These represent the most exhaustive computations of this nature to date. We further extend the building game to include geometric information. The combinatorial structure of each intermediate yields a systems of constraints specifying a polyhedral linkage and its moduli space. We use a random walk to simulate a reflected Brownian motion in each moduli space. Empirical statistics of the random walk may be used to define the rates of transition for a Markov process modeling the process of self-assembly.

  16. Consensus Modeling in Support of a Semi-Automated Read-Across Application (SOT)

    EPA Science Inventory

    Read-across is a widely used technique to help fill data gaps in a risk assessment. With the increasing availability of large amounts of computable in vitro and in vivo data on chemicals, it should be possible to build a variety of computer models to help guide a risk assessor in...

  17. Automatic 3D Building Detection and Modeling from Airborne LiDAR Point Clouds

    ERIC Educational Resources Information Center

    Sun, Shaohui

    2013-01-01

    Urban reconstruction, with an emphasis on man-made structure modeling, is an active research area with broad impact on several potential applications. Urban reconstruction combines photogrammetry, remote sensing, computer vision, and computer graphics. Even though there is a huge volume of work that has been done, many problems still remain…

  18. Decision making based on analysis of benefit versus costs of preventive retrofit versus costs of repair after earthquake hazards

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, M.

    2012-04-01

    In this presentation interventions on seismically vulnerable early reinforced concrete skeleton buildings, from the interwar time, at different performance levels, from avoiding collapse up to assuring immediate post-earthquake functionality are considered. Between these two poles there are degrees of damage depending on the performance aim set. The costs of the retrofit and post-earthquake repair differ depending on the targeted performance. Not only an earthquake has impact on a heritage building, but also the retrofit measure, for example on its appearance or its functional layout. This way criteria of the structural engineer, the investor, the architect/conservator/urban planner and the owner/inhabitants from the neighbourhood are considered for taking a benefit-cost decision. Benefit-cost analysis based decision is an element in a risk management process. A solution must be found on how much change to accept for retrofit and how much repairable damage to take into account. There are two impact studies. Numerical simulation was run for the building typology considered for successive earthquakes, selected in a deterministic way (1977, 1986 and two for 1991 from Vrancea, Romania and respectively 1978 Thessaloniki, Greece), considering also the case when retrofit is done between two earthquakes. The typology of buildings itself was studied not only for Greece and Romania, but for numerous European countries, including Italy. The typology was compared to earlier reinforced concrete buildings, with Hennebique system, in order to see to which amount these can belong to structural heritage and to shape the criteria of the architect/conservator. Based on the typology study two model buildings were designed, and for one of these different retrofit measures (side walls, structural walls, steel braces, steel jacketing) were considered, while for the other one of these retrofit techniques (diagonal braces, which permits adding also active measures such as energy dissipaters) to different amount and location in the building was considered. Device computations, a civil engineering method for building economics (and which was, before statistics existed, also the method for computing the costs of general upgrade of buildings), were done for the retrofit and for the repair measures, being able to be applied for different countries, also ones where there is no database on existing projects in seismic retrofit. The building elements for which the device computations were done are named "retrofit elements" and they can be new elements, modified elements or replaced elements of the initial building. The addition of the devices is simple, as the row in project management was, but, for the sake of comparison, also complex project management computed in other works was compared for innovative measures such as FRP (with glass and fibre). The theoretical costs for model measures were compared to the way costs of real retrofit for this building type (with reinforced concrete jacketing and FRP) are computed in Greece. The theoretical proposed measures were generally compared to those applied in practice, in Romania and Italy as well. A further study will include these, as in Italy diagonal braces with dissipation had been used. The typology of braces is relevant also for the local seismic culture, maybe outgoing for another type of skeleton structures the distribution of which has been studied: the timber skeleton. A subtype of Romanian reinforced concrete skeleton buildings includes diagonal braces. In order to assess the costs of rebuilding or general upgrade without retrofit, architecture methods for building economics are considered based on floor surface. Diagrams have been built to see how the total costs vary as addition between the preventive retrofit and the post-earthquake repair, and tables to compare to the costs of rebuilding, outgoing from a the model of addition of day-lighting in atria of buildings. The moment when a repair measure has to be applied, function of the recurrence period of earthquakes, is similar to the depth of the atria. Depending on how strong the expected earthquake is, a more extensive retrofit is required in order to decrease repair costs. A further study would allow converting the device computations in floor surface costs, to be able not only to implement in an ICT environment by means of ontology and BIM, but also to convert to urban scale. For the latter studies of probabilistic application of structural mechanics models instead of observation based statistics can be considered. But first the socio-economic models of construction management games will be considered, both computer games and board hard-copy games, starting with SimCity which initially included the San Francisco 1906 earthquake, in order to see how the resources needed can be modeled. All criteria build the taxonomy of decision. Among them different ways to make the cost-benefit analysis exist, from weighted tree to pair-wise comparison. The taxonomy was modeled as a decision tree, which builds the basis for an ontology.

  19. Building Cognition: The Construction of Computational Representations for Scientific Discovery

    ERIC Educational Resources Information Center

    Chandrasekharan, Sanjay; Nersessian, Nancy J.

    2015-01-01

    Novel computational representations, such as simulation models of complex systems and video games for scientific discovery (Foldit, EteRNA etc.), are dramatically changing the way discoveries emerge in science and engineering. The cognitive roles played by such computational representations in discovery are not well understood. We present a…

  20. Reproducible research in vadose zone sciences

    USDA-ARS?s Scientific Manuscript database

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  1. COMPUTATIONAL CHALLENGES IN BUILDING MULTI-SCALE AND MULTI-PHYSICS MODELS OF CARDIAC ELECTRO-MECHANICS

    PubMed Central

    Plank, G; Prassl, AJ; Augustin, C

    2014-01-01

    Despite the evident multiphysics nature of the heart – it is an electrically controlled mechanical pump – most modeling studies considered electrophysiology and mechanics in isolation. In no small part, this is due to the formidable modeling challenges involved in building strongly coupled anatomically accurate and biophyically detailed multi-scale multi-physics models of cardiac electro-mechanics. Among the main challenges are the selection of model components and their adjustments to achieve integration into a consistent organ-scale model, dealing with technical difficulties such as the exchange of data between electro-physiological and mechanical model, particularly when using different spatio-temporal grids for discretization, and, finally, the implementation of advanced numerical techniques to deal with the substantial computational. In this study we report on progress made in developing a novel modeling framework suited to tackle these challenges. PMID:24043050

  2. Yahoo! Compute Coop (YCC). A Next-Generation Passive Cooling Design for Data Centers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robison, AD; Page, Christina; Lytle, Bob

    The purpose of the Yahoo! Compute Coop (YCC) project is to research, design, build and implement a greenfield "efficient data factory" and to specifically demonstrate that the YCC concept is feasible for large facilities housing tens of thousands of heat-producing computing servers. The project scope for the Yahoo! Compute Coop technology includes: - Analyzing and implementing ways in which to drastically decrease energy consumption and waste output. - Analyzing the laws of thermodynamics and implementing naturally occurring environmental effects in order to maximize the "free-cooling" for large data center facilities. "Free cooling" is the direct usage of outside air tomore » cool the servers vs. traditional "mechanical cooling" which is supplied by chillers or other Dx units. - Redesigning and simplifying building materials and methods. - Shortening and simplifying build-to-operate schedules while at the same time reducing initial build and operating costs. Selected for its favorable climate, the greenfield project site is located in Lockport, NY. Construction on the 9.0 MW critical load data center facility began in May 2009, with the fully operational facility deployed in September 2010. The relatively low initial build cost, compatibility with current server and network models, and the efficient use of power and water are all key features that make it a highly compatible and globally implementable design innovation for the data center industry. Yahoo! Compute Coop technology is designed to achieve 99.98% uptime availability. This integrated building design allows for free cooling 99% of the year via the building's unique shape and orientation, as well as server physical configuration.« less

  3. Modelling Technology for Building Fire Scene with Virtual Geographic Environment

    NASA Astrophysics Data System (ADS)

    Song, Y.; Zhao, L.; Wei, M.; Zhang, H.; Liu, W.

    2017-09-01

    Building fire is a risky activity that can lead to disaster and massive destruction. The management and disposal of building fire has always attracted much interest from researchers. Integrated Virtual Geographic Environment (VGE) is a good choice for building fire safety management and emergency decisions, in which a more real and rich fire process can be computed and obtained dynamically, and the results of fire simulations and analyses can be much more accurate as well. To modelling building fire scene with VGE, the application requirements and modelling objective of building fire scene were analysed in this paper. Then, the four core elements of modelling building fire scene (the building space environment, the fire event, the indoor Fire Extinguishing System (FES) and the indoor crowd) were implemented, and the relationship between the elements was discussed also. Finally, with the theory and framework of VGE, the technology of building fire scene system with VGE was designed within the data environment, the model environment, the expression environment, and the collaborative environment as well. The functions and key techniques in each environment are also analysed, which may provide a reference for further development and other research on VGE.

  4. Image-Based Modeling Techniques for Architectural Heritage 3d Digitalization: Limits and Potentialities

    NASA Astrophysics Data System (ADS)

    Santagati, C.; Inzerillo, L.; Di Paola, F.

    2013-07-01

    3D reconstruction from images has undergone a revolution in the last few years. Computer vision techniques use photographs from data set collection to rapidly build detailed 3D models. The simultaneous applications of different algorithms (MVS), the different techniques of image matching, feature extracting and mesh optimization are inside an active field of research in computer vision. The results are promising: the obtained models are beginning to challenge the precision of laser-based reconstructions. Among all the possibilities we can mainly distinguish desktop and web-based packages. Those last ones offer the opportunity to exploit the power of cloud computing in order to carry out a semi-automatic data processing, thus allowing the user to fulfill other tasks on its computer; whereas desktop systems employ too much processing time and hard heavy approaches. Computer vision researchers have explored many applications to verify the visual accuracy of 3D model but the approaches to verify metric accuracy are few and no one is on Autodesk 123D Catch applied on Architectural Heritage Documentation. Our approach to this challenging problem is to compare the 3Dmodels by Autodesk 123D Catch and 3D models by terrestrial LIDAR considering different object size, from the detail (capitals, moldings, bases) to large scale buildings for practitioner purpose.

  5. Learning Universal Computations with Spikes

    PubMed Central

    Thalmeier, Dominik; Uhlmann, Marvin; Kappen, Hilbert J.; Memmesheimer, Raoul-Martin

    2016-01-01

    Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them. PMID:27309381

  6. Achieving Actionable Results from Available Inputs: Metamodels Take Building Energy Simulations One Step Further

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horsey, Henry; Fleming, Katherine; Ball, Brian

    Modeling commercial building energy usage can be a difficult and time-consuming task. The increasing prevalence of optimization algorithms provides one path for reducing the time and difficulty. Many use cases remain, however, where information regarding whole-building energy usage is valuable, but the time and expertise required to run and post-process a large number of building energy simulations is intractable. A relatively underutilized option to accurately estimate building energy consumption in real time is to pre-compute large datasets of potential building energy models, and use the set of results to quickly and efficiently provide highly accurate data. This process is calledmore » metamodeling. In this paper, two case studies are presented demonstrating the successful applications of metamodeling using the open-source OpenStudio Analysis Framework. The first case study involves the U.S. Department of Energy's Asset Score Tool, specifically the Preview Asset Score Tool, which is designed to give nontechnical users a near-instantaneous estimated range of expected results based on building system-level inputs. The second case study involves estimating the potential demand response capabilities of retail buildings in Colorado. The metamodel developed in this second application not only allows for estimation of a single building's expected performance, but also can be combined with public data to estimate the aggregate DR potential across various geographic (county and state) scales. In both case studies, the unique advantages of pre-computation allow building energy models to take the place of topdown actuarial evaluations. This paper ends by exploring the benefits of using metamodels and then examines the cost-effectiveness of this approach.« less

  7. Comparative analysis of economic models in selected solar energy computer programs

    NASA Astrophysics Data System (ADS)

    Powell, J. W.; Barnes, K. A.

    1982-01-01

    The economic evaluation models in five computer programs widely used for analyzing solar energy systems (F-CHART 3.0, F-CHART 4.0, SOLCOST, BLAST, and DOE-2) are compared. Differences in analysis techniques and assumptions among the programs are assessed from the point of view of consistency with the Federal requirements for life cycle costing (10 CFR Part 436), effect on predicted economic performance, and optimal system size, case of use, and general applicability to diverse systems types and building types. The FEDSOL program developed by the National Bureau of Standards specifically to meet the Federal life cycle cost requirements serves as a basis for the comparison. Results of the study are illustrated in test cases of two different types of Federally owned buildings: a single family residence and a low rise office building.

  8. Analysis of field test data on residential heating and cooling

    NASA Astrophysics Data System (ADS)

    Talbert, S. G.

    1980-12-01

    The computer program using field site data collected on 48 homes located in six cities in different climatic regions of the United States is discussed. In addition, a User's Guide was prepared for the computer program which is contained in a separate two-volume document entitled User's Guide for REAP: Residential Energy Analysis Program. Feasibility studies were conducted pertaining to potential improvements for REAP, including: the addition of an oil-furnace model; improving the infiltration subroutine; adding active and/or passive solar subroutines; incorporating a thermal energy storage model; and providing dual HVAC systems (e.g., heat pump-gas furnace). The purpose of REAP is to enable building designers and energy analysts to evaluate how such factors as building design, weather conditions, internal heat loads, and HVAC equipment performance, influence the energy requirements of residential buildings.

  9. The computational future for climate and Earth system models: on the path to petaflop and beyond.

    PubMed

    Washington, Warren M; Buja, Lawrence; Craig, Anthony

    2009-03-13

    The development of the climate and Earth system models has had a long history, starting with the building of individual atmospheric, ocean, sea ice, land vegetation, biogeochemical, glacial and ecological model components. The early researchers were much aware of the long-term goal of building the Earth system models that would go beyond what is usually included in the climate models by adding interactive biogeochemical interactions. In the early days, the progress was limited by computer capability, as well as by our knowledge of the physical and chemical processes. Over the last few decades, there has been much improved knowledge, better observations for validation and more powerful supercomputer systems that are increasingly meeting the new challenges of comprehensive models. Some of the climate model history will be presented, along with some of the successes and difficulties encountered with present-day supercomputer systems.

  10. Leveraging Modeling Approaches: Reaction Networks and Rules

    PubMed Central

    Blinov, Michael L.; Moraru, Ion I.

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high resolution and/or high throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatio-temporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks – the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks. PMID:22161349

  11. Leveraging modeling approaches: reaction networks and rules.

    PubMed

    Blinov, Michael L; Moraru, Ion I

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high-resolution and/or high-throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatiotemporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks - the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks.

  12. RCrane: semi-automated RNA model building.

    PubMed

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  13. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    DOE PAGES

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; ...

    2017-12-27

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less

  14. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less

  15. Computer Simulations of Coronary Blood Flow Through a Constriction

    DTIC Science & Technology

    2014-03-01

    interventional procedures (e.g., stent deployment). Building off previous models that have been partially validated with experimental data, this thesis... stent deployment). Building off previous models that have been partially validated with experimental data, this thesis continues to develop the...the artery and increase blood flow. Generally a stent , or a mesh wire tube, is permanently inserted in order to scaffold open the artery wall

  16. Scratch as a Computational Modelling Tool for Teaching Physics

    ERIC Educational Resources Information Center

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  17. The Virtual Solar System Project: Developing Conceptual Understanding of Astronomical Concepts through Building Three-Dimensional Computational Models.

    ERIC Educational Resources Information Center

    Keating, Thomas; Barnett, Michael; Barab, Sasha A.; Hay, Kenneth E.

    2002-01-01

    Describes the Virtual Solar System (VSS) course which is one of the first attempts to integrate three-dimensional (3-D) computer modeling as a central component of introductory undergraduate education. Assesses changes in student understanding of astronomy concepts as a result of participating in an experimental introductory astronomy course in…

  18. Computational Psychometrics for Modeling System Dynamics during Stressful Disasters

    PubMed Central

    Cipresso, Pietro; Bessi, Alessandro; Colombo, Desirée; Pedroli, Elisa; Riva, Giuseppe

    2017-01-01

    Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future. PMID:28861026

  19. Scalable geocomputation: evolving an environmental model building platform from single-core to supercomputers

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; de Jong, Kor; Karssenberg, Derek

    2017-04-01

    There is an increasing demand to run environmental models on a big scale: simulations over large areas at high resolution. The heterogeneity of available computing hardware such as multi-core CPUs, GPUs or supercomputer potentially provides significant computing power to fulfil this demand. However, this requires detailed knowledge of the underlying hardware, parallel algorithm design and the implementation thereof in an efficient system programming language. Domain scientists such as hydrologists or ecologists often lack this specific software engineering knowledge, their emphasis is (and should be) on exploratory building and analysis of simulation models. As a result, models constructed by domain specialists mostly do not take full advantage of the available hardware. A promising solution is to separate the model building activity from software engineering by offering domain specialists a model building framework with pre-programmed building blocks that they combine to construct a model. The model building framework, consequently, needs to have built-in capabilities to make full usage of the available hardware. Developing such a framework providing understandable code for domain scientists and being runtime efficient at the same time poses several challenges on developers of such a framework. For example, optimisations can be performed on individual operations or the whole model, or tasks need to be generated for a well-balanced execution without explicitly knowing the complexity of the domain problem provided by the modeller. Ideally, a modelling framework supports the optimal use of available hardware whichsoever combination of model building blocks scientists use. We demonstrate our ongoing work on developing parallel algorithms for spatio-temporal modelling and demonstrate 1) PCRaster, an environmental software framework (http://www.pcraster.eu) providing spatio-temporal model building blocks and 2) parallelisation of about 50 of these building blocks using the new Fern library (https://github.com/geoneric/fern/), an independent generic raster processing library. Fern is a highly generic software library and its algorithms can be configured according to the configuration of a modelling framework. With manageable programming effort (e.g. matching data types between programming and domain language) we created a binding between Fern and PCRaster. The resulting PCRaster Python multicore module can be used to execute existing PCRaster models without having to make any changes to the model code. We show initial results on synthetic and geoscientific models indicating significant runtime improvements provided by parallel local and focal operations. We further outline challenges in improving remaining algorithms such as flow operations over digital elevation maps and further potential improvements like enhancing disk I/O.

  20. A Learning Framework for Control-Oriented Modeling of Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubio-Herrero, Javier; Chandan, Vikas; Siegel, Charles M.

    Buildings consume a significant amount of energy worldwide. Several building optimization and control use cases require models of energy consumption which are control oriented, have high predictive capability, imposes minimal data pre-processing requirements, and have the ability to be adapted continuously to account for changing conditions as new data becomes available. Data driven modeling techniques, that have been investigated so far, while promising in the context of buildings, have been unable to simultaneously satisfy all the requirements mentioned above. In this context, deep learning techniques such as Recurrent Neural Networks (RNNs) hold promise, empowered by advanced computational capabilities and bigmore » data opportunities. In this paper, we propose a deep learning based methodology for the development of control oriented models for building energy management and test in on data from a real building. Results show that the proposed methodology outperforms other data driven modeling techniques significantly. We perform a detailed analysis of the proposed methodology along dimensions such as topology, sensitivity, and downsampling. Lastly, we conclude by envisioning a building analytics suite empowered by the proposed deep framework, that can drive several use cases related to building energy management.« less

  1. APPLICATION OF CFD SIMULATIONS FOR SHORT-RANGE ATMOSPHERIC DISPERSION OVER OPEN FIELDS AND WITHIN ARRAYS OF BUILDINGS

    EPA Science Inventory

    Computational Fluid Dynamics (CFD) techniques are increasingly being applied to air quality modeling of short-range dispersion, especially the flow and dispersion around buildings and other geometrically complex structures. The proper application and accuracy of such CFD techniqu...

  2. Footwear Physics.

    ERIC Educational Resources Information Center

    Blaser, Mark; Larsen, Jamie

    1996-01-01

    Presents five interactive, computer-based activities that mimic scientific tests used by sport researchers to help companies design high-performance athletic shoes, including impact tests, flexion tests, friction tests, video analysis, and computer modeling. Provides a platform for teachers to build connections between chemistry (polymer science),…

  3. D Topological Indoor Building Modeling Integrated with Open Street Map

    NASA Astrophysics Data System (ADS)

    Jamali, A.; Rahman, A. Abdul; Boguslawski, P.

    2016-09-01

    Considering various fields of applications for building surveying and various demands, geometry representation of a building is the most crucial aspect of a building survey. The interiors of the buildings need to be described along with the relative locations of the rooms, corridors, doors and exits in many kinds of emergency response, such as fire, bombs, smoke, and pollution. Topological representation is a challenging task within the Geography Information Science (GIS) environment, as the data structures required to express these relationships are particularly difficult to develop. Even within the Computer Aided Design (CAD) community, the structures for expressing the relationships between adjacent building parts are complex and often incomplete. In this paper, an integration of 3D topological indoor building modeling in Dual Half Edge (DHE) data structure and outdoor navigation network from Open Street Map (OSM) is presented.

  4. Global Dynamic Exposure and the OpenBuildingMap

    NASA Astrophysics Data System (ADS)

    Schorlemmer, D.; Beutin, T.; Hirata, N.; Hao, K. X.; Wyss, M.; Cotton, F.; Prehn, K.

    2015-12-01

    Detailed understanding of local risk factors regarding natural catastrophes requires in-depth characterization of the local exposure. Current exposure capture techniques have to find the balance between resolution and coverage. We aim at bridging this gap by employing a crowd-sourced approach to exposure capturing focusing on risk related to earthquake hazard. OpenStreetMap (OSM), the rich and constantly growing geographical database, is an ideal foundation for us. More than 2.5 billion geographical nodes, more than 150 million building footprints (growing by ~100'000 per day), and a plethora of information about school, hospital, and other critical facility locations allow us to exploit this dataset for risk-related computations. We will harvest this dataset by collecting exposure and vulnerability indicators from explicitly provided data (e.g. hospital locations), implicitly provided data (e.g. building shapes and positions), and semantically derived data, i.e. interpretation applying expert knowledge. With this approach, we can increase the resolution of existing exposure models from fragility classes distribution via block-by-block specifications to building-by-building vulnerability. To increase coverage, we will provide a framework for collecting building data by any person or community. We will implement a double crowd-sourced approach to bring together the interest and enthusiasm of communities with the knowledge of earthquake and engineering experts. The first crowd-sourced approach aims at collecting building properties in a community by local people and activists. This will be supported by tailored building capture tools for mobile devices for simple and fast building property capturing. The second crowd-sourced approach involves local experts in estimating building vulnerability that will provide building classification rules that translate building properties into vulnerability and exposure indicators as defined in the Building Taxonomy 2.0 developed by the Global Earthquake Model (GEM). These indicators will then be combined with a hazard model using the GEM OpenQuake engine to compute a risk model. The free/open framework we will provide can be used on commodity hardware for local to regional exposure capturing and for communities to understand their earthquake risk.

  5. A Modelica-based Model Library for Building Energy and Control Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael

    2009-04-07

    This paper describes an open-source library with component models for building energy and control systems that is based on Modelica, an equation-based objectoriented language that is well positioned to become the standard for modeling of dynamic systems in various industrial sectors. The library is currently developed to support computational science and engineering for innovative building energy and control systems. Early applications will include controls design and analysis, rapid prototyping to support innovation of new building systems and the use of models during operation for controls, fault detection and diagnostics. This paper discusses the motivation for selecting an equation-based object-oriented language.more » It presents the architecture of the library and explains how base models can be used to rapidly implement new models. To demonstrate the capability of analyzing novel energy and control systems, the paper closes with an example where we compare the dynamic performance of a conventional hydronic heating system with thermostatic radiator valves to an innovative heating system. In the new system, instead of a centralized circulation pump, each of the 18 radiators has a pump whose speed is controlled using a room temperature feedback loop, and the temperature of the boiler is controlled based on the speed of the radiator pump. All flows are computed by solving for the pressure distribution in the piping network, and the controls include continuous and discrete time controls.« less

  6. A View on Future Building System Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described bymore » coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).« less

  7. Computational modeling of latent-heat-storage in PCM modified interior plaster

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fořt, Jan; Maděra, Jiří; Trník, Anton

    2016-06-08

    The latent heat storage systems represent a promising way for decrease of buildings energy consumption with respect to the sustainable development principles of building industry. The presented paper is focused on the evaluation of the effect of PCM incorporation on thermal performance of cement-lime plasters. For basic characterization of the developed materials, matrix density, bulk density, and total open porosity are measured. Thermal conductivity is accessed by transient impulse method. DSC analysis is used for the identification of phase change temperature during the heating and cooling process. Using DSC data, the temperature dependent specific heat capacity is calculated. On themore » basis of the experiments performed, the supposed improvement of the energy efficiency of characteristic building envelope system where the designed plasters are likely to be used is evaluated by a computational analysis. Obtained experimental and computational results show a potential of PCM modified plasters for improvement of thermal stability of buildings and moderation of interior climate.« less

  8. The Earth System Model

    NASA Technical Reports Server (NTRS)

    Schoeberl, Mark; Rood, Richard B.; Hildebrand, Peter; Raymond, Carol

    2003-01-01

    The Earth System Model is the natural evolution of current climate models and will be the ultimate embodiment of our geophysical understanding of the planet. These models are constructed from components - atmosphere, ocean, ice, land, chemistry, solid earth, etc. models and merged together through a coupling program which is responsible for the exchange of data from the components. Climate models and future earth system models will have standardized modules, and these standards are now being developed by the ESMF project funded by NASA. The Earth System Model will have a variety of uses beyond climate prediction. The model can be used to build climate data records making it the core of an assimilation system, and it can be used in OSSE experiments to evaluate. The computing and storage requirements for the ESM appear to be daunting. However, the Japanese ES theoretical computing capability is already within 20% of the minimum requirements needed for some 2010 climate model applications. Thus it seems very possible that a focused effort to build an Earth System Model will achieve succcss.

  9. Interior thermal insulation systems for historical building envelopes

    NASA Astrophysics Data System (ADS)

    Jerman, Miloš; Solař, Miloš; Černý, Robert

    2017-11-01

    The design specifics of interior thermal insulation systems applied for historical building envelopes are described. The vapor-tight systems and systems based on capillary thermal insulation materials are taken into account as two basic options differing in building-physical considerations. The possibilities of hygrothermal analysis of renovated historical envelopes including laboratory methods, computer simulation techniques, and in-situ tests are discussed. It is concluded that the application of computational models for hygrothermal assessment of interior thermal insulation systems should always be performed with a particular care. On one hand, they present a very effective tool for both service life assessment and possible planning of subsequent reconstructions. On the other, the hygrothermal analysis of any historical building can involve quite a few potential uncertainties which may affect negatively the accuracy of obtained results.

  10. Computer modeling of the neurotoxin binding site of acetylcholine receptor spanning residues 185 through 196

    NASA Technical Reports Server (NTRS)

    Garduno-Juarez, R.; Shibata, M.; Zielinski, T. J.; Rein, R.

    1987-01-01

    A model of the complex between the acetylcholine receptor and the snake neurotoxin, cobratoxin, was built by molecular model building and energy optimization techniques. The experimentally identified functionally important residues of cobratoxin and the dodecapeptide corresponding to the residues 185-196 of acetylcholine receptor alpha subunit were used to build the model. Both cis and trans conformers of cyclic L-cystine portion of the dodecapeptide were examined. Binding residues independently identified on cobratoxin are shown to interact with the dodecapeptide AChR model.

  11. 20. SITE BUILDING 002 SCANNER BUILDING IN COMPUTER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    20. SITE BUILDING 002 - SCANNER BUILDING - IN COMPUTER ROOM LOOKING AT "CONSOLIDATED MAINTENANCE OPERATIONS CENTER" JOB AREA AND OPERATION WORK CENTER. TASKS INCLUDE RADAR MAINTENANCE, COMPUTER MAINTENANCE, CYBER COMPUTER MAINTENANCE AND RELATED ACTIVITIES. - Cape Cod Air Station, Technical Facility-Scanner Building & Power Plant, Massachusetts Military Reservation, Sandwich, Barnstable County, MA

  12. DEVELOPMENT AND APPLICATIONS OF CFD SIMULATIONS IN SUPPORT OF AIR QUALITY STUDIES INVOLVING BUILDINGS

    EPA Science Inventory

    There is a need to properly develop the application of Computational Fluid Dynamics (CFD) methods in support of air quality studies involving pollution sources near buildings at industrial sites. CFD models are emerging as a promising technology for such assessments, in part due ...

  13. Revision history aware repositories of computational models of biological systems.

    PubMed

    Miller, Andrew K; Yu, Tommy; Britten, Randall; Cooling, Mike T; Lawson, James; Cowan, Dougal; Garny, Alan; Halstead, Matt D B; Hunter, Peter J; Nickerson, David P; Nunns, Geo; Wimalaratne, Sarala M; Nielsen, Poul M F

    2011-01-14

    Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs) for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. We have extended the Physiome Model Repository software to be fully revision history aware, by building it on top of Mercurial, an existing DVCS. We have demonstrated the utility of this approach, when used in conjunction with the model composition facilities in CellML, to build and understand more complex models. We have also demonstrated the ability of the repository software to present version history to casual users over the web, and to highlight specific versions which are likely to be useful to users. Providing facilities for maintaining and using revision history information is an important part of building a useful repository of computational models, as this information is useful both for understanding the source of and justification for parts of a model, and to facilitate automated processes such as merges. The availability of fully revision history aware repositories, and associated tools, will therefore be of significant benefit to the community.

  14. Analysis of the impact of simulation model simplifications on the quality of low-energy buildings simulation results

    NASA Astrophysics Data System (ADS)

    Klimczak, Marcin; Bojarski, Jacek; Ziembicki, Piotr; Kęskiewicz, Piotr

    2017-11-01

    The requirements concerning energy performance of buildings and their internal installations, particularly HVAC systems, have been growing continuously in Poland and all over the world. The existing, traditional calculation methods following from the static heat exchange model are frequently not sufficient for a reasonable heating design of a building. Both in Poland and elsewhere in the world, methods and software are employed which allow a detailed simulation of the heating and moisture conditions in a building, and also an analysis of the performance of HVAC systems within a building. However, these systems are usually difficult in use and complex. In addition, the development of a simulation model that is sufficiently adequate to the real building requires considerable time involvement of a designer, is time-consuming and laborious. A simplification of the simulation model of a building renders it possible to reduce the costs of computer simulations. The paper analyses in detail the effect of introducing a number of different variants of the simulation model developed in Design Builder on the quality of final results obtained. The objective of this analysis is to find simplifications which allow obtaining simulation results which have an acceptable level of deviations from the detailed model, thus facilitating a quick energy performance analysis of a given building.

  15. Computer-aided design of biological circuits using TinkerCell

    PubMed Central

    Bergmann, Frank T; Sauro, Herbert M

    2010-01-01

    Synthetic biology is an engineering discipline that builds on modeling practices from systems biology and wet-lab techniques from genetic engineering. As synthetic biology advances, efficient procedures will be developed that will allow a synthetic biologist to design, analyze and build biological networks. In this idealized pipeline, computer-aided design (CAD) is a necessary component. The role of a CAD application would be to allow efficient transition from a general design to a final product. TinkerCell is a design tool for serving this purpose in synthetic biology. In TinkerCell, users build biological networks using biological parts and modules. The network can be analyzed using one of several functions provided by TinkerCell or custom programs from third-party sources. Since best practices for modeling and constructing synthetic biology networks have not yet been established, TinkerCell is designed as a flexible and extensible application that can adjust itself to changes in the field. PMID:21327060

  16. Quality Analysis on 3d Buidling Models Reconstructed from Uav Imagery

    NASA Astrophysics Data System (ADS)

    Jarzabek-Rychard, M.; Karpina, M.

    2016-06-01

    Recent developments in UAV technology and structure from motion techniques have effected that UAVs are becoming standard platforms for 3D data collection. Because of their flexibility and ability to reach inaccessible urban parts, drones appear as optimal solution for urban applications. Building reconstruction from the data collected with UAV has the important potential to reduce labour cost for fast update of already reconstructed 3D cities. However, especially for updating of existing scenes derived from different sensors (e.g. airborne laser scanning), a proper quality assessment is necessary. The objective of this paper is thus to evaluate the potential of UAV imagery as an information source for automatic 3D building modeling at LOD2. The investigation process is conducted threefold: (1) comparing generated SfM point cloud to ALS data; (2) computing internal consistency measures of the reconstruction process; (3) analysing the deviation of Check Points identified on building roofs and measured with a tacheometer. In order to gain deep insight in the modeling performance, various quality indicators are computed and analysed. The assessment performed according to the ground truth shows that the building models acquired with UAV-photogrammetry have the accuracy of less than 18 cm for the plannimetric position and about 15 cm for the height component.

  17. EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.

    PubMed

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.

  18. Energy and life-cycle cost analysis of a six-story office building

    NASA Astrophysics Data System (ADS)

    Turiel, I.

    1981-10-01

    An energy analysis computer program, DOE-2, was used to compute annual energy use for a typical office building as originally designed and with several energy conserving design modifications. The largest energy use reductions were obtained with the incorporation of daylighting techniques, the use of double pane windows, night temperature setback, and the reduction of artificial lighting levels. A life-cycle cost model was developed to assess the cost-effectiveness of the design modifications discussed. The model incorporates such features as inclusion of taxes, depreciation, and financing of conservation investments. The energy conserving strategies are ranked according to economic criteria such as net present benefit, discounted payback period, and benefit to cost ratio.

  19. The Cell Collective: Toward an open and collaborative approach to systems biology

    PubMed Central

    2012-01-01

    Background Despite decades of new discoveries in biomedical research, the overwhelming complexity of cells has been a significant barrier to a fundamental understanding of how cells work as a whole. As such, the holistic study of biochemical pathways requires computer modeling. Due to the complexity of cells, it is not feasible for one person or group to model the cell in its entirety. Results The Cell Collective is a platform that allows the world-wide scientific community to create these models collectively. Its interface enables users to build and use models without specifying any mathematical equations or computer code - addressing one of the major hurdles with computational research. In addition, this platform allows scientists to simulate and analyze the models in real-time on the web, including the ability to simulate loss/gain of function and test what-if scenarios in real time. Conclusions The Cell Collective is a web-based platform that enables laboratory scientists from across the globe to collaboratively build large-scale models of various biological processes, and simulate/analyze them in real time. In this manuscript, we show examples of its application to a large-scale model of signal transduction. PMID:22871178

  20. Greek Pre-Service Teachers' Intentions to Use Computers as In-Service Teachers

    ERIC Educational Resources Information Center

    Fokides, Emmanuel

    2017-01-01

    The study examines the factors affecting Greek pre-service teachers' intention to use computers when they become practicing teachers. Four variables (perceived usefulness, perceived ease of use, self-efficacy, and attitude toward use) as well as behavioral intention to use computers were used so as to build a research model that extended the…

  1. Virtual reality technique to assist measurement of degree of shaking of two minarets of an ancient building

    NASA Astrophysics Data System (ADS)

    Homainejad, Amir S.; Satari, Mehran

    2000-05-01

    VR is possible which brings users to the reality by computer and VE is a simulated world which takes users to any points and directions of the object. VR and VE can be very useful if accurate and precise data are sued, and allows users to work with realistic model. Photogrammetry is a technique which is able to collect and provide accurate and precise data for building 3D model in a computer. Data can be collected from various sensor and cameras, and methods of data collector are vary based on the method of image acquiring. Indeed VR includes real-time graphics, 3D model, and display and it has application in the entertainment industry, flight simulators, industrial design.

  2. Modelling of Rail Vehicles and Track for Calculation of Ground-Vibration Transmission Into Buildings

    NASA Astrophysics Data System (ADS)

    Hunt, H. E. M.

    1996-05-01

    A methodology for the calculation of vibration transmission from railways into buildings is presented. The method permits existing models of railway vehicles and track to be incorporated and it has application to any model of vibration transmission through the ground. Special attention is paid to the relative phasing between adjacent axle-force inputs to the rail, so that vibration transmission may be calculated as a random process. The vehicle-track model is used in conjunction with a building model of infinite length. The tracking and building are infinite and parallel to each other and forces applied are statistically stationary in space so that vibration levels at any two points along the building are the same. The methodology is two-dimensional for the purpose of application of random process theory, but fully three-dimensional for calculation of vibration transmission from the track and through the ground into the foundations of the building. The computational efficiency of the method will interest engineers faced with the task of reducing vibration levels in buildings. It is possible to assess the relative merits of using rail pads, under-sleeper pads, ballast mats, floating-slab track or base isolation for particular applications.

  3. Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2015-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.

  4. GillesPy: A Python Package for Stochastic Model Building and Simulation.

    PubMed

    Abel, John H; Drawert, Brian; Hellander, Andreas; Petzold, Linda R

    2016-09-01

    GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community.

  5. GillesPy: A Python Package for Stochastic Model Building and Simulation

    PubMed Central

    Abel, John H.; Drawert, Brian; Hellander, Andreas; Petzold, Linda R.

    2017-01-01

    GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community. PMID:28630888

  6. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  7. Implicit Regularization for Reconstructing 3D Building Rooftop Models Using Airborne LiDAR Data

    PubMed Central

    Jung, Jaewook; Jwa, Yoonseok; Sohn, Gunho

    2017-01-01

    With rapid urbanization, highly accurate and semantically rich virtualization of building assets in 3D become more critical for supporting various applications, including urban planning, emergency response and location-based services. Many research efforts have been conducted to automatically reconstruct building models at city-scale from remotely sensed data. However, developing a fully-automated photogrammetric computer vision system enabling the massive generation of highly accurate building models still remains a challenging task. One the most challenging task for 3D building model reconstruction is to regularize the noises introduced in the boundary of building object retrieved from a raw data with lack of knowledge on its true shape. This paper proposes a data-driven modeling approach to reconstruct 3D rooftop models at city-scale from airborne laser scanning (ALS) data. The focus of the proposed method is to implicitly derive the shape regularity of 3D building rooftops from given noisy information of building boundary in a progressive manner. This study covers a full chain of 3D building modeling from low level processing to realistic 3D building rooftop modeling. In the element clustering step, building-labeled point clouds are clustered into homogeneous groups by applying height similarity and plane similarity. Based on segmented clusters, linear modeling cues including outer boundaries, intersection lines, and step lines are extracted. Topology elements among the modeling cues are recovered by the Binary Space Partitioning (BSP) technique. The regularity of the building rooftop model is achieved by an implicit regularization process in the framework of Minimum Description Length (MDL) combined with Hypothesize and Test (HAT). The parameters governing the MDL optimization are automatically estimated based on Min-Max optimization and Entropy-based weighting method. The performance of the proposed method is tested over the International Society for Photogrammetry and Remote Sensing (ISPRS) benchmark datasets. The results show that the proposed method can robustly produce accurate regularized 3D building rooftop models. PMID:28335486

  8. Implicit Regularization for Reconstructing 3D Building Rooftop Models Using Airborne LiDAR Data.

    PubMed

    Jung, Jaewook; Jwa, Yoonseok; Sohn, Gunho

    2017-03-19

    With rapid urbanization, highly accurate and semantically rich virtualization of building assets in 3D become more critical for supporting various applications, including urban planning, emergency response and location-based services. Many research efforts have been conducted to automatically reconstruct building models at city-scale from remotely sensed data. However, developing a fully-automated photogrammetric computer vision system enabling the massive generation of highly accurate building models still remains a challenging task. One the most challenging task for 3D building model reconstruction is to regularize the noises introduced in the boundary of building object retrieved from a raw data with lack of knowledge on its true shape. This paper proposes a data-driven modeling approach to reconstruct 3D rooftop models at city-scale from airborne laser scanning (ALS) data. The focus of the proposed method is to implicitly derive the shape regularity of 3D building rooftops from given noisy information of building boundary in a progressive manner. This study covers a full chain of 3D building modeling from low level processing to realistic 3D building rooftop modeling. In the element clustering step, building-labeled point clouds are clustered into homogeneous groups by applying height similarity and plane similarity. Based on segmented clusters, linear modeling cues including outer boundaries, intersection lines, and step lines are extracted. Topology elements among the modeling cues are recovered by the Binary Space Partitioning (BSP) technique. The regularity of the building rooftop model is achieved by an implicit regularization process in the framework of Minimum Description Length (MDL) combined with Hypothesize and Test (HAT). The parameters governing the MDL optimization are automatically estimated based on Min-Max optimization and Entropy-based weighting method. The performance of the proposed method is tested over the International Society for Photogrammetry and Remote Sensing (ISPRS) benchmark datasets. The results show that the proposed method can robustly produce accurate regularized 3D building rooftop models.

  9. Predictive Models and Computational Embryology

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  10. Integrative structure modeling with the Integrative Modeling Platform.

    PubMed

    Webb, Benjamin; Viswanath, Shruthi; Bonomi, Massimiliano; Pellarin, Riccardo; Greenberg, Charles H; Saltzberg, Daniel; Sali, Andrej

    2018-01-01

    Building models of a biological system that are consistent with the myriad data available is one of the key challenges in biology. Modeling the structure and dynamics of macromolecular assemblies, for example, can give insights into how biological systems work, evolved, might be controlled, and even designed. Integrative structure modeling casts the building of structural models as a computational optimization problem, for which information about the assembly is encoded into a scoring function that evaluates candidate models. Here, we describe our open source software suite for integrative structure modeling, Integrative Modeling Platform (https://integrativemodeling.org), and demonstrate its use. © 2017 The Protein Society.

  11. A support architecture for reliable distributed computing systems

    NASA Technical Reports Server (NTRS)

    Dasgupta, Partha; Leblanc, Richard J., Jr.

    1988-01-01

    The Clouds project is well underway to its goal of building a unified distributed operating system supporting the object model. The operating system design uses the object concept of structuring software at all levels of the system. The basic operating system was developed and work is under progress to build a usable system.

  12. Impacts: NIST Building and Fire Research Laboratory (technical and societal)

    NASA Astrophysics Data System (ADS)

    Raufaste, N. J.

    1993-08-01

    The Building and Fire Research Laboratory (BFRL) of the National Institute of Standards and Technology (NIST) is dedicated to the life cycle quality of constructed facilities. The report describes major effects of BFRL's program on building and fire research. Contents of the document include: structural reliability; nondestructive testing of concrete; structural failure investigations; seismic design and construction standards; rehabilitation codes and standards; alternative refrigerants research; HVAC simulation models; thermal insulation; residential equipment energy efficiency; residential plumbing standards; computer image evaluation of building materials; corrosion-protection for reinforcing steel; prediction of the service lives of building materials; quality of construction materials laboratory testing; roofing standards; simulating fires with computers; fire safety evaluation system; fire investigations; soot formation and evolution; cone calorimeter development; smoke detector standards; standard for the flammability of children's sleepwear; smoldering insulation fires; wood heating safety research; in-place testing of concrete; communication protocols for building automation and control systems; computer simulation of the properties of concrete and other porous materials; cigarette-induced furniture fires; carbon monoxide formation in enclosure fires; halon alternative fire extinguishing agents; turbulent mixing research; materials fire research; furniture flammability testing; standard for the cigarette ignition resistance of mattresses; support of navy firefighter trainer program; and using fire to clean up oil spills.

  13. The Impact and Promise of Open-Source Computational Material for Physics Teaching

    NASA Astrophysics Data System (ADS)

    Christian, Wolfgang

    2017-01-01

    A computer-based modeling approach to teaching must be flexible because students and teachers have different skills and varying levels of preparation. Learning how to run the ``software du jour'' is not the objective for integrating computational physics material into the curriculum. Learning computational thinking, how to use computation and computer-based visualization to communicate ideas, how to design and build models, and how to use ready-to-run models to foster critical thinking is the objective. Our computational modeling approach to teaching is a research-proven pedagogy that predates computers. It attempts to enhance student achievement through the Modeling Cycle. This approach was pioneered by Robert Karplus and the SCIS Project in the 1960s and 70s and later extended by the Modeling Instruction Program led by Jane Jackson and David Hestenes at Arizona State University. This talk describes a no-cost open-source computational approach aligned with a Modeling Cycle pedagogy. Our tools, curricular material, and ready-to-run examples are freely available from the Open Source Physics Collection hosted on the AAPT-ComPADRE digital library. Examples will be presented.

  14. A cognitive computational model inspired by the immune system response.

    PubMed

    Abdo Abd Al-Hady, Mohamed; Badr, Amr Ahmed; Mostafa, Mostafa Abd Al-Azim

    2014-01-01

    The immune system has a cognitive ability to differentiate between healthy and unhealthy cells. The immune system response (ISR) is stimulated by a disorder in the temporary fuzzy state that is oscillating between the healthy and unhealthy states. However, modeling the immune system is an enormous challenge; the paper introduces an extensive summary of how the immune system response functions, as an overview of a complex topic, to present the immune system as a cognitive intelligent agent. The homogeneity and perfection of the natural immune system have been always standing out as the sought-after model we attempted to imitate while building our proposed model of cognitive architecture. The paper divides the ISR into four logical phases: setting a computational architectural diagram for each phase, proceeding from functional perspectives (input, process, and output), and their consequences. The proposed architecture components are defined by matching biological operations with computational functions and hence with the framework of the paper. On the other hand, the architecture focuses on the interoperability of main theoretical immunological perspectives (classic, cognitive, and danger theory), as related to computer science terminologies. The paper presents a descriptive model of immune system, to figure out the nature of response, deemed to be intrinsic for building a hybrid computational model based on a cognitive intelligent agent perspective and inspired by the natural biology. To that end, this paper highlights the ISR phases as applied to a case study on hepatitis C virus, meanwhile illustrating our proposed architecture perspective.

  15. A Cognitive Computational Model Inspired by the Immune System Response

    PubMed Central

    Abdo Abd Al-Hady, Mohamed; Badr, Amr Ahmed; Mostafa, Mostafa Abd Al-Azim

    2014-01-01

    The immune system has a cognitive ability to differentiate between healthy and unhealthy cells. The immune system response (ISR) is stimulated by a disorder in the temporary fuzzy state that is oscillating between the healthy and unhealthy states. However, modeling the immune system is an enormous challenge; the paper introduces an extensive summary of how the immune system response functions, as an overview of a complex topic, to present the immune system as a cognitive intelligent agent. The homogeneity and perfection of the natural immune system have been always standing out as the sought-after model we attempted to imitate while building our proposed model of cognitive architecture. The paper divides the ISR into four logical phases: setting a computational architectural diagram for each phase, proceeding from functional perspectives (input, process, and output), and their consequences. The proposed architecture components are defined by matching biological operations with computational functions and hence with the framework of the paper. On the other hand, the architecture focuses on the interoperability of main theoretical immunological perspectives (classic, cognitive, and danger theory), as related to computer science terminologies. The paper presents a descriptive model of immune system, to figure out the nature of response, deemed to be intrinsic for building a hybrid computational model based on a cognitive intelligent agent perspective and inspired by the natural biology. To that end, this paper highlights the ISR phases as applied to a case study on hepatitis C virus, meanwhile illustrating our proposed architecture perspective. PMID:25003131

  16. Comparison of Building Loads Analysis and System Thermodynamics (BLAST) Computer Program Simulations and Measured Energy Use for Army Buildings.

    DTIC Science & Technology

    1980-05-01

    engineering ,ZteNo D R RPTE16 research w 9 laboratory COMPARISON OF BUILDING LOADS ANALYSIS AND SYSTEM THERMODYNAMICS (BLAST) AD 0 5 5,0 3COMPUTER PROGRAM...Building Loads Analysis and System Thermodynamics (BLAST) computer program. A dental clinic and a battalion headquarters and classroom building were...Building and HVAC System Data Computer Simulation Comparison of Actual and Simulated Results ANALYSIS AND FINDINGS

  17. ENERGY COSTS OF IAQ CONTROL THROUGH INCREASED VENTILATION IN A SMALL OFFICE IN A WARM, HUMID CLIMATE: PARAMETRIC ANALYSIS USING THE DOE-2 COMPUTER MODEL

    EPA Science Inventory

    The report gives results of a series of computer runs using the DOE-2.1E building energy model, simulating a small office in a hot, humid climate (Miami). These simulations assessed the energy and relative humidity (RH) penalties when the outdoor air (OA) ventilation rate is inc...

  18. A Study of the Use of Ontologies for Building Computer-Aided Control Engineering Self-Learning Educational Software

    NASA Astrophysics Data System (ADS)

    García, Isaías; Benavides, Carmen; Alaiz, Héctor; Alonso, Angel

    2013-08-01

    This paper describes research on the use of knowledge models (ontologies) for building computer-aided educational software in the field of control engineering. Ontologies are able to represent in the computer a very rich conceptual model of a given domain. This model can be used later for a number of purposes in different software applications. In this study, domain ontology about the field of lead-lag compensator design has been built and used for automatic exercise generation, graphical user interface population and interaction with the user at any level of detail, including explanations about why things occur. An application called Onto-CELE (ontology-based control engineering learning environment) uses the ontology for implementing a learning environment that can be used for self and lifelong learning purposes. The experience has shown that the use of knowledge models as the basis for educational software applications is capable of showing students the whole complexity of the analysis and design processes at any level of detail. A practical experience with postgraduate students has shown the mentioned benefits and possibilities of the approach.

  19. Building the Scientific Modeling Assistant: An interactive environment for specialized software design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.

  20. Developing 3D morphologies for simulating building energy demand in urban microclimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    New, Joshua Ryan; Omitaomu, Olufemi A.; Allen, Melissa R.

    In order to simulate the effect of interactions between urban morphology and microclimate on demand for heating and cooling in buildings, we utilize source elevation data to create 3D building geometries at the neighborhood and city scale. Additionally, we use urban morphology concepts to design virtual morphologies for simulation scenarios in an undeveloped land parcel. Using these morphologies, we compute building-energy parameters such as the density for each surface and the frontal area index for each of the buildings to be able to effectively model the microclimate for the urban area.

  1. Rise of Buoyant Emissions from Low-Level Sources in the Presence of Upstream and Downstream Obstacles

    NASA Astrophysics Data System (ADS)

    Pournazeri, Sam; Princevac, Marko; Venkatram, Akula

    2012-08-01

    Field and laboratory studies have been conducted to investigate the effect of surrounding buildings on the plume rise from low-level buoyant sources, such as distributed power generators. The field experiments were conducted in Palm Springs, California, USA in November 2010 and plume rise from a 9.3 m stack was measured. In addition to the field study, a laboratory study was conducted in a water channel to investigate the effects of surrounding buildings on plume rise under relatively high wind-speed conditions. Different building geometries and source conditions were tested. The experiments revealed that plume rise from low-level buoyant sources is highly affected by the complex flows induced by buildings stationed upstream and downstream of the source. The laboratory results were compared with predictions from a newly developed numerical plume-rise model. Using the flow measurements associated with each building configuration, the numerical model accurately predicted plume rise from low-level buoyant sources that are influenced by buildings. This numerical plume rise model can be used as a part of a computational fluid dynamics model.

  2. Predictive Models and Computational Toxicology (II IBAMTOX)

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  3. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  4. EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing

    PubMed Central

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590

  5. Interactive computation of coverage regions for indoor wireless communication

    NASA Astrophysics Data System (ADS)

    Abbott, A. Lynn; Bhat, Nitin; Rappaport, Theodore S.

    1995-12-01

    This paper describes a system which assists in the strategic placement of rf base stations within buildings. Known as the site modeling tool (SMT), this system allows the user to display graphical floor plans and to select base station transceiver parameters, including location and orientation, interactively. The system then computes and highlights estimated coverage regions for each transceiver, enabling the user to assess the total coverage within the building. For single-floor operation, the user can choose between distance-dependent and partition- dependent path-loss models. Similar path-loss models are also available for the case of multiple floors. This paper describes the method used by the system to estimate coverage for both directional and omnidirectional antennas. The site modeling tool is intended to be simple to use by individuals who are not experts at wireless communication system design, and is expected to be very useful in the specification of indoor wireless systems.

  6. Shock wave interaction with L-shaped structures

    NASA Astrophysics Data System (ADS)

    Miller, Richard C.

    1993-12-01

    This study investigated the interaction of shock waves with L-shaped structures using the CTH hydrodynamics code developed by Sandia National Laboratories. Computer models of shock waves traveling through air were developed using techniques similar to shock tube experiments. Models of L-shaped buildings were used to determine overpressures achieved by the reflecting shock versus angle of incidence of the shock front. An L-shaped building model rotated 45 degrees to the planar shock front produced the highest reflected overpressure of 9.73 atmospheres in the corner joining the two wings, a value 9.5 times the incident overpressure of 1.02 atmospheres. The same L-shaped building was modeled with the two wings separated by 4.24 meters to simulate an open courtyard. This open area provided a relief path for the incident shock wave, creating a peak overpressure of only 4.86 atmospheres on the building's wall surfaces from the same 1.02 atmosphere overpressure incident shock wave.

  7. Community Seismic Network (CSN)

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.; Liu, A.; Strand, L.

    2012-12-01

    We report on developments in sensor connectivity, architecture, and data fusion algorithms executed in Cloud computing systems in the Community Seismic Network (CSN), a network of low-cost sensors housed in homes and offices by volunteers in the Pasadena, CA area. The network has over 200 sensors continuously reporting anomalies in local acceleration through the Internet to a Cloud computing service (the Google App Engine) that continually fuses sensor data to rapidly detect shaking from earthquakes. The Cloud computing system consists of data centers geographically distributed across the continent and is likely to be resilient even during earthquakes and other local disasters. The region of Southern California is partitioned in a multi-grid style into sets of telescoping cells called geocells. Data streams from sensors within a geocell are fused to detect anomalous shaking across the geocell. Temporal spatial patterns across geocells are used to detect anomalies across regions. The challenge is to detect earthquakes rapidly with an extremely low false positive rate. We report on two data fusion algorithms, one that tessellates the surface so as to fuse data from a large region around Pasadena and the other, which uses a standard tessellation of equal-sized cells. Since September 2011, the network has successfully detected earthquakes of magnitude 2.5 or higher within 40 Km of Pasadena. In addition to the standard USB device, which connects to the host's computer, we have developed a stand-alone sensor that directly connects to the internet via Ethernet or wifi. This bypasses security concerns that some companies have with the USB-connected devices, and allows for 24/7 monitoring at sites that would otherwise shut down their computers after working hours. In buildings we use the sensors to model the behavior of the structures during weak events in order to understand how they will perform during strong events. Visualization models of instrumented buildings ranging between five and 22 stories tall have been constructed using Google SketchUp. Ambient vibration records are used to identify the first set of horizontal vibrational modal frequencies of the buildings. These frequencies are used to compute the response on every floor of the building, given either observed data or scenario ground motion input at the buildings' base.

  8. Computational models of airway branching morphogenesis.

    PubMed

    Varner, Victor D; Nelson, Celeste M

    2017-07-01

    The bronchial network of the mammalian lung consists of millions of dichotomous branches arranged in a highly complex, space-filling tree. Recent computational models of branching morphogenesis in the lung have helped uncover the biological mechanisms that construct this ramified architecture. In this review, we focus on three different theoretical approaches - geometric modeling, reaction-diffusion modeling, and continuum mechanical modeling - and discuss how, taken together, these models have identified the geometric principles necessary to build an efficient bronchial network, as well as the patterning mechanisms that specify airway geometry in the developing embryo. We emphasize models that are integrated with biological experiments and suggest how recent progress in computational modeling has advanced our understanding of airway branching morphogenesis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. On Roles of Models in Information Systems

    NASA Astrophysics Data System (ADS)

    Sølvberg, Arne

    The increasing penetration of computers into all aspects of human activity makes it desirable that the interplay among software, data and the domains where computers are applied is made more transparent. An approach to this end is to explicitly relate the modeling concepts of the domains, e.g., natural science, technology and business, to the modeling concepts of software and data. This may make it simpler to build comprehensible integrated models of the interactions between computers and non-computers, e.g., interaction among computers, people, physical processes, biological processes, and administrative processes. This chapter contains an analysis of various facets of the modeling environment for information systems engineering. The lack of satisfactory conceptual modeling tools seems to be central to the unsatisfactory state-of-the-art in establishing information systems. The chapter contains a proposal for defining a concept of information that is relevant to information systems engineering.

  10. Deep Neural Networks: A New Framework for Modeling Biological Vision and Brain Information Processing.

    PubMed

    Kriegeskorte, Nikolaus

    2015-11-24

    Recent advances in neural network modeling have enabled major strides in computer vision and other artificial intelligence applications. Human-level visual recognition abilities are coming within reach of artificial systems. Artificial neural networks are inspired by the brain, and their computations could be implemented in biological neurons. Convolutional feedforward networks, which now dominate computer vision, take further inspiration from the architecture of the primate visual hierarchy. However, the current models are designed with engineering goals, not to model brain computations. Nevertheless, initial studies comparing internal representations between these models and primate brains find surprisingly similar representational spaces. With human-level performance no longer out of reach, we are entering an exciting new era, in which we will be able to build biologically faithful feedforward and recurrent computational models of how biological brains perform high-level feats of intelligence, including vision.

  11. Impact of input data uncertainty on environmental exposure assessment models: A case study for electromagnetic field modelling from mobile phone base stations.

    PubMed

    Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel

    2014-11-01

    With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Colour computer-generated holography for point clouds utilizing the Phong illumination model.

    PubMed

    Symeonidou, Athanasia; Blinder, David; Schelkens, Peter

    2018-04-16

    A technique integrating the bidirectional reflectance distribution function (BRDF) is proposed to generate realistic high-quality colour computer-generated holograms (CGHs). We build on prior work, namely a fast computer-generated holography method for point clouds that handles occlusions. We extend the method by integrating the Phong illumination model so that the properties of the objects' surfaces are taken into account to achieve natural light phenomena such as reflections and shadows. Our experiments show that rendering holograms with the proposed algorithm provides realistic looking objects without any noteworthy increase to the computational cost.

  13. Parametric Analysis of Energy Consumption in Army Buildings by the Building Loads Analysis and System Thermodynamics (BLAST) Computer Program.

    DTIC Science & Technology

    1980-08-01

    orientation, and HVAC systems have on three Army buildings in five different climatic regions. f Optimization of EnerV Usage in Military Facilities...The clinic’s environment is maintained by a multizone air-handling unit served by its own boiler and chiller . The building was modeled with 30... setpoints for the space temperature. This type of throttling range allows the heating system to control around a throttling range of 67 to 69oF (19 to 200

  14. Advanced Technology for Portable Personal Visualization

    DTIC Science & Technology

    1991-03-01

    walking. The building model is created using AutoCAD. Realism is enhanced by calculating a radiosity solution for the lighting model. This has an added...lighting, color combinations and decor. Due to the computationally intensive nature of the radiosity solution, modeling changes cannot be made on-line

  15. Human Resource Management, Computers, and Organization Theory.

    ERIC Educational Resources Information Center

    Garson, G. David

    In an attempt to provide a framework for research and theory building in public management information systems (PMIS), state officials responsible for computing in personnel operations were surveyed. The data were applied to hypotheses arising from a recent model by Bozeman and Bretschneider, attempting to relate organization theory to management…

  16. Answer Set Programming and Other Computing Paradigms

    ERIC Educational Resources Information Center

    Meng, Yunsong

    2013-01-01

    Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to…

  17. Mobile Building Energy Audit and Modeling Tools: Cooperative Research and Development Final Report, CRADA Number CRD-11-00441

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brackney, L.

    Broadly accessible, low cost, accurate, and easy-to-use energy auditing tools remain out of reach for managers of the aging U.S. building population (over 80% of U.S. commercial buildings are more than 10 years old*). concept3D and NREL's commercial buildings group will work to translate and extend NREL's existing spreadsheet-based energy auditing tool for a browser-friendly and mobile-computing platform. NREL will also work with concept3D to further develop a prototype geometry capture and materials inference tool operable on a smart phone/pad platform. These tools will be developed to interoperate with NREL's Building Component Library and OpenStudio energy modeling platforms, and willmore » be marketed by concept3D to commercial developers, academic institutions and governmental agencies. concept3D is NREL's lead developer and subcontractor of the Building Component Library.« less

  18. Systems biology by the rules: hybrid intelligent systems for pathway modeling and discovery.

    PubMed

    Bosl, William J

    2007-02-15

    Expert knowledge in journal articles is an important source of data for reconstructing biological pathways and creating new hypotheses. An important need for medical research is to integrate this data with high throughput sources to build useful models that span several scales. Researchers traditionally use mental models of pathways to integrate information and development new hypotheses. Unfortunately, the amount of information is often overwhelming and these are inadequate for predicting the dynamic response of complex pathways. Hierarchical computational models that allow exploration of semi-quantitative dynamics are useful systems biology tools for theoreticians, experimentalists and clinicians and may provide a means for cross-communication. A novel approach for biological pathway modeling based on hybrid intelligent systems or soft computing technologies is presented here. Intelligent hybrid systems, which refers to several related computing methods such as fuzzy logic, neural nets, genetic algorithms, and statistical analysis, has become ubiquitous in engineering applications for complex control system modeling and design. Biological pathways may be considered to be complex control systems, which medicine tries to manipulate to achieve desired results. Thus, hybrid intelligent systems may provide a useful tool for modeling biological system dynamics and computational exploration of new drug targets. A new modeling approach based on these methods is presented in the context of hedgehog regulation of the cell cycle in granule cells. Code and input files can be found at the Bionet website: www.chip.ord/~wbosl/Software/Bionet. This paper presents the algorithmic methods needed for modeling complicated biochemical dynamics using rule-based models to represent expert knowledge in the context of cell cycle regulation and tumor growth. A notable feature of this modeling approach is that it allows biologists to build complex models from their knowledge base without the need to translate that knowledge into mathematical form. Dynamics on several levels, from molecular pathways to tissue growth, are seamlessly integrated. A number of common network motifs are examined and used to build a model of hedgehog regulation of the cell cycle in cerebellar neurons, which is believed to play a key role in the etiology of medulloblastoma, a devastating childhood brain cancer.

  19. Building A Community Focused Data and Modeling Collaborative platform with Hardware Virtualization Technology

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Wang, W.; Melton, F. S.; Votava, P.; Milesi, C.; Hashimoto, H.; Nemani, R. R.; Hiatt, S. H.

    2009-12-01

    As the length and diversity of the global earth observation data records grow, modeling and analyses of biospheric conditions increasingly requires multiple terabytes of data from a diversity of models and sensors. With network bandwidth beginning to flatten, transmission of these data from centralized data archives presents an increasing challenge, and costs associated with local storage and management of data and compute resources are often significant for individual research and application development efforts. Sharing community valued intermediary data sets, results and codes from individual efforts with others that are not in direct funded collaboration can also be a challenge with respect to time, cost and expertise. We purpose a modeling, data and knowledge center that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform, named Ecosystem Modeling Center (EMC). With the recent development of new technologies for secure hardware virtualization, an opportunity exists to create specific modeling, analysis and compute environments that are customizable, “archiveable” and transferable. Allowing users to instantiate such environments on large compute infrastructures that are directly connected to large data archives may significantly reduce costs and time associated with scientific efforts by alleviating users from redundantly retrieving and integrating data sets and building modeling analysis codes. The EMC platform also provides the possibility for users receiving indirect assistance from expertise through prefabricated compute environments, potentially reducing study “ramp up” times.

  20. Hygrothermal Simulation: A Tool for Building Envelope Design Analysis

    Treesearch

    Samuel V. Glass; Anton TenWolde; Samuel L. Zelinka

    2013-01-01

    Is it possible to gauge the risk of moisture problems while designing the building envelope? This article provides a brief introduction to computer-based hygrothermal (heat and moisture) simulation, shows how simulation can be useful as a design tool, and points out a number of im-portant considerations regarding model inputs and limita-tions. Hygrothermal simulation...

  1. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Foudriat, E. C.

    1991-01-01

    A modeling tool for both analysis and design of distributed systems is discussed. Since many research institutions have access to networks of workstations, the researchers decided to build a tool running on top of the workstations to function as a prototype as well as a distributed simulator for a computing system. The effects of system modeling on performance prediction in distributed systems and the effect of static locking and deadlocks on the performance predictions of distributed transactions are also discussed. While the probability of deadlock is considerably small, its effects on performance could be significant.

  2. Sarment: Python modules for HMM analysis and partitioning of sequences.

    PubMed

    Guéguen, Laurent

    2005-08-15

    Sarment is a package of Python modules for easy building and manipulation of sequence segmentations. It provides efficient implementation of usual algorithms for hidden Markov Model computation, as well as for maximal predictive partitioning. Owing to its very large variety of criteria for computing segmentations, Sarment can handle many kinds of models. Because of object-oriented programming, the results of the segmentation are very easy tomanipulate.

  3. Distribution Locational Real-Time Pricing Based Smart Building Control and Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hao, Jun; Dai, Xiaoxiao; Zhang, Yingchen

    This paper proposes an real-virtual parallel computing scheme for smart building operations aiming at augmenting overall social welfare. The University of Denver's campus power grid and Ritchie fitness center is used for demonstrating the proposed approach. An artificial virtual system is built in parallel to the real physical system to evaluate the overall social cost of the building operation based on the social science based working productivity model, numerical experiment based building energy consumption model and the power system based real-time pricing mechanism. Through interactive feedback exchanged between the real and virtual system, enlarged social welfare, including monetary cost reductionmore » and energy saving, as well as working productivity improvements, can be achieved.« less

  4. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    PubMed Central

    Zhu, Hao; Sun, Yan; Rajagopal, Gunaretnam; Mondry, Adrian; Dhar, Pawan

    2004-01-01

    Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods described. PMID:15339335

  5. Statistical Surrogate Modeling of Atmospheric Dispersion Events Using Bayesian Adaptive Splines

    NASA Astrophysics Data System (ADS)

    Francom, D.; Sansó, B.; Bulaevskaya, V.; Lucas, D. D.

    2016-12-01

    Uncertainty in the inputs of complex computer models, including atmospheric dispersion and transport codes, is often assessed via statistical surrogate models. Surrogate models are computationally efficient statistical approximations of expensive computer models that enable uncertainty analysis. We introduce Bayesian adaptive spline methods for producing surrogate models that capture the major spatiotemporal patterns of the parent model, while satisfying all the necessities of flexibility, accuracy and computational feasibility. We present novel methodological and computational approaches motivated by a controlled atmospheric tracer release experiment conducted at the Diablo Canyon nuclear power plant in California. Traditional methods for building statistical surrogate models often do not scale well to experiments with large amounts of data. Our approach is well suited to experiments involving large numbers of model inputs, large numbers of simulations, and functional output for each simulation. Our approach allows us to perform global sensitivity analysis with ease. We also present an approach to calibration of simulators using field data.

  6. Efficient and Robust Optimization for Building Energy Simulation

    PubMed Central

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-01-01

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell’s Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell’s method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell’s Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell’s Hybrid method presently used in HVACSIM+. PMID:27325907

  7. Efficient and Robust Optimization for Building Energy Simulation.

    PubMed

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-06-15

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell's Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell's method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell's Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell's Hybrid method presently used in HVACSIM+.

  8. Statistical molecular design of balanced compound libraries for QSAR modeling.

    PubMed

    Linusson, A; Elofsson, M; Andersson, I E; Dahlgren, M K

    2010-01-01

    A fundamental step in preclinical drug development is the computation of quantitative structure-activity relationship (QSAR) models, i.e. models that link chemical features of compounds with activities towards a target macromolecule associated with the initiation or progression of a disease. QSAR models are computed by combining information on the physicochemical and structural features of a library of congeneric compounds, typically assembled from two or more building blocks, and biological data from one or more in vitro assays. Since the models provide information on features affecting the compounds' biological activity they can be used as guides for further optimization. However, in order for a QSAR model to be relevant to the targeted disease, and drug development in general, the compound library used must contain molecules with balanced variation of the features spanning the chemical space believed to be important for interaction with the biological target. In addition, the assays used must be robust and deliver high quality data that are directly related to the function of the biological target and the associated disease state. In this review, we discuss and exemplify the concept of statistical molecular design (SMD) in the selection of building blocks and final synthetic targets (i.e. compounds to synthesize) to generate information-rich, balanced libraries for biological testing and computation of QSAR models.

  9. Applied Distributed Model Predictive Control for Energy Efficient Buildings and Ramp Metering

    NASA Astrophysics Data System (ADS)

    Koehler, Sarah Muraoka

    Industrial large-scale control problems present an interesting algorithmic design challenge. A number of controllers must cooperate in real-time on a network of embedded hardware with limited computing power in order to maximize system efficiency while respecting constraints and despite communication delays. Model predictive control (MPC) can automatically synthesize a centralized controller which optimizes an objective function subject to a system model, constraints, and predictions of disturbance. Unfortunately, the computations required by model predictive controllers for large-scale systems often limit its industrial implementation only to medium-scale slow processes. Distributed model predictive control (DMPC) enters the picture as a way to decentralize a large-scale model predictive control problem. The main idea of DMPC is to split the computations required by the MPC problem amongst distributed processors that can compute in parallel and communicate iteratively to find a solution. Some popularly proposed solutions are distributed optimization algorithms such as dual decomposition and the alternating direction method of multipliers (ADMM). However, these algorithms ignore two practical challenges: substantial communication delays present in control systems and also problem non-convexity. This thesis presents two novel and practically effective DMPC algorithms. The first DMPC algorithm is based on a primal-dual active-set method which achieves fast convergence, making it suitable for large-scale control applications which have a large communication delay across its communication network. In particular, this algorithm is suited for MPC problems with a quadratic cost, linear dynamics, forecasted demand, and box constraints. We measure the performance of this algorithm and show that it significantly outperforms both dual decomposition and ADMM in the presence of communication delay. The second DMPC algorithm is based on an inexact interior point method which is suited for nonlinear optimization problems. The parallel computation of the algorithm exploits iterative linear algebra methods for the main linear algebra computations in the algorithm. We show that the splitting of the algorithm is flexible and can thus be applied to various distributed platform configurations. The two proposed algorithms are applied to two main energy and transportation control problems. The first application is energy efficient building control. Buildings represent 40% of energy consumption in the United States. Thus, it is significant to improve the energy efficiency of buildings. The goal is to minimize energy consumption subject to the physics of the building (e.g. heat transfer laws), the constraints of the actuators as well as the desired operating constraints (thermal comfort of the occupants), and heat load on the system. In this thesis, we describe the control systems of forced air building systems in practice. We discuss the "Trim and Respond" algorithm which is a distributed control algorithm that is used in practice, and show that it performs similarly to a one-step explicit DMPC algorithm. Then, we apply the novel distributed primal-dual active-set method and provide extensive numerical results for the building MPC problem. The second main application is the control of ramp metering signals to optimize traffic flow through a freeway system. This application is particularly important since urban congestion has more than doubled in the past few decades. The ramp metering problem is to maximize freeway throughput subject to freeway dynamics (derived from mass conservation), actuation constraints, freeway capacity constraints, and predicted traffic demand. In this thesis, we develop a hybrid model predictive controller for ramp metering that is guaranteed to be persistently feasible and stable. This contrasts to previous work on MPC for ramp metering where such guarantees are absent. We apply a smoothing method to the hybrid model predictive controller and apply the inexact interior point method to this nonlinear non-convex ramp metering problem.

  10. An empirical analysis of executive behaviour with hospital executive information systems in Taiwan.

    PubMed

    Huang, Wei-Min

    2013-01-01

    Existing health information systems largely only support the daily operations of a medical centre, and are unable to generate the information required by executives for decision-making. Building on past research concerning information retrieval behaviour and learning through mental models, this study examines the use of information systems by hospital executives in medical centres. It uses a structural equation model to help find ways hospital executives might use information systems more effectively. The results show that computer self-efficacy directly affects the maintenance of mental models, and that system characteristics directly impact learning styles and information retrieval behaviour. Other results include the significant impact of perceived environmental uncertainty on scan searches; information retrieval behaviour and focused searches on mental models and perceived efficiency; scan searches on mental model building; learning styles and model building on perceived efficiency; and finally the impact of mental model maintenance on perceived efficiency and effectiveness.

  11. Enhanced air dispersion modelling at a typical Chinese nuclear power plant site: Coupling RIMPUFF with two advanced diagnostic wind models.

    PubMed

    Liu, Yun; Li, Hong; Sun, Sida; Fang, Sheng

    2017-09-01

    An enhanced air dispersion modelling scheme is proposed to cope with the building layout and complex terrain of a typical Chinese nuclear power plant (NPP) site. In this modelling, the California Meteorological Model (CALMET) and the Stationary Wind Fit and Turbulence (SWIFT) are coupled with the Risø Mesoscale PUFF model (RIMPUFF) for refined wind field calculation. The near-field diffusion coefficient correction scheme of the Atmospheric Relative Concentrations in the Building Wakes Computer Code (ARCON96) is adopted to characterize dispersion in building arrays. The proposed method is evaluated by a wind tunnel experiment that replicates the typical Chinese NPP site. For both wind speed/direction and air concentration, the enhanced modelling predictions agree well with the observations. The fraction of the predictions within a factor of 2 and 5 of observations exceeds 55% and 82% respectively in the building area and the complex terrain area. This demonstrates the feasibility of the new enhanced modelling for typical Chinese NPP sites. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Geospatial Modelling Approach for 3d Urban Densification Developments

    NASA Astrophysics Data System (ADS)

    Koziatek, O.; Dragićević, S.; Li, S.

    2016-06-01

    With growing populations, economic pressures, and the need for sustainable practices, many urban regions are rapidly densifying developments in the vertical built dimension with mid- and high-rise buildings. The location of these buildings can be projected based on key factors that are attractive to urban planners, developers, and potential buyers. Current research in this area includes various modelling approaches, such as cellular automata and agent-based modelling, but the results are mostly linked to raster grids as the smallest spatial units that operate in two spatial dimensions. Therefore, the objective of this research is to develop a geospatial model that operates on irregular spatial tessellations to model mid- and high-rise buildings in three spatial dimensions (3D). The proposed model is based on the integration of GIS, fuzzy multi-criteria evaluation (MCE), and 3D GIS-based procedural modelling. Part of the City of Surrey, within the Metro Vancouver Region, Canada, has been used to present the simulations of the generated 3D building objects. The proposed 3D modelling approach was developed using ESRI's CityEngine software and the Computer Generated Architecture (CGA) language.

  13. Evaluation of an urban vegetative canopy scheme and impact on plume dispersion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Matthew A; Williams, Michael D; Zajic, Dragan

    2009-01-01

    The Quick Urban and Industrial Complex (QUIC) atmospheric dispersion modeling system attempts to fill an important gap between the fast, but nonbuilding-aware Gaussian plume models and the building-aware but slow computational fluid dynamics (CFD) models. While Gaussian models have the ability to give answers quickly to emergency responders, they are unlikely to be able to adequately account for the effects of the building-induced complex flow patterns on the near-source dispersion of contaminants. QUIC uses a diagnostic massconsistent empirical wind model called QUIC-URB that is based on the methodology of Rockle (1990), (see also Kaplan and Dinar 1996). In this approach,more » the recirculation zones that form around and between buildings are inserted into the flow using empirical parameterizations and then the wind field is forced to be mass consistent. Although not as accurate as CFD codes, this approach is several orders of magnitude faster and accounts for the bulk effects of buildings.« less

  14. Computer-aided design of biological circuits using TinkerCell.

    PubMed

    Chandran, Deepak; Bergmann, Frank T; Sauro, Herbert M

    2010-01-01

    Synthetic biology is an engineering discipline that builds on modeling practices from systems biology and wet-lab techniques from genetic engineering. As synthetic biology advances, efficient procedures will be developed that will allow a synthetic biologist to design, analyze, and build biological networks. In this idealized pipeline, computer-aided design (CAD) is a necessary component. The role of a CAD application would be to allow efficient transition from a general design to a final product. TinkerCell is a design tool for serving this purpose in synthetic biology. In TinkerCell, users build biological networks using biological parts and modules. The network can be analyzed using one of several functions provided by TinkerCell or custom programs from third-party sources. Since best practices for modeling and constructing synthetic biology networks have not yet been established, TinkerCell is designed as a flexible and extensible application that can adjust itself to changes in the field. © 2010 Landes Bioscience

  15. Overhead Crane Computer Model

    NASA Astrophysics Data System (ADS)

    Enin, S. S.; Omelchenko, E. Y.; Fomin, N. V.; Beliy, A. V.

    2018-03-01

    The paper has a description of a computer model of an overhead crane system. The designed overhead crane system consists of hoisting, trolley and crane mechanisms as well as a payload two-axis system. With the help of the differential equation of specified mechanisms movement derived through Lagrange equation of the II kind, it is possible to build an overhead crane computer model. The computer model was obtained using Matlab software. Transients of coordinate, linear speed and motor torque of trolley and crane mechanism systems were simulated. In addition, transients of payload swaying were obtained with respect to the vertical axis. A trajectory of the trolley mechanism with simultaneous operation with the crane mechanism is represented in the paper as well as a two-axis trajectory of payload. The designed computer model of an overhead crane is a great means for studying positioning control and anti-sway control systems.

  16. Simulation and evaluation of latent heat thermal energy storage

    NASA Technical Reports Server (NTRS)

    Sigmon, T. W.

    1980-01-01

    The relative value of thermal energy storage (TES) for heat pump storage (heating and cooling) as a function of storage temperature, mode of storage (hotside or coldside), geographic locations, and utility time of use rate structures were derived. Computer models used to simulate the performance of a number of TES/heat pump configurations are described. The models are based on existing performance data of heat pump components, available building thermal load computational procedures, and generalized TES subsystem design. Life cycle costs computed for each site, configuration, and rate structure are discussed.

  17. Computer aided drug design

    NASA Astrophysics Data System (ADS)

    Jain, A.

    2017-08-01

    Computer based method can help in discovery of leads and can potentially eliminate chemical synthesis and screening of many irrelevant compounds, and in this way, it save time as well as cost. Molecular modeling systems are powerful tools for building, visualizing, analyzing and storing models of complex molecular structure that can help to interpretate structure activity relationship. The use of various techniques of molecular mechanics and dynamics and software in Computer aided drug design along with statistics analysis is powerful tool for the medicinal chemistry to synthesis therapeutic and effective drugs with minimum side effect.

  18. Multi-scale modeling in cell biology

    PubMed Central

    Meier-Schellersheim, Martin; Fraser, Iain D. C.; Klauschen, Frederick

    2009-01-01

    Biomedical research frequently involves performing experiments and developing hypotheses that link different scales of biological systems such as, for instance, the scales of intracellular molecular interactions to the scale of cellular behavior and beyond to the behavior of cell populations. Computational modeling efforts that aim at exploring such multi-scale systems quantitatively with the help of simulations have to incorporate several different simulation techniques due to the different time and space scales involved. Here, we provide a non-technical overview of how different scales of experimental research can be combined with the appropriate computational modeling techniques. We also show that current modeling software permits building and simulating multi-scale models without having to become involved with the underlying technical details of computational modeling. PMID:20448808

  19. BRICK v0.2, a simple, accessible, and transparent model framework for climate and regional sea-level projections

    NASA Astrophysics Data System (ADS)

    Wong, Tony E.; Bakker, Alexander M. R.; Ruckert, Kelsey; Applegate, Patrick; Slangen, Aimée B. A.; Keller, Klaus

    2017-07-01

    Simple models can play pivotal roles in the quantification and framing of uncertainties surrounding climate change and sea-level rise. They are computationally efficient, transparent, and easy to reproduce. These qualities also make simple models useful for the characterization of risk. Simple model codes are increasingly distributed as open source, as well as actively shared and guided. Alas, computer codes used in the geosciences can often be hard to access, run, modify (e.g., with regards to assumptions and model components), and review. Here, we describe the simple model framework BRICK (Building blocks for Relevant Ice and Climate Knowledge) v0.2 and its underlying design principles. The paper adds detail to an earlier published model setup and discusses the inclusion of a land water storage component. The framework largely builds on existing models and allows for projections of global mean temperature as well as regional sea levels and coastal flood risk. BRICK is written in R and Fortran. BRICK gives special attention to the model values of transparency, accessibility, and flexibility in order to mitigate the above-mentioned issues while maintaining a high degree of computational efficiency. We demonstrate the flexibility of this framework through simple model intercomparison experiments. Furthermore, we demonstrate that BRICK is suitable for risk assessment applications by using a didactic example in local flood risk management.

  20. ISAMBARD: an open-source computational environment for biomolecular analysis, modelling and design.

    PubMed

    Wood, Christopher W; Heal, Jack W; Thomson, Andrew R; Bartlett, Gail J; Ibarra, Amaurys Á; Brady, R Leo; Sessions, Richard B; Woolfson, Derek N

    2017-10-01

    The rational design of biomolecules is becoming a reality. However, further computational tools are needed to facilitate and accelerate this, and to make it accessible to more users. Here we introduce ISAMBARD, a tool for structural analysis, model building and rational design of biomolecules. ISAMBARD is open-source, modular, computationally scalable and intuitive to use. These features allow non-experts to explore biomolecular design in silico. ISAMBARD addresses a standing issue in protein design, namely, how to introduce backbone variability in a controlled manner. This is achieved through the generalization of tools for parametric modelling, describing the overall shape of proteins geometrically, and without input from experimentally determined structures. This will allow backbone conformations for entire folds and assemblies not observed in nature to be generated de novo, that is, to access the 'dark matter of protein-fold space'. We anticipate that ISAMBARD will find broad applications in biomolecular design, biotechnology and synthetic biology. A current stable build can be downloaded from the python package index (https://pypi.python.org/pypi/isambard/) with development builds available on GitHub (https://github.com/woolfson-group/) along with documentation, tutorial material and all the scripts used to generate the data described in this paper. d.n.woolfson@bristol.ac.uk or chris.wood@bristol.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  1. Modelling of DNA-protein recognition

    NASA Technical Reports Server (NTRS)

    Rein, R.; Garduno, R.; Colombano, S.; Nir, S.; Haydock, K.; Macelroy, R. D.

    1980-01-01

    Computer model-building procedures using stereochemical principles together with theoretical energy calculations appear to be, at this stage, the most promising route toward the elucidation of DNA-protein binding schemes and recognition principles. A review of models and bonding principles is conducted and approaches to modeling are considered, taking into account possible di-hydrogen-bonding schemes between a peptide and a base (or a base pair) of a double-stranded nucleic acid in the major groove, aspects of computer graphic modeling, and a search for isogeometric helices. The energetics of recognition complexes is discussed and several models for peptide DNA recognition are presented.

  2. Building energy modeling for green architecture and intelligent dashboard applications

    NASA Astrophysics Data System (ADS)

    DeBlois, Justin

    Buildings are responsible for 40% of the carbon emissions in the United States. Energy efficiency in this sector is key to reducing overall greenhouse gas emissions. This work studied the passive technique called the roof solar chimney for reducing the cooling load in homes architecturally. Three models of the chimney were created: a zonal building energy model, computational fluid dynamics model, and numerical analytic model. The study estimated the error introduced to the building energy model (BEM) through key assumptions, and then used a sensitivity analysis to examine the impact on the model outputs. The conclusion was that the error in the building energy model is small enough to use it for building simulation reliably. Further studies simulated the roof solar chimney in a whole building, integrated into one side of the roof. Comparisons were made between high and low efficiency constructions, and three ventilation strategies. The results showed that in four US climates, the roof solar chimney results in significant cooling load energy savings of up to 90%. After developing this new method for the small scale representation of a passive architecture technique in BEM, the study expanded the scope to address a fundamental issue in modeling - the implementation of the uncertainty from and improvement of occupant behavior. This is believed to be one of the weakest links in both accurate modeling and proper, energy efficient building operation. A calibrated model of the Mascaro Center for Sustainable Innovation's LEED Gold, 3,400 m2 building was created. Then algorithms were developed for integration to the building's dashboard application that show the occupant the energy savings for a variety of behaviors in real time. An approach using neural networks to act on real-time building automation system data was found to be the most accurate and efficient way to predict the current energy savings for each scenario. A stochastic study examined the impact of the representation of unpredictable occupancy patterns on model results. Combined, these studies inform modelers and researchers on frameworks for simulating holistically designed architecture and improving the interaction between models and building occupants, in residential and commercial settings. v

  3. U.S. Geological Survey Groundwater Modeling Software: Making Sense of a Complex Natural Resource

    USGS Publications Warehouse

    Provost, Alden M.; Reilly, Thomas E.; Harbaugh, Arlen W.; Pollock, David W.

    2009-01-01

    Computer models of groundwater systems simulate the flow of groundwater, including water levels, and the transport of chemical constituents and thermal energy. Groundwater models afford hydrologists a framework on which to organize their knowledge and understanding of groundwater systems, and they provide insights water-resources managers need to plan effectively for future water demands. Building on decades of experience, the U.S. Geological Survey (USGS) continues to lead in the development and application of computer software that allows groundwater models to address scientific and management questions of increasing complexity.

  4. Introduction to Financial Projection Models. Business Management Instructional Software.

    ERIC Educational Resources Information Center

    Pomeroy, Robert W., III

    This guidebook and teacher's guide accompany a personal computer software program and introduce the key elements of financial projection modeling to project the financial statements of an industrial enterprise. The student will then build a model on an electronic spreadsheet. The guidebook teaches the purpose of a financial model and the steps…

  5. Meet EPA Scientist Valerie Zartarian, Ph.D.

    EPA Pesticide Factsheets

    Senior exposure scientist and research environmental engineer Valerie Zartarian, Ph.D. helps build computer models and other tools that advance our understanding of how people interact with chemicals.

  6. 2D hybrid analysis: Approach for building three-dimensional atomic model by electron microscopy image matching.

    PubMed

    Matsumoto, Atsushi; Miyazaki, Naoyuki; Takagi, Junichi; Iwasaki, Kenji

    2017-03-23

    In this study, we develop an approach termed "2D hybrid analysis" for building atomic models by image matching from electron microscopy (EM) images of biological molecules. The key advantage is that it is applicable to flexible molecules, which are difficult to analyze by 3DEM approach. In the proposed approach, first, a lot of atomic models with different conformations are built by computer simulation. Then, simulated EM images are built from each atomic model. Finally, they are compared with the experimental EM image. Two kinds of models are used as simulated EM images: the negative stain model and the simple projection model. Although the former is more realistic, the latter is adopted to perform faster computations. The use of the negative stain model enables decomposition of the averaged EM images into multiple projection images, each of which originated from a different conformation or orientation. We apply this approach to the EM images of integrin to obtain the distribution of the conformations, from which the pathway of the conformational change of the protein is deduced.

  7. Participatory modeling of recreation and tourism

    Treesearch

    Lisa C. Chase; Roelof M.J. Boumans; Stephanie Morse

    2007-01-01

    Communities involved in recreation and tourism planning need to understand the broad range of benefits and challenges--economic, social, and ecological--in order to make informed decisions. Participatory computer modeling is a methodology that involves a community in the process of collectively building a model about a particular situation that affects participants...

  8. Voluminator 2.0 - Speeding up the Approximation of the Volume of Defective 3d Building Models

    NASA Astrophysics Data System (ADS)

    Sindram, M.; Machl, T.; Steuer, H.; Pültz, M.; Kolbe, T. H.

    2016-06-01

    Semantic 3D city models are increasingly used as a data source in planning and analyzing processes of cities. They represent a virtual copy of the reality and are a common information base and source of information for examining urban questions. A significant advantage of virtual city models is that important indicators such as the volume of buildings, topological relationships between objects and other geometric as well as thematic information can be derived. Knowledge about the exact building volume is an essential base for estimating the building energy demand. In order to determine the volume of buildings with conventional algorithms and tools, the buildings may not contain any topological and geometrical errors. The reality, however, shows that city models very often contain errors such as missing surfaces, duplicated faces and misclosures. To overcome these errors (Steuer et al., 2015) have presented a robust method for approximating the volume of building models. For this purpose, a bounding box of the building is divided into a regular grid of voxels and it is determined which voxels are inside the building. The regular arrangement of the voxels leads to a high number of topological tests and prevents the application of this method using very high resolutions. In this paper we present an extension of the algorithm using an octree approach limiting the subdivision of space to regions around surfaces of the building models and to regions where, in the case of defective models, the topological tests are inconclusive. We show that the computation time can be significantly reduced, while preserving the robustness against geometrical and topological errors.

  9. Expanding HPC and Research Computing--The Sustainable Way

    ERIC Educational Resources Information Center

    Grush, Mary

    2009-01-01

    Increased demands for research and high-performance computing (HPC)--along with growing expectations for cost and environmental savings--are putting new strains on the campus data center. More and more, CIOs like the University of Notre Dame's (Indiana) Gordon Wishon are seeking creative ways to build more sustainable models for data center and…

  10. Towards a Theory-Based Design Framework for an Effective E-Learning Computer Programming Course

    ERIC Educational Resources Information Center

    McGowan, Ian S.

    2016-01-01

    Built on Dabbagh (2005), this paper presents a four component theory-based design framework for an e-learning session in introductory computer programming. The framework, driven by a body of exemplars component, emphasizes the transformative interaction between the knowledge building community (KBC) pedagogical model, a mixed instructional…

  11. Computer Assisted Vocational Math. Written for TRS-80, Model I, Level II, 16K.

    ERIC Educational Resources Information Center

    Daly, Judith; And Others

    This computer-assisted curriculum is intended to be used to enhance a vocational mathematics/applied mathematics course. A total of 32 packets were produced to increase the basic mathematics skills of students in the following vocational programs: automotive trades, beauty culture, building trades, climate control, electrical trades,…

  12. Toward a New Model of Usability: Guidelines for Selecting Reading Fluency Apps Suitable for Instruction of Struggling Readers

    ERIC Educational Resources Information Center

    Rinehart, Steven D.; Ahern, Terence C.

    2016-01-01

    Computer applications related to reading instruction have become commonplace in schools and link with established components of the reading process, emergent skills, decoding, comprehension, vocabulary, and fluency. This article focuses on computer technology in conjunction with durable methods for building oral reading fluency when readers…

  13. Building Professionalism and Employability Skills: Embedding Employer Engagement within First-Year Computing Modules

    ERIC Educational Resources Information Center

    Hanna, Philip; Allen, Angela; Kane, Russell; Anderson, Neil; McGowan, Aidan; Collins, Matthew; Hutchison, Malcolm

    2015-01-01

    This paper outlines a means of improving the employability skills of first-year university students through a closely integrated model of employer engagement within computer science modules. The outlined approach illustrates how employability skills, including communication, teamwork and time management skills, can be contextualised in a manner…

  14. Using experimental data to reduce the single-building sigma of fragility curves: case study of the BRD tower in Bucharest, Romania

    NASA Astrophysics Data System (ADS)

    Perrault, Matthieu; Gueguen, Philippe; Aldea, Alexandru; Demetriu, Sorin

    2013-12-01

    The lack of knowledge concerning modelling existing buildings leads to signifiant variability in fragility curves for single or grouped existing buildings. This study aims to investigate the uncertainties of fragility curves, with special consideration of the single-building sigma. Experimental data and simplified models are applied to the BRD tower in Bucharest, Romania, a RC building with permanent instrumentation. A three-step methodology is applied: (1) adjustment of a linear MDOF model for experimental modal analysis using a Timoshenko beam model and based on Anderson's criteria, (2) computation of the structure's response to a large set of accelerograms simulated by SIMQKE software, considering twelve ground motion parameters as intensity measurements (IM), and (3) construction of the fragility curves by comparing numerical interstory drift with the threshold criteria provided by the Hazus methodology for the slight damage state. By introducing experimental data into the model, uncertainty is reduced to 0.02 considering S d ( f 1) as seismic intensity IM and uncertainty related to the model is assessed at 0.03. These values must be compared with the total uncertainty value of around 0.7 provided by the Hazus methodology.

  15. Relative significance of heat transfer processes to quantify tradeoffs between complexity and accuracy of energy simulations with a building energy use patterns classification

    NASA Astrophysics Data System (ADS)

    Heidarinejad, Mohammad

    This dissertation develops rapid and accurate building energy simulations based on a building classification that identifies and focuses modeling efforts on most significant heat transfer processes. The building classification identifies energy use patterns and their contributing parameters for a portfolio of buildings. The dissertation hypothesis is "Building classification can provide minimal required inputs for rapid and accurate energy simulations for a large number of buildings". The critical literature review indicated there is lack of studies to (1) Consider synoptic point of view rather than the case study approach, (2) Analyze influence of different granularities of energy use, (3) Identify key variables based on the heat transfer processes, and (4) Automate the procedure to quantify model complexity with accuracy. Therefore, three dissertation objectives are designed to test out the dissertation hypothesis: (1) Develop different classes of buildings based on their energy use patterns, (2) Develop different building energy simulation approaches for the identified classes of buildings to quantify tradeoffs between model accuracy and complexity, (3) Demonstrate building simulation approaches for case studies. Penn State's and Harvard's campus buildings as well as high performance LEED NC office buildings are test beds for this study to develop different classes of buildings. The campus buildings include detailed chilled water, electricity, and steam data, enabling to classify buildings into externally-load, internally-load, or mixed-load dominated. The energy use of the internally-load buildings is primarily a function of the internal loads and their schedules. Externally-load dominated buildings tend to have an energy use pattern that is a function of building construction materials and outdoor weather conditions. However, most of the commercial medium-sized office buildings have a mixed-load pattern, meaning the HVAC system and operation schedule dictate the indoor condition regardless of the contribution of internal and external loads. To deploy the methodology to another portfolio of buildings, simulated LEED NC office buildings are selected. The advantage of this approach is to isolate energy performance due to inherent building characteristics and location, rather than operational and maintenance factors that can contribute to significant variation in building energy use. A framework for detailed building energy databases with annual energy end-uses is developed to select variables and omit outliers. The results show that the high performance office buildings are internally-load dominated with existence of three different clusters of low-intensity, medium-intensity, and high-intensity energy use pattern for the reviewed office buildings. Low-intensity cluster buildings benefit from small building area, while the medium- and high-intensity clusters have a similar range of floor areas and different energy use intensities. Half of the energy use in the low-intensity buildings is associated with the internal loads, such as lighting and plug loads, indicating that there are opportunities to save energy by using lighting or plug load management systems. A comparison between the frameworks developed for the campus buildings and LEED NC office buildings indicates these two frameworks are complementary to each other. Availability of the information has yielded to two different procedures, suggesting future studies for a portfolio of buildings such as city benchmarking and disclosure ordinance should collect and disclose minimal required inputs suggested by this study with the minimum level of monthly energy consumption granularity. This dissertation developed automated methods using the OpenStudio API (Application Programing Interface) to create energy models based on the building class. ASHRAE Guideline 14 defines well-accepted criteria to measure accuracy of energy simulations; however, there is no well-accepted methodology to quantify the model complexity without the influence of the energy modeler judgment about the model complexity. This study developed a novel method using two weighting factors, including weighting factors based on (1) computational time and (2) easiness of on-site data collection, to measure complexity of the energy models. Therefore, this dissertation enables measurement of both model complexity and accuracy as well as assessment of the inherent tradeoffs between energy simulation model complexity and accuracy. The results of this methodology suggest for most of the internal load contributors such as operation schedules the on-site data collection adds more complexity to the model compared to the computational time. Overall, this study provided specific data on tradeoffs between accuracy and model complexity that points to critical inputs for different building classes, rather than an increase in the volume and detail of model inputs as the current research and consulting practice indicates. (Abstract shortened by UMI.).

  16. Neuronize: a tool for building realistic neuronal cell morphologies

    PubMed Central

    Brito, Juan P.; Mata, Susana; Bayona, Sofia; Pastor, Luis; DeFelipe, Javier; Benavides-Piccione, Ruth

    2013-01-01

    This study presents a tool, Neuronize, for building realistic three-dimensional models of neuronal cells from the morphological information extracted through computer-aided tracing applications. Neuronize consists of a set of methods designed to build 3D neural meshes that approximate the cell membrane at different resolution levels, allowing a balance to be reached between the complexity and the quality of the final model. The main contribution of the present study is the proposal of a novel approach to build a realistic and accurate 3D shape of the soma from the incomplete information stored in the digitally traced neuron, which usually consists of a 2D cell body contour. This technique is based on the deformation of an initial shape driven by the position and thickness of the first order dendrites. The addition of a set of spines along the dendrites completes the model, building a final 3D neuronal cell suitable for its visualization in a wide range of 3D environments. PMID:23761740

  17. Wind tunnel measurements of three-dimensional wakes of buildings. [for aircraft safety applications

    NASA Technical Reports Server (NTRS)

    Logan, E., Jr.; Lin, S. H.

    1982-01-01

    Measurements relevant to the effect of buildings on the low level atmospheric boundary layer are presented. A wind tunnel experiment was undertaken to determine the nature of the flow downstream from a gap between two transversely aligned, equal sized models of rectangular cross section. These building models were immersed in an equilibrium turbulent boundary layer which was developed on a smooth floor in a zero longitudinal pressure gradient. Measurements with an inclined (45 degree) hot-wire were made at key positions downstream of models arranged with a large, small, and no gap between them. Hot-wire theory is presented which enables computation of the three mean velocity components, U, V and W, as well as Reynolds stresses. These measurements permit understanding of the character of the wake downstream of laterally spaced buildings. Surface streamline patterns obtained by the oil film method were used to delineate the separation region to the rear of the buildings for a variety of spacings.

  18. Neuronize: a tool for building realistic neuronal cell morphologies.

    PubMed

    Brito, Juan P; Mata, Susana; Bayona, Sofia; Pastor, Luis; Defelipe, Javier; Benavides-Piccione, Ruth

    2013-01-01

    This study presents a tool, Neuronize, for building realistic three-dimensional models of neuronal cells from the morphological information extracted through computer-aided tracing applications. Neuronize consists of a set of methods designed to build 3D neural meshes that approximate the cell membrane at different resolution levels, allowing a balance to be reached between the complexity and the quality of the final model. The main contribution of the present study is the proposal of a novel approach to build a realistic and accurate 3D shape of the soma from the incomplete information stored in the digitally traced neuron, which usually consists of a 2D cell body contour. This technique is based on the deformation of an initial shape driven by the position and thickness of the first order dendrites. The addition of a set of spines along the dendrites completes the model, building a final 3D neuronal cell suitable for its visualization in a wide range of 3D environments.

  19. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    PubMed

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  20. Progress in building a cognitive vision system

    NASA Astrophysics Data System (ADS)

    Benjamin, D. Paul; Lyons, Damian; Yue, Hong

    2016-05-01

    We are building a cognitive vision system for mobile robots that works in a manner similar to the human vision system, using saccadic, vergence and pursuit movements to extract information from visual input. At each fixation, the system builds a 3D model of a small region, combining information about distance, shape, texture and motion to create a local dynamic spatial model. These local 3D models are composed to create an overall 3D model of the robot and its environment. This approach turns the computer vision problem into a search problem whose goal is the acquisition of sufficient spatial understanding for the robot to succeed at its tasks. The research hypothesis of this work is that the movements of the robot's cameras are only those that are necessary to build a sufficiently accurate world model for the robot's current goals. For example, if the goal is to navigate through a room, the model needs to contain any obstacles that would be encountered, giving their approximate positions and sizes. Other information does not need to be rendered into the virtual world, so this approach trades model accuracy for speed.

  1. Computational Design of Self-Assembling Protein Nanomaterials with Atomic Level Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Neil P.; Sheffler, William; Sawaya, Michael R.

    2015-09-17

    We describe a general computational method for designing proteins that self-assemble to a desired symmetric architecture. Protein building blocks are docked together symmetrically to identify complementary packing arrangements, and low-energy protein-protein interfaces are then designed between the building blocks in order to drive self-assembly. We used trimeric protein building blocks to design a 24-subunit, 13-nm diameter complex with octahedral symmetry and a 12-subunit, 11-nm diameter complex with tetrahedral symmetry. The designed proteins assembled to the desired oligomeric states in solution, and the crystal structures of the complexes revealed that the resulting materials closely match the design models. The method canmore » be used to design a wide variety of self-assembling protein nanomaterials.« less

  2. Building adaptive connectionist-based controllers: review of experiments in human-robot interaction, collective robotics, and computational neuroscience

    NASA Astrophysics Data System (ADS)

    Billard, Aude

    2000-10-01

    This paper summarizes a number of experiments in biologically inspired robotics. The common feature to all experiments is the use of artificial neural networks as the building blocks for the controllers. The experiments speak in favor of using a connectionist approach for designing adaptive and flexible robot controllers, and for modeling neurological processes. I present 1) DRAMA, a novel connectionist architecture, which has general property for learning time series and extracting spatio-temporal regularities in multi-modal and highly noisy data; 2) Robota, a doll-shaped robot, which imitates and learns a proto-language; 3) an experiment in collective robotics, where a group of 4 to 15 Khepera robots learn dynamically the topography of an environment whose features change frequently; 4) an abstract, computational model of primate ability to learn by imitation; 5) a model for the control of locomotor gaits in a quadruped legged robot.

  3. Restoring Fort Frontenac in 3D: Effective Usage of 3D Technology for Heritage Visualization

    NASA Astrophysics Data System (ADS)

    Yabe, M.; Goins, E.; Jackson, C.; Halbstein, D.; Foster, S.; Bazely, S.

    2015-02-01

    This paper is composed of three elements: 3D modeling, web design, and heritage visualization. The aim is to use computer graphics design to inform and create an interest in historical visualization by rebuilding Fort Frontenac using 3D modeling and interactive design. The final model will be integr ated into an interactive website to learn more about the fort's historic imp ortance. It is apparent that using computer graphics can save time and money when it comes to historical visualization. Visitors do not have to travel to the actual archaeological buildings. They can simply use the Web in their own home to learn about this information virtually. Meticulously following historical records to create a sophisticated restoration of archaeological buildings will draw viewers into visualizations, such as the historical world of Fort Frontenac. As a result, it allows the viewers to effectively understand the fort's social sy stem, habits, and historical events.

  4. Biocellion: accelerating computer simulation of multicellular biological system models

    PubMed Central

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  5. Experimental Shielding Evaluation of the Radiation Protection Provided by Residential Structures

    NASA Astrophysics Data System (ADS)

    Dickson, Elijah D.

    The human health and environmental effects following a postulated accidental release of radioactive material to the environment has been a public and regulatory concern since the early development of nuclear technology and researched extensively to better understand the potential risks for accident mitigation and emergency planning purposes. The objective of this investigation is to research and develop the technical basis for contemporary building shielding factors for the U.S. housing stock. Building shielding factors quantify the protection a certain building-type provides from ionizing radiation. Much of the current data used to determine the quality of shielding around nuclear facilities and urban environments is based on simplistic point-kernel calculations for 1950's era suburbia and is no longer applicable to the densely populated urban environments seen today. To analyze a building's radiation shielding properties, the ideal approach would be to subject a variety of building-types to various radioactive materials and measure the radiation levels in and around the building. While this is not entirely practicable, this research uniquely analyzes the shielding effectiveness of a variety of likely U.S. residential buildings from a realistic source term in a laboratory setting. Results produced in the investigation provide a comparison between theory and experiment behind building shielding factor methodology by applying laboratory measurements to detailed computational models. These models are used to develop a series of validated building shielding factors for generic residential housing units using the computational code MCNP5. For these building shielding factors to be useful in radiologic consequence assessments and emergency response planning, two types of shielding factors have been developed for; (1) the shielding effectiveness of each structure within a semi-infinite cloud of radioactive material, and (2) the shielding effectiveness of each structure from contaminant deposition on the roof and surrounding surfaces. For example, results from this investigation estimate the building shielding factors from a semi-infinite plume between comparable two-story models with a basement constructed with either brick-and-mortar or vinyl siding composing the exterior wall weather and a typical single-wide manufactured home with vinyl siding to be 0.36, 0.65, and 0.82 respectively.

  6. Predicting Cost/Performance Trade-Offs for Whitney: A Commodity Computing Cluster

    NASA Technical Reports Server (NTRS)

    Becker, Jeffrey C.; Nitzberg, Bill; VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)

    1997-01-01

    Recent advances in low-end processor and network technology have made it possible to build a "supercomputer" out of commodity components. We develop simple models of the NAS Parallel Benchmarks version 2 (NPB 2) to explore the cost/performance trade-offs involved in building a balanced parallel computer supporting a scientific workload. We develop closed form expressions detailing the number and size of messages sent by each benchmark. Coupling these with measured single processor performance, network latency, and network bandwidth, our models predict benchmark performance to within 30%. A comparison based on total system cost reveals that current commodity technology (200 MHz Pentium Pros with 100baseT Ethernet) is well balanced for the NPBs up to a total system cost of around $1,000,000.

  7. Characterization of pollutant dispersion near elongated ...

    EPA Pesticide Factsheets

    This paper presents a wind tunnel study of the effects of elongated rectangular buildings on the dispersion of pollutants from nearby stacks. The study examines the influence of source location, building aspect ratio, and wind direction on pollutant dispersion with the goal of developing improved algorithms within dispersion models. The paper also examines the current AERMOD/PRIME modeling capabilities compared to wind tunnel observations. Differences in the amount of plume material entrained in the wake region downwind of a building for various source locations and source heights are illustrated with vertical and lateral concentration profiles. These profiles were parameterized using the Gaussian equation and show the influence of building/source configurations on those parameters. When the building is oriented at 45° to the approach flow, for example, the effective plume height descends more rapidly than it does for a perpendicular building, enhancing the resulting surface concentrations in the wake region. Buildings at angles to the wind cause a cross-wind shift in the location of the plume resulting from a lateral mean flow established in the building wake. These and other effects that are not well represented in many dispersion models are important considerations when developing improved algorithms to estimate the location and magnitude of concentrations downwind of elongated buildings. The National Exposure Research Laboratory (NERL) Computational Exposur

  8. COED Transactions, Vol. IX, No. 10 & No. 11, October/November 1977. Teaching Professional Use of the Computer While Teaching the Major. Computer Applications in Design Instruction.

    ERIC Educational Resources Information Center

    Marcovitz, Alan B., Ed.

    Presented are two papers on computer applications in engineering education coursework. The first paper suggests that since most engineering graduates use only "canned programs" and rarely write their own programs, educational emphasis should include model building and the use of existing software as well as program writing. The second paper deals…

  9. Computer network environment planning and analysis

    NASA Technical Reports Server (NTRS)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  10. The use of analytical models in human-computer interface design

    NASA Technical Reports Server (NTRS)

    Gugerty, Leo

    1993-01-01

    Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA.

  11. Aggregation of LoD 1 building models as an optimization problem

    NASA Astrophysics Data System (ADS)

    Guercke, R.; Götzelmann, T.; Brenner, C.; Sester, M.

    3D city models offered by digital map providers typically consist of several thousands or even millions of individual buildings. Those buildings are usually generated in an automated fashion from high resolution cadastral and remote sensing data and can be very detailed. However, not in every application such a high degree of detail is desirable. One way to remove complexity is to aggregate individual buildings, simplify the ground plan and assign an appropriate average building height. This task is computationally complex because it includes the combinatorial optimization problem of determining which subset of the original set of buildings should best be aggregated to meet the demands of an application. In this article, we introduce approaches to express different aspects of the aggregation of LoD 1 building models in the form of Mixed Integer Programming (MIP) problems. The advantage of this approach is that for linear (and some quadratic) MIP problems, sophisticated software exists to find exact solutions (global optima) with reasonable effort. We also propose two different heuristic approaches based on the region growing strategy and evaluate their potential for optimization by comparing their performance to a MIP-based approach.

  12. cryoem-cloud-tools: A software platform to deploy and manage cryo-EM jobs in the cloud.

    PubMed

    Cianfrocco, Michael A; Lahiri, Indrajit; DiMaio, Frank; Leschziner, Andres E

    2018-06-01

    Access to streamlined computational resources remains a significant bottleneck for new users of cryo-electron microscopy (cryo-EM). To address this, we have developed tools that will submit cryo-EM analysis routines and atomic model building jobs directly to Amazon Web Services (AWS) from a local computer or laptop. These new software tools ("cryoem-cloud-tools") have incorporated optimal data movement, security, and cost-saving strategies, giving novice users access to complex cryo-EM data processing pipelines. Integrating these tools into the RELION processing pipeline and graphical user interface we determined a 2.2 Å structure of ß-galactosidase in ∼55 hours on AWS. We implemented a similar strategy to submit Rosetta atomic model building and refinement to AWS. These software tools dramatically reduce the barrier for entry of new users to cloud computing for cryo-EM and are freely available at cryoem-tools.cloud. Copyright © 2018. Published by Elsevier Inc.

  13. A composite computational model of liver glucose homeostasis. I. Building the composite model.

    PubMed

    Hetherington, J; Sumner, T; Seymour, R M; Li, L; Rey, M Varela; Yamaji, S; Saffrey, P; Margoninski, O; Bogle, I D L; Finkelstein, A; Warner, A

    2012-04-07

    A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an 'engineering' approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.

  14. Integration of a CAS/DGS as a CAD system in the mathematics curriculum for architecture students

    NASA Astrophysics Data System (ADS)

    Falcón, R. M.

    2011-09-01

    Students of Architecture and Building Engineering Degrees work with Computer Aided Design systems daily in order to design and model architectonic constructions. Since this kind of software is based on the creation and transformation of geometrical objects, it seems to be a useful tool in Maths classes in order to capture the attention of the students. However, users of these systems cannot display the set of formulas and equations which constitute the basis of their studio. Moreover, if they want to represent curves or surfaces starting from its corresponding equations, they have to define specific macros which require the knowledge of some computer language or they have to create a table of points in order to convert a set of nodes into polylines, polysolids or splines. More specific concepts, like, for instance, those related to differential geometry, are not implemented in this kind of software, although they are taught in our Maths classes. In a very similar virtual environment, Computer Algebra and Dynamic Geometry Systems offer the possibility of implementing several concepts which can be found in the usual mathematics curriculum for Building Engineering: curves, surfaces and calculus. Specifically, the use of sliders related to the Euler's angles and the generation of tools which project 3D into 2D, facilitate the design and model of curves and rigid objects in space, by starting from their parametric equations. In this article, we show the experience carried out in an experimental and control group in the context of the Maths classes of the Building Engineering Degree of the University of Seville, where students have created their own building models by understanding and testing the usefulness of the mathematical concepts.

  15. Fault-tolerant computer study. [logic designs for building block circuits

    NASA Technical Reports Server (NTRS)

    Rennels, D. A.; Avizienis, A. A.; Ercegovac, M. D.

    1981-01-01

    A set of building block circuits is described which can be used with commercially available microprocessors and memories to implement fault tolerant distributed computer systems. Each building block circuit is intended for VLSI implementation as a single chip. Several building blocks and associated processor and memory chips form a self checking computer module with self contained input output and interfaces to redundant communications buses. Fault tolerance is achieved by connecting self checking computer modules into a redundant network in which backup buses and computer modules are provided to circumvent failures. The requirements and design methodology which led to the definition of the building block circuits are discussed.

  16. Critical review of the building downwash algorithms in AERMOD.

    PubMed

    Petersen, Ron L; Guerra, Sergio A; Bova, Anthony S

    2017-08-01

    The only documentation on the building downwash algorithm in AERMOD (American Meteorological Society/U.S. Environmental Protection Agency Regulatory Model), referred to as PRIME (Plume Rise Model Enhancements), is found in the 2000 A&WMA journal article by Schulman, Strimaitis and Scire. Recent field and wind tunnel studies have shown that AERMOD can overpredict concentrations by factors of 2 to 8 for certain building configurations. While a wind tunnel equivalent building dimension study (EBD) can be conducted to approximately correct the overprediction bias, past field and wind tunnel studies indicate that there are notable flaws in the PRIME building downwash theory. A detailed review of the theory supported by CFD (Computational Fluid Dynamics) and wind tunnel simulations of flow over simple rectangular buildings revealed the following serious theoretical flaws: enhanced turbulence in the building wake starting at the wrong longitudinal location; constant enhanced turbulence extending up to the wake height; constant initial enhanced turbulence in the building wake (does not vary with roughness or stability); discontinuities in the streamline calculations; and no method to account for streamlined or porous structures. This paper documents theoretical and other problems in PRIME along with CFD simulations and wind tunnel observations that support these findings. Although AERMOD/PRIME may provide accurate and unbiased estimates (within a factor of 2) for some building configurations, a major review and update is needed so that accurate estimates can be obtained for other building configurations where significant overpredictions or underpredictions are common due to downwash effects. This will ensure that regulatory evaluations subject to dispersion modeling requirements can be based on an accurate model. Thus, it is imperative that the downwash theory in PRIME is corrected to improve model performance and ensure that the model better represents reality.

  17. BUILDING MODEL ANALYSIS APPLICATIONS WITH THE JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY (JUPITER) API

    EPA Science Inventory

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input ...

  18. Building a computer program to support children, parents, and distraction during healthcare procedures.

    PubMed

    Hanrahan, Kirsten; McCarthy, Ann Marie; Kleiber, Charmaine; Ataman, Kaan; Street, W Nick; Zimmerman, M Bridget; Ersig, Anne L

    2012-10-01

    This secondary data analysis used data mining methods to develop predictive models of child risk for distress during a healthcare procedure. Data used came from a study that predicted factors associated with children's responses to an intravenous catheter insertion while parents provided distraction coaching. From the 255 items used in the primary study, 44 predictive items were identified through automatic feature selection and used to build support vector machine regression models. Models were validated using multiple cross-validation tests and by comparing variables identified as explanatory in the traditional versus support vector machine regression. Rule-based approaches were applied to the model outputs to identify overall risk for distress. A decision tree was then applied to evidence-based instructions for tailoring distraction to characteristics and preferences of the parent and child. The resulting decision support computer application, titled Children, Parents and Distraction, is being used in research. Future use will support practitioners in deciding the level and type of distraction intervention needed by a child undergoing a healthcare procedure.

  19. Can the Computer Design a School Building?

    ERIC Educational Resources Information Center

    Roberts, Charles

    The implications of computer technology and architecture are discussed with reference to school building design. A brief introduction is given of computer applications in other fields leading to the conclusions that computers alone cannot design school buildings but may serve as a useful tool in the overall design process. Specific examples are…

  20. Relevancy in Problem Solving: A Computational Framework

    ERIC Educational Resources Information Center

    Kwisthout, Johan

    2012-01-01

    When computer scientists discuss the computational complexity of, for example, finding the shortest path from building A to building B in some town or city, their starting point typically is a formal description of the problem at hand, e.g., a graph with weights on every edge where buildings correspond to vertices, routes between buildings to…

  1. Center for Building Science: Annual report, FY 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cairns, E.J.; Rosenfeld, A.H.

    1987-05-01

    The Center for Building Science consists of four programs in the Applied Science Division: energy analysis, buildings energy systems, windows and lighting, and indoor environment. It was established to provide an umbrella so that goups in different programs but with similar interests could combine to perform joint research, develop new research areas, share resources, and produce joint publications. As detailed below, potential savings for the U.S. society from energy efficient buildings are enormous. But these savings can only be realized through an expanding federal RandD program that develops expertise in this new area. The Center for Building Science develops efficientmore » new building componenets, computer models, data and information systems, and trains needed builidng scientists. 135 refs., 72 figs., 18 tabs.« less

  2. Building-Scale Atmospheric Modeling for Understanding and Anticipating Environmental Risks to Urban Populations

    NASA Astrophysics Data System (ADS)

    Warner, T. T.; Swerdlin, S. P.; Chen, F.; Hayden, M.

    2009-05-01

    The innovative use of Computational Fluid-Dynamics (CFD) models to define the building- and street-scale atmospheric environment in urban areas can benefit society in a number of ways. Design criteria used by architectural climatologists, who help plan the livable cities of the future, require information about air movement within street canyons for different seasons and weather regimes. Understanding indoor urban air- quality problems and their mitigation, especially for older buildings, requires data on air movement and associated dynamic pressures near buildings. Learning how heat waves and anthropogenic forcing in cities collectively affect the health of vulnerable residents is a problem in building thermodynamics, human behavior, and neighborhood-scale and street-canyon-scale atmospheric sciences. And, predicting the movement of plumes of hazardous material released in urban industrial or transportation accidents requires detailed information about vertical and horizontal air motions in the street canyons. These challenges are closer to being addressed because of advances in CFD modeling, the coupling of CFD models with models of indoor air motion and air quality, and the coupling of CFD models with mesoscale weather-prediction models. This paper will review some of the new knowledge and technologies that are being developed to meet these atmospheric-environment needs of our growing urban populations.

  3. Making Ordered DNA and Protein Structures from Computer-Printed Transparency Film Cut-Outs

    ERIC Educational Resources Information Center

    Jittivadhna, Karnyupha; Ruenwongsa, Pintip; Panijpan, Bhinyo

    2009-01-01

    Instructions are given for building physical scale models of ordered structures of B-form DNA, protein [alpha]-helix, and parallel and antiparallel protein [beta]-pleated sheets made from colored computer printouts designed for transparency film sheets. Cut-outs from these sheets are easily assembled. Conventional color coding for atoms are used…

  4. Automatic Generation of Just-in-Time Online Assessments from Software Design Models

    ERIC Educational Resources Information Center

    Zualkernan, Imran A.; El-Naaj, Salim Abou; Papadopoulos, Maria; Al-Amoudi, Budoor K.; Matthews, Charles E.

    2009-01-01

    Computer software is pervasive in today's society. The rate at which new versions of computer software products are released is phenomenal when compared to the release rate of new products in traditional industries such as aircraft building. This rapid rate of change can partially explain why most certifications in the software industry are…

  5. A Model for Critical Games Literacy

    ERIC Educational Resources Information Center

    Apperley, Tom; Beavis, Catherine

    2013-01-01

    This article outlines a model for teaching both computer games and videogames in the classroom for teachers. The model illustrates the connections between in-game actions and youth gaming culture. The article explains how the out-of-school knowledge building, creation and collaboration that occurs in gaming and gaming culture has an impact on…

  6. Toward Accessing Spatial Structure from Building Information Models

    NASA Astrophysics Data System (ADS)

    Schultz, C.; Bhatt, M.

    2011-08-01

    Data about building designs and layouts is becoming increasingly more readily available. In the near future, service personal (such as maintenance staff or emergency rescue workers) arriving at a building site will have immediate real-time access to enormous amounts of data relating to structural properties, utilities, materials, temperature, and so on. The critical problem for users is the taxing and error prone task of interpreting such a large body of facts in order to extract salient information. This is necessary for comprehending a situation and deciding on a plan of action, and is a particularly serious issue in time-critical and safety-critical activities such as firefighting. Current unifying building models such as the Industry Foundation Classes (IFC), while being comprehensive, do not directly provide data structures that focus on spatial reasoning and spatial modalities that are required for high-level analytical tasks. The aim of the research presented in this paper is to provide computational tools for higher level querying and reasoning that shift the cognitive burden of dealing with enormous amounts of data away from the user. The user can then spend more energy and time in planning and decision making in order to accomplish the tasks at hand. We present an overview of our framework that provides users with an enhanced model of "built-up space". In order to test our approach using realistic design data (in terms of both scale and the nature of the building models) we describe how our system interfaces with IFC, and we conduct timing experiments to determine the practicality of our approach. We discuss general computational approaches for deriving higher-level spatial modalities by focusing on the example of route graphs. Finally, we present a firefighting scenario with alternative route graphs to motivate the application of our framework.

  7. Comparisons of four computer models with experimental data from test buildings in northern New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, D.K.; Christian, J.E.

    1985-01-01

    Eight one-room test buildings, 20 ft (6.1 m) square and 7.5 ft (2.3 m) high, were constructed on a high desert site near Tesuque Pueblo, New Mexico, to study the influence of wall dynamic heat transfer characteristics on building heating requirements (the ''thermal mass effect''). The buildings are nominally identical except for the walls (adobe, concrete and masonry unit, wood-frame, and log) and are constructed so as to isolate the effects of the walls. The amount of mass in the walls varies from 240 lb/ft/sup 2/ (1171 kg/m/sup 2/) for the 2 ft (.61 m) thick adobe wall to 4.3more » lb/ft/sup 2/ (21 kg/m/sup 2/) for the insulated wood-frame wall. The roof, floor, and stem walls are all well insulated and the buildings were constructed with infiltration rates less than 0.4 air change per hour. The site is instrumented to record building component temperatures and heat fluxes, outside weather conditions, and heating energy use. Data were collected for two heating seasons from midwinter to late spring with the buildings in two configurations, with and without windows. Four computer codes were used to simulate the performance of the test buildings without windows, using site weather data. The codes used were DOE-2.1A, DOE-2.1C, BLAST, and DEROB. Each code was run by a different analyst. Simulations were done for midwinter, late winter, and spring. Two of the test cell comparisons are discussed; the insulated frame and an 11-in. (.28 m) adobe. This work presents a quantitative and qualitative critical comparison of the modeling and experimental results. Cumulative heating loads, wall heat fluxes, and air surface temperatures are compared, as well as input assumptions to the models. Explanations of differences and difficulties encountered are reported. The principal findings were that cumulative heating loads and the characteristic influences of wall thermal mass on hourly behavior were reproduced by the models.« less

  8. Solar thermal heating and cooling. A bibliography with abstracts

    NASA Technical Reports Server (NTRS)

    Arenson, M.

    1979-01-01

    This bibliographic series cites and abstracts the literature and technical papers on the heating and cooling of buildings with solar thermal energy. Over 650 citations are arranged in the following categories: space heating and cooling systems; space heating and cooling models; building energy conservation; architectural considerations, thermal load computations; thermal load measurements, domestic hot water, solar and atmospheric radiation, swimming pools; and economics.

  9. CAD/CAM/AM applications in the manufacture of dental appliances.

    PubMed

    Al Mortadi, Noor; Eggbeer, Dominic; Lewis, Jeffrey; Williams, Robert J

    2012-11-01

    The purposes of this study were to apply the latest developments in additive manufacturing (AM) construction and to evaluate the effectiveness of these computer-aided design and computer-aided manufacturing (CAD/CAM) techniques in the production of dental appliances. In addition, a new method of incorporating wire into a single build was developed. A scanner was used to capture 3-dimensional images of Class II Division 1 dental models that were translated onto a 2-dimensional computer screen. Andresen and sleep-apnea devices were designed in 3 dimensions by using FreeForm software (version 11; Geo Magics SensAble Group, Wilmington, Mass) and a phantom arm. The design was then exported and transferred to an AM machine for building. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  10. Laboratory Sequence in Computational Methods for Introductory Chemistry

    NASA Astrophysics Data System (ADS)

    Cody, Jason A.; Wiser, Dawn C.

    2003-07-01

    A four-exercise laboratory sequence for introductory chemistry integrating hands-on, student-centered experience with computer modeling has been designed and implemented. The progression builds from exploration of molecular shapes to intermolecular forces and the impact of those forces on chemical separations made with gas chromatography and distillation. The sequence ends with an exploration of molecular orbitals. The students use the computers as a tool; they build the molecules, submit the calculations, and interpret the results. Because of the construction of the sequence and its placement spanning the semester break, good laboratory notebook practices are reinforced and the continuity of course content and methods between semesters is emphasized. The inclusion of these techniques in the first year of chemistry has had a positive impact on student perceptions and student learning.

  11. Computer simulated building energy consumption for verification of energy conservation measures in network facilities

    NASA Technical Reports Server (NTRS)

    Plankey, B.

    1981-01-01

    A computer program called ECPVER (Energy Consumption Program - Verification) was developed to simulate all energy loads for any number of buildings. The program computes simulated daily, monthly, and yearly energy consumption which can be compared with actual meter readings for the same time period. Such comparison can lead to validation of the model under a variety of conditions, which allows it to be used to predict future energy saving due to energy conservation measures. Predicted energy saving can then be compared with actual saving to verify the effectiveness of those energy conservation changes. This verification procedure is planned to be an important advancement in the Deep Space Network Energy Project, which seeks to reduce energy cost and consumption at all DSN Deep Space Stations.

  12. Exploiting the On-Campus Boiler House.

    ERIC Educational Resources Information Center

    Woods, Donald R.; And Others

    1986-01-01

    Shows how a university utility building ("boiler house") is used in a chemical engineering course for computer simulations, mathematical modeling and process problem exercises. Student projects involving the facility are also discussed. (JN)

  13. Pressure Analysis

    NASA Technical Reports Server (NTRS)

    1990-01-01

    FluiDyne Engineering Corporation, Minneapolis, MN is one of the world's leading companies in design and construction of wind tunnels. In its designing work, FluiDyne uses a computer program called GTRAN. With GTRAN, engineers create a design and test its performance on the computer before actually building a model; should the design fail to meet criteria, the system or any component part can be redesigned and retested on the computer, saving a great deal of time and money.

  14. A regression-based approach to estimating retrofit savings using the Building Performance Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, Travis; Sohn, Michael D.

    Retrofitting building systems is known to provide cost-effective energy savings. This article addresses how the Building Performance Database is used to help identify potential savings. Currently, prioritizing retrofits and computing their expected energy savings and cost/benefits can be a complicated, costly, and an uncertain effort. Prioritizing retrofits for a portfolio of buildings can be even more difficult if the owner must determine different investment strategies for each of the buildings. Meanwhile, we are seeing greater availability of data on building energy use, characteristics, and equipment. These data provide opportunities for the development of algorithms that link building characteristics and retrofitsmore » empirically. In this paper we explore the potential of using such data for predicting the expected energy savings from equipment retrofits for a large number of buildings. We show that building data with statistical algorithms can provide savings estimates when detailed energy audits and physics-based simulations are not cost- or time-feasible. We develop a multivariate linear regression model with numerical predictors (e.g., operating hours, occupant density) and categorical indicator variables (e.g., climate zone, heating system type) to predict energy use intensity. The model quantifies the contribution of building characteristics and systems to energy use, and we use it to infer the expected savings when modifying particular equipment. We verify the model using residual analysis and cross-validation. We demonstrate the retrofit analysis by providing a probabilistic estimate of energy savings for several hypothetical building retrofits. We discuss the ways understanding the risk associated with retrofit investments can inform decision making. The contributions of this work are the development of a statistical model for estimating energy savings, its application to a large empirical building dataset, and a discussion of its use in informing building retrofit decisions.« less

  15. Implementation of AN Unmanned Aerial Vehicle System for Large Scale Mapping

    NASA Astrophysics Data System (ADS)

    Mah, S. B.; Cryderman, C. S.

    2015-08-01

    Unmanned Aerial Vehicles (UAVs), digital cameras, powerful personal computers, and software have made it possible for geomatics professionals to capture aerial photographs and generate digital terrain models and orthophotographs without using full scale aircraft or hiring mapping professionals. This has been made possible by the availability of miniaturized computers and sensors, and software which has been driven, in part, by the demand for this technology in consumer items such as smartphones. The other force that is in play is the increasing number of Do-It-Yourself (DIY) people who are building UAVs as a hobby or for professional use. Building a UAV system for mapping is an alternative to purchasing a turnkey system. This paper describes factors to be considered when building a UAV mapping system, the choices made, and the test results of a project using this completed system.

  16. Modal mass estimation from ambient vibrations measurement: A method for civil buildings

    NASA Astrophysics Data System (ADS)

    Acunzo, G.; Fiorini, N.; Mori, F.; Spina, D.

    2018-01-01

    A new method for estimating the modal mass ratios of buildings from unscaled mode shapes identified from ambient vibrations is presented. The method is based on the Multi Rigid Polygons (MRP) model in which each floor of the building is ideally divided in several non-deformable polygons that move independent of each other. The whole mass of the building is concentrated in the centroid of the polygons and the experimental mode shapes are expressed in term of rigid translations and of rotations. In this way, the mass matrix of the building can be easily computed on the basis of simple information about the geometry and the materials of the structure. The modal mass ratios can be then obtained through the classical equation of structural dynamics. Ambient vibrations measurement must be performed according to this MRP models, using at least two biaxial accelerometers per polygon. After a brief illustration of the theoretical background of the method, numerical validations are presented analysing the method sensitivity for possible different source of errors. Quality indexes are defined for evaluating the approximation of the modal mass ratios obtained from a certain MRP model. The capability of the proposed model to be applied to real buildings is illustrated through two experimental applications. In the first one, a geometrically irregular reinforced concrete building is considered, using a calibrated Finite Element Model for validating the results of the method. The second application refers to a historical monumental masonry building, with a more complex geometry and with less information available. In both cases, MRP models with a different number of rigid polygons per floor are compared.

  17. Computer display and manipulation of biological molecules

    NASA Technical Reports Server (NTRS)

    Coeckelenbergh, Y.; Macelroy, R. D.; Hart, J.; Rein, R.

    1978-01-01

    This paper describes a computer model that was designed to investigate the conformation of molecules, macromolecules and subsequent complexes. Utilizing an advanced 3-D dynamic computer display system, the model is sufficiently versatile to accommodate a large variety of molecular input and to generate data for multiple purposes such as visual representation of conformational changes, and calculation of conformation and interaction energy. Molecules can be built on the basis of several levels of information. These include the specification of atomic coordinates and connectivities and the grouping of building blocks and duplicated substructures using symmetry rules found in crystals and polymers such as proteins and nucleic acids. Called AIMS (Ames Interactive Molecular modeling System), the model is now being used to study pre-biotic molecular evolution toward life.

  18. Biocellion: accelerating computer simulation of multicellular biological system models.

    PubMed

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Object-oriented regression for building predictive models with high dimensional omics data from translational studies.

    PubMed

    Zhao, Lue Ping; Bolouri, Hamid

    2016-04-01

    Maturing omics technologies enable researchers to generate high dimension omics data (HDOD) routinely in translational clinical studies. In the field of oncology, The Cancer Genome Atlas (TCGA) provided funding support to researchers to generate different types of omics data on a common set of biospecimens with accompanying clinical data and has made the data available for the research community to mine. One important application, and the focus of this manuscript, is to build predictive models for prognostic outcomes based on HDOD. To complement prevailing regression-based approaches, we propose to use an object-oriented regression (OOR) methodology to identify exemplars specified by HDOD patterns and to assess their associations with prognostic outcome. Through computing patient's similarities to these exemplars, the OOR-based predictive model produces a risk estimate using a patient's HDOD. The primary advantages of OOR are twofold: reducing the penalty of high dimensionality and retaining the interpretability to clinical practitioners. To illustrate its utility, we apply OOR to gene expression data from non-small cell lung cancer patients in TCGA and build a predictive model for prognostic survivorship among stage I patients, i.e., we stratify these patients by their prognostic survival risks beyond histological classifications. Identification of these high-risk patients helps oncologists to develop effective treatment protocols and post-treatment disease management plans. Using the TCGA data, the total sample is divided into training and validation data sets. After building up a predictive model in the training set, we compute risk scores from the predictive model, and validate associations of risk scores with prognostic outcome in the validation data (P-value=0.015). Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Object-Oriented Regression for Building Predictive Models with High Dimensional Omics Data from Translational Studies

    PubMed Central

    Zhao, Lue Ping; Bolouri, Hamid

    2016-01-01

    Maturing omics technologies enable researchers to generate high dimension omics data (HDOD) routinely in translational clinical studies. In the field of oncology, The Cancer Genome Atlas (TCGA) provided funding support to researchers to generate different types of omics data on a common set of biospecimens with accompanying clinical data and to make the data available for the research community to mine. One important application, and the focus of this manuscript, is to build predictive models for prognostic outcomes based on HDOD. To complement prevailing regression-based approaches, we propose to use an object-oriented regression (OOR) methodology to identify exemplars specified by HDOD patterns and to assess their associations with prognostic outcome. Through computing patient’s similarities to these exemplars, the OOR-based predictive model produces a risk estimate using a patient’s HDOD. The primary advantages of OOR are twofold: reducing the penalty of high dimensionality and retaining the interpretability to clinical practitioners. To illustrate its utility, we apply OOR to gene expression data from non-small cell lung cancer patients in TCGA and build a predictive model for prognostic survivorship among stage I patients, i.e., we stratify these patients by their prognostic survival risks beyond histological classifications. Identification of these high-risk patients helps oncologists to develop effective treatment protocols and post-treatment disease management plans. Using the TCGA data, the total sample is divided into training and validation data sets. After building up a predictive model in the training set, we compute risk scores from the predictive model, and validate associations of risk scores with prognostic outcome in the validation data (p=0.015). PMID:26972839

  1. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  2. Contam airflow models of three large buildings: Model descriptions and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Black, Douglas R.; Price, Phillip N.

    2009-09-30

    Airflow and pollutant transport models are useful for several reasons, including protection from or response to biological terrorism. In recent years they have been used for deciding how many biological agent samplers are needed in a given building to detect the release of an agent; to figure out where those samplers should be located; to predict the number of people at risk in the event of a release of a given size and location; to devise response strategies in the event of a release; to determine optimal trade-offs between sampler characteristics (such as detection limit and response time); and somore » on. For some of these purposes it is necessary to model a specific building of interest: if you are trying to determine optimal sampling locations, you must have a model of your building and not some different building. But for many purposes generic or 'prototypical' building models would suffice. For example, for determining trade-offs between sampler characteristics, results from one building will carry over other, similar buildings. Prototypical building models are also useful for comparing or testing different algorithms or computational pproaches: different researchers can use the same models, thus allowing direct comparison of results in a way that is not otherwise possible. This document discusses prototypical building models developed by the Airflow and Pollutant Transport Group at Lawrence Berkeley National Laboratory. The models are implemented in the Contam v2.4c modeling program, available from the National Institutes for Standards and Technology. We present Contam airflow models of three virtual buildings: a convention center, an airport terminal, and a multi-story office building. All of the models are based to some extent on specific real buildings. Our goal is to produce models that are realistic, in terms of approximate magnitudes, directions, and speeds of airflow and pollutant transport. The three models vary substantially in detail. The airport model is the simplest; the onvention center model is more detailed; and the large office building model is quite complicated. We give several simplified floor plans in this document, to explain basic features of the buildings. The actual models are somewhat more complicated; for instance, spaces that are represented as rectangles in this document sometimes have more complicated shapes in the models. (However, note that the shape of a zone is irrelevant in Contam). Consult the Contam models themselves for detailed floor plans. Each building model is provided with three ventilation conditions, representing mechanical systems in which 20%, 50%, or 80% of the building air is recirculated and the rest is provided from outdoors. Please see the section on 'Use of the models' for important information about issues to consider if you wish to modify the models to provide no mechanical ventilation or eliminate provision of outdoor air.« less

  3. Super-resolution using a light inception layer in convolutional neural network

    NASA Astrophysics Data System (ADS)

    Mou, Qinyang; Guo, Jun

    2018-04-01

    Recently, several models based on CNN architecture have achieved great result on Single Image Super-Resolution (SISR) problem. In this paper, we propose an image super-resolution method (SR) using a light inception layer in convolutional network (LICN). Due to the strong representation ability of our well-designed inception layer that can learn richer representation with less parameters, we can build our model with shallow architecture that can reduce the effect of vanishing gradients problem and save computational costs. Our model strike a balance between computational speed and the quality of the result. Compared with state-of-the-art result, we produce comparable or better results with faster computational speed.

  4. Boxes of Model Building and Visualization.

    PubMed

    Turk, Dušan

    2017-01-01

    Macromolecular crystallography and electron microscopy (single-particle and in situ tomography) are merging into a single approach used by the two coalescing scientific communities. The merger is a consequence of technical developments that enabled determination of atomic structures of macromolecules by electron microscopy. Technological progress in experimental methods of macromolecular structure determination, computer hardware, and software changed and continues to change the nature of model building and visualization of molecular structures. However, the increase in automation and availability of structure validation are reducing interactive manual model building to fiddling with details. On the other hand, interactive modeling tools increasingly rely on search and complex energy calculation procedures, which make manually driven changes in geometry increasingly powerful and at the same time less demanding. Thus, the need for accurate manual positioning of a model is decreasing. The user's push only needs to be sufficient to bring the model within the increasing convergence radius of the computing tools. It seems that we can now better than ever determine an average single structure. The tools work better, requirements for engagement of human brain are lowered, and the frontier of intellectual and scientific challenges has moved on. The quest for resolution of new challenges requires out-of-the-box thinking. A few issues such as model bias and correctness of structure, ongoing developments in parameters defining geometric restraints, limitations of the ideal average single structure, and limitations of Bragg spot data are discussed here, together with the challenges that lie ahead.

  5. Mathematical modeling of HIV-like particle assembly in vitro.

    PubMed

    Liu, Yuewu; Zou, Xiufen

    2017-06-01

    In vitro, the recombinant HIV-1 Gag protein can generate spherical particles with a diameter of 25-30 nm in a fully defined system. It has approximately 80 building blocks, and its intermediates for assembly are abundant in geometry. Accordingly, there are a large number of nonlinear equations in the classical model. Therefore, it is difficult to compute values of geometry parameters for intermediates and make the mathematical analysis using the model. In this work, we develop a new model of HIV-like particle assembly in vitro by using six-fold symmetry of HIV-like particle assembly to decrease the number of geometry parameters. This method will greatly reduce computational costs and facilitate the application of the model. Then, we prove the existence and uniqueness of the positive equilibrium solution for this model with 79 nonlinear equations. Based on this model, we derive the interesting result that concentrations of all intermediates at equilibrium are independent of three important parameters, including two microscopic on-rate constants and the size of nucleating structure. Before equilibrium, these three parameters influence the concentration variation rates of all intermediates. We also analyze the relationship between the initial concentration of building blocks and concentrations of all intermediates. Furthermore, the bounds of concentrations of free building blocks and HIV-like particles are estimated. These results will be helpful to guide HIV-like particle assembly experiments and improve our understanding of the assembly dynamics of HIV-like particles in vitro. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Model-Based Building Detection from Low-Cost Optical Sensors Onboard Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Karantzalos, K.; Koutsourakis, P.; Kalisperakis, I.; Grammatikopoulos, L.

    2015-08-01

    The automated and cost-effective building detection in ultra high spatial resolution is of major importance for various engineering and smart city applications. To this end, in this paper, a model-based building detection technique has been developed able to extract and reconstruct buildings from UAV aerial imagery and low-cost imaging sensors. In particular, the developed approach through advanced structure from motion, bundle adjustment and dense image matching computes a DSM and a true orthomosaic from the numerous GoPro images which are characterised by important geometric distortions and fish-eye effect. An unsupervised multi-region, graphcut segmentation and a rule-based classification is responsible for delivering the initial multi-class classification map. The DTM is then calculated based on inpaininting and mathematical morphology process. A data fusion process between the detected building from the DSM/DTM and the classification map feeds a grammar-based building reconstruction and scene building are extracted and reconstructed. Preliminary experimental results appear quite promising with the quantitative evaluation indicating detection rates at object level of 88% regarding the correctness and above 75% regarding the detection completeness.

  7. Building Development Monitoring in Multitemporal Remotely Sensed Image Pairs with Stochastic Birth-Death Dynamics.

    PubMed

    Benedek, C; Descombes, X; Zerubia, J

    2012-01-01

    In this paper, we introduce a new probabilistic method which integrates building extraction with change detection in remotely sensed image pairs. A global optimization process attempts to find the optimal configuration of buildings, considering the observed data, prior knowledge, and interactions between the neighboring building parts. We present methodological contributions in three key issues: 1) We implement a novel object-change modeling approach based on Multitemporal Marked Point Processes, which simultaneously exploits low-level change information between the time layers and object-level building description to recognize and separate changed and unaltered buildings. 2) To answer the challenges of data heterogeneity in aerial and satellite image repositories, we construct a flexible hierarchical framework which can create various building appearance models from different elementary feature-based modules. 3) To simultaneously ensure the convergence, optimality, and computation complexity constraints raised by the increased data quantity, we adopt the quick Multiple Birth and Death optimization technique for change detection purposes, and propose a novel nonuniform stochastic object birth process which generates relevant objects with higher probability based on low-level image features.

  8. Brickworx builds recurrent RNA and DNA structural motifs into medium- and low-resolution electron-density maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chojnowski, Grzegorz, E-mail: gchojnowski@genesilico.pl; Waleń, Tomasz; University of Warsaw, Banacha 2, 02-097 Warsaw

    2015-03-01

    A computer program that builds crystal structure models of nucleic acid molecules is presented. Brickworx is a computer program that builds crystal structure models of nucleic acid molecules using recurrent motifs including double-stranded helices. In a first step, the program searches for electron-density peaks that may correspond to phosphate groups; it may also take into account phosphate-group positions provided by the user. Subsequently, comparing the three-dimensional patterns of the P atoms with a database of nucleic acid fragments, it finds the matching positions of the double-stranded helical motifs (A-RNA or B-DNA) in the unit cell. If the target structure ismore » RNA, the helical fragments are further extended with recurrent RNA motifs from a fragment library that contains single-stranded segments. Finally, the matched motifs are merged and refined in real space to find the most likely conformations, including a fit of the sequence to the electron-density map. The Brickworx program is available for download and as a web server at http://iimcb.genesilico.pl/brickworx.« less

  9. Building a semantic web-based metadata repository for facilitating detailed clinical modeling in cancer genome studies.

    PubMed

    Sharma, Deepak K; Solbrig, Harold R; Tao, Cui; Weng, Chunhua; Chute, Christopher G; Jiang, Guoqian

    2017-06-05

    Detailed Clinical Models (DCMs) have been regarded as the basis for retaining computable meaning when data are exchanged between heterogeneous computer systems. To better support clinical cancer data capturing and reporting, there is an emerging need to develop informatics solutions for standards-based clinical models in cancer study domains. The objective of the study is to develop and evaluate a cancer genome study metadata management system that serves as a key infrastructure in supporting clinical information modeling in cancer genome study domains. We leveraged a Semantic Web-based metadata repository enhanced with both ISO11179 metadata standard and Clinical Information Modeling Initiative (CIMI) Reference Model. We used the common data elements (CDEs) defined in The Cancer Genome Atlas (TCGA) data dictionary, and extracted the metadata of the CDEs using the NCI Cancer Data Standards Repository (caDSR) CDE dataset rendered in the Resource Description Framework (RDF). The ITEM/ITEM_GROUP pattern defined in the latest CIMI Reference Model is used to represent reusable model elements (mini-Archetypes). We produced a metadata repository with 38 clinical cancer genome study domains, comprising a rich collection of mini-Archetype pattern instances. We performed a case study of the domain "clinical pharmaceutical" in the TCGA data dictionary and demonstrated enriched data elements in the metadata repository are very useful in support of building detailed clinical models. Our informatics approach leveraging Semantic Web technologies provides an effective way to build a CIMI-compliant metadata repository that would facilitate the detailed clinical modeling to support use cases beyond TCGA in clinical cancer study domains.

  10. Parametric Estimation of Load for Air Force Data Centers

    DTIC Science & Technology

    2015-03-27

    R. Nelson, L. Orsenigo and S . Winter, "’History-friendly’ models of industry evolution : the computer industry," Industrial and Corporate Change...34’History-friendly’ models of industry evolution : the computer industry," Industrial and Corporate Change, vol. 8, no. 1, pp. 3-40, 1999. [7] VMWare...NAME( S ) AND ADDRESS(ES) Vinh Phung, 38ES/ENOC 5813 Arnold St, Building 4064 Tinker AFB OK 73145-8120 COM : 405-734-7461, vinh.phung@us.af.mil 10

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dress, W.B.

    Rosen's modeling relation is embedded in Popper's three worlds to provide an heuristic tool for model building and a guide for thinking about complex systems. The utility of this construct is demonstrated by suggesting a solution to the problem of pseudo science and a resolution of the famous Bohr-Einstein debates. A theory of bizarre systems is presented by an analogy with entangled particles of quantum mechanics. This theory underscores the poverty of present-day computational systems (e.g., computers) for creating complex and bizarre entities by distinguishing between mechanism and organism.

  12. Leveraging OpenStudio's Application Programming Interfaces: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, N.; Ball, B.; Goldwasser, D.

    2013-11-01

    OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less

  13. Hydrological analysis in R: Topmodel and beyond

    NASA Astrophysics Data System (ADS)

    Buytaert, W.; Reusser, D.

    2011-12-01

    R is quickly gaining popularity in the hydrological sciences community. The wide range of statistical and mathematical functionality makes it an excellent tool for data analysis, modelling and uncertainty analysis. Topmodel was one of the first hydrological models being implemented as an R package and distributed through R's own distribution network CRAN. This facilitated pre- and postprocessing of data such as parameter sampling, calculation of prediction bounds, and advanced visualisation. However, apart from these basic functionalities, the package did not use many of the more advanced features of the R environment, especially from R's object oriented functionality. With R's increasing expansion in arenas such as high performance computing, big data analysis, and cloud services, we revisit the topmodel package, and use it as an example of how to build and deploy the next generation of hydrological models. R provides a convenient environment and attractive features to build and couple hydrological - and in extension other environmental - models, to develop flexible and effective data assimilation strategies, and to take the model beyond the individual computer by linking into cloud services for both data provision and computing. However, in order to maximise the benefit of these approaches, it will be necessary to adopt standards and ontologies for model interaction and information exchange. Some of those are currently being developed, such as the OGC web processing standards, while other will need to be developed.

  14. Perceptual category learning and visual processing: An exercise in computational cognitive neuroscience.

    PubMed

    Cantwell, George; Riesenhuber, Maximilian; Roeder, Jessica L; Ashby, F Gregory

    2017-05-01

    The field of computational cognitive neuroscience (CCN) builds and tests neurobiologically detailed computational models that account for both behavioral and neuroscience data. This article leverages a key advantage of CCN-namely, that it should be possible to interface different CCN models in a plug-and-play fashion-to produce a new and biologically detailed model of perceptual category learning. The new model was created from two existing CCN models: the HMAX model of visual object processing and the COVIS model of category learning. Using bitmap images as inputs and by adjusting only a couple of learning-rate parameters, the new HMAX/COVIS model provides impressively good fits to human category-learning data from two qualitatively different experiments that used different types of category structures and different types of visual stimuli. Overall, the model provides a comprehensive neural and behavioral account of basal ganglia-mediated learning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. A distributed computing model for telemetry data processing

    NASA Astrophysics Data System (ADS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  16. A distributed computing model for telemetry data processing

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  17. Urban Terrain Modeling for Augmented Reality Applications

    DTIC Science & Technology

    2001-01-01

    pointing ( Maybank -92). Almost all such systems are designed to extract the geometry of buildings and to texture these to provide models that can be... Maybank , S. and Faugeras, O. (1992). A Theory of Self-Calibration of a Moving Camera, International Journal of Computer Vision, 8(2):123-151

  18. Building a Model for Distance Collaboration in the Computer-Assisted Business Communication Classroom.

    ERIC Educational Resources Information Center

    Lopez, Elizabeth Sanders; Nagelhout, Edwin

    1995-01-01

    Outlines a model for distance collaboration between business writing classrooms using network technology. Discusses ways to teach national and international audience awareness, problem solving, and the contextual nature of cases. Discusses goals for distance collaboration, sample assignments, and the pros and cons of network technologies. (SR)

  19. Testing Different Model Building Procedures Using Multiple Regression.

    ERIC Educational Resources Information Center

    Thayer, Jerome D.

    The stepwise regression method of selecting predictors for computer assisted multiple regression analysis was compared with forward, backward, and best subsets regression, using 16 data sets. The results indicated the stepwise method was preferred because of its practical nature, when the models chosen by different selection methods were similar…

  20. The variants of an LOD of a 3D building model and their influence on spatial analyses

    NASA Astrophysics Data System (ADS)

    Biljecki, Filip; Ledoux, Hugo; Stoter, Jantien; Vosselman, George

    2016-06-01

    The level of detail (LOD) of a 3D city model indicates the model's grade and usability. However, there exist multiple valid variants of each LOD. As a consequence, the LOD concept is inconclusive as an instruction for the acquisition of 3D city models. For instance, the top surface of an LOD1 block model may be modelled at the eaves of a building or at its ridge height. Such variants, which we term geometric references, are often overlooked and are usually not documented in the metadata. Furthermore, the influence of a particular geometric reference on the performance of a spatial analysis is not known. In response to this research gap, we investigate a variety of LOD1 and LOD2 geometric references that are commonly employed, and perform numerical experiments to investigate their relative difference when used as input for different spatial analyses. We consider three use cases (estimation of the area of the building envelope, building volume, and shadows cast by buildings), and compute the deviations in a Monte Carlo simulation. The experiments, carried out with procedurally generated models, indicate that two 3D models representing the same building at the same LOD, but modelled according to different geometric references, may yield substantially different results when used in a spatial analysis. The outcome of our experiments also suggests that the geometric reference may have a bigger influence than the LOD, since an LOD1 with a specific geometric reference may yield a more accurate result than when using LOD2 models.

  1. A compressed sensing method with analytical results for lidar feature classification

    NASA Astrophysics Data System (ADS)

    Allen, Josef D.; Yuan, Jiangbo; Liu, Xiuwen; Rahmes, Mark

    2011-04-01

    We present an innovative way to autonomously classify LiDAR points into bare earth, building, vegetation, and other categories. One desirable product of LiDAR data is the automatic classification of the points in the scene. Our algorithm automatically classifies scene points using Compressed Sensing Methods via Orthogonal Matching Pursuit algorithms utilizing a generalized K-Means clustering algorithm to extract buildings and foliage from a Digital Surface Models (DSM). This technology reduces manual editing while being cost effective for large scale automated global scene modeling. Quantitative analyses are provided using Receiver Operating Characteristics (ROC) curves to show Probability of Detection and False Alarm of buildings vs. vegetation classification. Histograms are shown with sample size metrics. Our inpainting algorithms then fill the voids where buildings and vegetation were removed, utilizing Computational Fluid Dynamics (CFD) techniques and Partial Differential Equations (PDE) to create an accurate Digital Terrain Model (DTM) [6]. Inpainting preserves building height contour consistency and edge sharpness of identified inpainted regions. Qualitative results illustrate other benefits such as Terrain Inpainting's unique ability to minimize or eliminate undesirable terrain data artifacts.

  2. Bigger data for big data: from Twitter to brain-computer interfaces.

    PubMed

    Roesch, Etienne B; Stahl, Frederic; Gaber, Mohamed Medhat

    2014-02-01

    We are sympathetic with Bentley et al.'s attempt to encompass the wisdom of crowds in a generative model, but posit that a successful attempt at using big data will include more sensitive measurements, more varied sources of information, and will also build from the indirect information available through technology, from ancillary technical features to data from brain-computer interfaces.

  3. Predictors of Interpersonal Trust in Virtual Distributed Teams

    DTIC Science & Technology

    2008-09-01

    understand systems that are very complex in nature . Such understanding is essential to facilitate building or maintaining operators’ mental models of the...a significant impact on overall system performance. Specifically, the level of automation that combined human generation of options with computer...and/or computer servers had a significant impact on automated system performance. Additionally, Parasuraman, Sheridan, & Wickens (2000) proposed

  4. Estimation of aortic valve leaflets from 3D CT images using local shape dictionaries and linear coding

    NASA Astrophysics Data System (ADS)

    Liang, Liang; Martin, Caitlin; Wang, Qian; Sun, Wei; Duncan, James

    2016-03-01

    Aortic valve (AV) disease is a significant cause of morbidity and mortality. The preferred treatment modality for severe AV disease is surgical resection and replacement of the native valve with either a mechanical or tissue prosthetic. In order to develop effective and long-lasting treatment methods, computational analyses, e.g., structural finite element (FE) and computational fluid dynamic simulations, are very effective for studying valve biomechanics. These computational analyses are based on mesh models of the aortic valve, which are usually constructed from 3D CT images though many hours of manual annotation, and therefore an automatic valve shape reconstruction method is desired. In this paper, we present a method for estimating the aortic valve shape from 3D cardiac CT images, which is represented by triangle meshes. We propose a pipeline for aortic valve shape estimation which includes novel algorithms for building local shape dictionaries and for building landmark detectors and curve detectors using local shape dictionaries. The method is evaluated on real patient image dataset using a leave-one-out approach and achieves an average accuracy of 0.69 mm. The work will facilitate automatic patient-specific computational modeling of the aortic valve.

  5. Adventures in Modeling: Exploring Complex, Dynamic Systems with StarLogo.

    ERIC Educational Resources Information Center

    Colella, Vanessa Stevens; Klopfer, Eric; Resnick, Mitchel

    For thousands of years people from da Vinci to Einstein have created models to help them better understand patterns and processes in the world around them. Computers make it easier for novices to build and explore their own models and learn new scientific ideas in the process. This book introduces teachers and students to designing, creating, and…

  6. Visual Modeling as a Motivation for Studying Mathematics and Art

    ERIC Educational Resources Information Center

    Sendova, Evgenia; Grkovska, Slavica

    2005-01-01

    The paper deals with the possibility of enriching the curriculum in mathematics, informatics and art by means of visual modeling of abstract paintings. The authors share their belief that in building a computer model of a construct, one gains deeper insight into the construct, and is motivated to elaborate one's knowledge in mathematics and…

  7. Validation of Point Clouds Segmentation Algorithms Through Their Application to Several Case Studies for Indoor Building Modelling

    NASA Astrophysics Data System (ADS)

    Macher, H.; Landes, T.; Grussenmeyer, P.

    2016-06-01

    Laser scanners are widely used for the modelling of existing buildings and particularly in the creation process of as-built BIM (Building Information Modelling). However, the generation of as-built BIM from point clouds involves mainly manual steps and it is consequently time consuming and error-prone. Along the path to automation, a three steps segmentation approach has been developed. This approach is composed of two phases: a segmentation into sub-spaces namely floors and rooms and a plane segmentation combined with the identification of building elements. In order to assess and validate the developed approach, different case studies are considered. Indeed, it is essential to apply algorithms to several datasets and not to develop algorithms with a unique dataset which could influence the development with its particularities. Indoor point clouds of different types of buildings will be used as input for the developed algorithms, going from an individual house of almost one hundred square meters to larger buildings of several thousand square meters. Datasets provide various space configurations and present numerous different occluding objects as for example desks, computer equipments, home furnishings and even wine barrels. For each dataset, the results will be illustrated. The analysis of the results will provide an insight into the transferability of the developed approach for the indoor modelling of several types of buildings.

  8. A seismic optimization procedure for reinforced concrete framed buildings based on eigenfrequency optimization

    NASA Astrophysics Data System (ADS)

    Arroyo, Orlando; Gutiérrez, Sergio

    2017-07-01

    Several seismic optimization methods have been proposed to improve the performance of reinforced concrete framed (RCF) buildings; however, they have not been widely adopted among practising engineers because they require complex nonlinear models and are computationally expensive. This article presents a procedure to improve the seismic performance of RCF buildings based on eigenfrequency optimization, which is effective, simple to implement and efficient. The method is used to optimize a 10-storey regular building, and its effectiveness is demonstrated by nonlinear time history analyses, which show important reductions in storey drifts and lateral displacements compared to a non-optimized building. A second example for an irregular six-storey building demonstrates that the method provides benefits to a wide range of RCF structures and supports the applicability of the proposed method.

  9. Inferring cortical function in the mouse visual system through large-scale systems neuroscience.

    PubMed

    Hawrylycz, Michael; Anastassiou, Costas; Arkhipov, Anton; Berg, Jim; Buice, Michael; Cain, Nicholas; Gouwens, Nathan W; Gratiy, Sergey; Iyer, Ramakrishnan; Lee, Jung Hoon; Mihalas, Stefan; Mitelut, Catalin; Olsen, Shawn; Reid, R Clay; Teeter, Corinne; de Vries, Saskia; Waters, Jack; Zeng, Hongkui; Koch, Christof

    2016-07-05

    The scientific mission of the Project MindScope is to understand neocortex, the part of the mammalian brain that gives rise to perception, memory, intelligence, and consciousness. We seek to quantitatively evaluate the hypothesis that neocortex is a relatively homogeneous tissue, with smaller functional modules that perform a common computational function replicated across regions. We here focus on the mouse as a mammalian model organism with genetics, physiology, and behavior that can be readily studied and manipulated in the laboratory. We seek to describe the operation of cortical circuitry at the computational level by comprehensively cataloging and characterizing its cellular building blocks along with their dynamics and their cell type-specific connectivities. The project is also building large-scale experimental platforms (i.e., brain observatories) to record the activity of large populations of cortical neurons in behaving mice subject to visual stimuli. A primary goal is to understand the series of operations from visual input in the retina to behavior by observing and modeling the physical transformations of signals in the corticothalamic system. We here focus on the contribution that computer modeling and theory make to this long-term effort.

  10. Knowledge-acquisition tools for medical knowledge-based systems.

    PubMed

    Lanzola, G; Quaglini, S; Stefanelli, M

    1995-03-01

    Knowledge-based systems (KBS) have been proposed to solve a large variety of medical problems. A strategic issue for KBS development and maintenance are the efforts required for both knowledge engineers and domain experts. The proposed solution is building efficient knowledge acquisition (KA) tools. This paper presents a set of KA tools we are developing within a European Project called GAMES II. They have been designed after the formulation of an epistemological model of medical reasoning. The main goal is that of developing a computational framework which allows knowledge engineers and domain experts to interact cooperatively in developing a medical KBS. To this aim, a set of reusable software components is highly recommended. Their design was facilitated by the development of a methodology for KBS construction. It views this process as comprising two activities: the tailoring of the epistemological model to the specific medical task to be executed and the subsequent translation of this model into a computational architecture so that the connections between computational structures and their knowledge level counterparts are maintained. The KA tools we developed are illustrated taking examples from the behavior of a KBS we are building for the management of children with acute myeloid leukemia.

  11. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  12. Automatic Generation of Building Models with Levels of Detail 1-3

    NASA Astrophysics Data System (ADS)

    Nguatem, W.; Drauschke, M.; Mayer, H.

    2016-06-01

    We present a workflow for the automatic generation of building models with levels of detail (LOD) 1 to 3 according to the CityGML standard (Gröger et al., 2012). We start with orienting unsorted image sets employing (Mayer et al., 2012), we compute depth maps using semi-global matching (SGM) (Hirschmüller, 2008), and fuse these depth maps to reconstruct dense 3D point clouds (Kuhn et al., 2014). Based on planes segmented from these point clouds, we have developed a stochastic method for roof model selection (Nguatem et al., 2013) and window model selection (Nguatem et al., 2014). We demonstrate our workflow up to the export into CityGML.

  13. Institutional Transformation Version 2.5 Modeling and Planning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villa, Daniel; Mizner, Jack H.; Passell, Howard D.

    Reducing the resource consumption and emissions of large institutions is an important step toward a sustainable future. Sandia National Laboratories' (SNL) Institutional Transformation (IX) project vision is to provide tools that enable planners to make well-informed decisions concerning sustainability, resource conservation, and emissions reduction across multiple sectors. The building sector has been the primary focus so far because it is the largest consumer of resources for SNL. The IX building module allows users to define the evolution of many buildings over time. The module has been created so that it can be generally applied to any set of DOE-2 (more » http://doe2.com ) building models that have been altered to include parameters and expressions required by energy conservation measures (ECM). Once building models have been appropriately prepared, they are checked into a Microsoft Access (r) database. Each building can be represented by many models. This enables the capability to keep a continuous record of models in the past, which are replaced with different models as changes occur to the building. In addition to this, the building module has the capability to apply climate scenarios through applying different weather files to each simulation year. Once the database has been configured, a user interface in Microsoft Excel (r) is used to create scenarios with one or more ECMs. The capability to include central utility buildings (CUBs) that service more than one building with chilled water has been developed. A utility has been created that joins multiple building models into a single model. After using the utility, several manual steps are required to complete the process. Once this CUB model has been created, the individual contributions of each building are still tracked through meters. Currently, 120 building models from SNL's New Mexico and California campuses have been created. This includes all buildings at SNL greater than 10,000 sq. ft., representing 80% of the energy consumption at SNL. SNL has been able to leverage this model to estimate energy savings potential of many competing ECMs. The results helped high level decision makers to create energy reduction goals for SNL. These resources also have multiple applications for use of the models as individual buildings. In addition to the building module, a solar module built in Powersim Studio (r) allows planners to evaluate the potential photovoltaic (PV) energy generation potential for flat plate PV, concentrating solar PV, and concentration solar thermal technologies at multiple sites across SNL's New Mexico campus. Development of the IX modeling framework was a unique collaborative effort among planners and engineers in SNL's facilities division; scientists and computer modelers in SNL's research and development division; faculty from Arizona State University; and energy modelers from Bridger and Paxton Consulting Engineers Incorporated.« less

  14. Airside HVAC BESTEST. Adaptation of ASHRAE RP 865 Airside HVAC Equipment Modeling Test Cases for ASHRAE Standard 140. Volume 1, Cases AE101-AE445

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neymark, J.; Kennedy, M.; Judkoff, R.

    This report documents a set of diagnostic analytical verification cases for testing the ability of whole building simulation software to model the air distribution side of typical heating, ventilating and air conditioning (HVAC) equipment. These cases complement the unitary equipment cases included in American National Standards Institute (ANSI)/American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs, which test the ability to model the heat-transfer fluid side of HVAC equipment.

  15. Dust in the wind: challenges for urban aerodynamics

    NASA Astrophysics Data System (ADS)

    Boris, Jay P.

    2007-04-01

    The fluid dynamics of airflow through a city controls the transport and dispersion of airborne contaminants. This is urban aerodynamics, not meteorology. The average flow, large-scale fluctuations and turbulence are closely coupled to the building geometry. Buildings create large "rooster-tail" wakes; there are systematic fountain flows up the backs of tall buildings; and dust in the wind can move perpendicular to or even against the locally prevailing wind. Requirements for better prediction accuracy demand time-dependent, three-dimensional CFD computations that include solar heating and buoyancy, complete landscape and building geometry specification including foliage and, realistic wind fluctuations. This fundamental prediction capability is necessary to assess urban visibility and line-of-sight sensor performance in street canyons and rugged terrain. Computing urban aerodynamics accurately is clearly a time-dependent High Performance Computing (HPC) problem. In an emergency, on the other hand, prediction technology to assess crisis information, sensor performance, and obscured line-of-sight propagation in the face of industrial spills, transportation accidents, or terrorist attacks has very tight time requirements that suggest simple approximations which tend to produce inaccurate results. In the past we have had to choose one or the other: a fast, inaccurate model or a slow accurate model. Using new fluid-dynamic principles, an urban-oriented emergency assessment system called CT-Analyst® was invented that solves this dilemma. It produces HPC-quality results for airborne contaminant scenarios nearly instantly and has unique new capabilities suited to sensor optimization. This presentation treats the design and use of CT-Analyst and discusses the developments needed for widespread use with advanced sensor and communication systems.

  16. Probabilistic Modeling and Visualization of the Flexibility in Morphable Models

    NASA Astrophysics Data System (ADS)

    Lüthi, M.; Albrecht, T.; Vetter, T.

    Statistical shape models, and in particular morphable models, have gained widespread use in computer vision, computer graphics and medical imaging. Researchers have started to build models of almost any anatomical structure in the human body. While these models provide a useful prior for many image analysis task, relatively little information about the shape represented by the morphable model is exploited. We propose a method for computing and visualizing the remaining flexibility, when a part of the shape is fixed. Our method, which is based on Probabilistic PCA, not only leads to an approach for reconstructing the full shape from partial information, but also allows us to investigate and visualize the uncertainty of a reconstruction. To show the feasibility of our approach we performed experiments on a statistical model of the human face and the femur bone. The visualization of the remaining flexibility allows for greater insight into the statistical properties of the shape.

  17. Modeling study of deposition locations in the 291-Z plenum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahoney, L.A.; Glissmeyer, J.A.

    The TEMPEST (Trent and Eyler 1991) and PART5 computer codes were used to predict the probable locations of particle deposition in the suction-side plenum of the 291-Z building in the 200 Area of the Hanford Site, the exhaust fan building for the 234-5Z, 236-Z, and 232-Z buildings in the 200 Area of the Hanford Site. The Tempest code provided velocity fields for the airflow through the plenum. These velocity fields were then used with TEMPEST to provide modeling of near-floor particle concentrations without particle sticking (100% resuspension). The same velocity fields were also used with PART5 to provide modeling ofmore » particle deposition with sticking (0% resuspension). Some of the parameters whose importance was tested were particle size, point of injection and exhaust fan configuration.« less

  18. Large-Eddy Simulation on Plume Dispersion within Regular Arrays of Cubic Buildings

    NASA Astrophysics Data System (ADS)

    Nakayama, H.; Jurcakova, K.; Nagai, H.

    2010-09-01

    There is a potential problem that hazardous and flammable materials are accidentally or intentionally released into the atmosphere, either within or close to populated urban areas. For the assessment of human health hazard from toxic substances, the existence of high concentration peaks in a plume should be considered. For the safety analysis of flammable gas, certain critical threshold levels should be evaluated. Therefore, in such a situation, not only average levels but also instantaneous magnitudes of concentration should be accurately predicted. However, plume dispersion is an extremely complicated process strongly influenced by the existence of buildings. In complex turbulent flows, such as impinging, separated and circulation flows around buildings, plume behaviors can be no longer accurately predicted using empirical Gaussian-type plume model. Therefore, we perform Large-Eddy Simulations (LES) on turbulent flows and plume dispersions within and over regular arrays of cubic buildings with various roughness densities and investigate the influence of the building arrangement pattern on the characteristics of mean and fluctuation concentrations. The basic equations for the LES model are composed of the spatially filtered continuity equation, Navier-Stokes equation and transport equation of concentration. The standard Smagorinsky model (Smagorinsky, 1963) that has enough potential for environment flows is used and its constant is set to 0.12 for estimating the eddy viscosity. The turbulent Schmidt number is 0.5. In our LES model, two computational regions are set up. One is a driver region for generation of inflow turbulence and the other is a main region for LES of plume dispersion within a regular array of cubic buildings. First, inflow turbulence is generated by using Kataoka's method (2002) in the driver region and then, its data are imposed at the inlet of the main computational region at each time step. In this study, the cubic building arrays with λf=0.16, 0.25 and 0.33 are set up (λf: the building frontal area index). These surface geometries consist of 20×6, 25×7 and 28×9 arrays in streamwise and spanwise directions, respectively. Three cases of plume source located at the ground surface behind the building in the 6th, 7th and 8th row of the building array are tested. It is found that the patterns of the dispersion behavior depending on roughness density are successfully simulated and the spatial distributions of mean and fluctuating concentrations are also captured within and over the building arrays in comparison with the wind tunnel experiments conducted by Bezpalcová (2008).

  19. DOE/ NREL Build One of the World's Most Energy Efficient Office Spaces

    ScienceCinema

    Radocy, Rachel; Livingston, Brian; von Luhrte, Rich

    2018-05-18

    Technology — from sophisticated computer modeling to advanced windows that actually open — will help the newest building at the U.S. Department of Energy's (DOE) National Renewable Energy Laboratory (NREL) be one of the world's most energy efficient offices. Scheduled to open this summer, the 222,000 square-foot RSF will house more than 800 staff and an energy efficient information technology data center. Because 19 percent of the country's energy is used by commercial buildings, DOE plans to make this facility a showcase for energy efficiency. DOE hopes the design of the RSF will be replicated by the building industry and help reduce the nation's energy consumption by changing the way commercial buildings are designed and built.

  20. QUIC Transport and Dispersion Modeling of Vehicle Emissions in Cities for Better Public Health Assessments.

    PubMed

    Brown, Michael J; Williams, Michael D; Nelson, Matthew A; Werley, Kenneth A

    2015-01-01

    The Quick Urban and Industrial Complex (QUIC) plume modeling system is used to explore how the transport and dispersion of vehicle emissions in cities are impacted by the presence of buildings. Using downtown Philadelphia as a test case, notional vehicle emissions of gases and particles are specified as line source releases on a subset of the east-west and north-south streets. Cases were run in flat terrain and with 3D buildings present in order to show the differences in the model-computed outdoor concentration fields with and without buildings present. The QUIC calculations show that buildings result in regions with much higher concentrations and other areas with much lower concentrations when compared to the flat-earth case. On the roads with vehicle emissions, street-level concentrations were up to a factor of 10 higher when buildings were on either side of the street as compared to the flat-earth case due to trapping of pollutants between buildings. However, on roads without vehicle emissions and in other open areas, the concentrations were up to a factor of 100 times smaller as compared to the flat earth case because of vertical mixing of the vehicle emissions to building height in the cavity circulation that develops on the downwind side of unsheltered buildings. QUIC was also used to calculate infiltration of the contaminant into the buildings. Indoor concentration levels were found to be much lower than outdoor concentrations because of deposition onto indoor surfaces and particulate capture for buildings with filtration systems. Large differences in indoor concentrations from building to building resulted from differences in leakiness, air handling unit volume exchange rates, and filter type and for naturally ventilated buildings, whether or not the building was sheltered from the prevailing wind by a building immediately upwind.

  1. Ndarts

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan

    2011-01-01

    Ndarts software provides algorithms for computing quantities associated with the dynamics of articulated, rigid-link, multibody systems. It is designed as a general-purpose dynamics library that can be used for the modeling of robotic platforms, space vehicles, molecular dynamics, and other such applications. The architecture and algorithms in Ndarts are based on the Spatial Operator Algebra (SOA) theory for computational multibody and robot dynamics developed at JPL. It uses minimal, internal coordinate models. The algorithms are low-order, recursive scatter/ gather algorithms. In comparison with the earlier Darts++ software, this version has a more general and cleaner design needed to support a larger class of computational dynamics needs. It includes a frames infrastructure, allows algorithms to operate on subgraphs of the system, and implements lazy and deferred computation for better efficiency. Dynamics modeling modules such as Ndarts are core building blocks of control and simulation software for space, robotic, mechanism, bio-molecular, and material systems modeling.

  2. Effects of building roof greening on air quality in street canyons

    NASA Astrophysics Data System (ADS)

    Baik, Jong-Jin; Kwak, Kyung-Hwan; Park, Seung-Bu; Ryu, Young-Hee

    2012-12-01

    Building roof greening is a successful strategy for improving urban thermal environment. It is of theoretical interest and practical importance to study the effects of building roof greening on urban air quality in a systematic and quantitative way. In this study, we examine the effects of building roof greening on air quality in street canyons using a computational fluid dynamics (CFD) model that includes the thermodynamic energy equation and the transport equation of passive, non-reactive pollutants. For simplicity, building roof greening is represented by specified cooling. Results for a simple building configuration with a street canyon aspect ratio of one show that the cool air produced due to building roof greening flows into the street canyon, giving rise to strengthened street canyon flow. The strengthened street canyon flow enhances pollutant dispersion near the road, which decreases pollutant concentration there. Thus, building roof greening improves air quality near the road. The degree of air quality improvement near the road increases as the cooling intensity increases. In the middle region of the street canyon, the air quality can worsen when the cooling intensity is not too strong. Results for a real urban morphology also show that building roof greening improves air quality near roads. The degree of air quality improvement near roads due to building roof greening depends on the ambient wind direction. These findings provide a theoretical foundation for constructing green roofs for the purpose of improving air quality near roads or at a pedestrian level as well as urban thermal environment. Further studies using a CFD model coupled with a photochemistry model and a surface energy balance model are required to evaluate the effects of building roof greening on air quality in street canyons in a more realistic framework.

  3. Computational Phenotyping in Psychiatry: A Worked Example

    PubMed Central

    2016-01-01

    Abstract Computational psychiatry is a rapidly emerging field that uses model-based quantities to infer the behavioral and neuronal abnormalities that underlie psychopathology. If successful, this approach promises key insights into (pathological) brain function as well as a more mechanistic and quantitative approach to psychiatric nosology—structuring therapeutic interventions and predicting response and relapse. The basic procedure in computational psychiatry is to build a computational model that formalizes a behavioral or neuronal process. Measured behavioral (or neuronal) responses are then used to infer the model parameters of a single subject or a group of subjects. Here, we provide an illustrative overview over this process, starting from the modeling of choice behavior in a specific task, simulating data, and then inverting that model to estimate group effects. Finally, we illustrate cross-validation to assess whether between-subject variables (e.g., diagnosis) can be recovered successfully. Our worked example uses a simple two-step maze task and a model of choice behavior based on (active) inference and Markov decision processes. The procedural steps and routines we illustrate are not restricted to a specific field of research or particular computational model but can, in principle, be applied in many domains of computational psychiatry. PMID:27517087

  4. Computational Phenotyping in Psychiatry: A Worked Example.

    PubMed

    Schwartenbeck, Philipp; Friston, Karl

    2016-01-01

    Computational psychiatry is a rapidly emerging field that uses model-based quantities to infer the behavioral and neuronal abnormalities that underlie psychopathology. If successful, this approach promises key insights into (pathological) brain function as well as a more mechanistic and quantitative approach to psychiatric nosology-structuring therapeutic interventions and predicting response and relapse. The basic procedure in computational psychiatry is to build a computational model that formalizes a behavioral or neuronal process. Measured behavioral (or neuronal) responses are then used to infer the model parameters of a single subject or a group of subjects. Here, we provide an illustrative overview over this process, starting from the modeling of choice behavior in a specific task, simulating data, and then inverting that model to estimate group effects. Finally, we illustrate cross-validation to assess whether between-subject variables (e.g., diagnosis) can be recovered successfully. Our worked example uses a simple two-step maze task and a model of choice behavior based on (active) inference and Markov decision processes. The procedural steps and routines we illustrate are not restricted to a specific field of research or particular computational model but can, in principle, be applied in many domains of computational psychiatry.

  5. Building Blocks for Reliable Complex Nonlinear Numerical Simulations

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Mansour, Nagi N. (Technical Monitor)

    2002-01-01

    This talk describes some of the building blocks to ensure a higher level of confidence in the predictability and reliability (PAR) of numerical simulation of multiscale complex nonlinear problems. The focus is on relating PAR of numerical simulations with complex nonlinear phenomena of numerics. To isolate sources of numerical uncertainties, the possible discrepancy between the chosen partial differential equation (PDE) model and the real physics and/or experimental data is set aside. The discussion is restricted to how well numerical schemes can mimic the solution behavior of the underlying PDE model for finite time steps and grid spacings. The situation is complicated by the fact that the available theory for the understanding of nonlinear behavior of numerics is not at a stage to fully analyze the nonlinear Euler and Navier-Stokes equations. The discussion is based on the knowledge gained for nonlinear model problems with known analytical solutions to identify and explain the possible sources and remedies of numerical uncertainties in practical computations. Examples relevant to turbulent flow computations are included.

  6. Exploring similarities among many species distributions

    USGS Publications Warehouse

    Simmerman, Scott; Wang, Jingyuan; Osborne, James; Shook, Kimberly; Huang, Jian; Godsoe, William; Simons, Theodore R.

    2012-01-01

    Collecting species presence data and then building models to predict species distribution has been long practiced in the field of ecology for the purpose of improving our understanding of species relationships with each other and with the environment. Due to limitations of computing power as well as limited means of using modeling software on HPC facilities, past species distribution studies have been unable to fully explore diverse data sets. We build a system that can, for the first time to our knowledge, leverage HPC to support effective exploration of species similarities in distribution as well as their dependencies on common environmental conditions. Our system can also compute and reveal uncertainties in the modeling results enabling domain experts to make informed judgments about the data. Our work was motivated by and centered around data collection efforts within the Great Smoky Mountains National Park that date back to the 1940s. Our findings present new research opportunities in ecology and produce actionable field-work items for biodiversity management personnel to include in their planning of daily management activities.

  7. Building Blocks for Reliable Complex Nonlinear Numerical Simulations. Chapter 2

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Mansour, Nagi N. (Technical Monitor)

    2001-01-01

    This chapter describes some of the building blocks to ensure a higher level of confidence in the predictability and reliability (PAR) of numerical simulation of multiscale complex nonlinear problems. The focus is on relating PAR of numerical simulations with complex nonlinear phenomena of numerics. To isolate sources of numerical uncertainties, the possible discrepancy between the chosen partial differential equation (PDE) model and the real physics and/or experimental data is set aside. The discussion is restricted to how well numerical schemes can mimic the solution behavior of the underlying PDE model for finite time steps and grid spacings. The situation is complicated by the fact that the available theory for the understanding of nonlinear behavior of numerics is not at a stage to fully analyze the nonlinear Euler and Navier-Stokes equations. The discussion is based on the knowledge gained for nonlinear model problems with known analytical solutions to identify and explain the possible sources and remedies of numerical uncertainties in practical computations. Examples relevant to turbulent flow computations are included.

  8. Accuracy Assessment of a Complex Building 3d Model Reconstructed from Images Acquired with a Low-Cost Uas

    NASA Astrophysics Data System (ADS)

    Oniga, E.; Chirilă, C.; Stătescu, F.

    2017-02-01

    Nowadays, Unmanned Aerial Systems (UASs) are a wide used technique for acquisition in order to create buildings 3D models, providing the acquisition of a high number of images at very high resolution or video sequences, in a very short time. Since low-cost UASs are preferred, the accuracy of a building 3D model created using this platforms must be evaluated. To achieve results, the dean's office building from the Faculty of "Hydrotechnical Engineering, Geodesy and Environmental Engineering" of Iasi, Romania, has been chosen, which is a complex shape building with the roof formed of two hyperbolic paraboloids. Seven points were placed on the ground around the building, three of them being used as GCPs, while the remaining four as Check points (CPs) for accuracy assessment. Additionally, the coordinates of 10 natural CPs representing the building characteristic points were measured with a Leica TCR 405 total station. The building 3D model was created as a point cloud which was automatically generated based on digital images acquired with the low-cost UASs, using the image matching algorithm and different software like 3DF Zephyr, Visual SfM, PhotoModeler Scanner and Drone2Map for ArcGIS. Except for the PhotoModeler Scanner software, the interior and exterior orientation parameters were determined simultaneously by solving a self-calibrating bundle adjustment. Based on the UAS point clouds, automatically generated by using the above mentioned software and GNSS data respectively, the parameters of the east side hyperbolic paraboloid were calculated using the least squares method and a statistical blunder detection. Then, in order to assess the accuracy of the building 3D model, several comparisons were made for the facades and the roof with reference data, considered with minimum errors: TLS mesh for the facades and GNSS mesh for the roof. Finally, the front facade of the building was created in 3D based on its characteristic points using the PhotoModeler Scanner software, resulting a CAD (Computer Aided Design) model. The results showed the high potential of using low-cost UASs for building 3D model creation and if the building 3D model is created based on its characteristic points the accuracy is significantly improved.

  9. Study of Wind Effects on Unique Buildings

    NASA Astrophysics Data System (ADS)

    Olenkov, V.; Puzyrev, P.

    2017-11-01

    The article deals with a numerical simulation of wind effects on the building of the Church of the Intercession of the Holy Virgin in the village Bulzi of the Chelyabinsk region. We presented a calculation algorithm and obtained pressure fields, velocity fields and the fields of kinetic energy of a wind stream, as well as streamlines. Computational fluid dynamic (CFD) evolved three decades ago at the interfaces of calculus mathematics and theoretical hydromechanics and has become a separate branch of science the subject of which is a numerical simulation of different fluid and gas flows as well as the solution of arising problems with the help of methods that involve computer systems. This scientific field which is of a great practical value is intensively developing. The increase in CFD-calculations is caused by the improvement of computer technologies, creation of multipurpose easy-to-use CFD-packagers that are available to a wide group of researchers and cope with various tasks. Such programs are not only competitive in comparison with physical experiments but sometimes they provide the only opportunity to answer the research questions. The following advantages of computer simulation can be pointed out: a) Reduction in time spent on design and development of a model in comparison with a real experiment (variation of boundary conditions). b) Numerical experiment allows for the simulation of conditions that are not reproducible with environmental tests (use of ideal gas as environment). c) Use of computational gas dynamics methods provides a researcher with a complete and ample information that is necessary to fully describe different processes of the experiment. d) Economic efficiency of computer calculations is more attractive than an experiment. e) Possibility to modify a computational model which ensures efficient timing (change of the sizes of wall layer cells in accordance with the chosen turbulence model).

  10. Slope tomography based on eikonal solvers and the adjoint-state method

    NASA Astrophysics Data System (ADS)

    Tavakoli F., B.; Operto, S.; Ribodetti, A.; Virieux, J.

    2017-06-01

    Velocity macromodel building is a crucial step in the seismic imaging workflow as it provides the necessary background model for migration or full waveform inversion. In this study, we present a new formulation of stereotomography that can handle more efficiently long-offset acquisition, complex geological structures and large-scale data sets. Stereotomography is a slope tomographic method based upon a semi-automatic picking of local coherent events. Each local coherent event, characterized by its two-way traveltime and two slopes in common-shot and common-receiver gathers, is tied to a scatterer or a reflector segment in the subsurface. Ray tracing provides a natural forward engine to compute traveltime and slopes but can suffer from non-uniform ray sampling in presence of complex media and long-offset acquisitions. Moreover, most implementations of stereotomography explicitly build a sensitivity matrix, leading to the resolution of large systems of linear equations, which can be cumbersome when large-scale data sets are considered. Overcoming these issues comes with a new matrix-free formulation of stereotomography: a factored eikonal solver based on the fast sweeping method to compute first-arrival traveltimes and an adjoint-state formulation to compute the gradient of the misfit function. By solving eikonal equation from sources and receivers, we make the computational cost proportional to the number of sources and receivers while it is independent of picked events density in each shot and receiver gather. The model space involves the subsurface velocities and the scatterer coordinates, while the dips of the reflector segments are implicitly represented by the spatial support of the adjoint sources and are updated through the joint localization of nearby scatterers. We present an application on the complex Marmousi model for a towed-streamer acquisition and a realistic distribution of local events. We show that the estimated model, built without any prior knowledge of the velocities, provides a reliable initial model for frequency-domain FWI of long-offset data for a starting frequency of 4 Hz, although some artefacts at the reservoir level result from a deficit of illumination. This formulation of slope tomography provides a computationally efficient alternative to waveform inversion method such as reflection waveform inversion or differential-semblance optimization to build an initial model for pre-stack depth migration and conventional FWI.

  11. How does the brain solve visual object recognition?

    PubMed Central

    Zoccolan, Davide; Rust, Nicole C.

    2012-01-01

    Mounting evidence suggests that “core object recognition,” the ability to rapidly recognize objects despite substantial appearance variation, is solved in the brain via a cascade of reflexive, largely feedforward computations that culminate in a powerful neuronal representation in the inferior temporal cortex. However, the algorithm that produces this solution remains little-understood. Here we review evidence ranging from individual neurons, to neuronal populations, to behavior, to computational models. We propose that understanding this algorithm will require using neuronal and psychophysical data to sift through many computational models, each based on building blocks of small, canonical sub-networks with a common functional goal. PMID:22325196

  12. Identifying logical planes formed of compute nodes of a subcommunicator in a parallel computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Kristan D.; Faraj, Daniel

    In a parallel computer, a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: for each compute node of the subcommunicator and for a number of dimensions beginning with a first dimension: establishing, by a plane building node, in a positive direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in a positive direction of a second dimension, where the second dimension is orthogonal to the first dimension; and establishing, by the plane building node, in a negative direction of the first dimension,more » all logical planes that include the plane building node and compute nodes of the subcommunicator in the positive direction of the second dimension.« less

  13. The thinking of Cloud computing in the digital construction of the oil companies

    NASA Astrophysics Data System (ADS)

    CaoLei, Qizhilin; Dengsheng, Lei

    In order to speed up digital construction of the oil companies and enhance productivity and decision-support capabilities while avoiding the disadvantages from the waste of the original process of building digital and duplication of development and input. This paper presents a cloud-based models for the build in the digital construction of the oil companies that National oil companies though the private network will join the cloud data of the oil companies and service center equipment integrated into a whole cloud system, then according to the needs of various departments to prepare their own virtual service center, which can provide a strong service industry and computing power for the Oil companies.

  14. The European computer model for optronic system performance prediction (ECOMOS)

    NASA Astrophysics Data System (ADS)

    Repasi, Endre; Bijl, Piet; Labarre, Luc; Wittenstein, Wolfgang; Bürsing, Helge

    2017-05-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The project involves close co-operation of defence and security industry and public research institutes from France, Germany, Italy, The Netherlands and Sweden. ECOMOS uses and combines well-accepted existing European tools to build up a strong competitive position. This includes two TA models: the analytical TRM4 model and the image-based TOD model. In addition, it uses the atmosphere model MATISSE. In this paper, the central idea of ECOMOS is exposed. The overall software structure and the underlying models are shown and elucidated. The status of the project development is given as well as a short outlook on validation tests and the future potential of simulation for sensor assessment.

  15. A modeling investigation of the impact of street and building configurations on personal air pollutant exposure in isolated deep urban canyons.

    PubMed

    Ng, Wai-Yin; Chau, Chi-Kwan

    2014-01-15

    This study evaluated the effectiveness of different configurations for two building design elements, namely building permeability and setback, proposed for mitigating air pollutant exposure problems in isolated deep canyons by using an indirect exposure approach. The indirect approach predicted the exposures of three different population subgroups (i.e. pedestrians, shop vendors and residents) by multiplying the pollutant concentrations with the duration of exposure within a specific micro-environment. In this study, the pollutant concentrations for different configurations were predicted using a computational fluid dynamics model. The model was constructed based on the Reynolds-Averaged Navier-Stokes (RANS) equations with the standard k-ε turbulence model. Fifty-one canyon configurations with aspect ratios of 2, 4, 6 and different building permeability values (ratio of building spacing to the building façade length) or different types of building setback (recess of a high building from the road) were examined. The findings indicated that personal exposures of shop vendors were extremely high if they were present inside a canyon without any setback or separation between buildings and when the prevailing wind was perpendicular to the canyon axis. Building separation and building setbacks were effective in reducing personal air exposures in canyons with perpendicular wind, although their effectiveness varied with different configurations. Increasing the permeability value from 0 to 10% significantly lowered the personal exposures on the different population subgroups. Likewise, the personal exposures could also be reduced by the introduction of building setbacks despite their effects being strongly influenced by the aspect ratio of a canyon. Equivalent findings were observed if the reduction in the total development floor area (the total floor area permitted to be developed within a particular site area) was also considered. These findings were employed to formulate a hierarchy decision making model to guide the planning of deep canyons in high density urban cities. © 2013 Elsevier B.V. All rights reserved.

  16. Determining position inside building via laser rangefinder and handheld computer

    DOEpatents

    Ramsey, Jr James L. [Albuquerque, NM; Finley, Patrick [Albuquerque, NM; Melton, Brad [Albuquerque, NM

    2010-01-12

    An apparatus, computer software, and a method of determining position inside a building comprising selecting on a PDA at least two walls of a room in a digitized map of a building or a portion of a building, pointing and firing a laser rangefinder at corresponding physical walls, transmitting collected range information to the PDA, and computing on the PDA a position of the laser rangefinder within the room.

  17. Contextuality as a Resource for Models of Quantum Computation with Qubits

    NASA Astrophysics Data System (ADS)

    Bermejo-Vega, Juan; Delfosse, Nicolas; Browne, Dan E.; Okay, Cihan; Raussendorf, Robert

    2017-09-01

    A central question in quantum computation is to identify the resources that are responsible for quantum speed-up. Quantum contextuality has been recently shown to be a resource for quantum computation with magic states for odd-prime dimensional qudits and two-dimensional systems with real wave functions. The phenomenon of state-independent contextuality poses a priori an obstruction to characterizing the case of regular qubits, the fundamental building block of quantum computation. Here, we establish contextuality of magic states as a necessary resource for a large class of quantum computation schemes on qubits. We illustrate our result with a concrete scheme related to measurement-based quantum computation.

  18. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less

  19. Computational Design of DNA-Binding Proteins.

    PubMed

    Thyme, Summer; Song, Yifan

    2016-01-01

    Predicting the outcome of engineered and naturally occurring sequence perturbations to protein-DNA interfaces requires accurate computational modeling technologies. It has been well established that computational design to accommodate small numbers of DNA target site substitutions is possible. This chapter details the basic method of design used in the Rosetta macromolecular modeling program that has been successfully used to modulate the specificity of DNA-binding proteins. More recently, combining computational design and directed evolution has become a common approach for increasing the success rate of protein engineering projects. The power of such high-throughput screening depends on computational methods producing multiple potential solutions. Therefore, this chapter describes several protocols for increasing the diversity of designed output. Lastly, we describe an approach for building comparative models of protein-DNA complexes in order to utilize information from homologous sequences. These models can be used to explore how nature modulates specificity of protein-DNA interfaces and potentially can even be used as starting templates for further engineering.

  20. Beyond standard model calculations with Sherpa

    DOE PAGES

    Höche, Stefan; Kuttimalai, Silvan; Schumann, Steffen; ...

    2015-03-24

    We present a fully automated framework as part of the Sherpa event generator for the computation of tree-level cross sections in beyond Standard Model scenarios, making use of model information given in the Universal FeynRules Output format. Elementary vertices are implemented into C++ code automatically and provided to the matrix-element generator Comix at runtime. Widths and branching ratios for unstable particles are computed from the same building blocks. The corresponding decays are simulated with spin correlations. Parton showers, QED radiation and hadronization are added by Sherpa, providing a full simulation of arbitrary BSM processes at the hadron level.

  1. Beyond standard model calculations with Sherpa.

    PubMed

    Höche, Stefan; Kuttimalai, Silvan; Schumann, Steffen; Siegert, Frank

    We present a fully automated framework as part of the Sherpa event generator for the computation of tree-level cross sections in Beyond Standard Model scenarios, making use of model information given in the Universal FeynRules Output format. Elementary vertices are implemented into C++ code automatically and provided to the matrix-element generator Comix at runtime. Widths and branching ratios for unstable particles are computed from the same building blocks. The corresponding decays are simulated with spin correlations. Parton showers, QED radiation and hadronization are added by Sherpa, providing a full simulation of arbitrary BSM processes at the hadron level.

  2. Estimate Tsunami Flow Conditions and Large-Debris Tracks for the Design of Coastal Infrastructures along Coastlines of the U.S. Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Thomas, S.; Zhou, H.; Arcas, D.; Titov, V. V.

    2017-12-01

    The increasing potential tsunami hazards pose great challenges for infrastructures along the coastlines of the U.S. Pacific Northwest. Tsunami impact at a coastal site is usually assessed from deterministic scenarios based on 10,000 years of geological records in the Cascadia Subduction Zone (CSZ). Aside from these deterministic methods, the new ASCE 7-16 tsunami provisions provide engineering design criteria of tsunami loads on buildings based on a probabilistic approach. This work develops a site-specific model near Newport, OR using high-resolution grids, and compute tsunami inundation depth and velocities at the study site resulted from credible probabilistic and deterministic earthquake sources in the Cascadia Subduction Zone. Three Cascadia scenarios, two deterministic scenarios, XXL1 and L1, and a 2,500-yr probabilistic scenario compliant with the new ASCE 7-16 standard, are simulated using combination of a depth-averaged shallow water model for offshore propagation and a Boussinesq-type model for onshore inundation. We speculate on the methods and procedure to obtain the 2,500-year probabilistic scenario for Newport that is compliant with the ASCE 7-16 tsunami provisions. We provide details of model results, particularly the inundation depth and flow speed for a new building, which will also be designated as a tsunami vertical evacuation shelter, at Newport, Oregon. We show that the ASCE 7-16 consistent hazards are between those obtained from deterministic L1 and XXL1 scenarios, and the greatest impact on the building may come from later waves. As a further step, we utilize the inundation model results to numerically compute tracks of large vessels in the vicinity of the building site and estimate if these vessels will impact on the building site during the extreme XXL1 and ASCE 7-16 hazard-consistent scenarios. Two-step study is carried out first to study tracks of massless particles and then large vessels with assigned mass considering drag force, inertial force, ship grounding and mooring. The simulation results show that none of the large vessels will impact on the building site in all tested scenarios.

  3. Fitting mechanistic epidemic models to data: A comparison of simple Markov chain Monte Carlo approaches.

    PubMed

    Li, Michael; Dushoff, Jonathan; Bolker, Benjamin M

    2018-07-01

    Simple mechanistic epidemic models are widely used for forecasting and parameter estimation of infectious diseases based on noisy case reporting data. Despite the widespread application of models to emerging infectious diseases, we know little about the comparative performance of standard computational-statistical frameworks in these contexts. Here we build a simple stochastic, discrete-time, discrete-state epidemic model with both process and observation error and use it to characterize the effectiveness of different flavours of Bayesian Markov chain Monte Carlo (MCMC) techniques. We use fits to simulated data, where parameters (and future behaviour) are known, to explore the limitations of different platforms and quantify parameter estimation accuracy, forecasting accuracy, and computational efficiency across combinations of modeling decisions (e.g. discrete vs. continuous latent states, levels of stochasticity) and computational platforms (JAGS, NIMBLE, Stan).

  4. Automatic discovery of the communication network topology for building a supercomputer model

    NASA Astrophysics Data System (ADS)

    Sobolev, Sergey; Stefanov, Konstantin; Voevodin, Vadim

    2016-10-01

    The Research Computing Center of Lomonosov Moscow State University is developing the Octotron software suite for automatic monitoring and mitigation of emergency situations in supercomputers so as to maximize hardware reliability. The suite is based on a software model of the supercomputer. The model uses a graph to describe the computing system components and their interconnections. One of the most complex components of a supercomputer that needs to be included in the model is its communication network. This work describes the proposed approach for automatically discovering the Ethernet communication network topology in a supercomputer and its description in terms of the Octotron model. This suite automatically detects computing nodes and switches, collects information about them and identifies their interconnections. The application of this approach is demonstrated on the "Lomonosov" and "Lomonosov-2" supercomputers.

  5. Simulation-based Testing of Control Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozmen, Ozgur; Nutaro, James J.; Sanyal, Jibonananda

    It is impossible to adequately test complex software by examining its operation in a physical prototype of the system monitored. Adequate test coverage can require millions of test cases, and the cost of equipment prototypes combined with the real-time constraints of testing with them makes it infeasible to sample more than a small number of these tests. Model based testing seeks to avoid this problem by allowing for large numbers of relatively inexpensive virtual prototypes that operate in simulation time at a speed limited only by the available computing resources. In this report, we describe how a computer system emulatormore » can be used as part of a model based testing environment; specifically, we show that a complete software stack including operating system and application software - can be deployed within a simulated environment, and that these simulations can proceed as fast as possible. To illustrate this approach to model based testing, we describe how it is being used to test several building control systems that act to coordinate air conditioning loads for the purpose of reducing peak demand. These tests involve the use of ADEVS (A Discrete Event System Simulator) and QEMU (Quick Emulator) to host the operational software within the simulation, and a building model developed with the MODELICA programming language using Buildings Library and packaged as an FMU (Functional Mock-up Unit) that serves as the virtual test environment.« less

  6. Performance Modeling of Experimental Laser Lightcrafts

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Chen, Yen-Sen; Liu, Jiwen; Myrabo, Leik N.; Mead, Franklin B., Jr.; Turner, Jim (Technical Monitor)

    2001-01-01

    A computational plasma aerodynamics model is developed to study the performance of a laser propelled Lightcraft. The computational methodology is based on a time-accurate, three-dimensional, finite-difference, chemically reacting, unstructured grid, pressure-based formulation. The underlying physics are added and tested systematically using a building-block approach. The physics modeled include non-equilibrium thermodynamics, non-equilibrium air-plasma finite-rate kinetics, specular ray tracing, laser beam energy absorption and refraction by plasma, non-equilibrium plasma radiation, and plasma resonance. A series of transient computations are performed at several laser pulse energy levels and the simulated physics are discussed and compared with those of tests and literatures. The predicted coupling coefficients for the Lightcraft compared reasonably well with those of tests conducted on a pendulum apparatus.

  7. Cool pool development. Quarterly technical report No. 2, June-December 1979

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowther, K.

    1980-01-05

    The Cool Pool is a variation of the evaporating roof pond idea. The pool is isolated from the living space and the cooled pond water thermosiphons into the water columns located within the building. A computer model of the Cool Pool and the various heat and mass transfer mechanisms involved in the system are discussed. Theory will be compared to experimental data collected from a Cool Pool test building.

  8. Oxford International Conference on the Mechanical Properties of Materials at High Rates of Strain (4th) Held in Oxford, United Kingdom on 19-22 March 1989

    DTIC Science & Technology

    1989-03-22

    models are used in the computer program EPIC2 to describe the structural response in the cylinder impact test are compared and the differences are...Inc. 8600 Le Salle Road Suite 614, Oxford Building Towson, Maryland 21204 This paper describes the development and application of a computer program ...performed using a dynamic viscoplastic finite element computer program . The resolution of the procedure has been investigated by obtaining replicate

  9. The Role of Standards in Cloud-Computing Interoperability

    DTIC Science & Technology

    2012-10-01

    services are not shared outside the organization. CloudStack, Eucalyptus, HP, Microsoft, OpenStack , Ubuntu, and VMWare provide tools for building...center requirements • Developing usage models for cloud ven- dors • Independent IT consortium OpenStack http://www.openstack.org • Open-source...software for running private clouds • Currently consists of three core software projects: OpenStack Compute (Nova), OpenStack Object Storage (Swift

  10. Statistical Field Estimation for Complex Coastal Regions and Archipelagos (PREPRINT)

    DTIC Science & Technology

    2011-04-09

    and study the computational properties of these schemes. Specifically, we extend a multiscale Objective Analysis (OA) approach to complex coastal...computational properties of these schemes. Specifically, we extend a multiscale Objective Analysis (OA) approach to complex coastal regions and... multiscale free-surface code builds on the primitive-equation model of the Harvard Ocean Predic- tion System (HOPS, Haley et al. (2009)). Additionally

  11. Supermultiplicative Speedups of Probabilistic Model-Building Genetic Algorithms

    DTIC Science & Technology

    2009-02-01

    physicists as well as practitioners in evolutionary computation. The project was later extended to the one-dimensional SK spin glass with power -law... Brasil ) 10. Yuji Sato (Hosei University, Japan) 11. Shunsukc Saruwatari (Tokyo University, Japan) 12. Jian-Hung Chen (Feng Chia University, Taiwan...scalability. In A. Tiwari, J. Knowlcs, E. Avincri, K. Dahal, and R. Roy (Eds.) Applications of Soft Computing: Recent Trends. Berlin: Springer (2006

  12. A case study of the Weather Research and Forecasting model applied to the Joint Urban 2003 tracer field experiment. Part 2: Gas tracer dispersion

    DOE PAGES

    Nelson, Matthew A.; Brown, Michael J.; Halverson, Scot A.; ...

    2016-07-28

    Here, the Quick Urban & Industrial Complex (QUIC) atmospheric transport, and dispersion modelling, system was evaluated against the Joint Urban 2003 tracer-gas measurements. This was done using the wind and turbulence fields computed by the Weather Research and Forecasting (WRF) model. We compare the simulated and observed plume transport when using WRF-model-simulated wind fields, and local on-site wind measurements. Degradation of the WRF-model-based plume simulations was cased by errors in the simulated wind direction, and limitations in reproducing the small-scale wind-field variability. We explore two methods for importing turbulence from the WRF model simulations into the QUIC system. The firstmore » method uses parametrized turbulence profiles computed from WRF-model-computed boundary-layer similarity parameters; and the second method directly imports turbulent kinetic energy from the WRF model. Using the WRF model’s Mellor-Yamada-Janjic boundary-layer scheme, the parametrized turbulence profiles and the direct import of turbulent kinetic energy were found to overpredict and underpredict the observed turbulence quantities, respectively. Near-source building effects were found to propagate several km downwind. These building effects and the temporal/spatial variations in the observed wind field were often found to have a stronger influence over the lateral and vertical plume spread than the intensity of turbulence. Correcting the WRF model wind directions using a single observational location improved the performance of the WRF-model-based simulations, but using the spatially-varying flow fields generated from multiple observation profiles generally provided the best performance.« less

  13. A case study of the Weather Research and Forecasting model applied to the Joint Urban 2003 tracer field experiment. Part 2: Gas tracer dispersion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Matthew A.; Brown, Michael J.; Halverson, Scot A.

    Here, the Quick Urban & Industrial Complex (QUIC) atmospheric transport, and dispersion modelling, system was evaluated against the Joint Urban 2003 tracer-gas measurements. This was done using the wind and turbulence fields computed by the Weather Research and Forecasting (WRF) model. We compare the simulated and observed plume transport when using WRF-model-simulated wind fields, and local on-site wind measurements. Degradation of the WRF-model-based plume simulations was cased by errors in the simulated wind direction, and limitations in reproducing the small-scale wind-field variability. We explore two methods for importing turbulence from the WRF model simulations into the QUIC system. The firstmore » method uses parametrized turbulence profiles computed from WRF-model-computed boundary-layer similarity parameters; and the second method directly imports turbulent kinetic energy from the WRF model. Using the WRF model’s Mellor-Yamada-Janjic boundary-layer scheme, the parametrized turbulence profiles and the direct import of turbulent kinetic energy were found to overpredict and underpredict the observed turbulence quantities, respectively. Near-source building effects were found to propagate several km downwind. These building effects and the temporal/spatial variations in the observed wind field were often found to have a stronger influence over the lateral and vertical plume spread than the intensity of turbulence. Correcting the WRF model wind directions using a single observational location improved the performance of the WRF-model-based simulations, but using the spatially-varying flow fields generated from multiple observation profiles generally provided the best performance.« less

  14. Comparison between a typical and a simplified model for blast load-induced structural response

    NASA Astrophysics Data System (ADS)

    Abd-Elhamed, A.; Mahmoud, S.

    2017-02-01

    As explosive blasts continue to cause severe damage as well as victims in both civil and military environments. There is a bad need for understanding the behavior of structural elements to such extremely short duration dynamic loads where it is of great concern nowadays. Due to the complexity of the typical blast pressure profile model and in order to reduce the modelling and computational efforts, the simplified triangle model for blast loads profile is used to analyze structural response. This simplified model considers only the positive phase and ignores the suction phase which characterizes the typical one in simulating blast loads. The closed from solution for the equation of motion under blast load as a forcing term modelled either typical or simplified models has been derived. The considered herein two approaches have been compared using the obtained results from simulation response analysis of a building structure under an applied blast load. The computed error in simulating response using the simplified model with respect to the typical one has been computed. In general, both simplified and typical models can perform the dynamic blast-load induced response of building structures. However, the simplified one shows a remarkably different response behavior as compared to the typical one despite its simplicity and the use of only positive phase for simulating the explosive loads. The prediction of the dynamic system responses using the simplified model is not satisfactory due to the obtained larger errors as compared to the system responses obtained using the typical one.

  15. Exploitation of Semantic Building Model in Indoor Navigation Systems

    NASA Astrophysics Data System (ADS)

    Anjomshoaa, A.; Shayeganfar, F.; Tjoa, A. Min

    2009-04-01

    There are many types of indoor and outdoor navigation tools and methodologies available. A majority of these solutions are based on Global Positioning Systems (GPS) and instant video and image processing. These approaches are ideal for open world environments where very few information about the target location is available, but for large scale building environments such as hospitals, governmental offices, etc the end-user will need more detailed information about the surrounding context which is especially important in case of people with special needs. This paper presents a smart indoor navigation solution that is based on Semantic Web technologies and Building Information Model (BIM). The proposed solution is also aligned with Google Android's concepts to enlighten the realization of results. Keywords: IAI IFCXML, Building Information Model, Indoor Navigation, Semantic Web, Google Android, People with Special Needs 1 Introduction Built environment is a central factor in our daily life and a big portion of human life is spent inside buildings. Traditionally the buildings are documented using building maps and plans by utilization of IT tools such as computer-aided design (CAD) applications. Documenting the maps in an electronic way is already pervasive but CAD drawings do not suffice the requirements regarding effective building models that can be shared with other building-related applications such as indoor navigation systems. The navigation in built environment is not a new issue, however with the advances in emerging technologies like GPS, mobile and networked environments, and Semantic Web new solutions have been suggested to enrich the traditional building maps and convert them to smart information resources that can be reused in other applications and improve the interpretability with building inhabitants and building visitors. Other important issues that should be addressed in building navigation scenarios are location tagging and end-user communication. The available solutions for location tagging are mostly based on proximity sensors and the information are bound to sensor references. In the proposed solution of this paper, the sensors simply play a role similar to annotations in Semantic Web world. Hence the sensors data in ontology sense bridges the gap between sensed information and building model. Combining these two and applying the proper inference rules, the building visitors will be able to reach their destinations with instant support of their communication devices such as hand helds, wearable computers, mobiles, etc. In a typical scenario of this kind, user's profile will be delivered to the smart building (via building ad-hoc services) and the appropriate route for user will be calculated and delivered to user's end-device. The calculated route is calculated by considering all constraints and requirements of the end user. So for example if the user is using a wheelchair, the calculated route should not contain stairs or narrow corridors that the wheelchair does not pass through. Then user starts to navigate through building by following the instructions of the end-device which are in turn generated from the calculated route. During the navigation process, the end-device should also interact with the smart building to sense the locations by reading the surrounding tags. So for example when a visually impaired person arrives at an unknown space, the tags will be sensed and the relevant information will be delivered to user in the proper way of communication. For example the building model can be used to generate a voice message for a blind person about a space and tell him/her that "the space has 3 doors, and the door on the left should be chosen which needs to be pushed to open". In this paper we will mainly focus on automatic generation of semantic building information models (Semantic BIM) and delivery of results to the end user. Combining the building information model with the environment and user constraints using Semantic Web technologies will make many scenarios conceivable. The generated IFC ontology that is base on the commonly accepted IFC (Industry Foundation Classes) standard can be used as the basis of information sharing between buildings, people, and applications. The proposed solution is aiming to facilitate the building navigation in an intuitive and extendable way that is easy to use by end-users and at the same time easy to maintain and manage by building administrators.

  16. COMIS -- an international multizone air-flow and contaminant transport model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feustel, H.E.

    1998-08-01

    A number of interzonal models have been developed to calculate air flows and pollutant transport mechanisms in both single and multizone buildings. A recent development in multizone air-flow modeling, the COMIS model, has a number of capabilities that go beyond previous models, much as COMIS can be used as either a stand-alone air-flow model with input and output features or as an infiltration module for thermal building simulation programs. COMIS was designed during a 12 month workshop at Lawrence Berkeley National Laboratory (LBNL) in 1988-89. In 1990, the Executive Committee of the International Energy Agency`s Energy Conservation in Buildings andmore » Community Systems program created a working group on multizone air-flow modeling, which continued work on COMIS. The group`s objectives were to study physical phenomena causing air flow and pollutant (e.g., moisture) transport in multizone buildings, develop numerical modules to be integrated in the previously designed multizone air flow modeling system, and evaluate the computer code. The working group supported by nine nations, officially finished in late 1997 with the release of IISiBat/COMIS 3.0, which contains the documented simulation program COMIS, the user interface IISiBat, and reports describing the evaluation exercise.« less

  17. Mechanical Modeling and Computer Simulation of Protein Folding

    ERIC Educational Resources Information Center

    Prigozhin, Maxim B.; Scott, Gregory E.; Denos, Sharlene

    2014-01-01

    In this activity, science education and modern technology are bridged to teach students at the high school and undergraduate levels about protein folding and to strengthen their model building skills. Students are guided from a textbook picture of a protein as a rigid crystal structure to a more realistic view: proteins are highly dynamic…

  18. Intelligent tutoring systems for systems engineering methodologies

    NASA Technical Reports Server (NTRS)

    Meyer, Richard J.; Toland, Joel; Decker, Louis

    1991-01-01

    The general goal is to provide the technology required to build systems that can provide intelligent tutoring in IDEF (Integrated Computer Aided Manufacturing Definition Method) modeling. The following subject areas are covered: intelligent tutoring systems for systems analysis methodologies; IDEF tutor architecture and components; developing cognitive skills for IDEF modeling; experimental software; and PC based prototype.

  19. Teaching Tip: Using Rapid Game Prototyping for Exploring Requirements Discovery and Modeling

    ERIC Educational Resources Information Center

    Dalal, Nikunj

    2012-01-01

    We describe the use of rapid game prototyping as a pedagogic technique to experientially explore and learn requirements discovery, modeling, and specification in systems analysis and design courses. Students have a natural interest in gaming that transcends age, gender, and background. Rapid digital game creation is used to build computer games…

  20. Building a Computer Program to Support Children, Parents, and Distraction during Healthcare Procedures

    PubMed Central

    McCarthy, Ann Marie; Kleiber, Charmaine; Ataman, Kaan; Street, W. Nick; Zimmerman, M. Bridget; Ersig, Anne L.

    2012-01-01

    This secondary data analysis used data mining methods to develop predictive models of child risk for distress during a healthcare procedure. Data used came from a study that predicted factors associated with children’s responses to an intravenous catheter insertion while parents provided distraction coaching. From the 255 items used in the primary study, 44 predictive items were identified through automatic feature selection and used to build support vector machine regression models. Models were validated using multiple cross-validation tests and by comparing variables identified as explanatory in the traditional versus support vector machine regression. Rule-based approaches were applied to the model outputs to identify overall risk for distress. A decision tree was then applied to evidence-based instructions for tailoring distraction to characteristics and preferences of the parent and child. The resulting decision support computer application, the Children, Parents and Distraction (CPaD), is being used in research. Future use will support practitioners in deciding the level and type of distraction intervention needed by a child undergoing a healthcare procedure. PMID:22805121

  1. EnergyPlus Run Time Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Buhl, Fred; Haves, Philip

    2008-09-20

    EnergyPlus is a new generation building performance simulation program offering many new modeling capabilities and more accurate performance calculations integrating building components in sub-hourly time steps. However, EnergyPlus runs much slower than the current generation simulation programs. This has become a major barrier to its widespread adoption by the industry. This paper analyzed EnergyPlus run time from comprehensive perspectives to identify key issues and challenges of speeding up EnergyPlus: studying the historical trends of EnergyPlus run time based on the advancement of computers and code improvements to EnergyPlus, comparing EnergyPlus with DOE-2 to understand and quantify the run time differences,more » identifying key simulation settings and model features that have significant impacts on run time, and performing code profiling to identify which EnergyPlus subroutines consume the most amount of run time. This paper provides recommendations to improve EnergyPlus run time from the modeler?s perspective and adequate computing platforms. Suggestions of software code and architecture changes to improve EnergyPlus run time based on the code profiling results are also discussed.« less

  2. Experimental Demonstration of Frequency Regulation by Commercial Buildings – Part II: Results and Performance Evaluation

    DOE PAGES

    Vrettos, Evangelos; Kara, Emre Can; MacDonald, Jason; ...

    2016-11-15

    This paper is the second part of a two-part series presenting the results from an experimental demonstration of frequency regulation in a commercial building test facility. We developed relevant building models and designed a hierarchical controller for reserve scheduling, building climate control and frequency regulation in Part I. In Part II, we introduce the communication architecture and experiment settings, and present extensive experimental results under frequency regulation. More specifically, we compute the day-ahead reserve capacity of the test facility under different assumptions and conditions. Furthermore, we demonstrate the ability of model predictive control to satisfy comfort constraints under frequency regulation,more » and show that fan speed control can track the fast-moving RegD signal of the Pennsylvania, Jersey, and Maryland Power Market (PJM) very accurately. In addition, we discuss potential effects of frequency regulation on building operation (e.g., increase in energy consumption, oscillations in supply air temperature, and effect on chiller cycling), and provide suggestions for real-world implementation projects. Our results show that hierarchical control is appropriate for frequency regulation from commercial buildings.« less

  3. Post Occupancy energy evaluation of Ronald Tutor Hall using eQUEST; Computer based simulation of existing building and comparison of data

    NASA Astrophysics Data System (ADS)

    Dulom, Duyum

    Buildings account for about 40 percent of total U.S. energy consumption. It is therefore important to shift our focus on important measures that can be taken to make buildings more energy efficient. With the rise in number of buildings day by day and the dwindling resources, retrofitting buildings is the key to an energy efficiency future. Post occupancy evaluation (POE) is an important tool and is ideal for the retrofitting process. POE would help to identify the problem areas in the building and enable researchers and designers to come up with solutions addressing the inefficient energy usage as well as the overall wellbeing of the users of the building. The post occupancy energy evaluation of Ronald Tutor Hall (RTH) located at the University of Southern California is one small step in that direction. RTH was chosen to study because; (a) relatively easy access to the building data (b) it was built in compliance with Title 24 2001 and (c) it was old enough to have post occupancy data. The energy modeling tool eQuest was used to simulate the RTH building using the background information of the building such as internal thermal comfort profile, occupancy profile, building envelope profile, internal heat gain profile, etc. The simulation results from eQuest were then compared with the actual building recorded data to verify that our simulated model was behaving similar to the actual building. Once we were able to make the simulated model behave like the actual building, changes were made to the model such as installation of occupancy sensor in the classroom & laboratories, changing the thermostat set points and introducing solar shade on northwest and southwest facade. The combined savings of the proposed interventions resulted in a 6% savings in the overall usage of energy.

  4. A highly detailed FEM volume conductor model based on the ICBM152 average head template for EEG source imaging and TCS targeting.

    PubMed

    Haufe, Stefan; Huang, Yu; Parra, Lucas C

    2015-08-01

    In electroencephalographic (EEG) source imaging as well as in transcranial current stimulation (TCS), it is common to model the head using either three-shell boundary element (BEM) or more accurate finite element (FEM) volume conductor models. Since building FEMs is computationally demanding and labor intensive, they are often extensively reused as templates even for subjects with mismatching anatomies. BEMs can in principle be used to efficiently build individual volume conductor models; however, the limiting factor for such individualization are the high acquisition costs of structural magnetic resonance images. Here, we build a highly detailed (0.5mm(3) resolution, 6 tissue type segmentation, 231 electrodes) FEM based on the ICBM152 template, a nonlinear average of 152 adult human heads, which we call ICBM-NY. We show that, through more realistic electrical modeling, our model is similarly accurate as individual BEMs. Moreover, through using an unbiased population average, our model is also more accurate than FEMs built from mismatching individual anatomies. Our model is made available in Matlab format.

  5. On the application of a new thermal diagnostic model: the passive elements equivalent in term of ventilation inside a room

    NASA Astrophysics Data System (ADS)

    El Khattabi, El Mehdi; Mharzi, Mohamed; Raefat, Saad; Meghari, Zouhair

    2018-05-01

    In this paper, the thermal equivalence of the passive elements of a room in a building located in Fez-Morocco has been studied. The possibility of replacing them with a semi-passive element such as ventilation has been appraised. For this aim a Software in Fortran taking into account the meteorological external conditions along with different parameters of the building envelope has been performed. A new computational approach is adapted to determinate the temperature distribution throughout the building multilayer walls. A novel equation gathering the internal temperature with the external conditions, and the building envelope has been deduced in transient state.

  6. Twenty Years On!: Updating the IEA BESTEST Building Thermal Fabric Test Cases for ASHRAE Standard 140

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.

    2013-07-01

    ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs applies the IEA BESTEST building thermal fabric test cases and example simulation results originally published in 1995. These software accuracy test cases and their example simulation results, which comprise the first test suite adapted for the initial 2001 version of Standard 140, are approaching their 20th anniversary. In response to the evolution of the state of the art in building thermal fabric modeling since the test cases and example simulation results were developed, work is commencing to update the normative test specification and themore » informative example results.« less

  7. Twenty Years On!: Updating the IEA BESTEST Building Thermal Fabric Test Cases for ASHRAE Standard 140: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.

    2013-07-01

    ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs applies the IEA BESTEST building thermal fabric test cases and example simulation results originally published in 1995. These software accuracy test cases and their example simulation results, which comprise the first test suite adapted for the initial 2001 version of Standard 140, are approaching their 20th anniversary. In response to the evolution of the state of the art in building thermal fabric modeling since the test cases and example simulation results were developed, work is commencing to update the normative test specification and themore » informative example results.« less

  8. A Computational Framework for Realistic Retina Modeling.

    PubMed

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas.

  9. HVAC modifications and computerized energy analysis for the Operations Support Building at the Mars Deep Space Station at Goldstone

    NASA Technical Reports Server (NTRS)

    Halperin, A.; Stelzmuller, P.

    1986-01-01

    The key heating, ventilation, and air-conditioning (HVAC) modifications implemented at the Mars Deep Space Station's Operation Support Building at Jet Propulsion Laboratories (JPL) in order to reduce energy consumption and decrease operating costs are described. An energy analysis comparison between the computer simulated model for the building and the actual meter data was presented. The measurement performance data showed that the cumulative energy savings was about 21% for the period 1979 to 1981. The deviation from simulated data to measurement performance data was only about 3%.

  10. Integrated analysis of numerical weather prediction and computational fluid dynamics for estimating cross-ventilation effects on inhaled air quality inside a factory

    NASA Astrophysics Data System (ADS)

    Murga, Alicia; Sano, Yusuke; Kawamoto, Yoichi; Ito, Kazuhide

    2017-10-01

    Mechanical and passive ventilation strategies directly impact indoor air quality. Passive ventilation has recently become widespread owing to its ability to reduce energy demand in buildings, such as the case of natural or cross ventilation. To understand the effect of natural ventilation on indoor environmental quality, outdoor-indoor flow paths need to be analyzed as functions of urban atmospheric conditions, topology of the built environment, and indoor conditions. Wind-driven natural ventilation (e.g., cross ventilation) can be calculated through the wind pressure coefficient distributions of outdoor wall surfaces and openings of a building, allowing the study of indoor air parameters and airborne contaminant concentrations. Variations in outside parameters will directly impact indoor air quality and residents' health. Numerical modeling can contribute to comprehend these various parameters because it allows full control of boundary conditions and sampling points. In this study, numerical weather prediction modeling was used to calculate wind profiles/distributions at the atmospheric scale, and computational fluid dynamics was used to model detailed urban and indoor flows, which were then integrated into a dynamic downscaling analysis to predict specific urban wind parameters from the atmospheric to built-environment scale. Wind velocity and contaminant concentration distributions inside a factory building were analyzed to assess the quality of the human working environment by using a computer simulated person. The impact of cross ventilation flows and its variations on local average contaminant concentration around a factory worker, and inhaled contaminant dose, were then discussed.

  11. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.

  12. Multi-scale modelling to evaluate building energy consumption at the neighbourhood scale.

    PubMed

    Mauree, Dasaraden; Coccolo, Silvia; Kaempf, Jérôme; Scartezzini, Jean-Louis

    2017-01-01

    A new methodology is proposed to couple a meteorological model with a building energy use model. The aim of such a coupling is to improve the boundary conditions of both models with no significant increase in computational time. In the present case, the Canopy Interface Model (CIM) is coupled with CitySim. CitySim provides the geometrical characteristics to CIM, which then calculates a high resolution profile of the meteorological variables. These are in turn used by CitySim to calculate the energy flows in an urban district. We have conducted a series of experiments on the EPFL campus in Lausanne, Switzerland, to show the effectiveness of the coupling strategy. First, measured data from the campus for the year 2015 are used to force CIM and to evaluate its aptitude to reproduce high resolution vertical profiles. Second, we compare the use of local climatic data and data from a meteorological station located outside the urban area, in an evaluation of energy use. In both experiments, we demonstrate the importance of using in building energy software, meteorological variables that account for the urban microclimate. Furthermore, we also show that some building and urban forms are more sensitive to the local environment.

  13. Multi-scale modelling to evaluate building energy consumption at the neighbourhood scale

    PubMed Central

    Coccolo, Silvia; Kaempf, Jérôme; Scartezzini, Jean-Louis

    2017-01-01

    A new methodology is proposed to couple a meteorological model with a building energy use model. The aim of such a coupling is to improve the boundary conditions of both models with no significant increase in computational time. In the present case, the Canopy Interface Model (CIM) is coupled with CitySim. CitySim provides the geometrical characteristics to CIM, which then calculates a high resolution profile of the meteorological variables. These are in turn used by CitySim to calculate the energy flows in an urban district. We have conducted a series of experiments on the EPFL campus in Lausanne, Switzerland, to show the effectiveness of the coupling strategy. First, measured data from the campus for the year 2015 are used to force CIM and to evaluate its aptitude to reproduce high resolution vertical profiles. Second, we compare the use of local climatic data and data from a meteorological station located outside the urban area, in an evaluation of energy use. In both experiments, we demonstrate the importance of using in building energy software, meteorological variables that account for the urban microclimate. Furthermore, we also show that some building and urban forms are more sensitive to the local environment. PMID:28880883

  14. Designing an Agent-Based Model Using Group Model Building: Application to Food Insecurity Patterns in a U.S. Midwestern Metropolitan City.

    PubMed

    Koh, Keumseok; Reno, Rebecca; Hyder, Ayaz

    2018-04-01

    Recent advances in computing resources have increased interest in systems modeling and population health. While group model building (GMB) has been effectively applied in developing system dynamics models (SD), few studies have used GMB for developing an agent-based model (ABM). This article explores the use of a GMB approach to develop an ABM focused on food insecurity. In our GMB workshops, we modified a set of the standard GMB scripts to develop and validate an ABM in collaboration with local experts and stakeholders. Based on this experience, we learned that GMB is a useful collaborative modeling platform for modelers and community experts to address local population health issues. We also provide suggestions for increasing the use of the GMB approach to develop rigorous, useful, and validated ABMs.

  15. [Fabrication and accuracy research on 3D printing dental model based on cone beam computed tomography digital modeling].

    PubMed

    Zhang, Hui-Rong; Yin, Le-Feng; Liu, Yan-Li; Yan, Li-Yi; Wang, Ning; Liu, Gang; An, Xiao-Li; Liu, Bin

    2018-04-01

    The aim of this study is to build a digital dental model with cone beam computed tomography (CBCT), to fabricate a virtual model via 3D printing, and to determine the accuracy of 3D printing dental model by comparing the result with a traditional dental cast. CBCT of orthodontic patients was obtained to build a digital dental model by using Mimics 10.01 and Geomagic studio software. The 3D virtual models were fabricated via fused deposition modeling technique (FDM). The 3D virtual models were compared with the traditional cast models by using a Vernier caliper. The measurements used for comparison included the width of each tooth, the length and width of the maxillary and mandibular arches, and the length of the posterior dental crest. 3D printing models had higher accuracy compared with the traditional cast models. The results of the paired t-test of all data showed that no statistically significant difference was observed between the two groups (P>0.05). Dental digital models built with CBCT realize the digital storage of patients' dental condition. The virtual dental model fabricated via 3D printing avoids traditional impression and simplifies the clinical examination process. The 3D printing dental models produced via FDM show a high degree of accuracy. Thus, these models are appropriate for clinical practice.

  16. Optimization-Based Inverse Identification of the Parameters of a Concrete Cap Material Model

    NASA Astrophysics Data System (ADS)

    Král, Petr; Hokeš, Filip; Hušek, Martin; Kala, Jiří; Hradil, Petr

    2017-10-01

    Issues concerning the advanced numerical analysis of concrete building structures in sophisticated computing systems currently require the involvement of nonlinear mechanics tools. The efforts to design safer, more durable and mainly more economically efficient concrete structures are supported via the use of advanced nonlinear concrete material models and the geometrically nonlinear approach. The application of nonlinear mechanics tools undoubtedly presents another step towards the approximation of the real behaviour of concrete building structures within the framework of computer numerical simulations. However, the success rate of this application depends on having a perfect understanding of the behaviour of the concrete material models used and having a perfect understanding of the used material model parameters meaning. The effective application of nonlinear concrete material models within computer simulations often becomes very problematic because these material models very often contain parameters (material constants) whose values are difficult to obtain. However, getting of the correct values of material parameters is very important to ensure proper function of a concrete material model used. Today, one possibility, which permits successful solution of the mentioned problem, is the use of optimization algorithms for the purpose of the optimization-based inverse material parameter identification. Parameter identification goes hand in hand with experimental investigation while it trying to find parameter values of the used material model so that the resulting data obtained from the computer simulation will best approximate the experimental data. This paper is focused on the optimization-based inverse identification of the parameters of a concrete cap material model which is known under the name the Continuous Surface Cap Model. Within this paper, material parameters of the model are identified on the basis of interaction between nonlinear computer simulations, gradient based and nature inspired optimization algorithms and experimental data, the latter of which take the form of a load-extension curve obtained from the evaluation of uniaxial tensile test results. The aim of this research was to obtain material model parameters corresponding to the quasi-static tensile loading which may be further used for the research involving dynamic and high-speed tensile loading. Based on the obtained results it can be concluded that the set goal has been reached.

  17. Validation of Computational Models in Biomechanics

    PubMed Central

    Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.

    2010-01-01

    The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648

  18. Automated method for structural segmentation of nasal airways based on cone beam computed tomography

    NASA Astrophysics Data System (ADS)

    Tymkovych, Maksym Yu.; Avrunin, Oleg G.; Paliy, Victor G.; Filzow, Maksim; Gryshkov, Oleksandr; Glasmacher, Birgit; Omiotek, Zbigniew; DzierŻak, RóŻa; Smailova, Saule; Kozbekova, Ainur

    2017-08-01

    The work is dedicated to the segmentation problem of human nasal airways using Cone Beam Computed Tomography. During research, we propose a specialized approach of structured segmentation of nasal airways. That approach use spatial information, symmetrisation of the structures. The proposed stages can be used for construction a virtual three dimensional model of nasal airways and for production full-scale personalized atlases. During research we build the virtual model of nasal airways, which can be used for construction specialized medical atlases and aerodynamics researches.

  19. From Oss CAD to Bim for Cultural Heritage Digital Representation

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Karachaliou, E.; Stylianidis, E.

    2017-02-01

    The paper illustrates the use of open source Computer-aided design (CAD) environments in order to develop Building Information Modelling (BIM) tools able to manage 3D models in the field of cultural heritage. Nowadays, the development of Free and Open Source Software (FOSS) has been rapidly growing and their use tends to be consolidated. Although BIM technology is widely known and used, there is a lack of integrated open source platforms able to support all stages of Historic Building Information Modelling (HBIM) processes. The present research aims to use a FOSS CAD environment in order to develop BIM plug-ins which will be able to import and edit digital representations of cultural heritage models derived by photogrammetric methods.

  20. Estimation of Model's Marginal likelihood Using Adaptive Sparse Grid Surrogates in Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Zeng, X.

    2015-12-01

    A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.

  1. Influence of urban pattern on inundation flow in floodplains of lowland rivers.

    PubMed

    Bruwier, M; Mustafa, A; Aliaga, D G; Archambeau, P; Erpicum, S; Nishida, G; Zhang, X; Pirotton, M; Teller, J; Dewals, B

    2018-05-01

    The objective of this paper is to investigate the respective influence of various urban pattern characteristics on inundation flow. A set of 2000 synthetic urban patterns were generated using an urban procedural model providing locations and shapes of streets and buildings over a square domain of 1×1km 2 . Steady two-dimensional hydraulic computations were performed over the 2000 urban patterns with identical hydraulic boundary conditions. To run such a large amount of simulations, the computational efficiency of the hydraulic model was improved by using an anisotropic porosity model. This model computes on relatively coarse computational cells, but preserves information from the detailed topographic data through porosity parameters. Relationships between urban characteristics and the computed inundation water depths have been based on multiple linear regressions. Finally, a simple mechanistic model based on two district-scale porosity parameters, combining several urban characteristics, is shown to capture satisfactorily the influence of urban characteristics on inundation water depths. The findings of this study give guidelines for more flood-resilient urban planning. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Curricular Influences on Female Afterschool Facilitators' Computer Science Interests and Career Choices

    NASA Astrophysics Data System (ADS)

    Koch, Melissa; Gorges, Torie

    2016-10-01

    Underrepresented populations such as women, African-Americans, and Latinos/as often come to STEM (science, technology, engineering, and mathematics) careers by less traditional paths than White and Asian males. To better understand how and why women might shift toward STEM, particularly computer science, careers, we investigated the education and career direction of afterschool facilitators, primarily women of color in their twenties and thirties, who taught Build IT, an afterschool computer science curriculum for middle school girls. Many of these women indicated that implementing Build IT had influenced their own interest in technology and computer science and in some cases had resulted in their intent to pursue technology and computer science education. We wanted to explore the role that teaching Build IT may have played in activating or reactivating interest in careers in computer science and to see whether in the years following implementation of Build IT, these women pursued STEM education and/or careers. We reached nine facilitators who implemented the program in 2011-12 or shortly after. Many indicated that while facilitating Build IT, they learned along with the participants, increasing their interest in and confidence with technology and computer science. Seven of the nine participants pursued further STEM or computer science learning or modified their career paths to include more of a STEM or computer science focus. Through interviews, we explored what aspects of Build IT influenced these facilitators' interest and confidence in STEM and when relevant their pursuit of technology and computer science education and careers.

  3. Computational Models for Calcium-Mediated Astrocyte Functions.

    PubMed

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus, we would like to emphasize that only via reproducible research are we able to build better computational models for astrocytes, which truly advance science. Our study is the first to characterize in detail the biophysical and biochemical mechanisms that have been modeled for astrocytes.

  4. Computational Models for Calcium-Mediated Astrocyte Functions

    PubMed Central

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus, we would like to emphasize that only via reproducible research are we able to build better computational models for astrocytes, which truly advance science. Our study is the first to characterize in detail the biophysical and biochemical mechanisms that have been modeled for astrocytes. PMID:29670517

  5. Computational toxicology using the OpenTox application programming interface and Bioclipse

    PubMed Central

    2011-01-01

    Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173

  6. ARACHNE: A neural-neuroglial network builder with remotely controlled parallel computing

    PubMed Central

    Rusakov, Dmitri A.; Savtchenko, Leonid P.

    2017-01-01

    Creating and running realistic models of neural networks has hitherto been a task for computing professionals rather than experimental neuroscientists. This is mainly because such networks usually engage substantial computational resources, the handling of which requires specific programing skills. Here we put forward a newly developed simulation environment ARACHNE: it enables an investigator to build and explore cellular networks of arbitrary biophysical and architectural complexity using the logic of NEURON and a simple interface on a local computer or a mobile device. The interface can control, through the internet, an optimized computational kernel installed on a remote computer cluster. ARACHNE can combine neuronal (wired) and astroglial (extracellular volume-transmission driven) network types and adopt realistic cell models from the NEURON library. The program and documentation (current version) are available at GitHub repository https://github.com/LeonidSavtchenko/Arachne under the MIT License (MIT). PMID:28362877

  7. JPL Energy Consumption Program (ECP) documentation: A computer model simulating heating, cooling and energy loads in buildings. [low cost solar array efficiency

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.; Chai, V. W.; Lascu, D.; Urbenajo, R.; Wong, P.

    1978-01-01

    The engineering manual provides a complete companion documentation about the structure of the main program and subroutines, the preparation of input data, the interpretation of output results, access and use of the program, and the detailed description of all the analytic, logical expressions and flow charts used in computations and program structure. A numerical example is provided and solved completely to show the sequence of computations followed. The program is carefully structured to reduce both user's time and costs without sacrificing accuracy. The user would expect a cost of CPU time of approximately $5.00 per building zone excluding printing costs. The accuracy, on the other hand, measured by deviation of simulated consumption from watt-hour meter readings, was found by many simulation tests not to exceed + or - 10 percent margin.

  8. Minimum-complexity helicopter simulation math model

    NASA Technical Reports Server (NTRS)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  9. Local-aggregate modeling for big data via distributed optimization: Applications to neuroimaging.

    PubMed

    Hu, Yue; Allen, Genevera I

    2015-12-01

    Technological advances have led to a proliferation of structured big data that have matrix-valued covariates. We are specifically motivated to build predictive models for multi-subject neuroimaging data based on each subject's brain imaging scans. This is an ultra-high-dimensional problem that consists of a matrix of covariates (brain locations by time points) for each subject; few methods currently exist to fit supervised models directly to this tensor data. We propose a novel modeling and algorithmic strategy to apply generalized linear models (GLMs) to this massive tensor data in which one set of variables is associated with locations. Our method begins by fitting GLMs to each location separately, and then builds an ensemble by blending information across locations through regularization with what we term an aggregating penalty. Our so called, Local-Aggregate Model, can be fit in a completely distributed manner over the locations using an Alternating Direction Method of Multipliers (ADMM) strategy, and thus greatly reduces the computational burden. Furthermore, we propose to select the appropriate model through a novel sequence of faster algorithmic solutions that is similar to regularization paths. We will demonstrate both the computational and predictive modeling advantages of our methods via simulations and an EEG classification problem. © 2015, The International Biometric Society.

  10. On directionality of phrase structure building.

    PubMed

    Chesi, Cristiano

    2015-02-01

    Minimalism in grammatical theorizing (Chomsky in The minimalist program. MIT Press, Cambridge, 1995) led to simpler linguistic devices and a better focalization of the core properties of the structure building engine: a lexicon and a free (recursive) phrase formation operation, dubbed Merge, are the basic components that serve in building syntactic structures. Here I suggest that by looking at the elementary restrictions that apply to Merge (i.e., selection and licensing of functional features), we could conclude that a re-orientation of the syntactic derivation (from bottom-up/right-left to top-down/left-right) is necessary to make the theory simpler, especially for long-distance (filler-gap) dependencies, and is also empirically more adequate. If the structure building operations would assemble lexical items in the order they are pronounced (Phillips in Order and structure. PhD thesis, MIT, 1996; Chesi in Phases and cartography in linguistic computation: Toward a cognitively motivated computational model of linguistic competence. PhD thesis, Università di Siena, 2004; Chesi in Competence and computation: Toward a processing friendly minimalist grammar. Unipress, Padova, 2012), on-line performance data could better fit the grammatical model, without resorting to external "performance factors." The phase-based, top-down (and, as a consequence, left-right) Minimalist Grammar here discussed goes in this direction, ultimately showing how strong Islands (Huang in Logical relations in Chinese and the theory of grammar. PhD thesis, MIT, 1982) and intervention effects (Gordon et al. in J Exp Psychol Learn Mem Cogn 27:1411-1423, 2001, Gordon et al. in J Mem Lang 51:97-114, 2004) could be better explained in structural terms assuming this unconventional derivational direction.

  11. Surface temperatures in New York City: Geospatial data enables the accurate prediction of radiative heat transfer.

    PubMed

    Ghandehari, Masoud; Emig, Thorsten; Aghamohamadnia, Milad

    2018-02-02

    Despite decades of research seeking to derive the urban energy budget, the dynamics of thermal exchange in the densely constructed environment is not yet well understood. Using New York City as a study site, we present a novel hybrid experimental-computational approach for a better understanding of the radiative heat transfer in complex urban environments. The aim of this work is to contribute to the calculation of the urban energy budget, particularly the stored energy. We will focus our attention on surface thermal radiation. Improved understanding of urban thermodynamics incorporating the interaction of various bodies, particularly in high rise cities, will have implications on energy conservation at the building scale, and for human health and comfort at the urban scale. The platform presented is based on longwave hyperspectral imaging of nearly 100 blocks of Manhattan, in addition to a geospatial radiosity model that describes the collective radiative heat exchange between multiple buildings. Despite assumptions in surface emissivity and thermal conductivity of buildings walls, the close comparison of temperatures derived from measurements and computations is promising. Results imply that the presented geospatial thermodynamic model of urban structures can enable accurate and high resolution analysis of instantaneous urban surface temperatures.

  12. Time Analysis of Building Dynamic Response Under Seismic Action. Part 2: Example of Calculation

    NASA Astrophysics Data System (ADS)

    Ufimtcev, E. M.

    2017-11-01

    The second part of the article illustrates the use of the time analysis method (TAM) by the example of the calculation of a 3-storey building, the design dynamic model (DDM) of which is adopted in the form of a flat vertical cantilever rod with 3 horizontal degrees of freedom associated with floor and coverage levels. The parameters of natural oscillations (frequencies and modes) and the results of the calculation of the elastic forced oscillations of the building’s DDM - oscillograms of the reaction parameters on the time interval t ∈ [0; 131,25] sec. The obtained results are analyzed on the basis of the computed values of the discrepancy of the DDS motion equation and the comparison of the results calculated on the basis of the numerical approach (FEM) and the normative method set out in SP 14.13330.2014 “Construction in Seismic Regions”. The data of the analysis testify to the accuracy of the construction of the computational model as well as the high accuracy of the results obtained. In conclusion, it is revealed that the use of the TAM will improve the strength of buildings and structures subject to seismic influences when designing them.

  13. Microgravity Manufacturing Via Fused Deposition

    NASA Technical Reports Server (NTRS)

    Cooper, K. G.; Griffin, M. R.

    2003-01-01

    Manufacturing polymer hardware during space flight is currently outside the state of the art. A process called fused deposition modeling (FDM) can make this approach a reality by producing net-shaped components of polymer materials directly from a CAE model. FDM is a rapid prototyping process developed by Stratasys, Inc.. which deposits a fine line of semi-molten polymer onto a substrate while moving via computer control to form the cross-sectional shape of the part it is building. The build platen is then lowered and the process is repeated, building a component directly layer by layer. This method enables direct net-shaped production of polymer components directly from a computer file. The layered manufacturing process allows for the manufacture of complex shapes and internal cavities otherwise impossible to machine. This task demonstrated the benefits of the FDM technique to quickly and inexpensively produce replacement components or repair broken hardware in a Space Shuttle or Space Station environment. The intent of the task was to develop and fabricate an FDM system that was lightweight, compact, and required minimum power consumption to fabricate ABS plastic hardware in microgravity. The final product of the shortened task turned out to be a ground-based breadboard device, demonstrating miniaturization capability of the system.

  14. Evaluation of Structural Robustness against Column Loss: Methodology and Application to RC Frame Buildings.

    PubMed

    Bao, Yihai; Main, Joseph A; Noh, Sam-Young

    2017-08-01

    A computational methodology is presented for evaluating structural robustness against column loss. The methodology is illustrated through application to reinforced concrete (RC) frame buildings, using a reduced-order modeling approach for three-dimensional RC framing systems that includes the floor slabs. Comparisons with high-fidelity finite-element model results are presented to verify the approach. Pushdown analyses of prototype buildings under column loss scenarios are performed using the reduced-order modeling approach, and an energy-based procedure is employed to account for the dynamic effects associated with sudden column loss. Results obtained using the energy-based approach are found to be in good agreement with results from direct dynamic analysis of sudden column loss. A metric for structural robustness is proposed, calculated by normalizing the ultimate capacities of the structural system under sudden column loss by the applicable service-level gravity loading and by evaluating the minimum value of this normalized ultimate capacity over all column removal scenarios. The procedure is applied to two prototype 10-story RC buildings, one employing intermediate moment frames (IMFs) and the other employing special moment frames (SMFs). The SMF building, with its more stringent seismic design and detailing, is found to have greater robustness.

  15. Dynamic Rate Dependent Elastoplastic Damage Modeling of Concrete Subject to Blast Loading: Formulation and Computational Aspects

    DTIC Science & Technology

    1990-10-31

    and ZIP Code) Dept. of Civil Fng. & Oper. Research Building 410 Princeton University Bolling AFB, D.C. 20332-6448 ?rinceton, NJ 08544 i. NAME OF FUNDING...Department of Civil Engineering & Operations Research Princeton University Princeton, N.J. 08544 31 October 1990 Final Technical Report Approved for public...release; Distribution is unlimited. I Prepared for Air Force Office of Scientific Research Building 410 Boiling AFB, D.C. 20332-6448 PREFACE This work

  16. DEEP: Database of Energy Efficiency Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Piette, Mary; Lee, Sang Hoon

    A database of energy efficiency performance (DEEP) is a presimulated database to enable quick and accurate assessment of energy retrofit of commercial buildings. DEEP was compiled from results of about 10 million EnergyPlus simulations. DEEP provides energy savings for screening and evaluation of retrofit measures targeting the small and medium-sized office and retail buildings in California. The prototype building models are developed for a comprehensive assessment of building energy performance based on DOE commercial reference buildings and the California DEER [sic] prototype buildings. The prototype buildings represent seven building types across six vintages of constructions and 16 California climate zones.more » DEEP uses these prototypes to evaluate energy performance of about 100 energy conservation measures covering envelope, lighting, heating, ventilation, air conditioning, plug loads, and domestic hot war. DEEP consists the energy simulation results for individual retrofit measures as well as packages of measures to consider interactive effects between multiple measures. The large scale EnergyPlus simulations are being conducted on the super computers at the National Energy Research Scientific Computing Center (NERSC) of Lawrence Berkeley National Laboratory. The pre-simulation database is a part of the CEC PIER project to develop a web-based retrofit toolkit for small and medium-sized commercial buildings in California, which provides real-time energy retrofit feedback by querying DEEP with recommended measures, estimated energy savings and financial payback period based on users' decision criteria of maximizing energy savings, energy cost savings, carbon reduction, or payback of investment. The pre-simulated database and associated comprehensive measure analysis enhances the ability to performance assessments of retrofits to reduce energy use for small and medium buildings and business owners who typically do not have resources to conduct costly building energy audit.« less

  17. The Amber Biomolecular Simulation Programs

    PubMed Central

    CASE, DAVID A.; CHEATHAM, THOMAS E.; DARDEN, TOM; GOHLKE, HOLGER; LUO, RAY; MERZ, KENNETH M.; ONUFRIEV, ALEXEY; SIMMERLING, CARLOS; WANG, BING; WOODS, ROBERT J.

    2006-01-01

    We describe the development, current features, and some directions for future development of the Amber package of computer programs. This package evolved from a program that was constructed in the late 1970s to do Assisted Model Building with Energy Refinement, and now contains a group of programs embodying a number of powerful tools of modern computational chemistry, focused on molecular dynamics and free energy calculations of proteins, nucleic acids, and carbohydrates. PMID:16200636

  18. Computer Architecture for Energy Efficient SFQ

    DTIC Science & Technology

    2014-08-27

    IBM Corporation (T.J. Watson Research Laboratory) 1101 Kitchawan Road Yorktown Heights, NY 10598 -0000 2 ABSTRACT Number of Papers published in peer...accomplished during this ARO-sponsored project at IBM Research to identify and model an energy efficient SFQ-based computer architecture. The... IBM Windsor Blue (WB), illustrated schematically in Figure 2. The basic building block of WB is a "tile" comprised of a 64-bit arithmetic logic unit

  19. The Power of Flexibility: Autonomous Agents That Conserve Energy in Commercial Buildings

    NASA Astrophysics Data System (ADS)

    Kwak, Jun-young

    Agent-based systems for energy conservation are now a growing area of research in multiagent systems, with applications ranging from energy management and control on the smart grid, to energy conservation in residential buildings, to energy generation and dynamic negotiations in distributed rural communities. Contributing to this area, my thesis presents new agent-based models and algorithms aiming to conserve energy in commercial buildings. More specifically, my thesis provides three sets of algorithmic contributions. First, I provide online predictive scheduling algorithms to handle massive numbers of meeting/event scheduling requests considering flexibility , which is a novel concept for capturing generic user constraints while optimizing the desired objective. Second, I present a novel BM-MDP ( Bounded-parameter Multi-objective Markov Decision Problem) model and robust algorithms for multi-objective optimization under uncertainty both at the planning and execution time. The BM-MDP model and its robust algorithms are useful in (re)scheduling events to achieve energy efficiency in the presence of uncertainty over user's preferences. Third, when multiple users contribute to energy savings, fair division of credit for such savings to incentivize users for their energy saving activities arises as an important question. I appeal to cooperative game theory and specifically to the concept of Shapley value for this fair division. Unfortunately, scaling up this Shapley value computation is a major hindrance in practice. Therefore, I present novel approximation algorithms to efficiently compute the Shapley value based on sampling and partitions and to speed up the characteristic function computation. These new models have not only advanced the state of the art in multiagent algorithms, but have actually been successfully integrated within agents dedicated to energy efficiency: SAVES, TESLA and THINC. SAVES focuses on the day-to-day energy consumption of individuals and groups in commercial buildings by reactively suggesting energy conserving alternatives. TESLA takes a long-range planning perspective and optimizes overall energy consumption of a large number of group events or meetings together. THINC provides an end-to-end integration within a single agent of energy efficient scheduling, rescheduling and credit allocation. While SAVES, TESLA and THINC thus differ in their scope and applicability, they demonstrate the utility of agent-based systems in actually reducing energy consumption in commercial buildings. I evaluate my algorithms and agents using extensive analysis on data from over 110,000 real meetings/events at multiple educational buildings including the main libraries at the University of Southern California. I also provide results on simulations and real-world experiments, clearly demonstrating the power of agent technology to assist human users in saving energy in commercial buildings.

  20. Computerized Biomechanical Man-Model

    DTIC Science & Technology

    1976-07-01

    Force Systems Command Wright-Patterson AFB, Ohio ABSTRACT The COMputerized BIomechanical MAN-Model (called COMBIMAN) is a computer interactive graphics...concept was to build a mock- The use of mock-ups for biomechanical evalua- up which permitted the designer to visualize the tion has long been a tool...of the can become an obstacle to design change. Aerospace Medical Research Laboratory, we are developing a computerized biomechanical man-model

  1. Collaborative modelling: the future of computational neuroscience?

    PubMed

    Davison, Andrew P

    2012-01-01

    Given the complexity of biological neural circuits and of their component cells and synapses, building and simulating robust, well-validated, detailed models increasingly surpasses the resources of an individual researcher or small research group. In this article, I will briefly review possible solutions to this problem, argue for open, collaborative modelling as the optimal solution for advancing neuroscience knowledge, and identify potential bottlenecks and possible solutions.

  2. Improved Air Combat Awareness; with AESA and Next-Generation Signal Processing

    DTIC Science & Technology

    2002-09-01

    competence network Building techniques Software development environment Communication Computer architecture Modeling Real-time programming Radar...memory access, skewed load and store, 3.2 GB/s BW • Performance: 400 MFLOPS Runtime environment Custom runtime routines Driver routines Hardware

  3. Tutoring at a Distance: Modelling as a Tool to Control Chaos

    ERIC Educational Resources Information Center

    Bertin, Jean-Claude; Narcy-Combes, Jean-Paul

    2012-01-01

    This article builds on a previous article published in 2007, which aimed at clarifying the concept of tutoring. Based on a new epistemological stance (emergentism) the authors will here show how the various components of the computer-assisted language learning situation form a complex chaotic system. They advocate that modelling is a way of…

  4. A Model Based Framework for Semantic Interpretation of Architectural Construction Drawings

    ERIC Educational Resources Information Center

    Babalola, Olubi Oluyomi

    2011-01-01

    The study addresses the automated translation of architectural drawings from 2D Computer Aided Drafting (CAD) data into a Building Information Model (BIM), with emphasis on the nature, possible role, and limitations of a drafting language Knowledge Representation (KR) on the problem and process. The central idea is that CAD to BIM translation is a…

  5. Computer-Mediated Counter-Arguments and Individual Learning

    ERIC Educational Resources Information Center

    Hsu, Jack Shih-Chieh; Huang, Hsieh-Hong; Linden, Lars P.

    2011-01-01

    This study explores a de-bias function for a decision support systems (DSS) that is designed to help a user avoid confirmation bias by increasing the user's learning opportunities. Grounded upon the theory of mental models, the use of DSS is viewed as involving a learning process, whereby a user is directed to build mental models so as to reduce…

  6. Modeling and simulation of dense cloud dispersion in urban areas by means of computational fluid dynamics.

    PubMed

    Scargiali, F; Grisafi, F; Busciglio, A; Brucato, A

    2011-12-15

    The formation of toxic heavy clouds as a result of sudden accidental releases from mobile containers, such as road tankers or railway tank cars, may occur inside urban areas so the problem arises of their consequences evaluation. Due to the semi-confined nature of the dispersion site simplified models may often be inappropriate. As an alternative, computational fluid dynamics (CFD) has the potential to provide realistic simulations even for geometrically complex scenarios since the heavy gas dispersion process is described by basic conservation equations with a reduced number of approximations. In the present work a commercial general purpose CFD code (CFX 4.4 by Ansys(®)) is employed for the simulation of dense cloud dispersion in urban areas. The simulation strategy proposed involves a stationary pre-release flow field simulation followed by a dynamic after-release flow and concentration field simulations. In order to try a generalization of results, the computational domain is modeled as a simple network of straight roads with regularly distributed blocks mimicking the buildings. Results show that the presence of buildings lower concentration maxima and enlarge the side spread of the cloud. Dispersion dynamics is also found to be strongly affected by the quantity of heavy-gas released. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Kristan D.; Faraj, Daniel A.

    In a parallel computer, a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: for each compute node of the subcommunicator and for a number of dimensions beginning with a first dimension: establishing, by a plane building node, in a positive direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in a positive direction of a second dimension, where the second dimension is orthogonal to the first dimension; and establishing, by the plane building node, in a negative direction of the first dimension,more » all logical planes that include the plane building node and compute nodes of the subcommunicator in the positive direction of the second dimension.« less

  8. Infill Walls Contribution on the Progressive Collapse Resistance of a Typical Mid-rise RC Framed Building

    NASA Astrophysics Data System (ADS)

    Besoiu, Teodora; Popa, Anca

    2017-10-01

    This study investigates the effect of the autoclaved aerated concrete infill walls on the progressive collapse resistance of a typical RC framed structure. The 13-storey building located in Brăila (a zone with high seismic risk in Romania) was designed according to the former Romanian seismic code P13-70 (1970). Two models of the structure are generated in the Extreme Loading® for Structures computer software: a model with infill walls and a model without infill walls. Following GSA (2003) Guidelines, a nonlinear dynamic procedure is used to determine the progressive collapse risk of the building when a first-storey corner column is suddenly removed. It was found that, the structure is not expected to fail under the standard GSA loading: DL+0.25LL. Moreover, if the infill walls are introduced in the model, the maximum vertical displacement of the node above the removed column is reduced by about 48%.

  9. New Directions for Hardware-assisted Trusted Computing Policies (Position Paper)

    NASA Astrophysics Data System (ADS)

    Bratus, Sergey; Locasto, Michael E.; Ramaswamy, Ashwin; Smith, Sean W.

    The basic technological building blocks of the TCG architecture seem to be stabilizing. As a result, we believe that the focus of the Trusted Computing (TC) discipline must naturally shift from the design and implementation of the hardware root of trust (and the subsequent trust chain) to the higher-level application policies. Such policies must build on these primitives to express new sets of security goals. We highlight the relationship between enforcing these types of policies and debugging, since both activities establish the link between expected and actual application behavior. We argue that this new class of policies better fits developers' mental models of expected application behaviors, and we suggest a hardware design direction for enabling the efficient interpretation of such policies.

  10. Building Fossils in the Elementary School and Writing about Them Using Computers.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.; Yoshida, Sarah

    This material describes a fossil-building activity using sea shells, chicken bones, and plaster for grade one through three students. Related process skills, vocabulary, computer principles, time requirements, and materials are listed. Two methods of building the fossils are discussed. After building the fossils, classes may be divided into pairs…

  11. Direct-coupled microcomputer-based building emulator for building energy management and control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lam, H.N.

    1999-07-01

    In this paper, the development and implementation of a direct-coupled building emulator for a building energy management and control system (EMCS) is presented. The building emulator consists of a microcomputer and a computer model of an air-conditioning system implemented in a modular dynamic simulation software package for direct-coupling to an EMCS, without using analog-to-digital and digital-to-analog converters. The building emulator can be used to simulate in real time the behavior of the air-conditioning system under a given operating environment and subject to a given usage pattern. Software modules for data communication, graphical display, dynamic data exchange, and synchronization of simulationmore » outputs with real time have been developed to achieve direct digital data transfer between the building emulator and a commercial EMCS. Based on the tests conducted, the validity of the building emulator has been established and the proportional-plus-integral control function of the EMCS assessed.« less

  12. Performance Modeling of an Experimental Laser Propelled Lightcraft

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Chen, Yen-Sen; Liu, Jiwen; Myrabo, Leik N.; Mead, Franklin B., Jr.

    2000-01-01

    A computational plasma aerodynamics model is developed to study the performance of an experimental laser propelled lightcraft. The computational methodology is based on a time-accurate, three-dimensional, finite-difference, chemically reacting, unstructured grid, pressure- based formulation. The underlying physics are added and tested systematically using a building-block approach. The physics modeled include non-equilibn'um thermodynamics, non-equilibrium air-plasma finite-rate kinetics, specular ray tracing, laser beam energy absorption and equi refraction by plasma, non-equilibrium plasma radiation, and plasma resonance. A series of transient computations are performed at several laser pulse energy levels and the simulated physics are discussed and compared with those of tests and literature. The predicted coupling coefficients for the lightcraft compared reasonably well with those of tests conducted on a pendulum apparatus.

  13. Modelling Complex Fenestration Systems using physical and virtual models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thanachareonkit, Anothai; Scartezzini, Jean-Louis

    2010-04-15

    Physical or virtual models are commonly used to visualize the conceptual ideas of architects, lighting designers and researchers; they are also employed to assess the daylighting performance of buildings, particularly in cases where Complex Fenestration Systems (CFS) are considered. Recent studies have however revealed a general tendency of physical models to over-estimate this performance, compared to those of real buildings; these discrepancies can be attributed to several reasons. In order to identify the main error sources, a series of comparisons in-between a real building (a single office room within a test module) and the corresponding physical and virtual models wasmore » undertaken. The physical model was placed in outdoor conditions, which were strictly identical to those of the real building, as well as underneath a scanning sky simulator. The virtual model simulations were carried out by way of the Radiance program using the GenSky function; an alternative evaluation method, named Partial Daylight Factor method (PDF method), was also employed with the physical model together with sky luminance distributions acquired by a digital sky scanner during the monitoring of the real building. The overall daylighting performance of physical and virtual models were assessed and compared. The causes of discrepancies between the daylighting performance of the real building and the models were analysed. The main identified sources of errors are the reproduction of building details, the CFS modelling and the mocking-up of the geometrical and photometrical properties. To study the impact of these errors on daylighting performance assessment, computer simulation models created using the Radiance program were also used to carry out a sensitivity analysis of modelling errors. The study of the models showed that large discrepancies can occur in daylighting performance assessment. In case of improper mocking-up of the glazing for instance, relative divergences of 25-40% can be found in different room locations, suggesting that more light is entering than actually monitored in the real building. All these discrepancies can however be reduced by making an effort to carefully mock up the geometry and photometry of the real building. A synthesis is presented in this article which can be used as guidelines for daylighting designers to avoid or estimate errors during CFS daylighting performance assessment. (author)« less

  14. Design and implementation of space physics multi-model application integration based on web

    NASA Astrophysics Data System (ADS)

    Jiang, Wenping; Zou, Ziming

    With the development of research on space environment and space science, how to develop network online computing environment of space weather, space environment and space physics models for Chinese scientific community is becoming more and more important in recent years. Currently, There are two software modes on space physics multi-model application integrated system (SPMAIS) such as C/S and B/S. the C/S mode which is traditional and stand-alone, demands a team or workshop from many disciplines and specialties to build their own multi-model application integrated system, that requires the client must be deployed in different physical regions when user visits the integrated system. Thus, this requirement brings two shortcomings: reducing the efficiency of researchers who use the models to compute; inconvenience of accessing the data. Therefore, it is necessary to create a shared network resource access environment which could help users to visit the computing resources of space physics models through the terminal quickly for conducting space science research and forecasting spatial environment. The SPMAIS develops high-performance, first-principles in B/S mode based on computational models of the space environment and uses these models to predict "Space Weather", to understand space mission data and to further our understanding of the solar system. the main goal of space physics multi-model application integration system (SPMAIS) is to provide an easily and convenient user-driven online models operating environment. up to now, the SPMAIS have contained dozens of space environment models , including international AP8/AE8 IGRF T96 models and solar proton prediction model geomagnetic transmission model etc. which are developed by Chinese scientists. another function of SPMAIS is to integrate space observation data sets which offers input data for models online high-speed computing. In this paper, service-oriented architecture (SOA) concept that divides system into independent modules according to different business needs is applied to solve the problem of the independence of the physical space between multiple models. The classic MVC(Model View Controller) software design pattern is concerned to build the architecture of space physics multi-model application integrated system. The JSP+servlet+javabean technology is used to integrate the web application programs of space physics multi-model. It solves the problem of multi-user requesting the same job of model computing and effectively balances each server computing tasks. In addition, we also complete follow tasks: establishing standard graphical user interface based on Java Applet application program; Designing the interface between model computing and model computing results visualization; Realizing three-dimensional network visualization without plug-ins; Using Java3D technology to achieve a three-dimensional network scene interaction; Improved ability to interact with web pages and dynamic execution capabilities, including rendering three-dimensional graphics, fonts and color control. Through the design and implementation of the SPMAIS based on Web, we provide an online computing and application runtime environment of space physics multi-model. The practical application improves that researchers could be benefit from our system in space physics research and engineering applications.

  15. A Spiking Neural Network Model of the Lateral Geniculate Nucleus on the SpiNNaker Machine

    PubMed Central

    Sen-Bhattacharya, Basabdatta; Serrano-Gotarredona, Teresa; Balassa, Lorinc; Bhattacharya, Akash; Stokes, Alan B.; Rowley, Andrew; Sugiarto, Indar; Furber, Steve

    2017-01-01

    We present a spiking neural network model of the thalamic Lateral Geniculate Nucleus (LGN) developed on SpiNNaker, which is a state-of-the-art digital neuromorphic hardware built with very-low-power ARM processors. The parallel, event-based data processing in SpiNNaker makes it viable for building massively parallel neuro-computational frameworks. The LGN model has 140 neurons representing a “basic building block” for larger modular architectures. The motivation of this work is to simulate biologically plausible LGN dynamics on SpiNNaker. Synaptic layout of the model is consistent with biology. The model response is validated with existing literature reporting entrainment in steady state visually evoked potentials (SSVEP)—brain oscillations corresponding to periodic visual stimuli recorded via electroencephalography (EEG). Periodic stimulus to the model is provided by: a synthetic spike-train with inter-spike-intervals in the range 10–50 Hz at a resolution of 1 Hz; and spike-train output from a state-of-the-art electronic retina subjected to a light emitting diode flashing at 10, 20, and 40 Hz, simulating real-world visual stimulus to the model. The resolution of simulation is 0.1 ms to ensure solution accuracy for the underlying differential equations defining Izhikevichs neuron model. Under this constraint, 1 s of model simulation time is executed in 10 s real time on SpiNNaker; this is because simulations on SpiNNaker work in real time for time-steps dt ⩾ 1 ms. The model output shows entrainment with both sets of input and contains harmonic components of the fundamental frequency. However, suppressing the feed-forward inhibition in the circuit produces subharmonics within the gamma band (>30 Hz) implying a reduced information transmission fidelity. These model predictions agree with recent lumped-parameter computational model-based predictions, using conventional computers. Scalability of the framework is demonstrated by a multi-node architecture consisting of three “nodes,” where each node is the “basic building block” LGN model. This 420 neuron model is tested with synthetic periodic stimulus at 10 Hz to all the nodes. The model output is the average of the outputs from all nodes, and conforms to the above-mentioned predictions of each node. Power consumption for model simulation on SpiNNaker is ≪1 W. PMID:28848380

  16. A Spiking Neural Network Model of the Lateral Geniculate Nucleus on the SpiNNaker Machine.

    PubMed

    Sen-Bhattacharya, Basabdatta; Serrano-Gotarredona, Teresa; Balassa, Lorinc; Bhattacharya, Akash; Stokes, Alan B; Rowley, Andrew; Sugiarto, Indar; Furber, Steve

    2017-01-01

    We present a spiking neural network model of the thalamic Lateral Geniculate Nucleus (LGN) developed on SpiNNaker, which is a state-of-the-art digital neuromorphic hardware built with very-low-power ARM processors. The parallel, event-based data processing in SpiNNaker makes it viable for building massively parallel neuro-computational frameworks. The LGN model has 140 neurons representing a "basic building block" for larger modular architectures. The motivation of this work is to simulate biologically plausible LGN dynamics on SpiNNaker. Synaptic layout of the model is consistent with biology. The model response is validated with existing literature reporting entrainment in steady state visually evoked potentials (SSVEP)-brain oscillations corresponding to periodic visual stimuli recorded via electroencephalography (EEG). Periodic stimulus to the model is provided by: a synthetic spike-train with inter-spike-intervals in the range 10-50 Hz at a resolution of 1 Hz; and spike-train output from a state-of-the-art electronic retina subjected to a light emitting diode flashing at 10, 20, and 40 Hz, simulating real-world visual stimulus to the model. The resolution of simulation is 0.1 ms to ensure solution accuracy for the underlying differential equations defining Izhikevichs neuron model. Under this constraint, 1 s of model simulation time is executed in 10 s real time on SpiNNaker; this is because simulations on SpiNNaker work in real time for time-steps dt ⩾ 1 ms. The model output shows entrainment with both sets of input and contains harmonic components of the fundamental frequency. However, suppressing the feed-forward inhibition in the circuit produces subharmonics within the gamma band (>30 Hz) implying a reduced information transmission fidelity. These model predictions agree with recent lumped-parameter computational model-based predictions, using conventional computers. Scalability of the framework is demonstrated by a multi-node architecture consisting of three "nodes," where each node is the "basic building block" LGN model. This 420 neuron model is tested with synthetic periodic stimulus at 10 Hz to all the nodes. The model output is the average of the outputs from all nodes, and conforms to the above-mentioned predictions of each node. Power consumption for model simulation on SpiNNaker is ≪1 W.

  17. Agent-based model to rural urban migration analysis

    NASA Astrophysics Data System (ADS)

    Silveira, Jaylson J.; Espíndola, Aquino L.; Penna, T. J. P.

    2006-05-01

    In this paper, we analyze the rural-urban migration phenomenon as it is usually observed in economies which are in the early stages of industrialization. The analysis is conducted by means of a statistical mechanics approach which builds a computational agent-based model. Agents are placed on a lattice and the connections among them are described via an Ising-like model. Simulations on this computational model show some emergent properties that are common in developing economies, such as a transitional dynamics characterized by continuous growth of urban population, followed by the equalization of expected wages between rural and urban sectors (Harris-Todaro equilibrium condition), urban concentration and increasing of per capita income.

  18. Exact computation of the maximum-entropy potential of spiking neural-network models.

    PubMed

    Cofré, R; Cessac, B

    2014-05-01

    Understanding how stimuli and synaptic connectivity influence the statistics of spike patterns in neural networks is a central question in computational neuroscience. The maximum-entropy approach has been successfully used to characterize the statistical response of simultaneously recorded spiking neurons responding to stimuli. However, in spite of good performance in terms of prediction, the fitting parameters do not explain the underlying mechanistic causes of the observed correlations. On the other hand, mathematical models of spiking neurons (neuromimetic models) provide a probabilistic mapping between the stimulus, network architecture, and spike patterns in terms of conditional probabilities. In this paper we build an exact analytical mapping between neuromimetic and maximum-entropy models.

  19. The algorithmic anatomy of model-based evaluation

    PubMed Central

    Daw, Nathaniel D.; Dayan, Peter

    2014-01-01

    Despite many debates in the first half of the twentieth century, it is now largely a truism that humans and other animals build models of their environments and use them for prediction and control. However, model-based (MB) reasoning presents severe computational challenges. Alternative, computationally simpler, model-free (MF) schemes have been suggested in the reinforcement learning literature, and have afforded influential accounts of behavioural and neural data. Here, we study the realization of MB calculations, and the ways that this might be woven together with MF values and evaluation methods. There are as yet mostly only hints in the literature as to the resulting tapestry, so we offer more preview than review. PMID:25267820

  20. BIPV: a real-time building performance study for a roof-integrated facility

    NASA Astrophysics Data System (ADS)

    Aaditya, Gayathri; Mani, Monto

    2018-03-01

    Building integrated photovoltaic system (BIPV) is a photovoltaic (PV) integration that generates energy and serves as a building envelope. A building element (e.g. roof and wall) is based on its functional performance, which could include structure, durability, maintenance, weathering, thermal insulation, acoustics, and so on. The present paper discusses the suitability of PV as a building element in terms of thermal performance based on a case study of a 5.25 kWp roof-integrated BIPV system in tropical regions. Performance of PV has been compared with conventional construction materials and various scenarios have been simulated to understand the impact on occupant comfort levels. In the current case study, PV as a roofing material has been shown to cause significant thermal discomfort to the occupants. The study has been based on real-time data monitoring supported by computer-based building simulation model.

  1. Specification and implementation of IFC based performance metrics to support building life cycle assessment of hybrid energy systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrissey, Elmer; O'Donnell, James; Keane, Marcus

    2004-03-29

    Minimizing building life cycle energy consumption is becoming of paramount importance. Performance metrics tracking offers a clear and concise manner of relating design intent in a quantitative form. A methodology is discussed for storage and utilization of these performance metrics through an Industry Foundation Classes (IFC) instantiated Building Information Model (BIM). The paper focuses on storage of three sets of performance data from three distinct sources. An example of a performance metrics programming hierarchy is displayed for a heat pump and a solar array. Utilizing the sets of performance data, two discrete performance effectiveness ratios may be computed, thus offeringmore » an accurate method of quantitatively assessing building performance.« less

  2. Economic analysis of wind-powered farmhouse and farm building heating systems

    NASA Astrophysics Data System (ADS)

    Stafford, R. W.; Greeb, F. J.; Smith, M. H.; Deschenes, C.; Weaver, N. L.

    1981-01-01

    The break even values of wind energy for selected farmhouses and farm buildings focusing on the effects of thermal storage on the use of WECS production were evaluated. Farmhouse structural models include three types derived from a national survey: an older, a more modern, and a passive solar structure. The eight farm building applications include: (1) poultry layers; (2) poultry brooding/layers; (3) poultry broilers; (4) poultry turkeys; (5) swine farrowing; (6) swine growing/finishing; (7) dairy; and (8) lambing. The farm buildings represent the spectrum of animal types, heating energy use, and major contributions to national agricultural economic values. All energy analyses are based on hour by hour computations which allow for growth of animals, sensible and latent heat production, and ventilation requirements.

  3. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data.

    PubMed

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H

    2012-11-06

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the "big data" challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce.

  4. Building a virtual network in a community health research training program.

    PubMed

    Lau, F; Hayward, R

    2000-01-01

    To describe the experiences, lessons, and implications of building a virtual network as part of a two-year community health research training program in a Canadian province. An action research field study in which 25 health professionals from 17 health regions participated in a seven-week training course on health policy, management, economics, research methods, data analysis, and computer technology. The participants then returned to their regions to apply the knowledge in different community health research projects. Ongoing faculty consultations and support were provided as needed. Each participant was given a notebook computer with the necessary software, Internet access, and technical support for two years, to access information resources, engage in group problem solving, share ideas and knowledge, and collaborate on projects. Data collected over two years consisted of program documents, records of interviews with participants and staff, meeting notes, computer usage statistics, automated online surveys, computer conference postings, program Web site, and course feedback. The analysis consisted of detailed review and comparison of the data from different sources. NUD*IST was then used to validate earlier study findings. The ten key lessons are that role clarity, technology vision, implementation staging, protected time, just-in-time training, ongoing facilitation, work integration, participatory design, relationship building, and the demonstration of results are essential ingredients for building a successful network. This study provides a descriptive model of the processes involved in developing, in the community health setting, virtual networks that can be used as the basis for future research and as a practical guide for managers.

  5. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data

    PubMed Central

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H.

    2013-01-01

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the “big data” challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce. PMID:24501719

  6. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  7. Resources and Approaches for Teaching Quantitative and Computational Skills in the Geosciences and Allied Fields

    NASA Astrophysics Data System (ADS)

    Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.

    2016-12-01

    Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop syntheses that highlights best practices, a set of webpages to support teaching with software such as MATLAB, and an interest group actively discussing aspects these issues in Geoscience and allied fields. Learn more and view the resources at http://serc.carleton.edu/matlab_computation2016/index.html

  8. Modeling resident error-making patterns in detection of mammographic masses using computer-extracted image features: preliminary experiments

    NASA Astrophysics Data System (ADS)

    Mazurowski, Maciej A.; Zhang, Jing; Lo, Joseph Y.; Kuzmiak, Cherie M.; Ghate, Sujata V.; Yoon, Sora

    2014-03-01

    Providing high quality mammography education to radiology trainees is essential, as good interpretation skills potentially ensure the highest benefit of screening mammography for patients. We have previously proposed a computer-aided education system that utilizes trainee models, which relate human-assessed image characteristics to interpretation error. We proposed that these models be used to identify the most difficult and therefore the most educationally useful cases for each trainee. In this study, as a next step in our research, we propose to build trainee models that utilize features that are automatically extracted from images using computer vision algorithms. To predict error, we used a logistic regression which accepts imaging features as input and returns error as output. Reader data from 3 experts and 3 trainees were used. Receiver operating characteristic analysis was applied to evaluate the proposed trainee models. Our experiments showed that, for three trainees, our models were able to predict error better than chance. This is an important step in the development of adaptive computer-aided education systems since computer-extracted features will allow for faster and more extensive search of imaging databases in order to identify the most educationally beneficial cases.

  9. Using regression equations built from summary data in the psychological assessment of the individual case: extension to multiple regression.

    PubMed

    Crawford, John R; Garthwaite, Paul H; Denham, Annie K; Chelune, Gordon J

    2012-12-01

    Regression equations have many useful roles in psychological assessment. Moreover, there is a large reservoir of published data that could be used to build regression equations; these equations could then be employed to test a wide variety of hypotheses concerning the functioning of individual cases. This resource is currently underused because (a) not all psychologists are aware that regression equations can be built not only from raw data but also using only basic summary data for a sample, and (b) the computations involved are tedious and prone to error. In an attempt to overcome these barriers, Crawford and Garthwaite (2007) provided methods to build and apply simple linear regression models using summary statistics as data. In the present study, we extend this work to set out the steps required to build multiple regression models from sample summary statistics and the further steps required to compute the associated statistics for drawing inferences concerning an individual case. We also develop, describe, and make available a computer program that implements these methods. Although there are caveats associated with the use of the methods, these need to be balanced against pragmatic considerations and against the alternative of either entirely ignoring a pertinent data set or using it informally to provide a clinical "guesstimate." Upgraded versions of earlier programs for regression in the single case are also provided; these add the point and interval estimates of effect size developed in the present article.

  10. Computer-generated forces in distributed interactive simulation

    NASA Astrophysics Data System (ADS)

    Petty, Mikel D.

    1995-04-01

    Distributed Interactive Simulation (DIS) is an architecture for building large-scale simulation models from a set of independent simulator nodes communicating via a common network protocol. DIS is most often used to create a simulated battlefield for military training. Computer Generated Forces (CGF) systems control large numbers of autonomous battlefield entities in a DIS simulation using computer equipment and software rather than humans in simulators. CGF entities serve as both enemy forces and supplemental friendly forces in a DIS exercise. Research into various aspects of CGF systems is ongoing. Several CGF systems have been implemented.

  11. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action

    PubMed Central

    Pawlik, Aleksandra; van Gelder, Celia W.G.; Nenadic, Aleksandra; Palagi, Patricia M.; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community. PMID:28781745

  12. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action.

    PubMed

    Pawlik, Aleksandra; van Gelder, Celia W G; Nenadic, Aleksandra; Palagi, Patricia M; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community.

  13. Testing of SWMM Model’s LID Modules

    EPA Science Inventory

    EPA’s Storm Water Management Model (SWMM) is a computational code heavily relied upon by industry for the simulation of wastewater and stormwater infrastructure performance to . design and build multi-billion-dollar, multi-decade infrastructure upgrades. Since the 1970’s, EPA a...

  14. 49 CFR 229.211 - Processing of petitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... to the U.S. Department of Transportation Docket Operations (M-30), West Building Ground Floor, Room... requires additional information to appropriately consider the petition, FRA will conduct a hearing on the..., or both which may include validated computer modeling, structural crush analysis, component testing...

  15. 49 CFR 229.211 - Processing of petitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... to the U.S. Department of Transportation Docket Operations (M-30), West Building Ground Floor, Room... requires additional information to appropriately consider the petition, FRA will conduct a hearing on the..., or both which may include validated computer modeling, structural crush analysis, component testing...

  16. 49 CFR 229.211 - Processing of petitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... to the U.S. Department of Transportation Docket Operations (M-30), West Building Ground Floor, Room... requires additional information to appropriately consider the petition, FRA will conduct a hearing on the..., or both which may include validated computer modeling, structural crush analysis, component testing...

  17. Thermal Destruction Of CB Contaminants Bound On Building ...

    EPA Pesticide Factsheets

    Symposium Paper An experimental and theoretical program has been initiated by the U.S. EPA to investigate issues of chemical/biological agent destruction in incineration systems when the agent in question is bound on common porous building interior materials. This program includes 3-dimensional computational fluid dynamics modeling with matrix-bound agent destruction kinetics, bench-scale experiments to determine agent destruction kinetics while bound on various matrices, and pilot-scale experiments to scale-up the bench-scale experiments to a more practical scale. Finally, model predictions are made to predict agent destruction and combustion conditions in two full-scale incineration systems that are typical of modern combustor design.

  18. Electric Conduction in Solids: a Pedagogical Approach Supported by Laboratory Measurements and Computer Modelling Environments

    NASA Astrophysics Data System (ADS)

    Bonura, A.; Capizzo, M. C.; Fazio, C.; Guastella, I.

    2008-05-01

    In this paper we present a pedagogic approach aimed at modeling electric conduction in semiconductors, built by using NetLogo, a programmable modeling environment for building and exploring multi-agent systems. `Virtual experiments' are implemented to confront predictions of different microscopic models with real measurements of electric properties of matter, such as resistivity. The relations between these electric properties and other physical variables, like temperature, are, then, analyzed.

  19. QUIC Transport and Dispersion Modeling of Vehicle Emissions in Cities for Better Public Health Assessments

    PubMed Central

    Brown, Michael J.; Williams, Michael D.; Nelson, Matthew A.; Werley, Kenneth A.

    2015-01-01

    The Quick Urban and Industrial Complex (QUIC) plume modeling system is used to explore how the transport and dispersion of vehicle emissions in cities are impacted by the presence of buildings. Using downtown Philadelphia as a test case, notional vehicle emissions of gases and particles are specified as line source releases on a subset of the east–west and north–south streets. Cases were run in flat terrain and with 3D buildings present in order to show the differences in the model-computed outdoor concentration fields with and without buildings present. The QUIC calculations show that buildings result in regions with much higher concentrations and other areas with much lower concentrations when compared to the flat-earth case. On the roads with vehicle emissions, street-level concentrations were up to a factor of 10 higher when buildings were on either side of the street as compared to the flat-earth case due to trapping of pollutants between buildings. However, on roads without vehicle emissions and in other open areas, the concentrations were up to a factor of 100 times smaller as compared to the flat earth case because of vertical mixing of the vehicle emissions to building height in the cavity circulation that develops on the downwind side of unsheltered buildings. QUIC was also used to calculate infiltration of the contaminant into the buildings. Indoor concentration levels were found to be much lower than outdoor concentrations because of deposition onto indoor surfaces and particulate capture for buildings with filtration systems. Large differences in indoor concentrations from building to building resulted from differences in leakiness, air handling unit volume exchange rates, and filter type and for naturally ventilated buildings, whether or not the building was sheltered from the prevailing wind by a building immediately upwind. PMID:27867300

  20. Fast Learning for Immersive Engagement in Energy Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian W; Bugbee, Bruce; Gruchalla, Kenny M

    The fast computation which is critical for immersive engagement with and learning from energy simulations would be furthered by developing a general method for creating rapidly computed simplified versions of NREL's computation-intensive energy simulations. Created using machine learning techniques, these 'reduced form' simulations can provide statistically sound estimates of the results of the full simulations at a fraction of the computational cost with response times - typically less than one minute of wall-clock time - suitable for real-time human-in-the-loop design and analysis. Additionally, uncertainty quantification techniques can document the accuracy of the approximate models and their domain of validity. Approximationmore » methods are applicable to a wide range of computational models, including supply-chain models, electric power grid simulations, and building models. These reduced-form representations cannot replace or re-implement existing simulations, but instead supplement them by enabling rapid scenario design and quality assurance for large sets of simulations. We present an overview of the framework and methods we have implemented for developing these reduced-form representations.« less

  1. A walk through the planned CS building. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Khorramabadi, Delnaz

    1991-01-01

    Using the architectural plan views of our future computer science building as test objects, we have completed the first stage of a Building walkthrough system. The inputs to our system are AutoCAD files. An AutoCAD converter translates the geometrical information in these files into a format suitable for 3D rendering. Major model errors, such as incorrect polygon intersections and random face orientations, are detected and fixed automatically. Interactive viewing and editing tools are provided to view the results, to modify and clean the model and to change surface attributes. Our display system provides a simple-to-use user interface for interactive exploration of buildings. Using only the mouse buttons, the user can move inside and outside the building and change floors. Several viewing and rendering options are provided, such as restricting the viewing frustum, avoiding wall collisions, and selecting different rendering algorithms. A plan view of the current floor, with the position of the eye point and viewing direction on it, is displayed at all times. The scene illumination can be manipulated, by interactively controlling intensity values for 5 light sources.

  2. 3-D Object Recognition from Point Cloud Data

    NASA Astrophysics Data System (ADS)

    Smith, W.; Walker, A. S.; Zhang, B.

    2011-09-01

    The market for real-time 3-D mapping includes not only traditional geospatial applications but also navigation of unmanned autonomous vehicles (UAVs). Massively parallel processes such as graphics processing unit (GPU) computing make real-time 3-D object recognition and mapping achievable. Geospatial technologies such as digital photogrammetry and GIS offer advanced capabilities to produce 2-D and 3-D static maps using UAV data. The goal is to develop real-time UAV navigation through increased automation. It is challenging for a computer to identify a 3-D object such as a car, a tree or a house, yet automatic 3-D object recognition is essential to increasing the productivity of geospatial data such as 3-D city site models. In the past three decades, researchers have used radiometric properties to identify objects in digital imagery with limited success, because these properties vary considerably from image to image. Consequently, our team has developed software that recognizes certain types of 3-D objects within 3-D point clouds. Although our software is developed for modeling, simulation and visualization, it has the potential to be valuable in robotics and UAV applications. The locations and shapes of 3-D objects such as buildings and trees are easily recognizable by a human from a brief glance at a representation of a point cloud such as terrain-shaded relief. The algorithms to extract these objects have been developed and require only the point cloud and minimal human inputs such as a set of limits on building size and a request to turn on a squaring option. The algorithms use both digital surface model (DSM) and digital elevation model (DEM), so software has also been developed to derive the latter from the former. The process continues through the following steps: identify and group 3-D object points into regions; separate buildings and houses from trees; trace region boundaries; regularize and simplify boundary polygons; construct complex roofs. Several case studies have been conducted using a variety of point densities, terrain types and building densities. The results have been encouraging. More work is required for better processing of, for example, forested areas, buildings with sides that are not at right angles or are not straight, and single trees that impinge on buildings. Further work may also be required to ensure that the buildings extracted are of fully cartographic quality. A first version will be included in production software later in 2011. In addition to the standard geospatial applications and the UAV navigation, the results have a further advantage: since LiDAR data tends to be accurately georeferenced, the building models extracted can be used to refine image metadata whenever the same buildings appear in imagery for which the GPS/IMU values are poorer than those for the LiDAR.

  3. Additional and revised thermochemical data and computer code for WATEQ2: a computerized chemical model for trace and major element speciation and mineral equilibria of natural waters

    USGS Publications Warehouse

    Ball, James W.; Nordstrom, D. Kirk; Jenne, Everett A.

    1980-01-01

    A computerized chemical model, WATEQ2, has resulted from extensive additions to and revision of the WATEQ model of Truesdell and Jones (Truesdell, A. H., and Jones, B. F., 1974, WATEQ, a computer program for calculating chemical equilibria of natural waters: J. Res. U. S. Geol, Survey, v. 2, p. 233-274). The model building effort has necessitated searching the literature and selecting thermochemical data pertinent to the reactions added to the model. This supplementary report manes available the details of the reactions added to the model together with the selected thermochemical data and their sources. Also listed are details of program operation and a brief description of the output of the model. Appendices-contain a glossary of identifiers used in the PL/1 computer code, the complete PL/1 listing, and sample output from three water analyses used as test cases.

  4. The European computer model for optronic system performance prediction (ECOMOS)

    NASA Astrophysics Data System (ADS)

    Keßler, Stefan; Bijl, Piet; Labarre, Luc; Repasi, Endre; Wittenstein, Wolfgang; Bürsing, Helge

    2017-10-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The project involves close co-operation of defence and security industry and public research institutes from France, Germany, Italy, The Netherlands and Sweden. ECOMOS uses and combines well-accepted existing European tools to build up a strong competitive position. This includes two TA models: the analytical TRM4 model and the image-based TOD model. In addition, it uses the atmosphere model MATISSE. In this paper, the central idea of ECOMOS is exposed. The overall software structure and the underlying models are shown and elucidated. The status of the project development is given as well as a short discussion of validation tests and an outlook on the future potential of simulation for sensor assessment.

  5. Why Things Are So Bad for the Computer-Naive User

    DTIC Science & Technology

    1975-03-01

    knowledge of human communication that we need. Many of the things that people do in communication, including the entire list indicated above, are not...making direct use of computers feasible and comfortable for broad classes of people, research in modeling human communication processes deserves a far...higher national priority than it currently h<is. There are a few active research projects that are building the right kinds of rodels of human

  6. Metering Best Practices Applied in the National Renewable Energy Laboratory's Research Support Facility: A Primer to the 2011 Measured and Modeled Energy Consumption Datasets

    DOE Data Explorer

    Sheppy, Michael; Beach, A.; Pless, Shanti

    2016-08-09

    Modern buildings are complex energy systems that must be controlled for energy efficiency. The Research Support Facility (RSF) at the National Renewable Energy Laboratory (NREL) has hundreds of controllers -- computers that communicate with the building's various control systems -- to control the building based on tens of thousands of variables and sensor points. These control strategies were designed for the RSF's systems to efficiently support research activities. Many events that affect energy use cannot be reliably predicted, but certain decisions (such as control strategies) must be made ahead of time. NREL researchers modeled the RSF systems to predict how they might perform. They then monitor these systems to understand how they are actually performing and reacting to the dynamic conditions of weather, occupancy, and maintenance.

  7. BIM-Based Timber Structures Refurbishment of the Immovable Heritage Listed Buildings

    NASA Astrophysics Data System (ADS)

    Henek, Vladan; Venkrbec, Václav

    2017-12-01

    The use of Building information model (BIM) design tools is no longer an exception, but a common issue. When designing new buildings or complex renovations using BIM, the benefits have already been repeatedly published. The essence of BIM is to create a multidimensional geometric model of a planned building electronically on a computer, supplemented with the necessary information in advance of the construction process. Refurbishment is a specific process that combines both - new structures and demolished structures, or structures that need to be dismantled, repaired, and then returned to the original position. Often it can be historically valuable part of the building. BIM-based repairs and refurbishments of the constructions, especially complicated repairs of the structures of roof trusses of immovable heritage listed buildings, have not yet been credibly presented. However, the use of BIM tools may be advantageous in this area, because user can quickly response to the necessary changes that may be needed during refurbishments, but also in connection with the quick assessment and cost estimation of any unexpected additional works. The paper deals with the use of BIM in the field of repairs and refurbishment of the buildings in general. The emphasis on monumentally protected elements was priority. Advantage of the proposal research is demonstrated on case study of the refurbishment of the immovable heritage listed truss roof. According to this study, this construction was realized in the Czech Republic. Case study consists of 3D modelled truss parts and the connected technological workflow base. The project work was carried out in one common model environment.

  8. Verifying Stability of Dynamic Soft-Computing Systems

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Napolitano, Marcello; Callahan, John

    1997-01-01

    Soft computing is a general term for algorithms that learn from human knowledge and mimic human skills. Example of such algorithms are fuzzy inference systems and neural networks. Many applications, especially in control engineering, have demonstrated their appropriateness in building intelligent systems that are flexible and robust. Although recent research have shown that certain class of neuro-fuzzy controllers can be proven bounded and stable, they are implementation dependent and difficult to apply to the design and validation process. Many practitioners adopt the trial and error approach for system validation or resort to exhaustive testing using prototypes. In this paper, we describe our on-going research towards establishing necessary theoretic foundation as well as building practical tools for the verification and validation of soft-computing systems. A unified model for general neuro-fuzzy system is adopted. Classic non-linear system control theory and recent results of its applications to neuro-fuzzy systems are incorporated and applied to the unified model. It is hoped that general tools can be developed to help the designer to visualize and manipulate the regions of stability and boundedness, much the same way Bode plots and Root locus plots have helped conventional control design and validation.

  9. The Development of a Virtual McKenna Military Operations in Urban Terrain (MOUT) Site for Command, Control, Communication, Computing, Intelligence, Surveillance, and Reconnaissance (C4ISR) Studies

    DTIC Science & Technology

    2007-06-01

    4.2 Creating the Skybox and Terrain Model .........................................................................7 4.3 Creating New Textures... Skybox and Terrain Model The next step was to build a sky box. Since it already resided in Raven Shield, the creation of the sky box was limited to

  10. The Future Is 3D

    ERIC Educational Resources Information Center

    Carter, Luke

    2015-01-01

    3D printers are a way of producing a 3D model of an item from a digital file. The model builds up in successive layers of material placed by the printer controlled by the information in the computer file. In this article the author argues that 3D printers are one of the greatest technological advances of recent times. He discusses practical uses…

  11. Integrative Approach for Computationally Inferring Interactions between the Alpha and Beta Subunits of the Calcium-Activated Potassium Channel (BK): a Docking Study

    PubMed Central

    González, Janneth; Gálvez, Angela; Morales, Ludis; Barreto, George E.; Capani, Francisco; Sierra, Omar; Torres, Yolima

    2013-01-01

    Three-dimensional models of the alpha- and beta-1 subunits of the calcium-activated potassium channel (BK) were predicted by threading modeling. A recursive approach comprising of sequence alignment and model building based on three templates was used to build these models, with the refinement of non-conserved regions carried out using threading techniques. The complex formed by the subunits was studied by means of docking techniques, using 3D models of the two subunits, and an approach based on rigid-body structures. Structural effects of the complex were analyzed with respect to hydrogen-bond interactions and binding-energy calculations. Potential interaction sites of the complex were determined by referencing a study of the difference accessible surface area (DASA) of the protein subunits in the complex. PMID:23492851

  12. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures.

    PubMed

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.

  13. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures

    PubMed Central

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315

  14. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  15. Towards a Framework for Evolvable Network Design

    NASA Astrophysics Data System (ADS)

    Hassan, Hoda; Eltarras, Ramy; Eltoweissy, Mohamed

    The layered Internet architecture that had long guided network design and protocol engineering was an “interconnection architecture” defining a framework for interconnecting networks rather than a model for generic network structuring and engineering. We claim that the approach of abstracting the network in terms of an internetwork hinders the thorough understanding of the network salient characteristics and emergent behavior resulting in impeding design evolution required to address extreme scale, heterogeneity, and complexity. This paper reports on our work in progress that aims to: 1) Investigate the problem space in terms of the factors and decisions that influenced the design and development of computer networks; 2) Sketch the core principles for designing complex computer networks; and 3) Propose a model and related framework for building evolvable, adaptable and self organizing networks We will adopt a bottom up strategy primarily focusing on the building unit of the network model, which we call the “network cell”. The model is inspired by natural complex systems. A network cell is intrinsically capable of specialization, adaptation and evolution. Subsequently, we propose CellNet; a framework for evolvable network design. We outline scenarios for using the CellNet framework to enhance legacy Internet protocol stack.

  16. Electron-Ion Dynamics with Time-Dependent Density Functional Theory: Towards Predictive Solar Cell Modeling: Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maitra, Neepa

    2016-07-14

    This project investigates the accuracy of currently-used functionals in time-dependent density functional theory, which is today routinely used to predict and design materials and computationally model processes in solar energy conversion. The rigorously-based electron-ion dynamics method developed here sheds light on traditional methods and overcomes challenges those methods have. The fundamental research undertaken here is important for building reliable and practical methods for materials discovery. The ultimate goal is to use these tools for the computational design of new materials for solar cell devices of high efficiency.

  17. Estimating average growth trajectories in shape-space using kernel smoothing.

    PubMed

    Hutton, Tim J; Buxton, Bernard F; Hammond, Peter; Potts, Henry W W

    2003-06-01

    In this paper, we show how a dense surface point distribution model of the human face can be computed and demonstrate the usefulness of the high-dimensional shape-space for expressing the shape changes associated with growth and aging. We show how average growth trajectories for the human face can be computed in the absence of longitudinal data by using kernel smoothing across a population. A training set of three-dimensional surface scans of 199 male and 201 female subjects of between 0 and 50 years of age is used to build the model.

  18. Research on application of intelligent computation based LUCC model in urbanization process

    NASA Astrophysics Data System (ADS)

    Chen, Zemin

    2007-06-01

    Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents of complexity science research and the conception of complexity feature to reveal the complexity features of LUCC research in urbanization process. Urban space system is a complex economic and cultural phenomenon as well as a social process, is the comprehensive characterization of urban society, economy and culture, and is a complex space system formed by society, economy and nature. It has dissipative structure characteristics, such as opening, dynamics, self-organization, non-balance etc. Traditional model cannot simulate these social, economic and natural driving forces of LUCC including main feedback relation from LUCC to driving force. 2. Establishment of Markov extended model of LUCC analog research in urbanization process. Firstly, use traditional LUCC research model to compute change speed of regional land use through calculating dynamic degree, exploitation degree and consumption degree of land use; use the theory of fuzzy set to rewrite the traditional Markov model, establish structure transfer matrix of land use, forecast and analyze dynamic change and development trend of land use, and present noticeable problems and corresponding measures in urbanization process according to research results. 3. Application of intelligent computation research and complexity science research method in LUCC analog model in urbanization process. On the basis of detailed elaboration of the theory and the model of LUCC research in urbanization process, analyze the problems of existing model used in LUCC research (namely, difficult to resolve many complexity phenomena in complex urban space system), discuss possible structure realization forms of LUCC analog research in combination with the theories of intelligent computation and complexity science research. Perform application analysis on BP artificial neural network and genetic algorithms of intelligent computation and CA model and MAS technology of complexity science research, discuss their theoretical origins and their own characteristics in detail, elaborate the feasibility of them in LUCC analog research, and bring forward improvement methods and measures on existing problems of this kind of model. 4. Establishment of LUCC analog model in urbanization process based on theories of intelligent computation and complexity science. Based on the research on abovementioned BP artificial neural network, genetic algorithms, CA model and multi-agent technology, put forward improvement methods and application assumption towards their expansion on geography, build LUCC analog model in urbanization process based on CA model and Agent model, realize the combination of learning mechanism of BP artificial neural network and fuzzy logic reasoning, express the regulation with explicit formula, and amend the initial regulation through self study; optimize network structure of LUCC analog model and methods and procedures of model parameters with genetic algorithms. In this paper, I introduce research theory and methods of complexity science into LUCC analog research and presents LUCC analog model based upon CA model and MAS theory. Meanwhile, I carry out corresponding expansion on traditional Markov model and introduce the theory of fuzzy set into data screening and parameter amendment of improved model to improve the accuracy and feasibility of Markov model in the research on land use/cover change.

  19. Impact of a Large San Andreas Fault Earthquake on Tall Buildings in Southern California

    NASA Astrophysics Data System (ADS)

    Krishnan, S.; Ji, C.; Komatitsch, D.; Tromp, J.

    2004-12-01

    In 1857, an earthquake of magnitude 7.9 occurred on the San Andreas fault, starting at Parkfield and rupturing in a southeasterly direction for more than 300~km. Such a unilateral rupture produces significant directivity toward the San Fernando and Los Angeles basins. The strong shaking in the basins due to this earthquake would have had a significant long-period content (2--8~s). If such motions were to happen today, they could have a serious impact on tall buildings in Southern California. In order to study the effects of large San Andreas fault earthquakes on tall buildings in Southern California, we use the finite source of the magnitude 7.9 2001 Denali fault earthquake in Alaska and map it onto the San Andreas fault with the rupture originating at Parkfield and proceeding southward over a distance of 290~km. Using the SPECFEM3D spectral element seismic wave propagation code, we simulate a Denali-like earthquake on the San Andreas fault and compute ground motions at sites located on a grid with a 2.5--5.0~km spacing in the greater Southern California region. We subsequently analyze 3D structural models of an existing tall steel building designed in 1984 as well as one designed according to the current building code (Uniform Building Code, 1997) subjected to the computed ground motion. We use a sophisticated nonlinear building analysis program, FRAME3D, that has the ability to simulate damage in buildings due to three-component ground motion. We summarize the performance of these structural models on contour maps of carefully selected structural performance indices. This study could benefit the city in laying out emergency response strategies in the event of an earthquake on the San Andreas fault, in undertaking appropriate retrofit measures for tall buildings, and in formulating zoning regulations for new construction. In addition, the study would provide risk data associated with existing and new construction to insurance companies, real estate developers, and individual owners, so that they can make well-informed financial decisions.

  20. Modeling Atmospheric Transport for Greenhouse Gas Observations within the Urban Dome

    NASA Astrophysics Data System (ADS)

    Nehrkorn, T.; Sargent, M. R.; Wofsy, S. C.

    2016-12-01

    Observations of CO2, CH4, and other greenhouse gases (GHGs) within the urban dome of major cities generally show large enhancements over background values, and large sensitivity to surface fluxes (as measured by the footprints computed by Lagrangian Particle Dispersion Models, LPDMs) within the urban dome. However, their use in top-down inversion studies to constrain urban emission estimates is complicated by difficulties in proper modeling of the atmospheric transport. We are conducting experiments with the Weather Research and Forecast model (WRF) coupled to the STILT LPDM to improve model simulation of atmospheric transport on spatial scales of a few km in urban domains, because errors in transport on short time/space scales are amplified by the patchiness of GHG emissions and may engender systematic errors of simulated concentrations.We are evaluating the quality of the meteorological simulations from model configurations with different resolutions and PBL packages, using both standard and non-standard (Lidar PBL height and ACARS aircraft profile) observations. To take into account the effect of building scale eddies for observations located on top of buildings, we are modifying the basic STILT algorithm for the computation of footprints by replacing the nominal receptor height by an effective sampling height. In addition, the footprint computations for near-field emissions make use of the vertical particle spread within the LPDM to arrive at a more appropriate estimate of mixing heights in the immediate vicinity of receptors. We present the effect of these and similar modifications on simulated concentrations and their level of agreement with observed values.

  1. Computer-Aided Modeling and Analysis of Power Processing Systems (CAMAPPS). Phase 1: Users handbook

    NASA Technical Reports Server (NTRS)

    Kim, S.; Lee, J.; Cho, B. H.; Lee, F. C.

    1986-01-01

    The EASY5 macro component models developed for the spacecraft power system simulation are described. A brief explanation about how to use the macro components with the EASY5 Standard Components to build a specific system is given through an example. The macro components are ordered according to the following functional group: converter power stage models, compensator models, current-feedback models, constant frequency control models, load models, solar array models, and shunt regulator models. Major equations, a circuit model, and a program listing are provided for each macro component.

  2. Experiences in Automated Calibration of a Nickel Equation of State

    NASA Astrophysics Data System (ADS)

    Carpenter, John H.

    2017-06-01

    Wide availability of large computers has led to increasing incorporation of computational data, such as from density functional theory molecular dynamics, in the development of equation of state (EOS) models. Once a grid of computational data is available, it is usually left to an expert modeler to model the EOS using traditional techniques. One can envision the possibility of using the increasing computing resources to perform black-box calibration of EOS models, with the goal of reducing the workload on the modeler or enabling non-experts to generate good EOSs with such a tool. Progress towards building such a black-box calibration tool will be explored in the context of developing a new, wide-range EOS for nickel. While some details of the model and data will be shared, the focus will be on what was learned by automatically calibrating the model in a black-box method. Model choices and ensuring physicality will also be discussed. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  3. Assessment and prediction of urban air pollution caused by motor transport exhaust gases using computer simulation methods

    NASA Astrophysics Data System (ADS)

    Boyarshinov, Michael G.; Vaismana, Yakov I.

    2016-10-01

    The following methods were used in order to identify the pollution fields of urban air caused by the motor transport exhaust gases: the mathematical model, which enables to consider the influence of the main factors that determine pollution fields formation in the complex spatial domain; the authoring software designed for computational modeling of the gas flow, generated by numerous mobile point sources; the results of computing experiments on pollutant spread analysis and evolution of their concentration fields. The computational model of exhaust gas distribution and dispersion in a spatial domain, which includes urban buildings, structures and main traffic arteries, takes into account a stochastic character of cars apparition on the borders of the examined territory and uses a Poisson process. The model also considers the traffic lights switching and permits to define the fields of velocity, pressure and temperature of the discharge gases in urban air. The verification of mathematical model and software used confirmed their satisfactory fit to the in-situ measurements data and the possibility to use the obtained computing results for assessment and prediction of urban air pollution caused by motor transport exhaust gases.

  4. Synthetic analog computation in living cells.

    PubMed

    Daniel, Ramiz; Rubens, Jacob R; Sarpeshkar, Rahul; Lu, Timothy K

    2013-05-30

    A central goal of synthetic biology is to achieve multi-signal integration and processing in living cells for diagnostic, therapeutic and biotechnology applications. Digital logic has been used to build small-scale circuits, but other frameworks may be needed for efficient computation in the resource-limited environments of cells. Here we demonstrate that synthetic analog gene circuits can be engineered to execute sophisticated computational functions in living cells using just three transcription factors. Such synthetic analog gene circuits exploit feedback to implement logarithmically linear sensing, addition, ratiometric and power-law computations. The circuits exhibit Weber's law behaviour as in natural biological systems, operate over a wide dynamic range of up to four orders of magnitude and can be designed to have tunable transfer functions. Our circuits can be composed to implement higher-order functions that are well described by both intricate biochemical models and simple mathematical functions. By exploiting analog building-block functions that are already naturally present in cells, this approach efficiently implements arithmetic operations and complex functions in the logarithmic domain. Such circuits may lead to new applications for synthetic biology and biotechnology that require complex computations with limited parts, need wide-dynamic-range biosensing or would benefit from the fine control of gene expression.

  5. Wallboard with Latent Heat Storage for Passive Solar Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kedl, R.J.

    2001-05-31

    Conventional wallboard impregnated with octadecane paraffin [melting point-23 C (73.5 F)] is being developed as a building material with latent heat storage for passive solar and other applications. Impregnation was accomplished simply by soaking the wallboard in molten wax. Concentrations of wax in the combined product as high as 35% by weight can be achieved. Scale-up of the soaking process, from small laboratory samples to full-sized 4- by 8-ft sheets, has been successfully accomplished. The required construction properties of wallboard are maintained after impregnation, that is, it can be painted and spackled. Long-term, high-temperature exposure tests and thermal cycling testsmore » showed no tendency of the paraffin to migrate within the wallboard, and there was no deterioration of thermal energy storage capacity. In support of this concept, a computer model was developed to handle thermal transport and storage by a phase change material (PCM) dispersed in a porous media. The computer model was confirmed by comparison with known analytical solutions and also by comparison with temperatures measured in wallboard during an experimentally generated thermal transient. Agreement between the model and known solution was excellent. Agreement between the model and thermal transient was good, only after the model was modified to allow the PCM to melt over a temperature range, rather than at a specific melting point. When the melting characteristics of the PCM (melting point, melting range, and heat of fusion), as determined from a differential scanning calorimeter plot, were used in the model, agreement between the model and transient data was very good. The confirmed computer model may now be used in conjunction with a building heating and cooling code to evaluate design parameters and operational characteristics of latent heat storage wallboard for passive solar applications.« less

  6. High-resolution modeling of tsunami run-up flooding: a case study of flooding in Kamaishi city, Japan, induced by the 2011 Tohoku tsunami

    NASA Astrophysics Data System (ADS)

    Akoh, Ryosuke; Ishikawa, Tadaharu; Kojima, Takashi; Tomaru, Mahito; Maeno, Shiro

    2017-11-01

    Run-up processes of the 2011 Tohoku tsunami into the city of Kamaishi, Japan, were simulated numerically using 2-D shallow water equations with a new treatment of building footprints. The model imposes an internal hydraulic condition of permeable and impermeable walls at the building footprint outline on unstructured triangular meshes. Digital data of the building footprint approximated by polygons were overlaid on a 1.0 m resolution terrain model. The hydraulic boundary conditions were ascertained using conventional tsunami propagation calculation from the seismic center to nearshore areas. Run-up flow calculations were conducted under the same hydraulic conditions for several cases having different building permeabilities. Comparison of computation results with field data suggests that the case with a small amount of wall permeability gives better agreement than the case with impermeable condition. Spatial mapping of an indicator for run-up flow intensity (IF = (hU2)max, where h and U respectively denote the inundation depth and flow velocity during the flood, shows fairly good correlation with the distribution of houses destroyed by flooding. As a possible mitigation measure, the influence of the buildings on the flow was assessed using a numerical experiment for solid buildings arrayed alternately in two lines along the coast. Results show that the buildings can prevent seawater from flowing straight to the city center while maintaining access to the sea.

  7. A graph-based computational framework for simulation and optimisation of coupled infrastructure networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek

    Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less

  8. A graph-based computational framework for simulation and optimisation of coupled infrastructure networks

    DOE PAGES

    Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek; ...

    2017-04-24

    Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less

  9. Interactive computer aided technology, evolution in the design/manufacturing process

    NASA Technical Reports Server (NTRS)

    English, C. H.

    1975-01-01

    A powerful computer-operated three dimensional graphic system and associated auxiliary computer equipment used in advanced design, production design, and manufacturing was described. This system has made these activities more productive than when using older and more conventional methods to design and build aerospace vehicles. With the use of this graphic system, designers are now able to define parts using a wide variety of geometric entities, define parts as fully surface 3-dimensional models as well as "wire-frame" models. Once geometrically defined, the designer is able to take section cuts of the surfaced model and automatically determine all of the section properties of the planar cut, lightpen detect all of the surface patches and automatically determine the volume and weight of the part. Further, his designs are defined mathematically at a degree of accuracy never before achievable.

  10. Saliency image of feature building for image quality assessment

    NASA Astrophysics Data System (ADS)

    Ju, Xinuo; Sun, Jiyin; Wang, Peng

    2011-11-01

    The purpose and method of image quality assessment are quite different for automatic target recognition (ATR) and traditional application. Local invariant feature detectors, mainly including corner detectors, blob detectors and region detectors etc., are widely applied for ATR. A saliency model of feature was proposed to evaluate feasibility of ATR in this paper. The first step consisted of computing the first-order derivatives on horizontal orientation and vertical orientation, and computing DoG maps in different scales respectively. Next, saliency images of feature were built based auto-correlation matrix in different scale. Then, saliency images of feature of different scales amalgamated. Experiment were performed on a large test set, including infrared images and optical images, and the result showed that the salient regions computed by this model were consistent with real feature regions computed by mostly local invariant feature extraction algorithms.

  11. An innovative time-cost-quality tradeoff modeling of building construction project based on resource allocation.

    PubMed

    Hu, Wenfa; He, Xinhua

    2014-01-01

    The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated.

  12. Computational Systems Biology in Cancer: Modeling Methods and Applications

    PubMed Central

    Materi, Wayne; Wishart, David S.

    2007-01-01

    In recent years it has become clear that carcinogenesis is a complex process, both at the molecular and cellular levels. Understanding the origins, growth and spread of cancer, therefore requires an integrated or system-wide approach. Computational systems biology is an emerging sub-discipline in systems biology that utilizes the wealth of data from genomic, proteomic and metabolomic studies to build computer simulations of intra and intercellular processes. Several useful descriptive and predictive models of the origin, growth and spread of cancers have been developed in an effort to better understand the disease and potential therapeutic approaches. In this review we describe and assess the practical and theoretical underpinnings of commonly-used modeling approaches, including ordinary and partial differential equations, petri nets, cellular automata, agent based models and hybrid systems. A number of computer-based formalisms have been implemented to improve the accessibility of the various approaches to researchers whose primary interest lies outside of model development. We discuss several of these and describe how they have led to novel insights into tumor genesis, growth, apoptosis, vascularization and therapy. PMID:19936081

  13. Consensus models to predict endocrine disruption for all ...

    EPA Pesticide Factsheets

    Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been tested for their ability to disrupt the endocrine system, in particular, their ability to interact with the estrogen receptor. EPA needs tools to prioritize thousands of chemicals, for instance in the Endocrine Disruptor Screening Program (EDSP). Collaborative Estrogen Receptor Activity Prediction Project (CERAPP) was intended to be a demonstration of the use of predictive computational models on HTS data including ToxCast and Tox21 assays to prioritize a large chemical universe of 32464 unique structures for one specific molecular target – the estrogen receptor. CERAPP combined multiple computational models for prediction of estrogen receptor activity, and used the predicted results to build a unique consensus model. Models were developed in collaboration between 17 groups in the U.S. and Europe and applied to predict the common set of chemicals. Structure-based techniques such as docking and several QSAR modeling approaches were employed, mostly using a common training set of 1677 compounds provided by U.S. EPA, to build a total of 42 classification models and 8 regression models for binding, agonist and antagonist activity. All predictions were evaluated on ToxCast data and on an exte

  14. Evaluation of Structural Robustness against Column Loss: Methodology and Application to RC Frame Buildings

    PubMed Central

    Bao, Yihai; Main, Joseph A.; Noh, Sam-Young

    2017-01-01

    A computational methodology is presented for evaluating structural robustness against column loss. The methodology is illustrated through application to reinforced concrete (RC) frame buildings, using a reduced-order modeling approach for three-dimensional RC framing systems that includes the floor slabs. Comparisons with high-fidelity finite-element model results are presented to verify the approach. Pushdown analyses of prototype buildings under column loss scenarios are performed using the reduced-order modeling approach, and an energy-based procedure is employed to account for the dynamic effects associated with sudden column loss. Results obtained using the energy-based approach are found to be in good agreement with results from direct dynamic analysis of sudden column loss. A metric for structural robustness is proposed, calculated by normalizing the ultimate capacities of the structural system under sudden column loss by the applicable service-level gravity loading and by evaluating the minimum value of this normalized ultimate capacity over all column removal scenarios. The procedure is applied to two prototype 10-story RC buildings, one employing intermediate moment frames (IMFs) and the other employing special moment frames (SMFs). The SMF building, with its more stringent seismic design and detailing, is found to have greater robustness. PMID:28890599

  15. Transforming parts of a differential equations system to difference equations as a method for run-time savings in NONMEM.

    PubMed

    Petersson, K J F; Friberg, L E; Karlsson, M O

    2010-10-01

    Computer models of biological systems grow more complex as computing power increase. Often these models are defined as differential equations and no analytical solutions exist. Numerical integration is used to approximate the solution; this can be computationally intensive, time consuming and be a large proportion of the total computer runtime. The performance of different integration methods depend on the mathematical properties of the differential equations system at hand. In this paper we investigate the possibility of runtime gains by calculating parts of or the whole differential equations system at given time intervals, outside of the differential equations solver. This approach was tested on nine models defined as differential equations with the goal to reduce runtime while maintaining model fit, based on the objective function value. The software used was NONMEM. In four models the computational runtime was successfully reduced (by 59-96%). The differences in parameter estimates, compared to using only the differential equations solver were less than 12% for all fixed effects parameters. For the variance parameters, estimates were within 10% for the majority of the parameters. Population and individual predictions were similar and the differences in OFV were between 1 and -14 units. When computational runtime seriously affects the usefulness of a model we suggest evaluating this approach for repetitive elements of model building and evaluation such as covariate inclusions or bootstraps.

  16. A computational model of the human visual cortex

    NASA Astrophysics Data System (ADS)

    Albus, James S.

    2008-04-01

    The brain is first and foremost a control system that is capable of building an internal representation of the external world, and using this representation to make decisions, set goals and priorities, formulate plans, and control behavior with intent to achieve its goals. The computational model proposed here assumes that this internal representation resides in arrays of cortical columns. More specifically, it models each cortical hypercolumn together with its underlying thalamic nuclei as a Fundamental Computational Unit (FCU) consisting of a frame-like data structure (containing attributes and pointers) plus the computational processes and mechanisms required to maintain it. In sensory-processing areas of the brain, FCUs enable segmentation, grouping, and classification. Pointers stored in FCU frames link pixels and signals to objects and events in situations and episodes that are overlaid with meaning and emotional values. In behavior-generating areas of the brain, FCUs make decisions, set goals and priorities, generate plans, and control behavior. Pointers are used to define rules, grammars, procedures, plans, and behaviors. It is suggested that it may be possible to reverse engineer the human brain at the FCU level of fidelity using nextgeneration massively parallel computer hardware and software. Key Words: computational modeling, human cortex, brain modeling, reverse engineering the brain, image processing, perception, segmentation, knowledge representation

  17. DEEP: A Database of Energy Efficiency Performance to Accelerate Energy Retrofitting of Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoon Lee, Sang; Hong, Tianzhen; Sawaya, Geof

    The paper presents a method and process to establish a database of energy efficiency performance (DEEP) to enable quick and accurate assessment of energy retrofit of commercial buildings. DEEP was compiled from results of about 35 million EnergyPlus simulations. DEEP provides energy savings for screening and evaluation of retrofit measures targeting the small and medium-sized office and retail buildings in California. The prototype building models are developed for a comprehensive assessment of building energy performance based on DOE commercial reference buildings and the California DEER prototype buildings. The prototype buildings represent seven building types across six vintages of constructions andmore » 16 California climate zones. DEEP uses these prototypes to evaluate energy performance of about 100 energy conservation measures covering envelope, lighting, heating, ventilation, air-conditioning, plug-loads, and domestic hot water. DEEP consists the energy simulation results for individual retrofit measures as well as packages of measures to consider interactive effects between multiple measures. The large scale EnergyPlus simulations are being conducted on the super computers at the National Energy Research Scientific Computing Center of Lawrence Berkeley National Laboratory. The pre-simulation database is a part of an on-going project to develop a web-based retrofit toolkit for small and medium-sized commercial buildings in California, which provides real-time energy retrofit feedback by querying DEEP with recommended measures, estimated energy savings and financial payback period based on users’ decision criteria of maximizing energy savings, energy cost savings, carbon reduction, or payback of investment. The pre-simulated database and associated comprehensive measure analysis enhances the ability to performance assessments of retrofits to reduce energy use for small and medium buildings and business owners who typically do not have resources to conduct costly building energy audit. DEEP will be migrated into the DEnCity - DOE’s Energy City, which integrates large-scale energy data for multi-purpose, open, and dynamic database leveraging diverse source of existing simulation data.« less

  18. Roof Type Selection Based on Patch-Based Classification Using Deep Learning for High Resolution Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Partovi, T.; Fraundorfer, F.; Azimi, S.; Marmanis, D.; Reinartz, P.

    2017-05-01

    3D building reconstruction from remote sensing image data from satellites is still an active research topic and very valuable for 3D city modelling. The roof model is the most important component to reconstruct the Level of Details 2 (LoD2) for a building in 3D modelling. While the general solution for roof modelling relies on the detailed cues (such as lines, corners and planes) extracted from a Digital Surface Model (DSM), the correct detection of the roof type and its modelling can fail due to low quality of the DSM generated by dense stereo matching. To reduce dependencies of roof modelling on DSMs, the pansharpened satellite images as a rich resource of information are used in addition. In this paper, two strategies are employed for roof type classification. In the first one, building roof types are classified in a state-of-the-art supervised pre-trained convolutional neural network (CNN) framework. In the second strategy, deep features from deep layers of different pre-trained CNN model are extracted and then an RBF kernel using SVM is employed to classify the building roof type. Based on roof complexity of the scene, a roof library including seven types of roofs is defined. A new semi-automatic method is proposed to generate training and test patches of each roof type in the library. Using the pre-trained CNN model does not only decrease the computation time for training significantly but also increases the classification accuracy.

  19. Parallel computing works

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of manymore » computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.« less

  20. Dynamic virtual machine allocation policy in cloud computing complying with service level agreement using CloudSim

    NASA Astrophysics Data System (ADS)

    Aneri, Parikh; Sumathy, S.

    2017-11-01

    Cloud computing provides services over the internet and provides application resources and data to the users based on their demand. Base of the Cloud Computing is consumer provider model. Cloud provider provides resources which consumer can access using cloud computing model in order to build their application based on their demand. Cloud data center is a bulk of resources on shared pool architecture for cloud user to access. Virtualization is the heart of the Cloud computing model, it provides virtual machine as per application specific configuration and those applications are free to choose their own configuration. On one hand, there is huge number of resources and on other hand it has to serve huge number of requests effectively. Therefore, resource allocation policy and scheduling policy play very important role in allocation and managing resources in this cloud computing model. This paper proposes the load balancing policy using Hungarian algorithm. Hungarian Algorithm provides dynamic load balancing policy with a monitor component. Monitor component helps to increase cloud resource utilization by managing the Hungarian algorithm by monitoring its state and altering its state based on artificial intelligent. CloudSim used in this proposal is an extensible toolkit and it simulates cloud computing environment.

  1. Design of a Reliable Computing System for the Petite Amateur Navy Satellite (PANSAT)

    DTIC Science & Technology

    1989-03-01

    S6 vi LIST OF TABLES TFable I. ISO SE’.VE.-N LAYER MO D EL.................. 9 TFable 2. SATELLITFE PROCESSOR SUM...the ORION project with the additional groundwork and data to serve as a baseline on which to build. 2. Mission The primary mission of PANSAT is to...seven layer ISO model for computer communication (see Table 1), which layers are handled in software and which in hardware? The physical layer, which

  2. A Model Program in a Remodeled Building.

    ERIC Educational Resources Information Center

    Muir, Maya

    2001-01-01

    Renovations contributed to academic improvement at an Issaquah (Washington) elementary school. Enclosing an open-air corridor enabled it to be used for educational activities. Double doors connected classrooms for team teaching, and carpet improved acoustics. A music room, library, and computer lab were also added. Student and community…

  3. DEVELOPMENTS AND APPLICATIONS OF CFD SIMULATIONS OF MICROMETEOROLOGY AND POLLUTION TRANSPORT IN SUPPORT OF AIR QUALITY MODELING

    EPA Science Inventory

    Development and application of computational fluid dynamics (CFD) simulations are being advanced through case studies for simulating air pollutant concentrations from sources within open fields and within complex urban building environments. CFD applications have been under deve...

  4. PC-Based Virtual Reality for CAD Model Viewing

    ERIC Educational Resources Information Center

    Seth, Abhishek; Smith, Shana S.-F.

    2004-01-01

    Virtual reality (VR), as an emerging visualization technology, has introduced an unprecedented communication method for collaborative design. VR refers to an immersive, interactive, multisensory, viewer-centered, 3D computer-generated environment and the combination of technologies required to build such an environment. This article introduces the…

  5. Angular selective window systems: Assessment of technical potential for energy savings

    DOE PAGES

    Fernandes, Luis L.; Lee, Eleanor S.; McNeil, Andrew; ...

    2014-10-16

    Static angular selective shading systems block direct sunlight and admit daylight within a specific range of incident solar angles. The objective of this study is to quantify their potential to reduce energy use and peak demand in commercial buildings using state-of-the art whole-building computer simulation software that allows accurate modeling of the behavior of optically-complex fenestration systems such as angular selective systems. Three commercial systems were evaluated: a micro-perforated screen, a tubular shading structure, and an expanded metal mesh. This evaluation was performed through computer simulation for multiple climates (Chicago, Illinois and Houston, Texas), window-to-wall ratios (0.15-0.60), building codes (ASHRAEmore » 90.1-2004 and 2010) and lighting control configurations (with and without). The modeling of the optical complexity of the systems took advantage of the development of state-of-the-art versions of the EnergyPlus, Radiance and Window simulation tools. Results show significant reductions in perimeter zone energy use; the best system reached 28% and 47% savings, respectively without and with daylighting controls (ASHRAE 90.1-2004, south facade, Chicago,WWR=0.45). As a result, angular selectivity and thermal conductance of the angle-selective layer, as well as spectral selectivity of low-emissivity coatings, were identified as factors with significant impact on performance.« less

  6. Computational modeling of neural plasticity for self-organization of neural networks.

    PubMed

    Chrol-Cannon, Joseph; Jin, Yaochu

    2014-11-01

    Self-organization in biological nervous systems during the lifetime is known to largely occur through a process of plasticity that is dependent upon the spike-timing activity in connected neurons. In the field of computational neuroscience, much effort has been dedicated to building up computational models of neural plasticity to replicate experimental data. Most recently, increasing attention has been paid to understanding the role of neural plasticity in functional and structural neural self-organization, as well as its influence on the learning performance of neural networks for accomplishing machine learning tasks such as classification and regression. Although many ideas and hypothesis have been suggested, the relationship between the structure, dynamics and learning performance of neural networks remains elusive. The purpose of this article is to review the most important computational models for neural plasticity and discuss various ideas about neural plasticity's role. Finally, we suggest a few promising research directions, in particular those along the line that combines findings in computational neuroscience and systems biology, and their synergetic roles in understanding learning, memory and cognition, thereby bridging the gap between computational neuroscience, systems biology and computational intelligence. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Hydrodynamic modeling of urban flooding taking into account detailed data about city infrastructure

    NASA Astrophysics Data System (ADS)

    Belikov, Vitaly; Norin, Sergey; Aleksyuk, Andrey; Krylenko, Inna; Borisova, Natalya; Rumyantsev, Alexey

    2017-04-01

    Flood waves moving across urban areas have specific features. Thus, the linear objects of infrastructure (such as embankments, roads, dams) can change the direction of flow or block the water movement. On the contrary, paved avenues and wide streets in the cities contribute to the concentration of flood waters. Buildings create an additional resistance to the movement of water, which depends on the urban density and the type of constructions; this effect cannot be completely described by Manning's resistance law. In addition, part of the earth surface, occupied by buildings, is excluded from the flooded area, which results in a substantial (relative to undeveloped areas) increase of the depth of flooding, especially for unsteady flow conditions. An approach to numerical simulation of urban areas flooding that consists in direct allocating of all buildings and structures on the computational grid are proposed. This can be done in almost full automatic way with usage of modern software. Real geometry of all objects of infrastructure can be taken into account on the base of highly detailed digital maps and satellite images. The calculations based on two-dimensional Saint-Venant equations on irregular adaptive computational meshes, which can contain millions of cells and take into account tens of thousands of buildings and other objects of infrastructure. Flood maps, received as result of modeling, are the basis for the damage and risk assessment for urban areas. The main advantage of the developed method is high-precision calculations, realistic modeling results and appropriate graphical display of the flood dynamics and dam-break wave's propagation on urban areas. Verification of this method has been done on the experimental data and real events simulations, including catastrophic flooding of the Krymsk city in 2012 year.

  8. Compact and Hybrid Feature Description for Building Extraction

    NASA Astrophysics Data System (ADS)

    Li, Z.; Liu, Y.; Hu, Y.; Li, P.; Ding, Y.

    2017-05-01

    Building extraction in aerial orthophotos is crucial for various applications. Currently, deep learning has been shown to be successful in addressing building extraction with high accuracy and high robustness. However, quite a large number of samples is required in training a classifier when using deep learning model. In order to realize accurate and semi-interactive labelling, the performance of feature description is crucial, as it has significant effect on the accuracy of classification. In this paper, we bring forward a compact and hybrid feature description method, in order to guarantees desirable classification accuracy of the corners on the building roof contours. The proposed descriptor is a hybrid description of an image patch constructed from 4 sets of binary intensity tests. Experiments show that benefiting from binary description and making full use of color channels, this descriptor is not only computationally frugal, but also accurate than SURF for building extraction.

  9. Base Flow Model Validation

    NASA Technical Reports Server (NTRS)

    Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John

    2011-01-01

    A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.

  10. Performance evaluation of radiant cooling system application on a university building in Indonesia

    NASA Astrophysics Data System (ADS)

    Satrio, Pujo; Sholahudin, S.; Nasruddin

    2017-03-01

    The paper describes a study developed to estimate the energy savings potential of a radiant cooling system installed in an institutional building in Indonesia. The simulations were carried out using IESVE to evaluate thermal performance and energy consumption The building model was calibrated using the measured data for the installed radiant system. Then this calibrated model was used to simulate the energy consumption and temperature distribution to determine the proportional energy savings and occupant comfort under different systems. The result was radiant cooling which integrated with a Dedicated Outside Air System (DOAS) could make 41,84% energy savings compared to the installed cooling system. The Computational Fluid Dynamics (CFD) simulation showed that a radiant system integrated with DOAS provides superior human comfort than a radiant system integrated with Variable Air Volume (VAV). Percentage People Dissatisfied was kept below 10% using the proposed system.

  11. Scalable tuning of building models to hourly data

    DOE PAGES

    Garrett, Aaron; New, Joshua Ryan

    2015-03-31

    Energy models of existing buildings are unreliable unless calibrated so they correlate well with actual energy usage. Manual tuning requires a skilled professional, is prohibitively expensive for small projects, imperfect, non-repeatable, non-transferable, and not scalable to the dozens of sensor channels that smart meters, smart appliances, and cheap/ubiquitous sensors are beginning to make available today. A scalable, automated methodology is needed to quickly and intelligently calibrate building energy models to all available data, increase the usefulness of those models, and facilitate speed-and-scale penetration of simulation-based capabilities into the marketplace for actualized energy savings. The "Autotune'' project is a novel, model-agnosticmore » methodology which leverages supercomputing, large simulation ensembles, and big data mining with multiple machine learning algorithms to allow automatic calibration of simulations that match measured experimental data in a way that is deployable on commodity hardware. This paper shares several methodologies employed to reduce the combinatorial complexity to a computationally tractable search problem for hundreds of input parameters. Furthermore, accuracy metrics are provided which quantify model error to measured data for either monthly or hourly electrical usage from a highly-instrumented, emulated-occupancy research home.« less

  12. Downscaling modelling system for multi-scale air quality forecasting

    NASA Astrophysics Data System (ADS)

    Nuterman, R.; Baklanov, A.; Mahura, A.; Amstrup, B.; Weismann, J.

    2010-09-01

    Urban modelling for real meteorological situations, in general, considers only a small part of the urban area in a micro-meteorological model, and urban heterogeneities outside a modelling domain affect micro-scale processes. Therefore, it is important to build a chain of models of different scales with nesting of higher resolution models into larger scale lower resolution models. Usually, the up-scaled city- or meso-scale models consider parameterisations of urban effects or statistical descriptions of the urban morphology, whereas the micro-scale (street canyon) models are obstacle-resolved and they consider a detailed geometry of the buildings and the urban canopy. The developed system consists of the meso-, urban- and street-scale models. First, it is the Numerical Weather Prediction (HIgh Resolution Limited Area Model) model combined with Atmospheric Chemistry Transport (the Comprehensive Air quality Model with extensions) model. Several levels of urban parameterisation are considered. They are chosen depending on selected scales and resolutions. For regional scale, the urban parameterisation is based on the roughness and flux corrections approach; for urban scale - building effects parameterisation. Modern methods of computational fluid dynamics allow solving environmental problems connected with atmospheric transport of pollutants within urban canopy in a presence of penetrable (vegetation) and impenetrable (buildings) obstacles. For local- and micro-scales nesting the Micro-scale Model for Urban Environment is applied. This is a comprehensive obstacle-resolved urban wind-flow and dispersion model based on the Reynolds averaged Navier-Stokes approach and several turbulent closures, i.e. k -ɛ linear eddy-viscosity model, k - ɛ non-linear eddy-viscosity model and Reynolds stress model. Boundary and initial conditions for the micro-scale model are used from the up-scaled models with corresponding interpolation conserving the mass. For the boundaries a kind of Dirichlet condition is chosen to provide the values based on interpolation from the coarse to the fine grid. When the roughness approach is changed to the obstacle-resolved one in the nested model, the interpolation procedure will increase the computational time (due to additional iterations) for meteorological/ chemical fields inside the urban sub-layer. In such situations, as a possible alternative, the perturbation approach can be applied. Here, the effects of main meteorological variables and chemical species are considered as a sum of two components: background (large-scale) values, described by the coarse-resolution model, and perturbations (micro-scale) features, obtained from the nested fine resolution model.

  13. Computational prediction of chemical reactions: current status and outlook.

    PubMed

    Engkvist, Ola; Norrby, Per-Ola; Selmi, Nidhal; Lam, Yu-Hong; Peng, Zhengwei; Sherer, Edward C; Amberg, Willi; Erhard, Thomas; Smyth, Lynette A

    2018-06-01

    Over the past few decades, various computational methods have become increasingly important for discovering and developing novel drugs. Computational prediction of chemical reactions is a key part of an efficient drug discovery process. In this review, we discuss important parts of this field, with a focus on utilizing reaction data to build predictive models, the existing programs for synthesis prediction, and usage of quantum mechanics and molecular mechanics (QM/MM) to explore chemical reactions. We also outline potential future developments with an emphasis on pre-competitive collaboration opportunities. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. [Advancements of computer chemistry in separation of Chinese medicine].

    PubMed

    Li, Lingjuan; Hong, Hong; Xu, Xuesong; Guo, Liwei

    2011-12-01

    Separating technique of Chinese medicine is not only a key technique in the field of Chinese medicine' s research and development, but also a significant step in the modernization of Chinese medicinal preparation. Computer chemistry can build model and look for the regulations from Chinese medicine system which is full of complicated data. This paper analyzed the applicability, key technology, basic mode and common algorithm of computer chemistry applied in the separation of Chinese medicine, introduced the mathematic mode and the setting methods of Extraction kinetics, investigated several problems which based on traditional Chinese medicine membrane procession, and forecasted the application prospect.

  15. Structure, function, and behaviour of computational models in systems biology

    PubMed Central

    2013-01-01

    Background Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such “bio-models” necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. Results We present a conceptual framework – the meaning facets – which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model’s components (structure), the meaning of the model’s intended use (function), and the meaning of the model’s dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. Conclusions The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research. PMID:23721297

  16. Modelling of Indoor Environments Using Lindenmayer Systems

    NASA Astrophysics Data System (ADS)

    Peter, M.

    2017-09-01

    Documentation of the "as-built" state of building interiors has gained a lot of interest in the recent years. Various data acquisition methods exist, e.g. the extraction from photographed evacuation plans using image processing or, most prominently, indoor mobile laser scanning. Due to clutter or data gaps as well as errors during data acquisition and processing, automatic reconstruction of CAD/BIM-like models from these data sources is not a trivial task. Thus it is often tried to support reconstruction by general rules for the perpendicularity and parallelism which are predominant in man-made structures. Indoor environments of large, public buildings, however, often also follow higher-level rules like symmetry and repetition of e.g. room sizes and corridor widths. In the context of reconstruction of city city elements (e.g. street networks) or building elements (e.g. façade layouts), formal grammars have been put to use. In this paper, we describe the use of Lindenmayer systems - which originally have been developed for the computer-based modelling of plant growth - to model and reproduce the layout of indoor environments in 2D.

  17. The challenge of forecasting impacts of flash floods: test of a simplified hydraulic approach and validation based on insurance claim data

    NASA Astrophysics Data System (ADS)

    Le Bihan, Guillaume; Payrastre, Olivier; Gaume, Eric; Moncoulon, David; Pons, Frédéric

    2017-11-01

    Up to now, flash flood monitoring and forecasting systems, based on rainfall radar measurements and distributed rainfall-runoff models, generally aimed at estimating flood magnitudes - typically discharges or return periods - at selected river cross sections. The approach presented here goes one step further by proposing an integrated forecasting chain for the direct assessment of flash flood possible impacts on inhabited areas (number of buildings at risk in the presented case studies). The proposed approach includes, in addition to a distributed rainfall-runoff model, an automatic hydraulic method suited for the computation of flood extent maps on a dense river network and over large territories. The resulting catalogue of flood extent maps is then combined with land use data to build a flood impact curve for each considered river reach, i.e. the number of inundated buildings versus discharge. These curves are finally used to compute estimated impacts based on forecasted discharges. The approach has been extensively tested in the regions of Alès and Draguignan, located in the south of France, where well-documented major flash floods recently occurred. The article presents two types of validation results. First, the automatically computed flood extent maps and corresponding water levels are tested against rating curves at available river gauging stations as well as against local reference or observed flood extent maps. Second, a rich and comprehensive insurance claim database is used to evaluate the relevance of the estimated impacts for some recent major floods.

  18. [Computer aided diagnosis model for lung tumor based on ensemble convolutional neural network].

    PubMed

    Wang, Yuanyuan; Zhou, Tao; Lu, Huiling; Wu, Cuiying; Yang, Pengfei

    2017-08-01

    The convolutional neural network (CNN) could be used on computer-aided diagnosis of lung tumor with positron emission tomography (PET)/computed tomography (CT), which can provide accurate quantitative analysis to compensate for visual inertia and defects in gray-scale sensitivity, and help doctors diagnose accurately. Firstly, parameter migration method is used to build three CNNs (CT-CNN, PET-CNN, and PET/CT-CNN) for lung tumor recognition in CT, PET, and PET/CT image, respectively. Then, we aimed at CT-CNN to obtain the appropriate model parameters for CNN training through analysis the influence of model parameters such as epochs, batchsize and image scale on recognition rate and training time. Finally, three single CNNs are used to construct ensemble CNN, and then lung tumor PET/CT recognition was completed through relative majority vote method and the performance between ensemble CNN and single CNN was compared. The experiment results show that the ensemble CNN is better than single CNN on computer-aided diagnosis of lung tumor.

  19. Computational substrates of social value in interpersonal collaboration.

    PubMed

    Fareri, Dominic S; Chang, Luke J; Delgado, Mauricio R

    2015-05-27

    Decisions to engage in collaborative interactions require enduring considerable risk, yet provide the foundation for building and maintaining relationships. Here, we investigate the mechanisms underlying this process and test a computational model of social value to predict collaborative decision making. Twenty-six participants played an iterated trust game and chose to invest more frequently with their friends compared with a confederate or computer despite equal reinforcement rates. This behavior was predicted by our model, which posits that people receive a social value reward signal from reciprocation of collaborative decisions conditional on the closeness of the relationship. This social value signal was associated with increased activity in the ventral striatum and medial prefrontal cortex, which significantly predicted the reward parameters from the social value model. Therefore, we demonstrate that the computation of social value drives collaborative behavior in repeated interactions and provide a mechanistic account of reward circuit function instantiating this process. Copyright © 2015 the authors 0270-6474/15/358170-11$15.00/0.

  20. Meeting report from the fourth meeting of the Computational Modeling in Biology Network (COMBINE)

    PubMed Central

    Waltemath, Dagmar; Bergmann, Frank T.; Chaouiya, Claudine; Czauderna, Tobias; Gleeson, Padraig; Goble, Carole; Golebiewski, Martin; Hucka, Michael; Juty, Nick; Krebs, Olga; Le Novère, Nicolas; Mi, Huaiyu; Moraru, Ion I.; Myers, Chris J.; Nickerson, David; Olivier, Brett G.; Rodriguez, Nicolas; Schreiber, Falk; Smith, Lucian; Zhang, Fengkai; Bonnet, Eric

    2014-01-01

    The Computational Modeling in Biology Network (COMBINE) is an initiative to coordinate the development of community standards and formats in computational systems biology and related fields. This report summarizes the topics and activities of the fourth edition of the annual COMBINE meeting, held in Paris during September 16-20 2013, and attended by a total of 96 people. This edition pioneered a first day devoted to modeling approaches in biology, which attracted a broad audience of scientists thanks to a panel of renowned speakers. During subsequent days, discussions were held on many subjects including the introduction of new features in the various COMBINE standards, new software tools that use the standards, and outreach efforts. Significant emphasis went into work on extensions of the SBML format, and also into community-building. This year’s edition once again demonstrated that the COMBINE community is thriving, and still manages to help coordinate activities between different standards in computational systems biology.

  1. IEA EBC annex 53: Total energy use in buildings—Analysis and evaluation methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoshino, Hiroshi; Hong, Tianzhen; Nord, Natasa

    One of the most significant barriers to achieving deep building energy efficiency is a lack of knowledge about the factors determining energy use. In fact, there is often a significant discrepancy between designed and real energy use in buildings, which is poorly understood but are believed to have more to do with the role of human behavior than building design. Building energy use is mainly influenced by six factors: climate, building envelope, building services and energy systems, building operation and maintenance, occupants’ activities and behavior, and indoor environmental quality. In the past, much research focused on the first three factors.more » However, the next three human-related factors can have an influence as significant as the first three. Annex 53 employed an interdisciplinary approach, integrating building science, architectural engineering, computer modeling and simulation, and social and behavioral science to develop and apply methods to analyze and evaluate the real energy use in buildings considering the six influencing factors. Finally, outcomes from Annex 53 improved understanding and strengthen knowledge regarding the robust prediction of total energy use in buildings, enabling reliable quantitative assessment of energy-savings measures, policies, and techniques.« less

  2. IEA EBC annex 53: Total energy use in buildings—Analysis and evaluation methods

    DOE PAGES

    Yoshino, Hiroshi; Hong, Tianzhen; Nord, Natasa

    2017-07-18

    One of the most significant barriers to achieving deep building energy efficiency is a lack of knowledge about the factors determining energy use. In fact, there is often a significant discrepancy between designed and real energy use in buildings, which is poorly understood but are believed to have more to do with the role of human behavior than building design. Building energy use is mainly influenced by six factors: climate, building envelope, building services and energy systems, building operation and maintenance, occupants’ activities and behavior, and indoor environmental quality. In the past, much research focused on the first three factors.more » However, the next three human-related factors can have an influence as significant as the first three. Annex 53 employed an interdisciplinary approach, integrating building science, architectural engineering, computer modeling and simulation, and social and behavioral science to develop and apply methods to analyze and evaluate the real energy use in buildings considering the six influencing factors. Finally, outcomes from Annex 53 improved understanding and strengthen knowledge regarding the robust prediction of total energy use in buildings, enabling reliable quantitative assessment of energy-savings measures, policies, and techniques.« less

  3. On Study of Building Smart Campus under Conditions of Cloud Computing and Internet of Things

    NASA Astrophysics Data System (ADS)

    Huang, Chao

    2017-12-01

    two new concepts in the information era are cloud computing and internet of things, although they are defined differently, they share close relationship. It is a new measure to realize leap-forward development of campus by virtue of cloud computing, internet of things and other internet technologies to build smart campus. This paper, centering on the construction of smart campus, analyzes and compares differences between network in traditional campus and that in smart campus, and makes proposals on how to build smart campus finally from the perspectives of cloud computing and internet of things.

  4. Automatic mathematical modeling for space application

    NASA Technical Reports Server (NTRS)

    Wang, Caroline K.

    1987-01-01

    A methodology for automatic mathematical modeling is described. The major objective is to create a very friendly environment for engineers to design, maintain and verify their model and also automatically convert the mathematical model into FORTRAN code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine simulation mathematical model called Propulsion System Automatic Modeling (PSAM). PSAM provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. PSAM contains an initial set of component process elements for the Space Shuttle Main Engine simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. PSAM is then able to automatically generate the model and the FORTRAN code. A future goal is to download the FORTRAN code to the VAX/VMS system for conventional computation.

  5. The impact of urban open space and 'lift-up' building design on building intake fraction and daily pollutant exposure in idealized urban models.

    PubMed

    Sha, Chenyuan; Wang, Xuemei; Lin, Yuanyuan; Fan, Yifan; Chen, Xi; Hang, Jian

    2018-08-15

    Sustainable urban design is an effective way to improve urban ventilation and reduce vehicular pollutant exposure to urban residents. This paper investigated the impacts of urban open space and 'lift-up' building design on vehicular CO (carbon monoxide) exposure in typical three-dimensional (3D) urban canopy layer (UCL) models under neutral atmospheric conditions. The building intake fraction (IF) represents the fraction of total vehicular pollutant emissions inhaled by residents when they stay at home. The building daily CO exposure (E t ) means the extent of human beings' contact with CO within one day indoor at home. Computational fluid dynamics (CFD) simulations integrating with these two concepts were performed to solve turbulent flow and assess vehicular CO exposure to urban residents. CFD technique with the standard k-ε model was successfully validated by wind tunnel data. The initial numerical UCL model consists of 5-row and 5-column (5×5) cubic buildings (building height H=street width W=30m) with four approaching wind directions (θ=0°, 15°, 30°, 45°). In Group I, one of the 25 building models is removed to attain urban open space settings. In Group II, the first floor (Lift-up1), or second floor (Lift-up2), or third floor (Lift-up3) of all buildings is elevated respectively to create wind pathways through buildings. Compared to the initial case, urban open space can slightly or significantly reduce pollutant exposure for urban residents. As θ=30° and 45°, open space settings are more effective to reduce pollutant exposure than θ=0° and 15°.The pollutant dilution near or surrounding open space and in its adjacent downstream regions is usually enhanced. Lift-up1 and Lift-up2 experience much greater pollutant exposure reduction in all wind directions than Lift-up3 and open space. Although further investigations are still required to provide practical guidelines, this study is one of the first attempts for reducing urban pollutant exposure by improving urban design. Copyright © 2018. Published by Elsevier B.V.

  6. DSB Task Force on Cyber Supply Chain

    DTIC Science & Technology

    2017-02-06

    27 3.4 Cybersecurity for Commercial and Open Source Components...Communications and Intelligence ASD(L&MR): Assistant Secretary of Defense for Logistics and Materiel Readiness ASD(R&E): Assistant Secretary of Defense...system BSIMM: Building Security in Maturity Model C4ISR: command, control, communications, computers, intelligence , surveillance and

  7. A Systematic Approach to Simulating Metabolism in Computational Toxicology. I. The Times Heuristic Modeling Framework

    EPA Science Inventory

    This paper presents a new system for automated 2D-3D migration of chemicals in large databases with conformer multiplication. The main advantages of this system are its straightforward performance, reasonable execution time, simplicity, and applicability to building large 3D che...

  8. EPA Project Updates: DSSTox and ToxCast Generating New Data and Data Linkages for Use in Predictive Modeling

    EPA Science Inventory

    EPAs National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than tr...

  9. Model-Based Building Verification in Aerial Photographs.

    DTIC Science & Technology

    1987-09-01

    Powers ’ordon E. Schacher Chaii nan Dean of Science and Electrical and Computer Engineering Engineering "p. 5.€ ’ ,’"..€ € . € -, _ _ . ."€ . 4...paper, we have proposed an ex)erimental knowledge-based " verification syste, te organization for change (letection is oitliinet. , Kowledge rules and

  10. Evaluating Outdoor Experiential Training for Leadership and Team Building.

    ERIC Educational Resources Information Center

    Williams, Scott D.; Graham, T. Scott; Baker, Bud

    2003-01-01

    Presents a model for calculating the return on investment in outdoor experiential training that focuses on pre- and posttraining behavior and business performance. Includes a method for converting data on turnover, absenteeism, productivity, quality, and job performance into monetary values to compute return. (Contains 54 references.) (SK)

  11. EXAMPLE APPLICATION OF CFD SIMULATIONS FOR SHORT-RANGE ATMOSPHERIC DISPERSION OVER THE OPEN FIELDS OF PROJECT PRAIRIE GRASS

    EPA Science Inventory

    Computational Fluid Dynamics (CFD) techniques are increasingly being applied to air quality modeling of short-range dispersion, especially the flow and dispersion around buildings and other geometrically complex structures. The proper application and accuracy of such CFD techniqu...

  12. Building Virtual Models by Postprocessing Radiology Images: A Guide for Anatomy Faculty

    ERIC Educational Resources Information Center

    Tam, Matthew D. B. S.

    2010-01-01

    Radiology and radiologists are recognized as increasingly valuable resources for the teaching and learning of anatomy. State-of-the-art radiology department workstations with industry-standard software applications can provide exquisite demonstrations of anatomy, pathology, and more recently, physiology. Similar advances in personal computers and…

  13. Comparison of different approaches of modelling in a masonry building

    NASA Astrophysics Data System (ADS)

    Saba, M.; Meloni, D.

    2017-12-01

    The present work has the objective to model a simple masonry building, through two different modelling methods in order to assess their validity in terms of evaluation of static stresses. Have been chosen two of the most commercial software used to address this kind of problem, which are of S.T.A. Data S.r.l. and Sismicad12 of Concrete S.r.l. While the 3Muri software adopts the Frame by Macro Elements Method (FME), which should be more schematic and more efficient, Sismicad12 software uses the Finite Element Method (FEM), which guarantees accurate results, with greater computational burden. Remarkably differences of the static stresses, for such a simple structure between the two approaches have been found, and an interesting comparison and analysis of the reasons is proposed.

  14. Nonparametric Bayesian Modeling for Automated Database Schema Matching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferragut, Erik M; Laska, Jason A

    2015-01-01

    The problem of merging databases arises in many government and commercial applications. Schema matching, a common first step, identifies equivalent fields between databases. We introduce a schema matching framework that builds nonparametric Bayesian models for each field and compares them by computing the probability that a single model could have generated both fields. Our experiments show that our method is more accurate and faster than the existing instance-based matching algorithms in part because of the use of nonparametric Bayesian models.

  15. BIM based virtual environment for fire emergency evacuation.

    PubMed

    Wang, Bin; Li, Haijiang; Rezgui, Yacine; Bradley, Alex; Ong, Hoang N

    2014-01-01

    Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management.

  16. A Selective Bibliography of Building Environment and Service Systems with Particular Reference to Computer Applications. Computer Report CR20.

    ERIC Educational Resources Information Center

    Forwood, Bruce S.

    This bibliography has been produced as part of a research program attempting to develop a new approach to building environment and service systems design using computer-aided design techniques. As such it not only classifies available literature on the service systems themselves, but also contains sections on the application of computers and…

  17. Exact and efficient simulation of concordant computation

    NASA Astrophysics Data System (ADS)

    Cable, Hugo; Browne, Daniel E.

    2015-11-01

    Concordant computation is a circuit-based model of quantum computation for mixed states, that assumes that all correlations within the register are discord-free (i.e. the correlations are essentially classical) at every step of the computation. The question of whether concordant computation always admits efficient simulation by a classical computer was first considered by Eastin in arXiv:quant-ph/1006.4402v1, where an answer in the affirmative was given for circuits consisting only of one- and two-qubit gates. Building on this work, we develop the theory of classical simulation of concordant computation. We present a new framework for understanding such computations, argue that a larger class of concordant computations admit efficient simulation, and provide alternative proofs for the main results of arXiv:quant-ph/1006.4402v1 with an emphasis on the exactness of simulation which is crucial for this model. We include detailed analysis of the arithmetic complexity for solving equations in the simulation, as well as extensions to larger gates and qudits. We explore the limitations of our approach, and discuss the challenges faced in developing efficient classical simulation algorithms for all concordant computations.

  18. Quantifying Earthquake Collapse Risk of Tall Steel Braced Frame Buildings Using Rupture-to-Rafters Simulations

    NASA Astrophysics Data System (ADS)

    Mourhatch, Ramses

    This thesis examines collapse risk of tall steel braced frame buildings using rupture-to-rafters simulations due to suite of San Andreas earthquakes. Two key advancements in this work are the development of (i) a rational methodology for assigning scenario earthquake probabilities and (ii) an artificial correction-free approach to broadband ground motion simulation. The work can be divided into the following sections: earthquake source modeling, earthquake probability calculations, ground motion simulations, building response, and performance analysis. As a first step the kinematic source inversions of past earthquakes in the magnitude range of 6-8 are used to simulate 60 scenario earthquakes on the San Andreas fault. For each scenario earthquake a 30-year occurrence probability is calculated and we present a rational method to redistribute the forecast earthquake probabilities from UCERF to the simulated scenario earthquake. We illustrate the inner workings of the method through an example involving earthquakes on the San Andreas fault in southern California. Next, three-component broadband ground motion histories are computed at 636 sites in the greater Los Angeles metropolitan area by superposing short-period (0.2s-2.0s) empirical Green's function synthetics on top of long-period (> 2.0s) spectral element synthetics. We superimpose these seismograms on low-frequency seismograms, computed from kinematic source models using the spectral element method, to produce broadband seismograms. Using the ground motions at 636 sites for the 60 scenario earthquakes, 3-D nonlinear analysis of several variants of an 18-story steel braced frame building, designed for three soil types using the 1994 and 1997 Uniform Building Code provisions and subjected to these ground motions, are conducted. Model performance is classified into one of five performance levels: Immediate Occupancy, Life Safety, Collapse Prevention, Red-Tagged, and Model Collapse. The results are combined with the 30-year probability of occurrence of the San Andreas scenario earthquakes using the PEER performance based earthquake engineering framework to determine the probability of exceedance of these limit states over the next 30 years.

  19. A toolbox to visualise benefits resulting from flood hazard mitigation

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Thaler, Thomas; Heiser, Micha

    2017-04-01

    In order to visualize the benefits resulting from technical mitigation, a toolbox was developed within an open-source environment that allows for an assessment of gains and losses for buildings exposed to flood hazards. Starting with different scenarios showing the changes in flood magnitude with respect to the considered management options, the computation was based on the amount and value of buildings exposed as well as their vulnerability, following the general concept of risk assessment. As a result, beneficiaries of risk reduction may be identified and - more general - also different mitigation options may be strategically evaluated with respect to the height of risk reduction for different elements exposed. As such, multiple management options can be ranked according to their costs and benefits, and in order of priority. A relational database composed from different modules was created in order to mirror the requirements of an open source application and to allow for future dynamics in the data availability as well as the spatiotemporal dynamics of this data (Fuchs et al. 2013). An economic module was used to compute the monetary value of buildings exposed using (a) the building footprint, (b) the information of the building cadaster such as building type, number of storeys and utilisation, and (c) regionally averaged construction costs. An exposition module was applied to connect the spatial GIS information (X and Y coordinates) of elements at risk to the hazard information in order to achieve information on exposure. An impact module linked this information to vulnerability functions (Totschnig and Fuchs 2013; Papathoma-Köhle et al. 2015) in order to achieve the monetary level of risk for every building exposed. These values were finally computed before and after the implementation of mitigation measure in order to show gains and losses, and visualised. The results can be exported in terms of spread sheets for further computation. References Fuchs S, Keiler M, Sokratov SA, Shnyparkov A (2013) Spatiotemporal dynamics: the need for an innovative approach in mountain hazard risk management. Natural Hazards 68 (3):1217-1241 Papathoma-Köhle M, Zischg A, Fuchs S, Glade T, Keiler M (2015) Loss estimation for landslides in mountain areas - An integrated toolbox for vulnerability assessment and damage documentation. Environmental Modelling and Software 63:156-169 Totschnig R, Fuchs S (2013) Mountain torrents: quantifying vulnerability and assessing uncertainties. Engineering Geology 155:31-44

  20. Optimizing ion channel models using a parallel genetic algorithm on graphical processors.

    PubMed

    Ben-Shalom, Roy; Aviv, Amit; Razon, Benjamin; Korngreen, Alon

    2012-01-01

    We have recently shown that we can semi-automatically constrain models of voltage-gated ion channels by combining a stochastic search algorithm with ionic currents measured using multiple voltage-clamp protocols. Although numerically successful, this approach is highly demanding computationally, with optimization on a high performance Linux cluster typically lasting several days. To solve this computational bottleneck we converted our optimization algorithm for work on a graphical processing unit (GPU) using NVIDIA's CUDA. Parallelizing the process on a Fermi graphic computing engine from NVIDIA increased the speed ∼180 times over an application running on an 80 node Linux cluster, considerably reducing simulation times. This application allows users to optimize models for ion channel kinetics on a single, inexpensive, desktop "super computer," greatly reducing the time and cost of building models relevant to neuronal physiology. We also demonstrate that the point of algorithm parallelization is crucial to its performance. We substantially reduced computing time by solving the ODEs (Ordinary Differential Equations) so as to massively reduce memory transfers to and from the GPU. This approach may be applied to speed up other data intensive applications requiring iterative solutions of ODEs. Copyright © 2012 Elsevier B.V. All rights reserved.

Top