Science.gov

Sample records for building stock modelling

  1. U.S. Department of Energy Commercial Reference Building Models of the National Building Stock

    SciTech Connect

    Deru, M.; Field, K.; Studer, D.; Benne, K.; Griffith, B.; Torcellini, P.; Liu, B.; Halverson, M.; Winiarski, D.; Rosenberg, M.; Yazdanian, M.; Huang, J.; Crawley, D.

    2011-02-01

    The U.S. Department of Energy (DOE) Building Technologies Program has set the aggressive goal of producing marketable net-zero energy buildings by 2025. This goal will require collaboration between the DOE laboratories and the building industry. We developed standard or reference energy models for the most common commercial buildings to serve as starting points for energy efficiency research. These models represent fairly realistic buildings and typical construction practices. Fifteen commercial building types and one multifamily residential building were determined by consensus between DOE, the National Renewable Energy Laboratory, Pacific Northwest National Laboratory, and Lawrence Berkeley National Laboratory, and represent approximately two-thirds of the commercial building stock.

  2. Towards a Very Low Energy Building Stock: Modeling the U.S. Commercial Building Sector to Support Policy and Innovation Planning

    SciTech Connect

    Coffey, Brian; Borgeson, Sam; Selkowitz, Stephen; Apte, Josh; Mathew, Paul; Haves, Philip

    2009-07-01

    This paper describes the origin, structure and continuing development of a model of time varying energy consumption in the US commercial building stock. The model is based on a flexible structure that disaggregates the stock into various categories (e.g. by building type, climate, vintage and life-cycle stage) and assigns attributes to each of these (e.g. floor area and energy use intensity by fuel type and end use), based on historical data and user-defined scenarios for future projections. In addition to supporting the interactive exploration of building stock dynamics, the model has been used to study the likely outcomes of specific policy and innovation scenarios targeting very low future energy consumption in the building stock. Model use has highlighted the scale of the challenge of meeting targets stated by various government and professional bodies, and the importance of considering both new construction and existing buildings.

  3. Maintenance and Expansion: Modeling Material Stocks and Flows for Residential Buildings and Transportation Networks in the EU25.

    PubMed

    Wiedenhofer, Dominik; Steinberger, Julia K; Eisenmenger, Nina; Haas, Willi

    2015-08-01

    Material stocks are an important part of the social metabolism. Owing to long service lifetimes of stocks, they not only shape resource flows during construction, but also during use, maintenance, and at the end of their useful lifetime. This makes them an important topic for sustainable development. In this work, a model of stocks and flows for nonmetallic minerals in residential buildings, roads, and railways in the EU25, from 2004 to 2009 is presented. The changing material composition of the stock is modeled using a typology of 72 residential buildings, four road and two railway types, throughout the EU25. This allows for estimating the amounts of materials in in-use stocks of residential buildings and transportation networks, as well as input and output flows. We compare the magnitude of material demands for expansion versus those for maintenance of existing stock. Then, recycling potentials are quantitatively explored by comparing the magnitude of estimated input, waste, and recycling flows from 2004 to 2009 and in a business-as-usual scenario for 2020. Thereby, we assess the potential impacts of the European Waste Framework Directive, which strives for a significant increase in recycling. We find that in the EU25, consisting of highly industrialized countries, a large share of material inputs are directed at maintaining existing stocks. Proper management of existing transportation networks and residential buildings is therefore crucial for the future size of flows of nonmetallic minerals.

  4. Maintenance and Expansion: Modeling Material Stocks and Flows for Residential Buildings and Transportation Networks in the EU25

    PubMed Central

    Steinberger, Julia K.; Eisenmenger, Nina; Haas, Willi

    2015-01-01

    Summary Material stocks are an important part of the social metabolism. Owing to long service lifetimes of stocks, they not only shape resource flows during construction, but also during use, maintenance, and at the end of their useful lifetime. This makes them an important topic for sustainable development. In this work, a model of stocks and flows for nonmetallic minerals in residential buildings, roads, and railways in the EU25, from 2004 to 2009 is presented. The changing material composition of the stock is modeled using a typology of 72 residential buildings, four road and two railway types, throughout the EU25. This allows for estimating the amounts of materials in in‐use stocks of residential buildings and transportation networks, as well as input and output flows. We compare the magnitude of material demands for expansion versus those for maintenance of existing stock. Then, recycling potentials are quantitatively explored by comparing the magnitude of estimated input, waste, and recycling flows from 2004 to 2009 and in a business‐as‐usual scenario for 2020. Thereby, we assess the potential impacts of the European Waste Framework Directive, which strives for a significant increase in recycling. We find that in the EU25, consisting of highly industrialized countries, a large share of material inputs are directed at maintaining existing stocks. Proper management of existing transportation networks and residential buildings is therefore crucial for the future size of flows of nonmetallic minerals. PMID:27524878

  5. A High-Granularity Approach to Modeling Energy Consumption and Savings Potential in the U.S. Residential Building Stock

    SciTech Connect

    2016-08-12

    Building simulations are increasingly used in various applications related to energy efficient buildings. For individual buildings, applications include: design of new buildings, prediction of retrofit savings, ratings, performance path code compliance and qualification for incentives. Beyond individual building applications, larger scale applications (across the stock of buildings at various scales: national, regional and state) include: codes and standards development, utility program design, regional/state planning, and technology assessments. For these sorts of applications, a set of representative buildings are typically simulated to predict performance of the entire population of buildings. Focusing on the U.S. single-family residential building stock, this paper will describe how multiple data sources for building characteristics are combined into a highly-granular database that preserves the important interdependencies of the characteristics. We will present the sampling technique used to generate a representative set of thousands (up to hundreds of thousands) of building models. We will also present results of detailed calibrations against building stock consumption data.

  6. 8. Detail of viaduct, livestock exchange building to left, stock ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Detail of viaduct, livestock exchange building to left, stock yards autopark right. View to north. - South Omaha Union Stock Yards, Buckingham Road Viaduct, Twenty-ninth Street spanning Stockyard Cattle Pens, Omaha, Douglas County, NE

  7. FOUNDRY BUILDING SCAPE SOUTHSOUTHWEST FROM MALLEABLE STOCK YARD CRANE SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FOUNDRY BUILDING SCAPE SOUTH-SOUTHWEST FROM MALLEABLE STOCK YARD CRANE SHOWING VALVE ASSEMBLY BUILDINGS AND DISTANT ROOF OF THE SHIPPING AND STORAGE BUILDING. - Stockham Pipe & Fittings Company, 4000 Tenth Avenue North, Birmingham, Jefferson County, AL

  8. Building generalized tree mass/volume component models for improved estimation of forest stocks and utilization potential

    Treesearch

    David W. MacFarlane

    2015-01-01

    Accurately assessing forest biomass potential is contingent upon having accurate tree biomass models to translate data from forest inventories. Building generality into these models is especially important when they are to be applied over large spatial domains, such as regional, national and international scales. Here, new, generalized whole-tree mass / volume...

  9. 37. Exterior view of main yard. Stock room building (left), ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    37. Exterior view of main yard. Stock room building (left), old machine shop (center), steel fabrication building (right), and traveling yard crane (middle fore). - Barbour Boat Works, Tryon Palace Drive, New Bern, Craven County, NC

  10. Evolutionary model of stock markets

    NASA Astrophysics Data System (ADS)

    Kaldasch, Joachim

    2014-12-01

    The paper presents an evolutionary economic model for the price evolution of stocks. Treating a stock market as a self-organized system governed by a fast purchase process and slow variations of demand and supply the model suggests that the short term price distribution has the form a logistic (Laplace) distribution. The long term return can be described by Laplace-Gaussian mixture distributions. The long term mean price evolution is governed by a Walrus equation, which can be transformed into a replicator equation. This allows quantifying the evolutionary price competition between stocks. The theory suggests that stock prices scaled by the price over all stocks can be used to investigate long-term trends in a Fisher-Pry plot. The price competition that follows from the model is illustrated by examining the empirical long-term price trends of two stocks.

  11. Stochastic dynamical model for stock-stock correlations.

    PubMed

    Ma, Wen-Jong; Hu, Chin-Kun; Amritkar, Ravindra E

    2004-08-01

    We propose a model of coupled random walks for stock-stock correlations. The walks in the model are coupled via a mechanism that the displacement (price change) of each walk (stock) is activated by the price gradients over some underlying network. We assume that the network has two underlying structures, describing the correlations among the stocks of the whole market and among those within individual groups, respectively, each with a coupling parameter controlling the degree of correlation. The model provides the interpretation of the features displayed in the distribution of the eigenvalues for the correlation matrix of real market on the level of time sequences. We verify that such modeling indeed gives good fitting for the market data of US stocks.

  12. Build and Stock the Essential Tool Box

    ERIC Educational Resources Information Center

    Vickers, Ron

    2013-01-01

    Last summer, the author finally acted on an idea that had been banging around in his head for the last couple of years. He decided to build his son a toolbox and equip it with the basic set of tools he'd need as a future homeowner. Thinking that other technology educators and their students might be interested in the project, the author describes…

  13. LETTER: Synchronization model for stock market asymmetry

    NASA Astrophysics Data System (ADS)

    Donangelo, Raul; Jensen, Mogens H.; Simonsen, Ingve; Sneppen, Kim

    2006-11-01

    The waiting time needed for a stock market index to undergo a given percentage change in its value is found to have an up down asymmetry, which, surprisingly, is not observed for the individual stocks composing that index. To explain this, we introduce a market model consisting of randomly fluctuating stocks that occasionally synchronize their short term draw-downs. These synchronous events are parametrized by a 'fear factor', that reflects the occurrence of dramatic external events which affect the financial market.

  14. Power law models of stock indices

    NASA Astrophysics Data System (ADS)

    Tse, Man Kit

    Viewing the stock market as a self-organized system, Sornette and Johansen introduced physics-based models to study the dynamics of stock market crashes from the perspective of complex systems. This involved modeling stock market Indices using a mathematical power law exhibiting log-periodicity as the system approaches a market crash, which acts like a critical point in a thermodynamic system. In this dissertation, I aim to investigate stock indices to determine whether or not they exhibit log-periodic oscillations, according to the models proposed by Sornette, as they approach a crash. In addition to analyzing stock market crashes in the frequency domain using the discrete Fourier transform and the Lomb-Scargle periodogram, I perform a detailed analysis of the stock market crash models through parameter estimation and model testing. I find that the probability landscapes have a complex topography and that there is very little evidence that these phase transition-based models accurately describe stock market crashes.

  15. Technology Prioritization: Transforming the U.S. Building Stock to Embrace Energy Efficiency

    SciTech Connect

    Abdelaziz, Omar; Farese, Philip; Abramson, Alexis; Phelan, Patrick

    2013-01-01

    The U.S. Buildings sector is responsible for about 40% of the national energy expenditures. This is due in part to wasteful use of resources and limited considerations made for energy efficiency during the design and retrofit phases. Recent studies have indicated the potential for up to 30-50% energy savings in the U.S. buildings sector using currently available technologies. This paper discusses efforts to accelerate the transformation in the U.S. building energy efficiency sector using a new technology prioritization framework. The underlying analysis examines building energy use micro segments using the Energy Information Administration Annual Energy Outlook and other publically available information. The tool includes a stock-and-flow model to track stock vintage and efficiency levels with time. The tool can be used to investigate energy efficiency measures under a variety of scenarios and has a built-in energy accounting framework to prevent double counting of energy savings within any given portfolio. This tool is developed to inform decision making and estimate long term potential energy savings for different market adoption scenarios.

  16. An autocatalytic network model for stock markets

    NASA Astrophysics Data System (ADS)

    Caetano, Marco Antonio Leonel; Yoneyama, Takashi

    2015-02-01

    The stock prices of companies with businesses that are closely related within a specific sector of economy might exhibit movement patterns and correlations in their dynamics. The idea in this work is to use the concept of autocatalytic network to model such correlations and patterns in the trends exhibited by the expected returns. The trends are expressed in terms of positive or negative returns within each fixed time interval. The time series derived from these trends is then used to represent the movement patterns by a probabilistic boolean network with transitions modeled as an autocatalytic network. The proposed method might be of value in short term forecasting and identification of dependencies. The method is illustrated with a case study based on four stocks of companies in the field of natural resource and technology.

  17. Building the Stock of College-Educated Labor Revisited

    ERIC Educational Resources Information Center

    Sjoquist, David L.; Winters, John V.

    2012-01-01

    In a recent paper in the "Journal of Human Resources," Dynarski (2008) used data from the 1 percent 2000 Census Public Use Microdata Sample (PUMS) files to demonstrate that merit scholarship programs in Georgia and Arkansas increased the stock of college-educated individuals in those states. This paper replicates the results in Dynarski…

  18. SUSY GUT Model Building

    SciTech Connect

    Raby, Stuart

    2008-11-23

    In this talk I discuss the evolution of SUSY GUT model building as I see it. Starting with 4 dimensional model building, I then consider orbifold GUTs in 5 dimensions and finally orbifold GUTs embedded into the E{sub 8}xE{sub 8} heterotic string.

  19. Stochastic model for market stocks with floors

    NASA Astrophysics Data System (ADS)

    Villarroel, Javier

    2007-08-01

    We present a model to describe the stochastic evolution of stocks that show a strong resistance at some level and generalize to this situation the evolution based upon geometric Brownian motion. If volatility and drift are related in a certain way we show that our model can be integrated in an exact way. The related problem of how to prize general securities that pay dividends at a continuous rate and earn a terminal payoff at maturity T is solved via the martingale probability approach.

  20. Statistical pairwise interaction model of stock market

    NASA Astrophysics Data System (ADS)

    Bury, Thomas

    2013-03-01

    Financial markets are a classical example of complex systems as they are compound by many interacting stocks. As such, we can obtain a surprisingly good description of their structure by making the rough simplification of binary daily returns. Spin glass models have been applied and gave some valuable results but at the price of restrictive assumptions on the market dynamics or they are agent-based models with rules designed in order to recover some empirical behaviors. Here we show that the pairwise model is actually a statistically consistent model with the observed first and second moments of the stocks orientation without making such restrictive assumptions. This is done with an approach only based on empirical data of price returns. Our data analysis of six major indices suggests that the actual interaction structure may be thought as an Ising model on a complex network with interaction strengths scaling as the inverse of the system size. This has potentially important implications since many properties of such a model are already known and some techniques of the spin glass theory can be straightforwardly applied. Typical behaviors, as multiple equilibria or metastable states, different characteristic time scales, spatial patterns, order-disorder, could find an explanation in this picture.

  1. Rational GARCH model: An empirical test for stock returns

    NASA Astrophysics Data System (ADS)

    Takaishi, Tetsuya

    2017-05-01

    We propose a new ARCH-type model that uses a rational function to capture the asymmetric response of volatility to returns, known as the "leverage effect". Using 10 individual stocks on the Tokyo Stock Exchange and two stock indices, we compare the new model with several other asymmetric ARCH-type models. We find that according to the deviance information criterion, the new model ranks first for several stocks. Results show that the proposed new model can be used as an alternative asymmetric ARCH-type model in empirical applications.

  2. Multivariate Markov chain modeling for stock markets

    NASA Astrophysics Data System (ADS)

    Maskawa, Jun-ichi

    2003-06-01

    We study a multivariate Markov chain model as a stochastic model of the price changes of portfolios in the framework of the mean field approximation. The time series of price changes are coded into the sequences of up and down spins according to their signs. We start with the discussion for small portfolios consisting of two stock issues. The generalization of our model to arbitrary size of portfolio is constructed by a recurrence relation. The resultant form of the joint probability of the stationary state coincides with Gibbs measure assigned to each configuration of spin glass model. Through the analysis of actual portfolios, it has been shown that the synchronization of the direction of the price changes is well described by the model.

  3. Quantum Brownian motion model for the stock market

    NASA Astrophysics Data System (ADS)

    Meng, Xiangyi; Zhang, Jian-Wei; Guo, Hong

    2016-06-01

    It is believed by the majority today that the efficient market hypothesis is imperfect because of market irrationality. Using the physical concepts and mathematical structures of quantum mechanics, we construct an econophysical framework for the stock market, based on which we analogously map massive numbers of single stocks into a reservoir consisting of many quantum harmonic oscillators and their stock index into a typical quantum open system-a quantum Brownian particle. In particular, the irrationality of stock transactions is quantitatively considered as the Planck constant within Heisenberg's uncertainty relationship of quantum mechanics in an analogous manner. We analyze real stock data of Shanghai Stock Exchange of China and investigate fat-tail phenomena and non-Markovian behaviors of the stock index with the assistance of the quantum Brownian motion model, thereby interpreting and studying the limitations of the classical Brownian motion model for the efficient market hypothesis from a new perspective of quantum open system dynamics.

  4. Mitigation of CO2 emissions from the EU-15 building stock: beyond the EU Directive on the Energy Performance of Buildings.

    PubMed

    Petersdorff, Carsten; Boermans, Thomas; Harnisch, Jochen

    2006-09-01

    GOAL SCOPE AND BACKGROUND: The European Directive on Energy Performance of Buildings which came into force 16 December 2002 will be implemented in the legislation of Member States by 4 January 2006. In addition to the aim of improving the overall energy efficiency of new buildings, large existing buildings will become a target for improvement, as soon as they undergo significant renovation. The building sector is responsible for about 40% of Europe's total end energy consumption and hence this Directive is an important step for the European Union in order that it should reach the level of saving required by the Kyoto Agreement. In this the EU is committed to reduce CO2 emissions relative to the base year of 1990 by 8 per cent, by 2010. But what will be the impact of the new Directive, how large could be the impacts of extending the obligation for energy efficiency retrofitting towards smaller buildings? Can improvement of the insulation offset or reduce the growing energy consumption from the increasing installation of cooling installations? EURIMA, the European Insulation Manufacturers Association and EuroACE, the European Alliance of Companies for Energy Efficiency in Buildings, asked Ecofys to address these questions. The effect of the EPB Directive on the emissions associated with the heating energy consumption of the total EU 15 building stock has been examined in a model calculation, using the Built Environment Analysis Model (BEAM), which was developed by Ecofys to investigate energy saving measures in the building stock. The great complexity of the EU-15 building stock had to be simplified by examining five standard buildings with eight insulation standards, which are assigned to building age and renovation status. Furthermore, three climatic regions (cold, moderate, warm) were distinguished for the calculation of the heating energy demand. This gave a basic 210 building types for which the heating energy demand and CO2 emissions from heating were

  5. Model for non-Gaussian intraday stock returns.

    PubMed

    Gerig, Austin; Vicente, Javier; Fuentes, Miguel A

    2009-12-01

    Stock prices are known to exhibit non-Gaussian dynamics, and there is much interest in understanding the origin of this behavior. Here, we present a model that explains the shape and scaling of the distribution of intraday stock price fluctuations (called intraday returns) and verify the model using a large database for several stocks traded on the London Stock Exchange. We provide evidence that the return distribution for these stocks is non-Gaussian and similar in shape and that the distribution appears stable over intraday time scales. We explain these results by assuming the volatility of returns is constant intraday but varies over longer periods such that its inverse square follows a gamma distribution. This produces returns that are Student distributed for intraday time scales. The predicted results show excellent agreement with the data for all stocks in our study and over all regions of the return distribution.

  6. Model for non-Gaussian intraday stock returns

    NASA Astrophysics Data System (ADS)

    Gerig, Austin; Vicente, Javier; Fuentes, Miguel A.

    2009-12-01

    Stock prices are known to exhibit non-Gaussian dynamics, and there is much interest in understanding the origin of this behavior. Here, we present a model that explains the shape and scaling of the distribution of intraday stock price fluctuations (called intraday returns) and verify the model using a large database for several stocks traded on the London Stock Exchange. We provide evidence that the return distribution for these stocks is non-Gaussian and similar in shape and that the distribution appears stable over intraday time scales. We explain these results by assuming the volatility of returns is constant intraday but varies over longer periods such that its inverse square follows a gamma distribution. This produces returns that are Student distributed for intraday time scales. The predicted results show excellent agreement with the data for all stocks in our study and over all regions of the return distribution.

  7. Statistical Analysis by Statistical Physics Model for the STOCK Markets

    NASA Astrophysics Data System (ADS)

    Wang, Tiansong; Wang, Jun; Fan, Bingli

    A new stochastic stock price model of stock markets based on the contact process of the statistical physics systems is presented in this paper, where the contact model is a continuous time Markov process, one interpretation of this model is as a model for the spread of an infection. Through this model, the statistical properties of Shanghai Stock Exchange (SSE) and Shenzhen Stock Exchange (SZSE) are studied. In the present paper, the data of SSE Composite Index and the data of SZSE Component Index are analyzed, and the corresponding simulation is made by the computer computation. Further, we investigate the statistical properties, fat-tail phenomena, the power-law distributions, and the long memory of returns for these indices. The techniques of skewness-kurtosis test, Kolmogorov-Smirnov test, and R/S analysis are applied to study the fluctuation characters of the stock price returns.

  8. Models for U.S Fish Stock Assessment

    NASA Astrophysics Data System (ADS)

    Bynes, K.

    2016-02-01

    A stock assessment is a statistical summary of fish population that provides past and current information about the population, and forecasts the status of the stock. The NOAA Fisheries' Species Information System (SIS) collects data pertaining to the stock assessments that are completed throughout the U.S. The system was standardized by categorizing statistical models used in stock assessments across the U.S. The categories of the statistical models were Statistical Catch-at-Age (SCAA), Statistical Catch-at-Length (SCAL), Biomass Dynamics, Index, Virtual Population Analysis (VPA), Equilibrium, and Unknown. An analytical summary was performed on the categories of approaches used to evaluate stock assessments. It was hypothesized the size and age based models would become more frequent over time, and approaches would differ by science center. By conducting linear regressions, it was determined the frequency of size and age based models did increase with time; thus, the alternative hypothesis was supported. The frequency at which age based models are conducted have a direct positive relationship with time. The frequency at which size based models are used to assess stocks have an inverse relationship with time which neither fails to reject null hypothesis. Also, the approaches differed based on science center rejecting the null hypothesis. SCAA was overall the most frequent approach used across science centers to evaluate stock assessments. The northern regions rely heavily on SCAA and Index approaches to evaluate stocks, while the southern regions are more likely to use a variety of approaches.

  9. An explicit solution for calculating optimum spawning stock size from Ricker's stock recruitment model.

    PubMed

    Scheuerell, Mark D

    2016-01-01

    Stock-recruitment models have been used for decades in fisheries management as a means of formalizing the expected number of offspring that recruit to a fishery based on the number of parents. In particular, Ricker's stock recruitment model is widely used due to its flexibility and ease with which the parameters can be estimated. After model fitting, the spawning stock size that produces the maximum sustainable yield (S MSY) to a fishery, and the harvest corresponding to it (U MSY), are two of the most common biological reference points of interest to fisheries managers. However, to date there has been no explicit solution for either reference point because of the transcendental nature of the equation needed to solve for them. Therefore, numerical or statistical approximations have been used for more than 30 years. Here I provide explicit formulae for calculating both S MSY and U MSY in terms of the productivity and density-dependent parameters of Ricker's model.

  10. Energy demand of the German and Dutch residential building stock under climate change

    NASA Astrophysics Data System (ADS)

    Olonscheck, Mady; Holsten, Anne; Walther, Carsten; Kropp, Jürgen P.

    2014-05-01

    In order to mitigate climate change, extraordinary measures are necessary in the future. The building sector, in particular, offers considerable potential for transformation to lower energy demand. On a national level, however, successful and far-reaching measures will likely be taken only if reliable estimates regarding future energy demand from different scenarios are available. The energy demand for space heating and cooling is determined by a combination of behavioral, climatic, constructional, and demographic factors. For two countries, namely Germany and the Netherlands, we analyze the combined effect of future climate and building stock changes as well as renovation measures on the future energy demand for room conditioning of residential buildings until 2060. We show how much the heating energy demand will decrease in the future and answer the question of whether the energy decrease will be exceeded by an increase in cooling energy demand. Based on a sensitivity analysis, we determine those influencing factors with the largest impact on the future energy demand from the building stock. Both countries have national targets regarding the reduction of the energy demand for the future. We provide relevant information concerning the annual renovation rates that are necessary to reach these targets. Retrofitting buildings is a win-win option as it not only helps to mitigate climate change and to lower the dependency on fossil fuels but also transforms the buildings stock into one that is better equipped for extreme temperatures that may occur more frequently with climate change. For the Netherlands, the study concentrates not only on the national, but also the provincial level, which should facilitate directed policy measures. Moreover, the analysis is done on a monthly basis in order to ascertain a deeper understanding of the future seasonal energy demand changes. Our approach constitutes an important first step towards deeper insights into the internal dynamics

  11. Portfolio optimization for index tracking modelling in Malaysia stock market

    NASA Astrophysics Data System (ADS)

    Siew, Lam Weng; Jaaman, Saiful Hafizah; Ismail, Hamizun

    2016-06-01

    Index tracking is an investment strategy in portfolio management which aims to construct an optimal portfolio to generate similar mean return with the stock market index mean return without purchasing all of the stocks that make up the index. The objective of this paper is to construct an optimal portfolio using the optimization model which adopts regression approach in tracking the benchmark stock market index return. In this study, the data consists of weekly price of stocks in Malaysia market index which is FTSE Bursa Malaysia Kuala Lumpur Composite Index from January 2010 until December 2013. The results of this study show that the optimal portfolio is able to track FBMKLCI Index at minimum tracking error of 1.0027% with 0.0290% excess mean return over the mean return of FBMKLCI Index. The significance of this study is to construct the optimal portfolio using optimization model which adopts regression approach in tracking the stock market index without purchasing all index components.

  12. Multilayer Stock Forecasting Model Using Fuzzy Time Series

    PubMed Central

    Javedani Sadaei, Hossein; Lee, Muhammad Hisyam

    2014-01-01

    After reviewing the vast body of literature on using FTS in stock market forecasting, certain deficiencies are distinguished in the hybridization of findings. In addition, the lack of constructive systematic framework, which can be helpful to indicate direction of growth in entire FTS forecasting systems, is outstanding. In this study, we propose a multilayer model for stock market forecasting including five logical significant layers. Every single layer has its detailed concern to assist forecast development by reconciling certain problems exclusively. To verify the model, a set of huge data containing Taiwan Stock Index (TAIEX), National Association of Securities Dealers Automated Quotations (NASDAQ), Dow Jones Industrial Average (DJI), and S&P 500 have been chosen as experimental datasets. The results indicate that the proposed methodology has the potential to be accepted as a framework for model development in stock market forecasts using FTS. PMID:24605058

  13. Multilayer stock forecasting model using fuzzy time series.

    PubMed

    Javedani Sadaei, Hossein; Lee, Muhammad Hisyam

    2014-01-01

    After reviewing the vast body of literature on using FTS in stock market forecasting, certain deficiencies are distinguished in the hybridization of findings. In addition, the lack of constructive systematic framework, which can be helpful to indicate direction of growth in entire FTS forecasting systems, is outstanding. In this study, we propose a multilayer model for stock market forecasting including five logical significant layers. Every single layer has its detailed concern to assist forecast development by reconciling certain problems exclusively. To verify the model, a set of huge data containing Taiwan Stock Index (TAIEX), National Association of Securities Dealers Automated Quotations (NASDAQ), Dow Jones Industrial Average (DJI), and S&P 500 have been chosen as experimental datasets. The results indicate that the proposed methodology has the potential to be accepted as a framework for model development in stock market forecasts using FTS.

  14. Modeling STOCK Market Based on Genetic Cellular Automata

    NASA Astrophysics Data System (ADS)

    Zhou, Tao; Zhou, Pei-Ling; Wang, Bing-Hong; Tang, Zi-Nan; Liu, Jun

    An artificial stock market is established with the modeling method and ideas of cellular automata. Cells are used to represent stockholders, who have the capability of self-teaching and are affected by the investing history of the neighboring ones. The neighborhood relationship among the stockholders is the expanded Von Neumann relationship, and the interaction among them is realized through selection operator and crossover operator. Experiment shows that the large events are frequent in the fluctuations of the stock price generated by the artificial stock market when compared with a normal process and the price returns distribution is a Lévy distribution in the central part followed by an approximately exponential truncation.

  15. Building Models with Bayes

    NASA Astrophysics Data System (ADS)

    Hart, Gus; Nelson, Lance J.; Reese, Shane

    2011-10-01

    The whole of modern Bayesian statistical methods is founded on the simple idea of Bayes rule, stated by the Reverend Thomas Bayes, and presented in 1763. Bayes rule is merely a simple statement of conditional probablility but can be used to make strong inferences. However, the application of Bayes rule to all but the simplest problems requires significant computation. As a result, Baysian-based approaches have been largely impractical until high-speed computing became inexpensive in the recent in the last 20 years or so. We discuss the general idea behind Bayes rule, how to use it to build physical models, and illustrate the approach for a simple case of lattice gas models.

  16. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  17. Revisiting the multifractality in stock returns and its modeling implications

    NASA Astrophysics Data System (ADS)

    He, Shanshan; Wang, Yudong

    2017-02-01

    In this paper, we investigate the multifractality of Chinese and the U.S. stock markets using a multifractal detrending moving average algorithm. The results show that stock returns in both markets are multifractal at a similar extent. We detect the source of multifractality and find that long-range correlations are one of the major sources of multifractality in the US market but not in the Chinese market. Fat-tailed distribution plays a crucial role in multifractality of both markets. As an innovation, we quantify the effect of extreme events on multifractality and find the strong evidence of their contribution to multifractality. Furthermore, we investigate the usefulness of popular ARFIMA-GARCH models with skew-t distribution in capturing multifractality. Our results indicate that these models can capture only a fraction of multifractality. More complex models do not necessarily perform better than simple GARCH models in describing multifractality in stock returns.

  18. A quantum anharmonic oscillator model for the stock market

    NASA Astrophysics Data System (ADS)

    Gao, Tingting; Chen, Yu

    2017-02-01

    A financially interpretable quantum model is proposed to study the probability distributions of the stock price return. The dynamics of a quantum particle is considered an analog of the motion of stock price. Then the probability distributions of price return can be computed from the wave functions that evolve according to Schrodinger equation. Instead of a harmonic oscillator in previous studies, a quantum anharmonic oscillator is applied to the stock in liquid market. The leptokurtic distributions of price return can be reproduced by our quantum model with the introduction of mixed-state and multi-potential. The trend following dominant market, in which the price return follows a bimodal distribution, is discussed as a specific case of the illiquid market.

  19. Dynamic material flow modeling: an effort to calibrate and validate aluminum stocks and flows in Austria.

    PubMed

    Buchner, Hanno; Laner, David; Rechberger, Helmut; Fellner, Johann

    2015-05-05

    A calibrated and validated dynamic material flow model of Austrian aluminum (Al) stocks and flows between 1964 and 2012 was developed. Calibration and extensive plausibility testing was performed to illustrate how the quality of dynamic material flow analysis can be improved on the basis of the consideration of independent bottom-up estimates. According to the model, total Austrian in-use Al stocks reached a level of 360 kg/capita in 2012, with buildings (45%) and transport applications (32%) being the major in-use stocks. Old scrap generation (including export of end-of-life vehicles) amounted to 12.5 kg/capita in 2012, still being on the increase, while Al final demand has remained rather constant at around 25 kg/capita in the past few years. The application of global sensitivity analysis showed that only small parts of the total variance of old scrap generation could be explained by the variation of single parameters, emphasizing the need for comprehensive sensitivity analysis tools accounting for interaction between parameters and time-delay effects in dynamic material flow models. Overall, it was possible to generate a detailed understanding of the evolution of Al stocks and flows in Austria, including plausibility evaluations of the results. Such models constitute a reliable basis for evaluating future recycling potentials, in particular with respect to application-specific qualities of current and future national Al scrap generation and utilization.

  20. Building Fractal Models with Manipulatives.

    ERIC Educational Resources Information Center

    Coes, Loring

    1993-01-01

    Uses manipulative materials to build and examine geometric models that simulate the self-similarity properties of fractals. Examples are discussed in two dimensions, three dimensions, and the fractal dimension. Discusses how models can be misleading. (Contains 10 references.) (MDH)

  1. Building Fractal Models with Manipulatives.

    ERIC Educational Resources Information Center

    Coes, Loring

    1993-01-01

    Uses manipulative materials to build and examine geometric models that simulate the self-similarity properties of fractals. Examples are discussed in two dimensions, three dimensions, and the fractal dimension. Discusses how models can be misleading. (Contains 10 references.) (MDH)

  2. Modeling the uncertainty of estimating forest carbon stocks in China

    NASA Astrophysics Data System (ADS)

    Yue, T. X.; Wang, Y. F.; Du, Z. P.; Zhao, M. W.; Zhang, L. L.; Zhao, N.; Lu, M.; Larocque, G. R.; Wilson, J. P.

    2015-12-01

    Earth surface systems are controlled by a combination of global and local factors, which cannot be understood without accounting for both the local and global components. The system dynamics cannot be recovered from the global or local controls alone. Ground forest inventory is able to accurately estimate forest carbon stocks at sample plots, but these sample plots are too sparse to support the spatial simulation of carbon stocks with required accuracy. Satellite observation is an important source of global information for the simulation of carbon stocks. Satellite remote-sensing can supply spatially continuous information about the surface of forest carbon stocks, which is impossible from ground-based investigations, but their description has considerable uncertainty. In this paper, we validated the Lund-Potsdam-Jena dynamic global vegetation model (LPJ), the Kriging method for spatial interpolation of ground sample plots and a satellite-observation-based approach as well as an approach for fusing the ground sample plots with satellite observations and an assimilation method for incorporating the ground sample plots into LPJ. The validation results indicated that both the data fusion and data assimilation approaches reduced the uncertainty of estimating carbon stocks. The data fusion had the lowest uncertainty by using an existing method for high accuracy surface modeling to fuse the ground sample plots with the satellite observations (HASM-SOA). The estimates produced with HASM-SOA were 26.1 and 28.4 % more accurate than the satellite-based approach and spatial interpolation of the sample plots, respectively. Forest carbon stocks of 7.08 Pg were estimated for China during the period from 2004 to 2008, an increase of 2.24 Pg from 1984 to 2008, using the preferred HASM-SOA method.

  3. Buildings Lean Maintenance Implementation Model

    NASA Astrophysics Data System (ADS)

    Abreu, Antonio; Calado, João; Requeijo, José

    2016-11-01

    Nowadays, companies in global markets have to achieve high levels of performance and competitiveness to stay "alive".Within this assumption, the building maintenance cannot be done in a casual and improvised way due to the costs related. Starting with some discussion about lean management and building maintenance, this paper introduces a model to support the Lean Building Maintenance (LBM) approach. Finally based on a real case study from a Portuguese company, the benefits, challenges and difficulties are presented and discussed.

  4. Extreme value modelling of Ghana stock exchange index.

    PubMed

    Nortey, Ezekiel N N; Asare, Kwabena; Mettle, Felix Okoe

    2015-01-01

    Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana stock exchange all-shares index (2000-2010) by applying the extreme value theory (EVT) to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before the EVT method was applied. The Peak Over Threshold approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model's goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the value at risk and expected shortfall risk measures at some high quantiles, based on the fitted GPD model.

  5. Building Model Motorcars.

    ERIC Educational Resources Information Center

    Altshuler, Ken

    1995-01-01

    Describes a project where students build a motorized car that can perform well in two distinctly different competitions: traveling 20 meters in the shortest time and pulling a 500-gram mass the farthest distance in 20 seconds. Enables students to apply physics principles to a real problem and to discover the importance of teamwork on large…

  6. Building Model Motorcars.

    ERIC Educational Resources Information Center

    Altshuler, Ken

    1995-01-01

    Describes a project where students build a motorized car that can perform well in two distinctly different competitions: traveling 20 meters in the shortest time and pulling a 500-gram mass the farthest distance in 20 seconds. Enables students to apply physics principles to a real problem and to discover the importance of teamwork on large…

  7. Application of artificial neural network models and principal component analysis method in predicting stock prices on Tehran Stock Exchange

    NASA Astrophysics Data System (ADS)

    Zahedi, Javad; Rounaghi, Mohammad Mahdi

    2015-11-01

    Stock price changes are receiving the increasing attention of investors, especially those who have long-term aims. The present study intends to assess the predictability of prices on Tehran Stock Exchange through the application of artificial neural network models and principal component analysis method and using 20 accounting variables. Finally, goodness of fit for principal component analysis has been determined through real values, and the effective factors in Tehran Stock Exchange prices have been accurately predicted and modeled in the form of a new pattern consisting of all variables.

  8. Visual automated macromolecular model building.

    PubMed

    Langer, Gerrit G; Hazledine, Saul; Wiegels, Tim; Carolan, Ciaran; Lamzin, Victor S

    2013-04-01

    Automated model-building software aims at the objective interpretation of crystallographic diffraction data by means of the construction or completion of macromolecular models. Automated methods have rapidly gained in popularity as they are easy to use and generate reproducible and consistent results. However, the process of model building has become increasingly hidden and the user is often left to decide on how to proceed further with little feedback on what has preceded the output of the built model. Here, ArpNavigator, a molecular viewer tightly integrated into the ARP/wARP automated model-building package, is presented that directly controls model building and displays the evolving output in real time in order to make the procedure transparent to the user.

  9. Building Mental Models by Dissecting Physical Models

    ERIC Educational Resources Information Center

    Srivastava, Anveshna

    2016-01-01

    When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to…

  10. Building Mental Models by Dissecting Physical Models

    ERIC Educational Resources Information Center

    Srivastava, Anveshna

    2016-01-01

    When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to…

  11. A fuzzy logic model to forecast stock market momentum in Indonesia's property and real estate sector

    NASA Astrophysics Data System (ADS)

    Penawar, H. K.; Rustam, Z.

    2017-07-01

    The Capital market has the important role in Indonesia's economy. The capital market does not only support the economy of Indonesia but also being an indicator Indonesia's economy improvement. Something that has been traded in the capital market is stock (stock market). Nowadays, the stock market is full of uncertainty. That uncertainty values make predicting stock market is all that we have to do before we make a decision in the stock market. One that can be predicted in the stock market is momentum. To forecast stock market momentum, it can use fuzzy logic model. In the process of modeling, it will be used 14 days historical data that consisting the value of open, high, low, and close, to predict the next 5 days momentum categories. There are three momentum categories namely Bullish, Neutral, and Bearish. To illustrate the fuzzy logic model, we will use stocks data from several companies that listed on Indonesia Stock Exchange (IDX) in property and real estate sector.

  12. Equation-based model for the stock market

    NASA Astrophysics Data System (ADS)

    Xavier, Paloma O. C.; Atman, A. P. F.; de Magalhães, A. R. Bosco

    2017-09-01

    We propose a stock market model which is investigated in the forms of difference and differential equations whose variables correspond to the demand or supply of each agent and to the price. In the model, agents are driven by the behavior of their trust contact network as well by fundamental analysis. By means of the deterministic version of the model, the connection between such drive mechanisms and the price is analyzed: imitation behavior promotes market instability, finitude of resources is associated to stock index stability, and high sensitivity to the fair price provokes price oscillations. Long-range correlations in the price temporal series and heavy-tailed distribution of returns are observed for the version of the model which considers different proposals for stochasticity of microeconomic and macroeconomic origins.

  13. A quantum mechanical model for the relationship between stock price and stock ownership

    SciTech Connect

    Cotfas, Liviu-Adrian

    2012-11-01

    The trade of a fixed stock can be regarded as the basic process that measures its momentary price. The stock price is exactly known only at the time of sale when the stock is between traders, that is, only in the case when the owner is unknown. We show that the stock price can be better described by a function indicating at any moment of time the probabilities for the possible values of price if a transaction takes place. This more general description contains partial information on the stock price, but it also contains partial information on the stock owner. By following the analogy with quantum mechanics, we assume that the time evolution of the function describing the stock price can be described by a Schroedinger type equation.

  14. A quantum mechanical model for the relationship between stock price and stock ownership

    NASA Astrophysics Data System (ADS)

    Cotfas, Liviu-Adrian

    2012-11-01

    The trade of a fixed stock can be regarded as the basic process that measures its momentary price. The stock price is exactly known only at the time of sale when the stock is between traders, that is, only in the case when the owner is unknown. We show that the stock price can be better described by a function indicating at any moment of time the probabilities for the possible values of price if a transaction takes place. This more general description contains partial information on the stock price, but it also contains partial information on the stock owner. By following the analogy with quantum mechanics, we assume that the time evolution of the function describing the stock price can be described by a Schrödinger type equation.

  15. Jeddah Historical Building Information Modeling "JHBIM" Old Jeddah - Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Baik, A.; Boehm, J.; Robson, S.

    2013-07-01

    The historic city of Jeddah faces serious issues in the conservation, documentation and recording of its valuable building stock. Terrestrial Laser Scanning and Architectural Photogrammetry have already been used in many Heritage sites in the world. The integration of heritage recording and Building Information Modelling (BIM) has been introduced as HBIM and is now a method to document and manage these buildings. In the last decade many traditional surveying methods were used to record the buildings in Old Jeddah. However, these methods take a long time, can sometimes provide unreliable information and often lack completeness. This paper will look at another approach for heritage recording by using the Jeddah Historical Building Information Modelling (JHBIM).

  16. Building a Computable Facility Model

    DTIC Science & Technology

    2002-10-01

    Building Composer; facility design; facility management; Fort Future; decision support tools; installation design; integrated software; simulation ... modeling 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as Report (SAR) 18. NUMBER OF PAGES 4 19. NAME OF RESPONSIBLE PERSON Wolfe

  17. Self-organized percolation model for stock market fluctuations

    NASA Astrophysics Data System (ADS)

    Stauffer, Dietrich; Sornette, Didier

    1999-09-01

    In the Cont-Bouchaud model [cond-mat /9712318] of stock markets, percolation clusters act as buying or selling investors and their statistics controls that of the price variations. Rather than fixing the concentration controlling each cluster connectivity artificially at or close to the critical value, we propose that clusters shatter and aggregate continuously as the concentration evolves randomly, reflecting the incessant time evolution of groups of opinions and market moods. By the mechanism of “sweeping of an instability” [Sornette, J. Phys. I 4, 209(1994)], this market model spontaneously exhibits reasonable power-law statistics for the distribution of price changes and accounts for the other important stylized facts of stock market price fluctuations.

  18. Generalized Bogoliubov Polariton Model: An Application to Stock Exchange Market

    NASA Astrophysics Data System (ADS)

    Thuy Anh, Chu; Anh, Truong Thi Ngoc; Lan, Nguyen Tri; Viet, Nguyen Ai

    2016-06-01

    A generalized Bogoliubov method for investigation non-simple and complex systems was developed. We take two branch polariton Hamiltonian model in second quantization representation and replace the energies of quasi-particles by two distribution functions of research objects. Application to stock exchange market was taken as an example, where the changing the form of return distribution functions from Boltzmann-like to Gaussian-like was studied.

  19. Conditional statistical model building

    NASA Astrophysics Data System (ADS)

    Hansen, Mads Fogtmann; Hansen, Michael Sass; Larsen, Rasmus

    2008-03-01

    We present a new statistical deformation model suited for parameterized grids with different resolutions. Our method models the covariances between multiple grid levels explicitly, and allows for very efficient fitting of the model to data on multiple scales. The model is validated on a data set consisting of 62 annotated MR images of Corpus Callosum. One fifth of the data set was used as a training set, which was non-rigidly registered to each other without a shape prior. From the non-rigidly registered training set a shape prior was constructed by performing principal component analysis on each grid level and using the results to construct a conditional shape model, conditioning the finer parameters with the coarser grid levels. The remaining shapes were registered with the constructed shape prior. The dice measures for the registration without prior and the registration with a prior were 0.875 +/- 0.042 and 0.8615 +/- 0.051, respectively.

  20. Calibrating Building Energy Models Using Supercomputer Trained Machine Learning Agents

    SciTech Connect

    Sanyal, Jibonananda; New, Joshua Ryan; Edwards, Richard; Parker, Lynne Edwards

    2014-01-01

    Building Energy Modeling (BEM) is an approach to model the energy usage in buildings for design and retrofit purposes. EnergyPlus is the flagship Department of Energy software that performs BEM for different types of buildings. The input to EnergyPlus can often extend in the order of a few thousand parameters which have to be calibrated manually by an expert for realistic energy modeling. This makes it challenging and expensive thereby making building energy modeling unfeasible for smaller projects. In this paper, we describe the Autotune research which employs machine learning algorithms to generate agents for the different kinds of standard reference buildings in the U.S. building stock. The parametric space and the variety of building locations and types make this a challenging computational problem necessitating the use of supercomputers. Millions of EnergyPlus simulations are run on supercomputers which are subsequently used to train machine learning algorithms to generate agents. These agents, once created, can then run in a fraction of the time thereby allowing cost-effective calibration of building models.

  1. Autotune E+ Building Energy Models

    SciTech Connect

    New, Joshua Ryan; Sanyal, Jibonananda; Bhandari, Mahabir S; Shrestha, Som S

    2012-01-01

    This paper introduces a novel Autotune methodology under development for calibrating building energy models (BEM). It is aimed at developing an automated BEM tuning methodology that enables models to reproduce measured data such as utility bills, sub-meter, and/or sensor data accurately and robustly by selecting best-match E+ input parameters in a systematic, automated, and repeatable fashion. The approach is applicable to a building retrofit scenario and aims to quantify the trade-offs between tuning accuracy and the minimal amount of ground truth data required to calibrate the model. Autotune will use a suite of machine-learning algorithms developed and run on supercomputers to generate calibration functions. Specifically, the project will begin with a de-tuned model and then perform Monte Carlo simulations on the model by perturbing the uncertain parameters within permitted ranges. Machine learning algorithms will then extract minimal perturbation combinations that result in modeled results that most closely track sensor data. A large database of parametric EnergyPlus (E+) simulations has been made publicly available. Autotune is currently being applied to a heavily instrumented residential building as well as three light commercial buildings in which a de-tuned model is autotuned using faux sensor data from the corresponding target E+ model.

  2. Quantum spatial-periodic harmonic model for daily price-limited stock markets

    NASA Astrophysics Data System (ADS)

    Meng, Xiangyi; Zhang, Jian-Wei; Xu, Jingjing; Guo, Hong

    2015-11-01

    We investigate the behaviors of stocks in daily price-limited stock markets by purposing a quantum spatial-periodic harmonic model. The stock price is considered to be oscillating and damping in a quantum spatial-periodic harmonic oscillator potential well. A complicated non-linear relation including inter-band positive correlation and intra-band negative correlation between the volatility and trading volume of a stock is numerically derived with the energy band structure of the model concerned. The effectiveness of price limit is re-examined, with some observed characteristics of price-limited stock markets in China studied by applying our quantum model.

  3. Darwinian Model Building

    NASA Astrophysics Data System (ADS)

    Kester, Do; Bontekoe, Romke

    2011-03-01

    We present a way to generate heuristic mathematical models based on the Darwinian principles of variation and selection in a pool of individuals over many generations. Each individual has a genotype (the hereditary properties) and a phenotype (the expression of these properties in the environment). Variation is achieved by cross-over and mutation operations on the genotype which consists in the present case of a single chromosome. The genotypes `live' in the environment of the data. Nested Sampling is used to optimize the free parameters of the models given the data, thus giving rise to the phenotypes. Selection is based on the phenotypes. The evidences which naturally follow from the Nested Sampling Algorithm are used in a second level of Nested Sampling to find increasingly better models. The data in this paper originate from the Leiden Cytology and Pathology Laboratory (LCPL), which screens pap smears for cervical cancer. We have data for 1750 women who on average underwent 5 tests each. The data on individual women are treated as a small time series. We will try to estimate the next value of the prime cancer indicator from previous tests of the same woman.

  4. Darwinian Model Building

    SciTech Connect

    Kester, Do; Bontekoe, Romke

    2011-03-14

    We present a way to generate heuristic mathematical models based on the Darwinian principles of variation and selection in a pool of individuals over many generations. Each individual has a genotype (the hereditary properties) and a phenotype (the expression of these properties in the environment). Variation is achieved by cross-over and mutation operations on the genotype which consists in the present case of a single chromosome. The genotypes 'live' in the environment of the data. Nested Sampling is used to optimize the free parameters of the models given the data, thus giving rise to the phenotypes. Selection is based on the phenotypes.The evidences which naturally follow from the Nested Sampling Algorithm are used in a second level of Nested Sampling to find increasingly better models.The data in this paper originate from the Leiden Cytology and Pathology Laboratory (LCPL), which screens pap smears for cervical cancer. We have data for 1750 women who on average underwent 5 tests each. The data on individual women are treated as a small time series. We will try to estimate the next value of the prime cancer indicator from previous tests of the same woman.

  5. Experimental Building Information Models

    DTIC Science & Technology

    2011-09-01

    link can later be removed. In some cases, it may also help to have a 2D drawing extracted from the 3D model, such as when laying out a large floor...plan. In such cases, a Revit plan view can be exported as a 2D Autocad DWG file using Revit’s export function. This 2D DWG file can then be...with government-furnished 2D Autocad DWG files. The 2D DWG files included architectural, mechanical, plumbing, and electrical layouts, and were

  6. Reaction-diffusion-branching models of stock price fluctuations

    NASA Astrophysics Data System (ADS)

    Tang, Lei-Han; Tian, Guang-Shan

    Several models of stock trading (Bak et al., Physica A 246 (1997) 430.) are analyzed in analogy with one-dimensional, two-species reaction-diffusion-branching processes. Using heuristic and scaling arguments, we show that the short-time market price variation is subdiffusive with a Hurst exponent H=1/4. Biased diffusion towards the market price and blind-eyed copying lead to crossovers to the empirically observed random-walk behavior ( H=1/2) at long times. The calculated crossover forms and diffusion constants are shown to agree well with simulation data.

  7. An Australian stocks and flows model for asbestos.

    PubMed

    Donovan, Sally; Pickin, Joe

    2016-10-01

    All available data on asbestos consumption in Australia were collated in order to determine the most common asbestos-containing materials remaining in the built environment. The proportion of asbestos contained within each material and the types of products these materials are most commonly found in was also determined. The lifetime of these asbestos containing products was estimated in order to develop a model that projects stocks and flows of asbestos products in Australia through to the year 2100. The model is based on a Weibull distribution and was built in an excel spreadsheet to make it user-friendly and accessible. The nature of the products under consideration means both their asbestos content and lifetime parameters are highly variable, and so for each of these a high and low estimate is presented along with the estimate used in the model. The user is able to vary the parameters in the model as better data become available. © The Author(s) 2016.

  8. Building mental models by dissecting physical models.

    PubMed

    Srivastava, Anveshna

    2016-01-01

    When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to ensure focused learning; models that are too constrained require less supervision, but can be constructed mechanically, with little to no conceptual engagement. We propose "model-dissection" as an alternative to "model-building," whereby instructors could make efficient use of supervisory resources, while simultaneously promoting focused learning. We report empirical results from a study conducted with biology undergraduate students, where we demonstrate that asking them to "dissect" out specific conceptual structures from an already built 3D physical model leads to a significant improvement in performance than asking them to build the 3D model from simpler components. Using questionnaires to measure understanding both before and after model-based interventions for two cohorts of students, we find that both the "builders" and the "dissectors" improve in the post-test, but it is the latter group who show statistically significant improvement. These results, in addition to the intrinsic time-efficiency of "model dissection," suggest that it could be a valuable pedagogical tool. © 2015 The International Union of Biochemistry and Molecular Biology.

  9. Virtual building environments (VBE) - Applying information modeling to buildings

    SciTech Connect

    Bazjanac, Vladimir

    2004-06-21

    A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative and the benefits from a couple of early VBE projects.

  10. Using data on biomass and fishing mortality in stock production modelling of flatfish

    NASA Astrophysics Data System (ADS)

    IK Zhang, Chang; Gunderson, Donald R.; Sullivan, Patrick J.

    Stock production modelling was used to estimate population parameters such as the carrying capacity (B ∞), as well as management parameters such as maximum sustainable yield (MSY), the instantaneous rate of fishing mortality at MSY (F MSY) and the sustainable biomass at MSY (B MSY). The input data were not catch and effort data, which usually require adjustments for changes in catchability, but biomass and catch (or fishing mortality), which are frequently available from cohort analysis or direct surveys. The model does not require the assumption of stock equilibrium for estimating parameters. The model was applied to data from the Alaska plaice, Pleuronectes quadrituberculatus, and yellowfin sole, Limanda aspera stocks in the eastern Bering Sea, and the Pacific halibut, Hippoglossus stenolepis, stock in the Gulf of Alaska and Bering Sea. All three stocks are characterized by separation of nursery area and exploitable population. There are at least five age groups present in nursery areas and ten or more in the exploitable stock so that recruitment levels and exploitable stock sizes are well-buffered. Predictions from the surplus production model provided reasonable fits to the biomass time series for all three stocks examined, given the sources of uncertainty in the biomass estimates available. It appears that the stock dynamics for the three species can be described by a relatively simple density-dependent model assuming instantaneous responses in stock biomass via recruitment and growth.

  11. Stochastic cellular automata model for stock market dynamics

    NASA Astrophysics Data System (ADS)

    Bartolozzi, M.; Thomas, A. W.

    2004-04-01

    In the present work we introduce a stochastic cellular automata model in order to simulate the dynamics of the stock market. A direct percolation method is used to create a hierarchy of clusters of active traders on a two-dimensional grid. Active traders are characterized by the decision to buy, σi (t)=+1 , or sell, σi (t)=-1 , a stock at a certain discrete time step. The remaining cells are inactive, σi (t)=0 . The trading dynamics is then determined by the stochastic interaction between traders belonging to the same cluster. Extreme, intermittent events, such as crashes or bubbles, are triggered by a phase transition in the state of the bigger clusters present on the grid, where almost all the active traders come to share the same spin orientation. Most of the stylized aspects of the financial market time series, including multifractal proprieties, are reproduced by the model. A direct comparison is made with the daily closures of the S&P500 index.

  12. Stochastic cellular automata model for stock market dynamics.

    PubMed

    Bartolozzi, M; Thomas, A W

    2004-04-01

    In the present work we introduce a stochastic cellular automata model in order to simulate the dynamics of the stock market. A direct percolation method is used to create a hierarchy of clusters of active traders on a two-dimensional grid. Active traders are characterized by the decision to buy, sigma(i) (t)=+1, or sell, sigma(i) (t)=-1, a stock at a certain discrete time step. The remaining cells are inactive, sigma(i) (t)=0. The trading dynamics is then determined by the stochastic interaction between traders belonging to the same cluster. Extreme, intermittent events, such as crashes or bubbles, are triggered by a phase transition in the state of the bigger clusters present on the grid, where almost all the active traders come to share the same spin orientation. Most of the stylized aspects of the financial market time series, including multifractal proprieties, are reproduced by the model. A direct comparison is made with the daily closures of the S&P 500 index.

  13. Recommendation Method for Build-to-Order Products Considering Substitutability of Specifications and Stock Consumption Balance of Components

    NASA Astrophysics Data System (ADS)

    Shimoda, Atsushi; Kosugi, Hidenori; Karino, Takafumi; Komoda, Norihisa

    This study focuses on a stock reduction method for build-to-order (BTO) products to flow surplus parts out to the market using sale by recommendation. A sale by recommendation is repeated in an each business negotiation using a recommended configuration selected from the inventory of parts to minimize the stock deficiency or excess at the end of a certain period of the production plan. The method is based on the potential of a customer specification to be replaced by an alternative one if the alternative one is close to the initial customer specification. A recommendation method is proposed that decides the recommended product configuration by balancing the part consumption so that the alternative specification of the configuration is close enough to the initial customer specification for substitutability. The method was evaluated by a simulation using real BTO manufacturing data and the result demonstrates that the unbalance of the consumption of parts inventory is improved.

  14. Modeling Long-term Behavior of Stock Market Prices Using Differential Equations

    NASA Astrophysics Data System (ADS)

    Yang, Xiaoxiang; Zhao, Conan; Mazilu, Irina

    2015-03-01

    Due to incomplete information available in the market and uncertainties associated with the price determination process, the stock prices fluctuate randomly during a short period of time. In the long run, however, certain economic factors, such as the interest rate, the inflation rate, and the company's revenue growth rate, will cause a gradual shift in the stock price. Thus, in this paper, a differential equation model has been constructed in order to study the effects of these factors on the stock prices. The model obtained accurately describes the general trends in the AAPL and XOM stock price changes over the last ten years.

  15. Energy Savings Modeling of Standard Commercial Building Re-tuning Measures: Large Office Buildings

    SciTech Connect

    Fernandez, Nicholas; Katipamula, Srinivas; Wang, Weimin; Huang, Yunzhi; Liu, Guopeng

    2012-06-01

    Today, many large commercial buildings use sophisticated building automation systems (BASs) to manage a wide range of building equipment. While the capabilities of BASs have increased over time, many buildings still do not fully use the BAS's capabilities and are not properly commissioned, operated or maintained, which leads to inefficient operation, increased energy use, and reduced lifetimes of the equipment. This report investigates the energy savings potential of several common HVAC system retuning measures on a typical large office building prototype model, using the Department of Energy's building energy modeling software, EnergyPlus. The baseline prototype model uses roughly as much energy as an average large office building in existing building stock, but does not utilize any re-tuning measures. Individual re-tuning measures simulated against this baseline include automatic schedule adjustments, damper minimum flow adjustments, thermostat adjustments, as well as dynamic resets (set points that change continuously with building and/or outdoor conditions) to static pressure, supply air temperature, condenser water temperature, chilled and hot water temperature, and chilled and hot water differential pressure set points. Six combinations of these individual measures have been formulated - each designed to conform to limitations to implementation of certain individual measures that might exist in typical buildings. All of these measures and combinations were simulated in 16 cities representative of specific U.S. climate zones. The modeling results suggest that the most effective energy savings measures are those that affect the demand-side of the building (air-systems and schedules). Many of the demand-side individual measures were capable of reducing annual HVAC system energy consumption by over 20% in most cities that were modeled. Supply side measures affecting HVAC plant conditions were only modestly successful (less than 5% annual HVAC energy savings for

  16. A refined fuzzy time series model for stock market forecasting

    NASA Astrophysics Data System (ADS)

    Jilani, Tahseen Ahmed; Burney, Syed Muhammad Aqil

    2008-05-01

    Time series models have been used to make predictions of stock prices, academic enrollments, weather, road accident casualties, etc. In this paper we present a simple time-variant fuzzy time series forecasting method. The proposed method uses heuristic approach to define frequency-density-based partitions of the universe of discourse. We have proposed a fuzzy metric to use the frequency-density-based partitioning. The proposed fuzzy metric also uses a trend predictor to calculate the forecast. The new method is applied for forecasting TAIEX and enrollments’ forecasting of the University of Alabama. It is shown that the proposed method work with higher accuracy as compared to other fuzzy time series methods developed for forecasting TAIEX and enrollments of the University of Alabama.

  17. Future Premature Mortality Due to O3, Secondary Inorganic Aerosols and Primary PM in Europe — Sensitivity to Changes in Climate, Anthropogenic Emissions, Population and Building Stock

    PubMed Central

    Geels, Camilla; Andersson, Camilla; Hänninen, Otto; Lansø, Anne Sofie; Schwarze, Per E.; Ambelas Skjøth, Carsten; Brandt, Jørgen

    2015-01-01

    Air pollution is an important environmental factor associated with health impacts in Europe and considerable resources are used to reduce exposure to air pollution through emission reductions. These reductions will have non-linear effects on exposure due, e.g., to interactions between climate and atmospheric chemistry. By using an integrated assessment model, we quantify the effect of changes in climate, emissions and population demography on exposure and health impacts in Europe. The sensitivity to the changes is assessed by investigating the differences between the decades 2000–2009, 2050–2059 and 2080–2089. We focus on the number of premature deaths related to atmospheric ozone, Secondary Inorganic Aerosols and primary PM. For the Nordic region we furthermore include a projection on how population exposure might develop due to changes in building stock with increased energy efficiency. Reductions in emissions cause a large significant decrease in mortality, while climate effects on chemistry and emissions only affects premature mortality by a few percent. Changes in population demography lead to a larger relative increase in chronic mortality than the relative increase in population. Finally, the projected changes in building stock and infiltration rates in the Nordic indicate that this factor may be very important for assessments of population exposure in the future. PMID:25749320

  18. Future premature mortality due to O3, secondary inorganic aerosols and primary PM in Europe--sensitivity to changes in climate, anthropogenic emissions, population and building stock.

    PubMed

    Geels, Camilla; Andersson, Camilla; Hänninen, Otto; Lansø, Anne Sofie; Schwarze, Per E; Skjøth, Carsten Ambelas; Brandt, Jørgen

    2015-03-04

    Air pollution is an important environmental factor associated with health impacts in Europe and considerable resources are used to reduce exposure to air pollution through emission reductions. These reductions will have non-linear effects on exposure due, e.g., to interactions between climate and atmospheric chemistry. By using an integrated assessment model, we quantify the effect of changes in climate, emissions and population demography on exposure and health impacts in Europe. The sensitivity to the changes is assessed by investigating the differences between the decades 2000-2009, 2050-2059 and 2080-2089. We focus on the number of premature deaths related to atmospheric ozone, Secondary Inorganic Aerosols and primary PM. For the Nordic region we furthermore include a projection on how population exposure might develop due to changes in building stock with increased energy efficiency. Reductions in emissions cause a large significant decrease in mortality, while climate effects on chemistry and emissions only affects premature mortality by a few percent. Changes in population demography lead to a larger relative increase in chronic mortality than the relative increase in population. Finally, the projected changes in building stock and infiltration rates in the Nordic indicate that this factor may be very important for assessments of population exposure in the future.

  19. Prediction of Stock Returns Based on Cross-Sectional Multivariable Model

    NASA Astrophysics Data System (ADS)

    Yamada, Shinya; Takahashi, Shinsuke; Funabashi, Motohisa

    A new prediction method of stock returns was constructed from a cross-sectional multivariable model where explanatory variables are current financial indexes and an explained variable is a future stock return. To achieve precise prediction, explanatory variables were appropriately selected over time based on various test statistics and optimization of a performance index of expected portfolio return. A long-short portfolio, in which stocks with high predicted return were bought and stocks with low predicted return were sold short, was constructed to evaluate the proposed method. The simulation test showed that the proposed prediction method was effective to achieve high portfolio performance.

  20. Building Knowledge Stocks: The Role of State Higher-Education Policies

    ERIC Educational Resources Information Center

    Groen, Jeffrey A.

    2009-01-01

    A variety of studies provide evidence that the stock of college-educated labor has fundamental effects on state and local economies through its association with wages, economic growth, personal incomes, and tax revenues. As a result, policymakers in many states try to increase the percentage of the state's population (or workforce) that has a…

  1. Building Knowledge Stocks: The Role of State Higher-Education Policies

    ERIC Educational Resources Information Center

    Groen, Jeffrey A.

    2009-01-01

    A variety of studies provide evidence that the stock of college-educated labor has fundamental effects on state and local economies through its association with wages, economic growth, personal incomes, and tax revenues. As a result, policymakers in many states try to increase the percentage of the state's population (or workforce) that has a…

  2. The uncertainty of modeled soil carbon stock change for Finland

    NASA Astrophysics Data System (ADS)

    Lehtonen, Aleksi; Heikkinen, Juha

    2013-04-01

    Countries should report soil carbon stock changes of forests for Kyoto Protocol. Under Kyoto Protocol one can omit reporting of a carbon pool by verifying that the pool is not a source of carbon, which is especially tempting for the soil pool. However, verifying that soils of a nation are not a source of carbon in given year seems to be nearly impossible. The Yasso07 model was parametrized against various decomposition data using MCMC method. Soil carbon change in Finland between 1972 and 2011 were simulated with Yasso07 model using litter input data derived from the National Forest Inventory (NFI) and fellings time series. The uncertainties of biomass models, litter turnoverrates, NFI sampling and Yasso07 model were propagated with Monte Carlo simulations. Due to biomass estimation methods, uncertainties of various litter input sources (e.g. living trees, natural mortality and fellings) correlate strongly between each other. We show how original covariance matrices can be analytically combined and the amount of simulated components reduce greatly. While doing simulations we found that proper handling correlations may be even more essential than accurate estimates of standard errors. As a preliminary results, from the analysis we found that both Southern- and Northern Finland were soil carbon sinks, coefficient of variations (CV) varying 10%-25% when model was driven with long term constant weather data. When we applied annual weather data, soils were both sinks and sources of carbon and CVs varied from 10%-90%. This implies that the success of soil carbon sink verification depends on the weather data applied with models. Due to this fact IPCC should provide clear guidance for the weather data applied with soil carbon models and also for soil carbon sink verification. In the UNFCCC reporting carbon sinks of forest biomass have been typically averaged for five years - similar period for soil model weather data would be logical.

  3. Predicting the Direction of Stock Market Index Movement Using an Optimized Artificial Neural Network Model.

    PubMed

    Qiu, Mingyue; Song, Yu

    2016-01-01

    In the business sector, it has always been a difficult task to predict the exact daily price of the stock market index; hence, there is a great deal of research being conducted regarding the prediction of the direction of stock price index movement. Many factors such as political events, general economic conditions, and traders' expectations may have an influence on the stock market index. There are numerous research studies that use similar indicators to forecast the direction of the stock market index. In this study, we compare two basic types of input variables to predict the direction of the daily stock market index. The main contribution of this study is the ability to predict the direction of the next day's price of the Japanese stock market index by using an optimized artificial neural network (ANN) model. To improve the prediction accuracy of the trend of the stock market index in the future, we optimize the ANN model using genetic algorithms (GA). We demonstrate and verify the predictability of stock price direction by using the hybrid GA-ANN model and then compare the performance with prior studies. Empirical results show that the Type 2 input variables can generate a higher forecast accuracy and that it is possible to enhance the performance of the optimized ANN model by selecting input variables appropriately.

  4. Predicting the Direction of Stock Market Index Movement Using an Optimized Artificial Neural Network Model

    PubMed Central

    Qiu, Mingyue; Song, Yu

    2016-01-01

    In the business sector, it has always been a difficult task to predict the exact daily price of the stock market index; hence, there is a great deal of research being conducted regarding the prediction of the direction of stock price index movement. Many factors such as political events, general economic conditions, and traders’ expectations may have an influence on the stock market index. There are numerous research studies that use similar indicators to forecast the direction of the stock market index. In this study, we compare two basic types of input variables to predict the direction of the daily stock market index. The main contribution of this study is the ability to predict the direction of the next day’s price of the Japanese stock market index by using an optimized artificial neural network (ANN) model. To improve the prediction accuracy of the trend of the stock market index in the future, we optimize the ANN model using genetic algorithms (GA). We demonstrate and verify the predictability of stock price direction by using the hybrid GA-ANN model and then compare the performance with prior studies. Empirical results show that the Type 2 input variables can generate a higher forecast accuracy and that it is possible to enhance the performance of the optimized ANN model by selecting input variables appropriately. PMID:27196055

  5. An integrated material metabolism model for stocks of urban road system in Beijing, China.

    PubMed

    Guo, Zhen; Hu, Dan; Zhang, Fuhua; Huang, Guolong; Xiao, Qiang

    2014-02-01

    Rapid urbanization has greatly altered the urban metabolism of material and energy. As a significant part of the infrastructure, urban roads are being rapidly developed worldwide. Quantitative analysis of metabolic processes on urban road systems, especially the scale, composition and spatial distribution of their stocks, could help to assess the resource appropriation and potential environmental impacts, as well as improve urban metabolism models. In this paper, an integrated model, which covered all types of roads, intersection structures and ancillary facilities, was built for calculating the material stocks of urban road systems. Based on a bottom-up method, the total stocks were disassembled into a number of stock parts rather than obtained by input-output data, which provided an approach promoting data availability and inner structure understanding. The combination with GIS enabled the model to tackle the complex structures of road networks and avoid double counting. In the case study of Beijing, the following results are shown: 1) The total stocks for the entire road system reached 159 million tons, of which nearly 80% was stored in roads, and 20% in ancillary facilities. 2) Macadam was the largest stock (111 million tons), while stone mastic asphalt, polyurethane plastics, and atactic polypropylene accounted for smaller components of the overall system. 3) The stock per unit area of pedestrian overcrossing was higher than that of the other stock units in the entire system, and its steel stocks reached 0.49 t/m(2), which was 10 times as high as that in interchanges. 4) The high stock areas were mainly distributed in ring-shaped and radial expressways, as well as in major interchanges. 5) Expressways and arterials were excessively emphasized, while minor roads were relatively ignored. However, the variation of cross-sectional thickness in branches and neighborhood roads will have a significant impact on the scale of material stocks in the entire road system.

  6. Modeling and computing of stock index forecasting based on neural network and Markov chain.

    PubMed

    Dai, Yonghui; Han, Dongmei; Dai, Weihui

    2014-01-01

    The stock index reflects the fluctuation of the stock market. For a long time, there have been a lot of researches on the forecast of stock index. However, the traditional method is limited to achieving an ideal precision in the dynamic market due to the influences of many factors such as the economic situation, policy changes, and emergency events. Therefore, the approach based on adaptive modeling and conditional probability transfer causes the new attention of researchers. This paper presents a new forecast method by the combination of improved back-propagation (BP) neural network and Markov chain, as well as its modeling and computing technology. This method includes initial forecasting by improved BP neural network, division of Markov state region, computing of the state transition probability matrix, and the prediction adjustment. Results of the empirical study show that this method can achieve high accuracy in the stock index prediction, and it could provide a good reference for the investment in stock market.

  7. Modelling uncertainty of carbon stocks changes in peats.

    NASA Astrophysics Data System (ADS)

    Poggio, Laura; Gimona, Alessandro; Aalders, Inge; Morrice, Jane; Hough, Rupert

    2015-04-01

    Global warming might change the hydrology of upland blanket peats in Scotland with increased risk of release of the stored carbon. It is therefore important to model the loss of carbon in peat areas with estimation of the damage potential. The presented approach has the potential to provide important information for the assessment of carbon stocks over large areas, but also in case of changes of land use, such as construction of wind farms. The provided spatial uncertainty is important for including the results in further environmental and climate-change models and for decision making in order to provide alternatives and prioritisation. In this study, main peat properties (i.e. depth, water content, bulk density and carbon content) were modelled using a hybrid GAM-geostatistical 3D approach that allows full uncertainty propagation. The approach used involves 1) modelling the trend with full 3D spatial correlation, i.e., exploiting the values of the neighbouring pixels in 3D-space, and 2) 3D kriging as spatial component. The uncertainty of the approach is assessed with iterations in both steps of the process. We studied the difference between local estimates obtained with the present method and local estimates obtained assuming the global average value across the test area for Carbon content and bulk density. To this end, virtual pits with a surface area of 30x30 m were excavated for the whole peat depth at randomly selected locations. Calculated uncertainty was used to estimate credible intervals of C loss. In this case the estimates obtained with the proposed approach are higher that what would be obtained by assuming spatial homogeneity and using just average values across the area. This has implications for environmental decision making and planning as, in this case, it is likely that more carbon would be lost than estimated using traditional approaches.

  8. Underestimation of boreal soil carbon stocks by mathematical soil carbon models linked to soil nutrient status

    NASA Astrophysics Data System (ADS)

    Ťupek, Boris; Ortiz, Carina A.; Hashimoto, Shoji; Stendahl, Johan; Dahlgren, Jonas; Karltun, Erik; Lehtonen, Aleksi

    2016-08-01

    Inaccurate estimate of the largest terrestrial carbon pool, soil organic carbon (SOC) stock, is the major source of uncertainty in simulating feedback of climate warming on ecosystem-atmosphere carbon dioxide exchange by process-based ecosystem and soil carbon models. Although the models need to simplify complex environmental processes of soil carbon sequestration, in a large mosaic of environments a missing key driver could lead to a modeling bias in predictions of SOC stock change.We aimed to evaluate SOC stock estimates of process-based models (Yasso07, Q, and CENTURY soil sub-model v4) against a massive Swedish forest soil inventory data set (3230 samples) organized by a recursive partitioning method into distinct soil groups with underlying SOC stock development linked to physicochemical conditions.For two-thirds of measurements all models predicted accurate SOC stock levels regardless of the detail of input data, e.g., whether they ignored or included soil properties. However, in fertile sites with high N deposition, high cation exchange capacity, or moderately increased soil water content, Yasso07 and Q models underestimated SOC stocks. In comparison to Yasso07 and Q, accounting for the site-specific soil characteristics (e. g. clay content and topsoil mineral N) by CENTURY improved SOC stock estimates for sites with high clay content, but not for sites with high N deposition.Our analysis suggested that the soils with poorly predicted SOC stocks, as characterized by the high nutrient status and well-sorted parent material, indeed have had other predominant drivers of SOC stabilization lacking in the models, presumably the mycorrhizal organic uptake and organo-mineral stabilization processes. Our results imply that the role of soil nutrient status as regulator of organic matter mineralization has to be re-evaluated, since correct SOC stocks are decisive for predicting future SOC change and soil CO2 efflux.

  9. Black-Litterman model on non-normal stock return (Case study four banks at LQ-45 stock index)

    NASA Astrophysics Data System (ADS)

    Mahrivandi, Rizki; Noviyanti, Lienda; Setyanto, Gatot Riwi

    2017-03-01

    The formation of the optimal portfolio is a method that can help investors to minimize risks and optimize profitability. One model for the optimal portfolio is a Black-Litterman (BL) model. BL model can incorporate an element of historical data and the views of investors to form a new prediction about the return of the portfolio as a basis for preparing the asset weighting models. BL model has two fundamental problems, the assumption of normality and estimation parameters on the market Bayesian prior framework that does not from a normal distribution. This study provides an alternative solution where the modelling of the BL model stock returns and investor views from non-normal distribution.

  10. Automatic Building Information Model Query Generation

    SciTech Connect

    Jiang, Yufei; Yu, Nan; Ming, Jiang; Lee, Sanghoon; DeGraw, Jason; Yen, John; Messner, John I.; Wu, Dinghao

    2015-12-01

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approach to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. By demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.

  11. Automatic building information model query generation

    SciTech Connect

    Jiang, Yufei; Yu, Nan; Ming, Jiang; Lee, Sanghoon; DeGraw, Jason; Yen, John; Messner, John I.; Wu, Dinghao

    2015-12-01

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approach to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.

  12. Evaluation of approaches focused on modelling of organic carbon stocks using the RothC model

    NASA Astrophysics Data System (ADS)

    Koco, Štefan; Skalský, Rastislav; Makovníková, Jarmila; Tarasovičová, Zuzana; Barančíková, Gabriela

    2014-05-01

    The aim of current efforts in the European area is the protection of soil organic matter, which is included in all relevant documents related to the protection of soil. The use of modelling of organic carbon stocks for anticipated climate change, respectively for land management can significantly help in short and long-term forecasting of the state of soil organic matter. RothC model can be applied in the time period of several years to centuries and has been tested in long-term experiments within a large range of soil types and climatic conditions in Europe. For the initialization of the RothC model, knowledge about the carbon pool sizes is essential. Pool size characterization can be obtained from equilibrium model runs, but this approach is time consuming and tedious, especially for larger scale simulations. Due to this complexity we search for new possibilities how to simplify and accelerate this process. The paper presents a comparison of two approaches for SOC stocks modelling in the same area. The modelling has been carried out on the basis of unique input of land use, management and soil data for each simulation unit separately. We modeled 1617 simulation units of 1x1 km grid on the territory of agroclimatic region Žitný ostrov in the southwest of Slovakia. The first approach represents the creation of groups of simulation units based on the evaluation of results for simulation unit with similar input values. The groups were created after the testing and validation of modelling results for individual simulation units with results of modelling the average values of inputs for the whole group. Tests of equilibrium model for interval in the range 5 t.ha-1 from initial SOC stock showed minimal differences in results comparing with result for average value of whole interval. Management inputs data from plant residues and farmyard manure for modelling of carbon turnover were also the same for more simulation units. Combining these groups (intervals of initial

  13. Probabilistic modeling of the indoor climates of residential buildings using EnergyPlus

    DOE PAGES

    Buechler, Elizabeth D.; Pallin, Simon B.; Boudreaux, Philip R.; ...

    2017-04-25

    The indoor air temperature and relative humidity in residential buildings significantly affect material moisture durability, HVAC system performance, and occupant comfort. Therefore, indoor climate data is generally required to define boundary conditions in numerical models that evaluate envelope durability and equipment performance. However, indoor climate data obtained from field studies is influenced by weather, occupant behavior and internal loads, and is generally unrepresentative of the residential building stock. Likewise, whole-building simulation models typically neglect stochastic variables and yield deterministic results that are applicable to only a single home in a specific climate. The

  14. Autotune Calibrates Models to Building Use Data

    ScienceCinema

    None

    2016-09-02

    Models of existing buildings are currently unreliable unless calibrated manually by a skilled professional. Autotune, as the name implies, automates this process by calibrating the model of an existing building to measured data, and is now available as open source software. This enables private businesses to incorporate Autotune into their products so that their customers can more effectively estimate cost savings of reduced energy consumption measures in existing buildings.

  15. Autotune Calibrates Models to Building Use Data

    SciTech Connect

    2016-08-26

    Models of existing buildings are currently unreliable unless calibrated manually by a skilled professional. Autotune, as the name implies, automates this process by calibrating the model of an existing building to measured data, and is now available as open source software. This enables private businesses to incorporate Autotune into their products so that their customers can more effectively estimate cost savings of reduced energy consumption measures in existing buildings.

  16. Forecasting Stock Exchange Movements Using Artificial Neural Network Models and Hybrid Models

    NASA Astrophysics Data System (ADS)

    Güreşen, Erkam; Kayakutlu, Gülgün

    Forecasting stock exchange rates is an important financial problem that is receiving increasing attention. During the last few years, a number of neural network models and hybrid models have been proposed for obtaining accurate prediction results, in an attempt to outperform the traditional linear and nonlinear approaches. This paper evaluates the effectiveness of neural network models; recurrent neural network (RNN), dynamic artificial neural network (DAN2) and the hybrid neural networks which use generalized autoregressive conditional heteroscedasticity (GARCH) and exponential generalized autoregressive conditional heteroscedasticity (EGARCH) to extract new input variables. The comparison for each model is done in two view points: MSE and MAD using real exchange daily rate values of Istanbul Stock Exchange (ISE) index XU10).

  17. Assessment of South Pacific albacore stock ( Thunnus alalunga) by improved Schaefer model

    NASA Astrophysics Data System (ADS)

    Wang, Chien-Hsiung; Wang, Shyh-Bin

    2006-04-01

    Based on catch and effort data of tuna longline fishery operating in the South Pacific Ocean, the South Pacific albacore stock was assessed by an improved Schaefer model. The results revealed that the intrinsic growth rate was about 1.283 74 and carrying capacities vareied in the range from 73 734 to 266 732 metric tons. The growth ability of this species is remarkable. Stock dynamics mainly depends on environmental conditions. The stock is still in good condition. However, the continuous decreasing of biomass in recent years should be noticed.

  18. A Seminar in Mathematical Model-Building.

    ERIC Educational Resources Information Center

    Smith, David A.

    1979-01-01

    A course in mathematical model-building is described. Suggested modeling projects include: urban problems, biology and ecology, economics, psychology, games and gaming, cosmology, medicine, history, computer science, energy, and music. (MK)

  19. A Seminar in Mathematical Model-Building.

    ERIC Educational Resources Information Center

    Smith, David A.

    1979-01-01

    A course in mathematical model-building is described. Suggested modeling projects include: urban problems, biology and ecology, economics, psychology, games and gaming, cosmology, medicine, history, computer science, energy, and music. (MK)

  20. DOE Commercial Building Benchmark Models: Preprint

    SciTech Connect

    Torcelini, P.; Deru, M.; Griffith, B.; Benne, K.; Halverson, M.; Winiarski, D.; Crawley, D. B.

    2008-07-01

    To provide a consistent baseline of comparison and save time conducting such simulations, the U.S. Department of Energy (DOE) has developed a set of standard benchmark building models. This paper will provide an executive summary overview of these benchmark buildings, and how they can save building analysts valuable time. Fully documented and implemented to use with the EnergyPlus energy simulation program, the benchmark models are publicly available and new versions will be created to maintain compatibility with new releases of EnergyPlus. The benchmark buildings will form the basis for research on specific building technologies, energy code development, appliance standards, and measurement of progress toward DOE energy goals. Having a common starting point allows us to better share and compare research results and move forward to make more energy efficient buildings.

  1. Stock price forecasting for companies listed on Tehran stock exchange using multivariate adaptive regression splines model and semi-parametric splines technique

    NASA Astrophysics Data System (ADS)

    Rounaghi, Mohammad Mahdi; Abbaszadeh, Mohammad Reza; Arashi, Mohammad

    2015-11-01

    One of the most important topics of interest to investors is stock price changes. Investors whose goals are long term are sensitive to stock price and its changes and react to them. In this regard, we used multivariate adaptive regression splines (MARS) model and semi-parametric splines technique for predicting stock price in this study. The MARS model as a nonparametric method is an adaptive method for regression and it fits for problems with high dimensions and several variables. semi-parametric splines technique was used in this study. Smoothing splines is a nonparametric regression method. In this study, we used 40 variables (30 accounting variables and 10 economic variables) for predicting stock price using the MARS model and using semi-parametric splines technique. After investigating the models, we select 4 accounting variables (book value per share, predicted earnings per share, P/E ratio and risk) as influencing variables on predicting stock price using the MARS model. After fitting the semi-parametric splines technique, only 4 accounting variables (dividends, net EPS, EPS Forecast and P/E Ratio) were selected as variables effective in forecasting stock prices.

  2. Translating building information modeling to building energy modeling using model view definition.

    PubMed

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  3. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    PubMed Central

    Kim, Jong Bum; Clayton, Mark J.; Haberl, Jeff S.

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process. PMID:25309954

  4. Investigation of market efficiency and Financial Stability between S&P 500 and London Stock Exchange: Monthly and yearly Forecasting of Time Series Stock Returns using ARMA model

    NASA Astrophysics Data System (ADS)

    Rounaghi, Mohammad Mahdi; Nassir Zadeh, Farzaneh

    2016-08-01

    We investigated the presence and changes in, long memory features in the returns and volatility dynamics of S&P 500 and London Stock Exchange using ARMA model. Recently, multifractal analysis has been evolved as an important way to explain the complexity of financial markets which can hardly be described by linear methods of efficient market theory. In financial markets, the weak form of the efficient market hypothesis implies that price returns are serially uncorrelated sequences. In other words, prices should follow a random walk behavior. The random walk hypothesis is evaluated against alternatives accommodating either unifractality or multifractality. Several studies find that the return volatility of stocks tends to exhibit long-range dependence, heavy tails, and clustering. Because stochastic processes with self-similarity possess long-range dependence and heavy tails, it has been suggested that self-similar processes be employed to capture these characteristics in return volatility modeling. The present study applies monthly and yearly forecasting of Time Series Stock Returns in S&P 500 and London Stock Exchange using ARMA model. The statistical analysis of S&P 500 shows that the ARMA model for S&P 500 outperforms the London stock exchange and it is capable for predicting medium or long horizons using real known values. The statistical analysis in London Stock Exchange shows that the ARMA model for monthly stock returns outperforms the yearly. ​A comparison between S&P 500 and London Stock Exchange shows that both markets are efficient and have Financial Stability during periods of boom and bust.

  5. The modified Black-Scholes model via constant elasticity of variance for stock options valuation

    NASA Astrophysics Data System (ADS)

    Edeki, S. O.; Owoloko, E. A.; Ugbebor, O. O.

    2016-02-01

    In this paper, the classical Black-Scholes option pricing model is visited. We present a modified version of the Black-Scholes model via the application of the constant elasticity of variance model (CEVM); in this case, the volatility of the stock price is shown to be a non-constant function unlike the assumption of the classical Black-Scholes model.

  6. An economic model of international wood supply, forest stock and forest area change

    Treesearch

    James A. Turner; Joseph Buongiorno; Shushuai Zhu

    2006-01-01

    Wood supply, the link between roundwood removals and forest resources, is an important component of forest sector models. This paper develops a model of international wood supply within the structure of the spatial equilibrium Global Forest Products Model. The wood supply model determines, for each country, the annual forest harvest, the annual change of forest stock...

  7. Aboveground biomass and carbon stocks modelling using non-linear regression model

    NASA Astrophysics Data System (ADS)

    Ain Mohd Zaki, Nurul; Abd Latif, Zulkiflee; Nazip Suratman, Mohd; Zainee Zainal, Mohd

    2016-06-01

    Aboveground biomass (AGB) is an important source of uncertainty in the carbon estimation for the tropical forest due to the variation biodiversity of species and the complex structure of tropical rain forest. Nevertheless, the tropical rainforest holds the most extensive forest in the world with the vast diversity of tree with layered canopies. With the usage of optical sensor integrate with empirical models is a common way to assess the AGB. Using the regression, the linkage between remote sensing and a biophysical parameter of the forest may be made. Therefore, this paper exemplifies the accuracy of non-linear regression equation of quadratic function to estimate the AGB and carbon stocks for the tropical lowland Dipterocarp forest of Ayer Hitam forest reserve, Selangor. The main aim of this investigation is to obtain the relationship between biophysical parameter field plots with the remotely-sensed data using nonlinear regression model. The result showed that there is a good relationship between crown projection area (CPA) and carbon stocks (CS) with Pearson Correlation (p < 0.01), the coefficient of correlation (r) is 0.671. The study concluded that the integration of Worldview-3 imagery with the canopy height model (CHM) raster based LiDAR were useful in order to quantify the AGB and carbon stocks for a larger sample area of the lowland Dipterocarp forest.

  8. Technology Prioritization: Transforming the U.S. Building Stock to Embrace Energy Efficiency

    SciTech Connect

    2013-07-01

    This paper discusses the efforts to accelerate the transformation in the U.S. building energy efficiency sector using a new technology prioritization framework. The underlying analysis examines building energy use micro segments using the Energy Information Administration Annual Energy Outlook and other publically available information. The U.S. Department of Energy’s Building Technologies Office (BTO) has developed a prioritization tool in an effort to inform programmatic decision making based on the long-term national impact of different energy efficiency measures. The prioritization tool can be used to investigate energy efficiency measures under a variety of scenarios and has a built-in energy accounting framework to prevent double counting of energy savings within any given portfolio. This tool is developed to inform decision making and estimate long term potential energy savings for different market adoption scenarios. It provides an objective comparison of new and existing measures and is being used to inform decision making with respect to BTO’s portfolio of projects.

  9. Structured building model reduction toward parallel simulation

    SciTech Connect

    Dobbs, Justin R.; Hencey, Brondon M.

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  10. The Stock Price Prediction and Sell-buy Strategy Model by Genetic Network Programming

    NASA Astrophysics Data System (ADS)

    Mori, Shigeo; Hirasawa, Kotaro; Hu, Jinglu

    Various stock prices predicting and sell-buy strategy models have been so far proposed. They are classified as the fundamental analysis using the achievements of the companies and the trend of business, etc., and the technical analysis which carries out the numerical analysis of the movement of stock prices. On the other hand, as one of the methods for data mining which finds out the regularity from a vast quantity of stock price data, Genetic Algorithm (GA) has been so far applied widely. As a concrete example, the optimal values of parameters of stock indices like various moving averages and rates of deviation, etc. is computed by GA, and there have been developed various methods for predicting stock prices and determinig sell-buy strategy based on it. However, it is hard to determine which is the most effective index by the conventional GA. Moreover, the most effective one depends on the brands. So in this paper, a stock price prediction and sell-buy strategy model which searches for the optimal combination of various indices in the technical analysis has been proposed using Genetic Network programming and its effectiveness is confirmed by simulations.

  11. Comparison of Building Energy Modeling Programs: Building Loads

    SciTech Connect

    Zhu, Dandan; Hong, Tianzhen; Yan, Da; Wang, Chuang

    2012-06-01

    This technical report presented the methodologies, processes, and results of comparing three Building Energy Modeling Programs (BEMPs) for load calculations: EnergyPlus, DeST and DOE-2.1E. This joint effort, between Lawrence Berkeley National Laboratory, USA and Tsinghua University, China, was part of research projects under the US-China Clean Energy Research Center on Building Energy Efficiency (CERC-BEE). Energy Foundation, an industrial partner of CERC-BEE, was the co-sponsor of this study work. It is widely known that large discrepancies in simulation results can exist between different BEMPs. The result is a lack of confidence in building simulation amongst many users and stakeholders. In the fields of building energy code development and energy labeling programs where building simulation plays a key role, there are also confusing and misleading claims that some BEMPs are better than others. In order to address these problems, it is essential to identify and understand differences between widely-used BEMPs, and the impact of these differences on load simulation results, by detailed comparisons of these BEMPs from source code to results. The primary goal of this work was to research methods and processes that would allow a thorough scientific comparison of the BEMPs. The secondary goal was to provide a list of strengths and weaknesses for each BEMP, based on in-depth understandings of their modeling capabilities, mathematical algorithms, advantages and limitations. This is to guide the use of BEMPs in the design and retrofit of buildings, especially to support China’s building energy standard development and energy labeling program. The research findings could also serve as a good reference to improve the modeling capabilities and applications of the three BEMPs. The methodologies, processes, and analyses employed in the comparison work could also be used to compare other programs. The load calculation method of each program was analyzed and compared to

  12. Value-at-Risk forecasts by a spatiotemporal model in Chinese stock market

    NASA Astrophysics Data System (ADS)

    Gong, Pu; Weng, Yingliang

    2016-01-01

    This paper generalizes a recently proposed spatial autoregressive model and introduces a spatiotemporal model for forecasting stock returns. We support the view that stock returns are affected not only by the absolute values of factors such as firm size, book-to-market ratio and momentum but also by the relative values of factors like trading volume ranking and market capitalization ranking in each period. This article studies a new method for constructing stocks' reference groups; the method is called quartile method. Applying the method empirically to the Shanghai Stock Exchange 50 Index, we compare the daily volatility forecasting performance and the out-of-sample forecasting performance of Value-at-Risk (VaR) estimated by different models. The empirical results show that the spatiotemporal model performs surprisingly well in terms of capturing spatial dependences among individual stocks, and it produces more accurate VaR forecasts than the other three models introduced in the previous literature. Moreover, the findings indicate that both allowing for serial correlation in the disturbances and using time-varying spatial weight matrices can greatly improve the predictive accuracy of a spatial autoregressive model.

  13. Modeling Markov switching ARMA-GARCH neural networks models and an application to forecasting stock returns.

    PubMed

    Bildirici, Melike; Ersin, Özgür

    2014-01-01

    The study has two aims. The first aim is to propose a family of nonlinear GARCH models that incorporate fractional integration and asymmetric power properties to MS-GARCH processes. The second purpose of the study is to augment the MS-GARCH type models with artificial neural networks to benefit from the universal approximation properties to achieve improved forecasting accuracy. Therefore, the proposed Markov-switching MS-ARMA-FIGARCH, APGARCH, and FIAPGARCH processes are further augmented with MLP, Recurrent NN, and Hybrid NN type neural networks. The MS-ARMA-GARCH family and MS-ARMA-GARCH-NN family are utilized for modeling the daily stock returns in an emerging market, the Istanbul Stock Index (ISE100). Forecast accuracy is evaluated in terms of MAE, MSE, and RMSE error criteria and Diebold-Mariano equal forecast accuracy tests. The results suggest that the fractionally integrated and asymmetric power counterparts of Gray's MS-GARCH model provided promising results, while the best results are obtained for their neural network based counterparts. Further, among the models analyzed, the models based on the Hybrid-MLP and Recurrent-NN, the MS-ARMA-FIAPGARCH-HybridMLP, and MS-ARMA-FIAPGARCH-RNN provided the best forecast performances over the baseline single regime GARCH models and further, over the Gray's MS-GARCH model. Therefore, the models are promising for various economic applications.

  14. Modeling Markov Switching ARMA-GARCH Neural Networks Models and an Application to Forecasting Stock Returns

    PubMed Central

    Bildirici, Melike; Ersin, Özgür

    2014-01-01

    The study has two aims. The first aim is to propose a family of nonlinear GARCH models that incorporate fractional integration and asymmetric power properties to MS-GARCH processes. The second purpose of the study is to augment the MS-GARCH type models with artificial neural networks to benefit from the universal approximation properties to achieve improved forecasting accuracy. Therefore, the proposed Markov-switching MS-ARMA-FIGARCH, APGARCH, and FIAPGARCH processes are further augmented with MLP, Recurrent NN, and Hybrid NN type neural networks. The MS-ARMA-GARCH family and MS-ARMA-GARCH-NN family are utilized for modeling the daily stock returns in an emerging market, the Istanbul Stock Index (ISE100). Forecast accuracy is evaluated in terms of MAE, MSE, and RMSE error criteria and Diebold-Mariano equal forecast accuracy tests. The results suggest that the fractionally integrated and asymmetric power counterparts of Gray's MS-GARCH model provided promising results, while the best results are obtained for their neural network based counterparts. Further, among the models analyzed, the models based on the Hybrid-MLP and Recurrent-NN, the MS-ARMA-FIAPGARCH-HybridMLP, and MS-ARMA-FIAPGARCH-RNN provided the best forecast performances over the baseline single regime GARCH models and further, over the Gray's MS-GARCH model. Therefore, the models are promising for various economic applications. PMID:24977200

  15. Integrating Building Information Modeling and Green Building Certification: The BIM-LEED Application Model Development

    ERIC Educational Resources Information Center

    Wu, Wei

    2010-01-01

    Building information modeling (BIM) and green building are currently two major trends in the architecture, engineering and construction (AEC) industry. This research recognizes the market demand for better solutions to achieve green building certification such as LEED in the United States. It proposes a new strategy based on the integration of BIM…

  16. Integrating Building Information Modeling and Green Building Certification: The BIM-LEED Application Model Development

    ERIC Educational Resources Information Center

    Wu, Wei

    2010-01-01

    Building information modeling (BIM) and green building are currently two major trends in the architecture, engineering and construction (AEC) industry. This research recognizes the market demand for better solutions to achieve green building certification such as LEED in the United States. It proposes a new strategy based on the integration of BIM…

  17. RCrane: semi-automated RNA model building.

    PubMed

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  18. Calibrated Blade-Element/Momentum Theory Aerodynamic Model of the MARIN Stock Wind Turbine: Preprint

    SciTech Connect

    Goupee, A.; Kimball, R.; de Ridder, E. J.; Helder, J.; Robertson, A.; Jonkman, J.

    2015-04-02

    In this paper, a calibrated blade-element/momentum theory aerodynamic model of the MARIN stock wind turbine is developed and documented. The model is created using open-source software and calibrated to closely emulate experimental data obtained by the DeepCwind Consortium using a genetic algorithm optimization routine. The provided model will be useful for those interested in validating interested in validating floating wind turbine numerical simulators that rely on experiments utilizing the MARIN stock wind turbine—for example, the International Energy Agency Wind Task 30’s Offshore Code Comparison Collaboration Continued, with Correlation project.

  19. Asymptotic Behavior of the Stock Price Distribution Density and Implied Volatility in Stochastic Volatility Models

    SciTech Connect

    Gulisashvili, Archil; Stein, Elias M.

    2010-06-15

    We study the asymptotic behavior of distribution densities arising in stock price models with stochastic volatility. The main objects of our interest in the present paper are the density of time averages of the squared volatility process and the density of the stock price process in the Stein-Stein and the Heston model. We find explicit formulas for leading terms in asymptotic expansions of these densities and give error estimates. As an application of our results, sharp asymptotic formulas for the implied volatility in the Stein-Stein and the Heston model are obtained.

  20. A note on Black-Scholes pricing model for theoretical values of stock options

    NASA Astrophysics Data System (ADS)

    Edeki, S. O.; Ugbebor, O. O.; Owoloko, E. A.

    2016-02-01

    In this paper, we consider some conditions that transform the classical Black-Scholes Model for stock options valuation from its partial differential equation (PDE) form to an equivalent ordinary differential equation (ODE) form. In addition, we propose a relatively new semi-analytical method for the solution of the transformed Black-Scholes model. The obtained solutions via this method can be used to find the theoretical values of the stock options in relation to their fair prices. In considering the reliability and efficiency of the models, we test some cases and the results are in good agreement with the exact solution.

  1. Automatic building information model query generation

    DOE PAGES

    Jiang, Yufei; Yu, Nan; Ming, Jiang; ...

    2015-12-01

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less

  2. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1991-01-01

    A number of topics related to building a generalized distributed system model are discussed. The effects of distributed database modeling on evaluation of transaction rollbacks, the measurement of effects of distributed database models on transaction availability measures, and a performance analysis of static locking in replicated distributed database systems are covered.

  3. Peat Depth Assessment Using Airborne Geophysical Data for Carbon Stock Modelling

    NASA Astrophysics Data System (ADS)

    Keaney, Antoinette; McKinley, Jennifer; Ruffell, Alastair; Robinson, Martin; Graham, Conor; Hodgson, Jim; Desissa, Mohammednur

    2013-04-01

    -ray spectrometry, moisture content and rainfall monitoring combined with a real-time Differential Global Positioning System (DGPS) to monitor temporal and spatial variability of bog elevations. This research will assist in determining the accuracy and limitations of modelling soil carbon and changes in peat stocks by investigating the attenuation of gamma-radiation from underlying rocks. Tellus Border is supported by the EU INTERREG IVA programme, which is managed by the Special EU Programmes Body in Northern Ireland, the border Region of Ireland and western Scotland. The Tellus project was funded by the Northern Ireland Development of Enterprise Trade and Investment and by the Rural Development Programme through the Northern Ireland Programme for Building Sustainable Prosperity.

  4. RCrane: semi-automated RNA model building

    SciTech Connect

    Keating, Kevin S.; Pyle, Anna Marie

    2012-08-01

    RCrane is a new tool for the partially automated building of RNA crystallographic models into electron-density maps of low or intermediate resolution. This tool helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  5. Building vulnerability assessment based on cloud model

    NASA Astrophysics Data System (ADS)

    Sun, Xixia; Cai, Chao

    2013-10-01

    This study aims at building a general framework for estimating building vulnerability to blast-fragmentation warhead of a missile. Considering the fuzziness and randomness existing in the damage criterion rules, cloud models are applied to represent the qualitative concepts. On the basis of building geometric description, element criticality analysis, blast wave and fragment movement description, and meeting analysis of fragments and target, kill probabilities of the components are estimated by the shot line method. The damage state of the whole building given the threat is obtained by cloud model based uncertainty reasoning and the proposed similarity measure, enabling both randomness of probability reasoning and the fuzziness of the traditional fuzzy logic to be considered. Experimental results demonstrate that the proposed method can provide useful reference for optimizing warhead design and mission efficiency evaluation.

  6. Modeling carbon stocks in a secondary tropical dry forest in the Yucatan Peninsula, Mexico

    Treesearch

    Zhaohua Dai; Richard A. Birdsey; Kristofer D. Johnson; Juan Manuel Dupuy; Jose Luis Hernandez-Stefanoni; Karen. Richardson

    2014-01-01

    The carbon balance of secondary dry tropical forests of Mexico’s Yucatan Peninsula is sensitive to human and natural disturbances and climate change. The spatially explicit process model Forest-DeNitrification-DeComposition (DNDC) was used to estimate forest carbon dynamics in this region, including the effects of disturbance on carbon stocks. Model evaluation using...

  7. Model building techniques for analysis.

    SciTech Connect

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  8. Modeling and Computing of Stock Index Forecasting Based on Neural Network and Markov Chain

    PubMed Central

    Dai, Yonghui; Han, Dongmei; Dai, Weihui

    2014-01-01

    The stock index reflects the fluctuation of the stock market. For a long time, there have been a lot of researches on the forecast of stock index. However, the traditional method is limited to achieving an ideal precision in the dynamic market due to the influences of many factors such as the economic situation, policy changes, and emergency events. Therefore, the approach based on adaptive modeling and conditional probability transfer causes the new attention of researchers. This paper presents a new forecast method by the combination of improved back-propagation (BP) neural network and Markov chain, as well as its modeling and computing technology. This method includes initial forecasting by improved BP neural network, division of Markov state region, computing of the state transition probability matrix, and the prediction adjustment. Results of the empirical study show that this method can achieve high accuracy in the stock index prediction, and it could provide a good reference for the investment in stock market. PMID:24782659

  9. Enhancing Nursing Staffing Forecasting With Safety Stock Over Lead Time Modeling.

    PubMed

    McNair, Douglas S

    2015-01-01

    In balancing competing priorities, it is essential that nursing staffing provide enough nurses to safely and effectively care for the patients. Mathematical models to predict optimal "safety stocks" have been routine in supply chain management for many years but have up to now not been applied in nursing workforce management. There are various aspects that exhibit similarities between the 2 disciplines, such as an evolving demand forecast according to acuity and the fact that provisioning "stock" to meet demand in a future period has nonzero variable lead time. Under assumptions about the forecasts (eg, the demand process is well fit as an autoregressive process) and about the labor supply process (≥1 shifts' lead time), we show that safety stock over lead time for such systems is effectively equivalent to the corresponding well-studied problem for systems with stationary demand bounds and base stock policies. Hence, we can apply existing models from supply chain analytics to find the optimal safety levels of nurse staffing. We use a case study with real data to demonstrate that there are significant benefits from the inclusion of the forecast process when determining the optimal safety stocks.

  10. Modeling UHF Radio Propagation in Buildings.

    NASA Astrophysics Data System (ADS)

    Honcharenko, Walter

    The potential implementation of wireless Radio Local Area Networks and Personal Communication Services inside buildings requires a thorough understanding of signal propagation within buildings. This work describes a study leading to a theoretical understanding of wave propagation phenomenon inside buildings. Covered first is propagation in the clear space between the floor and ceiling, which is modeled using Kirchoff -Huygens diffraction theory. This along with ray tracing techniques are used to develop a model to predict signal coverage inside buildings. Simulations were conducted on a hotel building, two office buildings, and a university building to which measurements of CW signals were compared, with good agreement. Propagation to other floors was studied to determine the signal strength as a function of the number of floors separating transmitter and receiver. Diffraction paths and through the floor paths which carry significant power to the receivers were examined. Comparisons were made to measurements in a hotel building and an office building, in which agreements were excellent. As originally developed for Cellular Mobile Radio (CMR) systems, the sector average is obtained from the spatial average of the received signal as the mobile traverses a path of 20 or so wavelengths. This approach has also been applied indoors with the assumption that a unique average could be obtained by moving either end of the radio link. However, unlike in the CMR environment, inside buildings both ends of the radio link are in a rich multipath environment. It is shown both theoretically and experimentally that moving both ends of the link is required to achieve a unique average. Accurate modeling of the short pulse response of a signal within a building will provide insight for determining the hardware necessary for high speed data transmission and recovery, and a model for determining the impulse response is developed in detail. Lastly, the propagation characteristics of

  11. Cross-sectional test of the Fama-French three-factor model: Evidence from Bangladesh stock market

    NASA Astrophysics Data System (ADS)

    Hasan, Md. Zobaer; Kamil, Anton Abdulbasah

    2014-09-01

    Stock market is an important part of a country's economy. It supports the country's economic development and progress by encouraging the efficiency and profitability of firms. This research was designed to examine the risk-return association of companies in the Dhaka Stock Exchange (DSE) market of Bangladesh by using the Fama-French three-factor model structure. The model is based on three factors, which are stock beta, SMB (difference in returns of the portfolio with small market capitalisation minus that with big market capitalisation) and HML (difference in returns of the portfolio with high book-to-market ratio minus that with low book-to-market ratio). This study focused on the DSE market as it is one of the frontier emerging stock markets of South Asia. For this study, monthly stock returns from 71 non-financial companies were used for the period of January 2002 to December 2011. DSI Index was used as a proxy for the market portfolio and Bangladesh government 3-Month T-bill rate was used as the proxy for the risk-free asset. It was found that large capital stocks outperform small capital stocks and stocks with lower book-to-market ratios outperform stocks with higher book-to-market ratios in the context of Bangladesh stock market.

  12. Supercomputer Assisted Generation of Machine Learning Agents for the Calibration of Building Energy Models

    SciTech Connect

    Sanyal, Jibonananda; New, Joshua Ryan; Edwards, Richard

    2013-01-01

    Building Energy Modeling (BEM) is an approach to model the energy usage in buildings for design and retrot pur- poses. EnergyPlus is the agship Department of Energy software that performs BEM for dierent types of buildings. The input to EnergyPlus can often extend in the order of a few thousand parameters which have to be calibrated manu- ally by an expert for realistic energy modeling. This makes it challenging and expensive thereby making building en- ergy modeling unfeasible for smaller projects. In this paper, we describe the \\Autotune" research which employs machine learning algorithms to generate agents for the dierent kinds of standard reference buildings in the U.S. building stock. The parametric space and the variety of building locations and types make this a challenging computational problem necessitating the use of supercomputers. Millions of En- ergyPlus simulations are run on supercomputers which are subsequently used to train machine learning algorithms to generate agents. These agents, once created, can then run in a fraction of the time thereby allowing cost-eective cali- bration of building models.

  13. Model Building for Conceptual Change

    ERIC Educational Resources Information Center

    Jonassen, David; Strobel, Johannes; Gottdenker, Joshua

    2005-01-01

    Conceptual change is a popular, contemporary conception of meaningful learning. Conceptual change describes changes in conceptual frameworks (mental models or personal theories) that learners construct to comprehend phenomena. Different theories of conceptual change describe the reorganization of conceptual frameworks that results from different…

  14. Budget Formulas and Model Building.

    ERIC Educational Resources Information Center

    Cope, Robert G.

    Selected budget formulas currently in use for university operations are described as a background for examining a budgetary model that would provide for the integration of separate formulas. Data on the formulas were collected from states with system-wide coordinating boards that are responsible for budgetary reviews. The most common formula…

  15. Modelling the effect of agricultural management practices on soil organic carbon stocks: does soil erosion matter?

    NASA Astrophysics Data System (ADS)

    Nadeu, Elisabet; Van Wesemael, Bas; Van Oost, Kristof

    2014-05-01

    Over the last decades, an increasing number of studies have been conducted to assess the effect of soil management practices on soil organic carbon (SOC) stocks. At regional scales, biogeochemical models such as CENTURY or Roth-C have been commonly applied. These models simulate SOC dynamics at the profile level (point basis) over long temporal scales but do not consider the continuous lateral transfer of sediment that takes place along geomorphic toposequences. As a consequence, the impact of soil redistribution on carbon fluxes is very seldom taken into account when evaluating changes in SOC stocks due to agricultural management practices on the short and long-term. To address this gap, we assessed the role of soil erosion by water and tillage on SOC stocks under different agricultural management practices in the Walloon region of Belgium. The SPEROS-C model was run for a 100-year period combining three typical crop rotations (using winter wheat, winter barley, sugar beet and maize) with three tillage scenarios (conventional tillage, reduced tillage and reduced tillage in combination with additional crop residues). The results showed that including soil erosion by water in the simulations led to a general decrease in SOC stocks relative to a baseline scenario (where no erosion took place). The SOC lost from these arable soils was mainly exported to adjacent sites and to the river system by lateral fluxes, with magnitudes differing between crop rotations and in all cases lower under conservation tillage practices than under conventional tillage. Although tillage erosion plays an important role in carbon redistribution within fields, lateral fluxes induced by water erosion led to a higher spatial and in-depth heterogeneity of SOC stocks with potential effects on the soil water holding capacity and crop yields. This indicates that studies assessing the effect of agricultural management practices on SOC stocks and other soil properties over the landscape should

  16. Stocking methods and parasite-induced reductions in capture: modelling Argulus foliaceus in trout fisheries.

    PubMed

    McPherson, N J; Norman, R A; Hoyle, A S; Bron, J E; Taylor, N G H

    2012-11-07

    Argulus foliaceus is a macroparasite which can have a significant impact on yield in recreational trout fisheries, partly by increasing fish mortalities but also by reducing the appetite of infected fish, making them less likely to respond to bait. The aim of this paper is to determine the impact of four commonly used fish stocking methods both on the parasite dynamics, and on fisheries' yields. The wider consequences of the resultant reduction in host feeding are also of interest. To this end four different stocking methods were incorporated into Anderson and May's macroparasite model, which comprises three differential equations representing the host, attached parasite and free-living parasite populations. To each of these a reduction in the fish capture rate, inversely linked to the mean parasite burden, is added and the effects interpreted. Results show that (1) the common practise of increasing the stocking rate as catches drop may be counterproductive; (2) in the absence of any wild population of reservoir hosts, the parasite will be unable to survive if the stocking rate does not exceed the rate of capture; (3) compensatory stocking to account for fish mortalities can have disastrous consequences on yield; and (4) the parasite can, under certain circumstances, maintain the host population by preventing their capture.

  17. On the choice of GARCH parameters for efficient modelling of real stock price dynamics

    NASA Astrophysics Data System (ADS)

    Pokhilchuk, K. A.; Savel'ev, S. E.

    2016-04-01

    We propose two different methods for optimal choice of GARCH(1,1) parameters for the efficient modelling of stock prices by using a particular return series. Using (as an example) stock return data for Intel Corporation, we vary parameters to fit the average volatility as well as fourth (linked to kurtosis of data) and eighth statistical moments and observe pure convergence of our simulated eighth moment to the stock data. Results indicate that fitting higher-order moments of a return series might not be an optimal approach for choosing GARCH parameters. In contrast, the simulated exponent of the Fourier spectrum decay is much less noisy and can easily fit the corresponding decay of the empirical Fourier spectrum of the used return series of Intel stock, allowing us to efficiently define all GARCH parameters. We compare the estimates of GARCH parameters obtained by fitting price data Fourier spectra with the ones obtained from standard software packages and conclude that the obtained estimates here are deeper in the stability region of parameters. Thus, the proposed method of using Fourier spectra of stock data to estimate GARCH parameters results in a more robust and stable stochastic process but with a shorter characteristic autocovariance time.

  18. Modeling metal stocks and flows: a review of dynamic material flow analysis methods.

    PubMed

    Müller, Esther; Hilty, Lorenz M; Widmer, Rolf; Schluep, Mathias; Faulstich, Martin

    2014-02-18

    Dynamic material flow analysis (MFA) is a frequently used method to assess past, present, and future stocks and flows of metals in the anthroposphere. Over the past fifteen years, dynamic MFA has contributed to increased knowledge about the quantities, qualities, and locations of metal-containing goods. This article presents a literature review of the methodologies applied in 60 dynamic MFAs of metals. The review is based on a standardized model description format, the ODD (overview, design concepts, details) protocol. We focus on giving a comprehensive overview of modeling approaches and structure them according to essential aspects, such as their treatment of material dissipation, spatial dimension of flows, or data uncertainty. The reviewed literature features similar basic modeling principles but very diverse extrapolation methods. Basic principles include the calculation of outflows of the in-use stock based on inflow or stock data and a lifetime distribution function. For extrapolating stocks and flows, authors apply constant, linear, exponential, and logistic models or approaches based on socioeconomic variables, such as regression models or the intensity-of-use hypothesis. The consideration and treatment of further aspects, such as dissipation, spatial distribution, and data uncertainty, vary significantly and highly depends on the objectives of each study.

  19. Impacts of Model Building Energy Codes

    SciTech Connect

    Athalye, Rahul A.; Sivaraman, Deepak; Elliott, Douglas B.; Liu, Bing; Bartlett, Rosemarie

    2016-10-31

    The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) periodically evaluates national and state-level impacts associated with energy codes in residential and commercial buildings. Pacific Northwest National Laboratory (PNNL), funded by DOE, conducted an assessment of the prospective impacts of national model building energy codes from 2010 through 2040. A previous PNNL study evaluated the impact of the Building Energy Codes Program; this study looked more broadly at overall code impacts. This report describes the methodology used for the assessment and presents the impacts in terms of energy savings, consumer cost savings, and reduced CO2 emissions at the state level and at aggregated levels. This analysis does not represent all potential savings from energy codes in the U.S. because it excludes several states which have codes which are fundamentally different from the national model energy codes or which do not have state-wide codes. Energy codes follow a three-phase cycle that starts with the development of a new model code, proceeds with the adoption of the new code by states and local jurisdictions, and finishes when buildings comply with the code. The development of new model code editions creates the potential for increased energy savings. After a new model code is adopted, potential savings are realized in the field when new buildings (or additions and alterations) are constructed to comply with the new code. Delayed adoption of a model code and incomplete compliance with the code’s requirements erode potential savings. The contributions of all three phases are crucial to the overall impact of codes, and are considered in this assessment.

  20. Option pricing formulas based on a non-Gaussian stock price model.

    PubMed

    Borland, Lisa

    2002-08-26

    Options are financial instruments that depend on the underlying stock. We explain their non-Gaussian fluctuations using the nonextensive thermodynamics parameter q. A generalized form of the Black-Scholes (BS) partial differential equation and some closed-form solutions are obtained. The standard BS equation (q=1) which is used by economists to calculate option prices requires multiple values of the stock volatility (known as the volatility smile). Using q=1.5 which well models the empirical distribution of returns, we get a good description of option prices using a single volatility.

  1. Modeling Stock Order Flows and Learning Market-Making from Data

    DTIC Science & Technology

    2002-06-01

    and demand. In this paper, we demonstrate a novel method for modeling the market as a dynamic system and a reinforcement learning algorithm that learns...difficult dynamic system. Our reinforcement learning algorithm, based on likelihood ratios, is run on this partially-observable environment. We demonstrate learning results for two separate real stocks.

  2. Mobile Modelling for Crowdsourcing Building Interior Data

    NASA Astrophysics Data System (ADS)

    Rosser, J.; Morley, J.; Jackson, M.

    2012-06-01

    Indoor spatial data forms an important foundation to many ubiquitous computing applications. It gives context to users operating location-based applications, provides an important source of documentation of buildings and can be of value to computer systems where an understanding of environment is required. Unlike external geographic spaces, no centralised body or agency is charged with collecting or maintaining such information. Widespread deployment of mobile devices provides a potential tool that would allow rapid model capture and update by a building's users. Here we introduce some of the issues involved in volunteering building interior data and outline a simple mobile tool for capture of indoor models. The nature of indoor data is inherently private; however in-depth analysis of this issue and legal considerations are not discussed in detail here.

  3. A Non-Gaussian Stock Price Model: Options, Credit and a Multi-Timescale Memory

    NASA Astrophysics Data System (ADS)

    Borland, L.

    We review a recently proposed model of stock prices, based on astatistical feedback model that results in a non-Gaussian distribution of price changes. Applications to option pricing and the pricing of debt is discussed. A generalization to account for feedback effects over multiple timescales is also presented. This model reproduces most of the stylized facts (ie statistical anomalies) observed in real financial markets.

  4. 97. ORIGINAL ARCHITECT'S MODEL OF BUILDING AS FIRST DESIGNED, NORTH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    97. ORIGINAL ARCHITECT'S MODEL OF BUILDING AS FIRST DESIGNED, NORTH FRONT - Smithsonian Institution Building, 1000 Jefferson Drive, between Ninth & Twelfth Streets, Southwest, Washington, District of Columbia, DC

  5. Promoting Mental Model Building in Astronomy Education

    ERIC Educational Resources Information Center

    Taylor, Ian; Barker, Miles; Jones, Alister

    2003-01-01

    While astronomy has recently re-emerged in many science curricula, there remain unresolved teaching and learning difficulties peculiar to astronomy education. This paper argues that mental model building, the core process in astronomy itself, should be reflected in astronomy education. Also, this crucial skill may promote a better understanding of…

  6. Promoting Mental Model Building in Astronomy Education

    ERIC Educational Resources Information Center

    Taylor, Ian; Barker, Miles; Jones, Alister

    2003-01-01

    While astronomy has recently re-emerged in many science curricula, there remain unresolved teaching and learning difficulties peculiar to astronomy education. This paper argues that mental model building, the core process in astronomy itself, should be reflected in astronomy education. Also, this crucial skill may promote a better understanding of…

  7. Scripted Building Energy Modeling and Analysis (Presentation)

    SciTech Connect

    Macumber, D.

    2012-10-01

    Building energy analysis is often time-intensive, error-prone, and non-reproducible. Entire energy analyses can be scripted end-to-end using the OpenStudio Ruby API. Common tasks within an analysis can be automated using OpenStudio Measures. Graphical user interfaces (GUI's) and component libraries reduce time, decrease errors, and improve repeatability in energy modeling.

  8. Modelling the growth of herring from four different stocks in the North Sea

    NASA Astrophysics Data System (ADS)

    Heath, M.; Scott, B.; Bryant, A. D.

    1997-12-01

    Variations in growth of the 1961-1983 year classes of North Sea herring larvae and juveniles from four different stocks in the North Sea have been modelled in a two-stage process. First, the ERSEM transport model and a database of temperature conditions in the North Sea have been used to simulate the year-specific dispersal and timing of recruitment of larvae to a model of juvenile growth. The juvenile model was forced by temperature and continuous plankton recorder (CPR) data, and migration was modelled from survey data on the relative distribution of stock components in the North Sea. The model explains the observed differences in mean growth from hatching to 1.5 years old of herring of different stock origins over the period 1970-1981, and therefore it has been concluded that the growth differences are generated mainly by the hydrographic conditions and plankton abundance along the drift trajectory of the larvae and migration route of the early juveniles. Comparison of the time series of modelled size-at-age for juveniles from the Shetland stock with observations for the same period shows that the model explains short-term year-to-year variability in growth, correctly identifying extreme years, but fails to explain the longer-term underlying trends. The model performed best over the period 1970-1981 when population biomass was uniformly low, and deviated during 1961-1969 when biomass was declining from high levels. The inclusion of population biomass as an independent explanatory variable in the comparison of model results with the longer-term data accounts for up to 58% of the total variance in the observations. Thus, it is concluded that hydrographic and planktonic conditions in the North Sea account for the short-term year-to-year variability in growth, but the major underlying trends over the last 40 years are due primarily to density dependence.

  9. Stochastic frontier model approach for measuring stock market efficiency with different distributions.

    PubMed

    Hasan, Md Zobaer; Kamil, Anton Abdulbasah; Mustafa, Adli; Baten, Md Azizul

    2012-01-01

    The stock market is considered essential for economic growth and expected to contribute to improved productivity. An efficient pricing mechanism of the stock market can be a driving force for channeling savings into profitable investments and thus facilitating optimal allocation of capital. This study investigated the technical efficiency of selected groups of companies of Bangladesh Stock Market that is the Dhaka Stock Exchange (DSE) market, using the stochastic frontier production function approach. For this, the authors considered the Cobb-Douglas Stochastic frontier in which the technical inefficiency effects are defined by a model with two distributional assumptions. Truncated normal and half-normal distributions were used in the model and both time-variant and time-invariant inefficiency effects were estimated. The results reveal that technical efficiency decreased gradually over the reference period and that truncated normal distribution is preferable to half-normal distribution for technical inefficiency effects. The value of technical efficiency was high for the investment group and low for the bank group, as compared with other groups in the DSE market for both distributions in time-varying environment whereas it was high for the investment group but low for the ceramic group as compared with other groups in the DSE market for both distributions in time-invariant situation.

  10. Stochastic Frontier Model Approach for Measuring Stock Market Efficiency with Different Distributions

    PubMed Central

    Hasan, Md. Zobaer; Kamil, Anton Abdulbasah; Mustafa, Adli; Baten, Md. Azizul

    2012-01-01

    The stock market is considered essential for economic growth and expected to contribute to improved productivity. An efficient pricing mechanism of the stock market can be a driving force for channeling savings into profitable investments and thus facilitating optimal allocation of capital. This study investigated the technical efficiency of selected groups of companies of Bangladesh Stock Market that is the Dhaka Stock Exchange (DSE) market, using the stochastic frontier production function approach. For this, the authors considered the Cobb-Douglas Stochastic frontier in which the technical inefficiency effects are defined by a model with two distributional assumptions. Truncated normal and half-normal distributions were used in the model and both time-variant and time-invariant inefficiency effects were estimated. The results reveal that technical efficiency decreased gradually over the reference period and that truncated normal distribution is preferable to half-normal distribution for technical inefficiency effects. The value of technical efficiency was high for the investment group and low for the bank group, as compared with other groups in the DSE market for both distributions in time- varying environment whereas it was high for the investment group but low for the ceramic group as compared with other groups in the DSE market for both distributions in time-invariant situation. PMID:22629352

  11. Theoretical development of a simplified wheelset model to evaluate collision-induced derailments of rolling stock

    NASA Astrophysics Data System (ADS)

    Koo, Jeong Seo; Choi, Se Young

    2012-06-01

    A theoretical method is proposed to predict and evaluate collision-induced derailments of rolling stock by using a simplified wheelset model and is verified with dynamic simulations. Because the impact forces occurring during collision are transmitted from the car body to the bogies and axles through suspensions, rolling stock leads to derailment as a result of the combination of horizontal and vertical impact forces applied to the axle and a simplified wheelset model enforced at the axle can be used to theoretically formulate derailment behaviors. The derailment type depends on the combination of the horizontal and vertical forces, the flange angle and the friction coefficient. According to collision conditions, wheel-climb, wheel-lift or roll-over derailment can occur between the wheel and the rail. In this theoretical derailment model of a simplified wheelset, the derailment types are classified as Slip-up, Slip/roll-over, Climb-up, Climb/roll-over and pure Roll-over according to the derailment mechanisms between the wheel and the rail and the theoretical conditions needed to generate each derailment mechanism are proposed. The theoretical wheelset model is verified by dynamic simulation and its applicability is demonstrated by comparing the simulation results of the theoretical wheelset model with those of an actual wheelset model. The theoretical derailment wheelset model is in good agreement with the virtual testing model simulation for a collision-induced derailment of rolling stock.

  12. Indoor Air Quality Building Education and Assessment Model

    EPA Pesticide Factsheets

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM), released in 2002, is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  13. Indoor Air Quality Building Education and Assessment Model Forms

    EPA Pesticide Factsheets

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  14. Building the RHIC tracking lattice model

    SciTech Connect

    Luo, Y.; Fischer, W.; Tepikian, S.

    2010-01-27

    In this note we outline the procedure to build a realistic lattice model for the RHIC beam-beam tracking simulation. We will install multipole field errors in the arc main dipoles, arc main quadrupols and interaction region magnets (DX, D0, and triplets) and introduce a residual closed orbit, tune ripples, and physical apertures in the tracking lattice model. Nonlinearities such as local IR multipoles, second order chromaticies and third order resonance driving terms are also corrected before tracking.

  15. Use of anthropometric dummies of mathematical models in the safety and comfortableness analysis of a passenger rolling stock

    NASA Astrophysics Data System (ADS)

    Kobishchanov, V.; Antipin, D.; Shorokhov, S.; Mitrakov, A.

    2016-04-01

    Approaches to the safety and comfortableness analysis of a railway passenger rolling stock with anthropometrical dummies of mathematical models usage are offered. There are recommendations about a rolling stock design, based on the analysis of traumatism of passengers and members of train crews, and also based on comfort parameters at various modes of train movement.

  16. From Models to Measurements: Comparing Downed Dead Wood Carbon Stock Estimates in the U.S. Forest Inventory

    Treesearch

    Grant M. Domke; Christopher W. Woodall; Brian F. Walters; James E. Smith

    2013-01-01

    The inventory and monitoring of coarse woody debris (CWD) carbon (C) stocks is an essential component of any comprehensive National Greenhouse Gas Inventory (NGHGI). Due to the expense and difficulty associated with conducting field inventories of CWD pools, CWD C stocks are often modeled as a function of more commonly measured stand attributes such as live tree C...

  17. Development of Residential Prototype Building Models and Analysis System for Large-Scale Energy Efficiency Studies Using EnergyPlus

    SciTech Connect

    Mendon, Vrushali V.; Taylor, Zachary T.

    2014-09-10

    ABSTRACT: Recent advances in residential building energy efficiency and codes have resulted in increased interest in detailed residential building energy models using the latest energy simulation software. One of the challenges of developing residential building models to characterize new residential building stock is to allow for flexibility to address variability in house features like geometry, configuration, HVAC systems etc. Researchers solved this problem in a novel way by creating a simulation structure capable of creating fully-functional EnergyPlus batch runs using a completely scalable residential EnergyPlus template system. This system was used to create a set of thirty-two residential prototype building models covering single- and multifamily buildings, four common foundation types and four common heating system types found in the United States (US). A weighting scheme with detailed state-wise and national weighting factors was designed to supplement the residential prototype models. The complete set is designed to represent a majority of new residential construction stock. The entire structure consists of a system of utility programs developed around the core EnergyPlus simulation engine to automate the creation and management of large-scale simulation studies with minimal human effort. The simulation structure and the residential prototype building models have been used for numerous large-scale studies, one of which is briefly discussed in this paper.

  18. Model risk for European-style stock index options.

    PubMed

    Gençay, Ramazan; Gibson, Rajna

    2007-01-01

    In empirical modeling, there have been two strands for pricing in the options literature, namely the parametric and nonparametric models. Often, the support for the nonparametric methods is based on a benchmark such as the Black-Scholes (BS) model with constant volatility. In this paper, we study the stochastic volatility (SV) and stochastic volatility random jump (SVJ) models as parametric benchmarks against feedforward neural network (FNN) models, a class of neural network models. Our choice for FNN models is due to their well-studied universal approximation properties of an unknown function and its partial derivatives. Since the partial derivatives of an option pricing formula are risk pricing tools, an accurate estimation of the unknown option pricing function is essential for pricing and hedging. Our findings indicate that FNN models offer themselves as robust option pricing tools, over their sophisticated parametric counterparts in predictive settings. There are two routes to explain the superiority of FNN models over the parametric models in forecast settings. These are nonnormality of return distributions and adaptive learning.

  19. Fractality of profit landscapes and validation of time series models for stock prices

    NASA Astrophysics Data System (ADS)

    Yi, Il Gu; Oh, Gabjin; Kim, Beom Jun

    2013-08-01

    We apply a simple trading strategy for various time series of real and artificial stock prices to understand the origin of fractality observed in the resulting profit landscapes. The strategy contains only two parameters p and q, and the sell (buy) decision is made when the log return is larger (smaller) than p (-q). We discretize the unit square (p,q) ∈ [0,1] × [0,1] into the N × N square grid and the profit Π(p,q) is calculated at the center of each cell. We confirm the previous finding that local maxima in profit landscapes are scattered in a fractal-like fashion: the number M of local maxima follows the power-law form M ˜ Na, but the scaling exponent a is found to differ for different time series. From comparisons of real and artificial stock prices, we find that the fat-tailed return distribution is closely related to the exponent a ≈ 1.6 observed for real stock markets. We suggest that the fractality of profit landscape characterized by a ≈ 1.6 can be a useful measure to validate time series model for stock prices.

  20. Stochastic modeling of stock price process induced from the conjugate heat equation

    NASA Astrophysics Data System (ADS)

    Paeng, Seong-Hun

    2015-02-01

    Currency can be considered as a ruler for values of commodities. Then the price is the measured value by the ruler. We can suppose that inflation and variation of exchange rate are caused by variation of the scale of the ruler. In geometry, variation of the scale means that the metric is time-dependent. The conjugate heat equation is the modified heat equation which satisfies the heat conservation law for the time-dependent metric space. We propose a new model of stock prices by using the stochastic process whose transition probability is determined by the kernel of the conjugate heat equation. Our model of stock prices shows how the volatility term is affected by inflation and exchange rate. This model modifies the Black-Scholes equation in light of inflation and exchange rate.

  1. Modeling Adaptation as a Flow and Stock Decsion with Mitigation

    EPA Science Inventory

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-live...

  2. Modeling Adaptation as a Flow and Stock Decision with Mitigation

    EPA Science Inventory

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-liv...

  3. Modeling Adaptation as a Flow and Stock Decision with Mitigation

    EPA Science Inventory

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-liv...

  4. Modeling Adaptation as a Flow and Stock Decsion with Mitigation

    EPA Science Inventory

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-live...

  5. Modeling stock price dynamics by continuum percolation system and relevant complex systems analysis

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Wang, Jun

    2012-10-01

    The continuum percolation system is developed to model a random stock price process in this work. Recent empirical research has demonstrated various statistical features of stock price changes, the financial model aiming at understanding price fluctuations needs to define a mechanism for the formation of the price, in an attempt to reproduce and explain this set of empirical facts. The continuum percolation model is usually referred to as a random coverage process or a Boolean model, the local interaction or influence among traders is constructed by the continuum percolation, and a cluster of continuum percolation is applied to define the cluster of traders sharing the same opinion about the market. We investigate and analyze the statistical behaviors of normalized returns of the price model by some analysis methods, including power-law tail distribution analysis, chaotic behavior analysis and Zipf analysis. Moreover, we consider the daily returns of Shanghai Stock Exchange Composite Index from January 1997 to July 2011, and the comparisons of return behaviors between the actual data and the simulation data are exhibited.

  6. Quantum modeling of nonlinear dynamics of stock prices: Bohmian approach

    NASA Astrophysics Data System (ADS)

    Choustova, O.

    2007-08-01

    We use quantum mechanical methods to model the price dynamics in the financial market mathematically. We propose describing behavioral financial factors using the pilot-wave (Bohmian) model of quantum mechanics. The real price trajectories are determined (via the financial analogue of the second Newton law) by two financial potentials: the classical-like potential V (q) (“hard” market conditions) and the quantumlike potential U(q) (behavioral market conditions).

  7. Prediction of stock markets by the evolutionary mix-game model

    NASA Astrophysics Data System (ADS)

    Chen, Fang; Gou, Chengling; Guo, Xiaoqian; Gao, Jieping

    2008-06-01

    This paper presents the efforts of using the evolutionary mix-game model, which is a modified form of the agent-based mix-game model, to predict financial time series. Here, we have carried out three methods to improve the original mix-game model by adding the abilities of strategy evolution to agents, and then applying the new model referred to as the evolutionary mix-game model to forecast the Shanghai Stock Exchange Composite Index. The results show that these modifications can improve the accuracy of prediction greatly when proper parameters are chosen.

  8. Scripted Building Energy Modeling and Analysis: Preprint

    SciTech Connect

    Hale, E.; Macumber, D.; Benne, K.; Goldwasser, D.

    2012-08-01

    Building energy modeling and analysis is currently a time-intensive, error-prone, and nonreproducible process. This paper describes the scripting platform of the OpenStudio tool suite (http://openstudio.nrel.gov) and demonstrates its use in several contexts. Two classes of scripts are described and demonstrated: measures and free-form scripts. Measures are small, single-purpose scripts that conform to a predefined interface. Because measures are fairly simple, they can be written or modified by inexperienced programmers.

  9. Forest soil carbon stock estimates in a nationwide inventory: evaluating performance of the ROMULv and Yasso07 models in Finland

    NASA Astrophysics Data System (ADS)

    Lehtonen, Aleksi; Linkosalo, Tapio; Peltoniemi, Mikko; Sievänen, Risto; Mäkipää, Raisa; Tamminen, Pekka; Salemaa, Maija; Nieminen, Tiina; Ťupek, Boris; Heikkinen, Juha; Komarov, Alexander

    2016-11-01

    Dynamic soil models are needed for estimating impact of weather and climate change on soil carbon stocks and fluxes. Here, we evaluate performance of Yasso07 and ROMULv models against forest soil carbon stock measurements. More specifically, we ask if litter quantity, litter quality and weather data are sufficient drivers for soil carbon stock estimation. We also test whether inclusion of soil water holding capacity improves reliability of modelled soil carbon stock estimates. Litter input of trees was estimated from stem volume maps provided by the National Forest Inventory, while understorey vegetation was estimated using new biomass models. The litter production rates of trees were based on earlier research, while for understorey biomass they were estimated from measured data. We applied Yasso07 and ROMULv models across Finland and ran those models into steady state; thereafter, measured soil carbon stocks were compared with model estimates. We found that the role of understorey litter input was underestimated when the Yasso07 model was parameterised, especially in northern Finland. We also found that the inclusion of soil water holding capacity in the ROMULv model improved predictions, especially in southern Finland. Our simulations and measurements show that models using only litter quality, litter quantity and weather data underestimate soil carbon stock in southern Finland, and this underestimation is due to omission of the impact of droughts to the decomposition of organic layers. Our results also imply that the ecosystem modelling community and greenhouse gas inventories should improve understorey litter estimation in the northern latitudes.

  10. Mapping soil organic carbon stocks by robust geostatistical and boosted regression models

    NASA Astrophysics Data System (ADS)

    Nussbaum, Madlene; Papritz, Andreas; Baltensweiler, Andri; Walthert, Lorenz

    2013-04-01

    Carbon (C) sequestration in forests offsets greenhouse gas emissions. Therefore, quantifying C stocks and fluxes in forest ecosystems is of interest for greenhouse gas reporting according to the Kyoto protocol. In Switzerland, the National Forest Inventory offers comprehensive data to quantify the aboveground forest biomass and its change in time. Estimating stocks of soil organic C (SOC) in forests is more difficult because the variables needed to quantify stocks vary strongly in space and precise quantification of some of them is very costly. Based on data from 1'033 plots we modeled SOC stocks of the organic layer and the mineral soil to depths of 30 cm and 100 cm for the Swiss forested area. For the statistical modeling a broad range of covariates were available: Climate data (e. g. precipitation, temperature), two elevation models (resolutions 25 and 2 m) with respective terrain attributes and spectral reflectance data representing vegetation. Furthermore, the main mapping units of an overview soil map and a coarse scale geological map were used to coarsely represent the parent material of the soils. The selection of important covariates for SOC stocks modeling out of a large set was a major challenge for the statistical modeling. We used two approaches to deal with this problem: 1) A robust restricted maximum likelihood method to fit linear regression model with spatially correlated errors. The large number of covariates was first reduced by LASSO (Least Absolute Shrinkage and Selection Operator) and then further narrowed down to a parsimonious set of important covariates by cross-validation of the robustly fitted model. To account for nonlinear dependencies of the response on the covariates interaction terms of the latter were included in model if this improved the fit. 2) A boosted structured regression model with componentwise linear least squares or componentwise smoothing splines as base procedures. The selection of important covariates was done by the

  11. Urbanization has a positive net effect on soil carbon stocks: modelling outcomes for the Moscow region

    NASA Astrophysics Data System (ADS)

    Vasenev, Viacheslav; Stoorvogel, Jetse; Leemans, Rik; Valentini, Riccardo

    2016-04-01

    Urbanization is responsible for large environmental changes worldwide. Urbanization was traditionally related to negative environmental impacts, but recent research highlights the potential to store soil carbon (C) in urban areas. The net effect of urbanization on soil C is, however, poorly understood. Negative influences of construction and soil sealing can be compensated by establishing of green areas. We explored possible net effects of future urbanization on soil C-stocks in the Moscow Region. Urbanization was modelled as a function of environmental, socio-economic and neighbourhood factors. This yielded three alternative scenarios: i) including neighbourhood factors; ii) excluding neighbourhood factors and focusing on environmental drivers; and iii) considering the New Moscow Project, establishing 1500km2 of new urbanized area following governmental regulation. All three scenarios showed substantial urbanization on 500 to 2000km2 former forests and arable lands. Our analysis shows a positive net effect on SOC stocks of 5 to 11 TgC. The highest increase occurred on the less fertile Orthic Podzols and Eutric Podzoluvisols, whereas C-storage in Orthic Luvisols, Luvic Chernozems, Dystric Histosols and Eutric Fluvisols increased less. Subsoil C-stocks were much more affected with an extra 4 to 10 TgC than those in the topsoils. The highest increase of both topsoil and subsoil C stocks occurred in the New Moscow scenario with the highest urbanization. Even when the relatively high uncertainties of the absolute C-values are considered, a clear positive net effect of urbanization on C-stocks is apparent. This highlights the potential of cities to enhance C-storage. This will progressively become more important in the future following the increasing world-wide urbanization.

  12. Building shape models from lousy data.

    PubMed

    Lüthi, Marcel; Albrecht, Thomas; Vetter, Thomas

    2009-01-01

    Statistical shape models have gained widespread use in medical image analysis. In order for such models to be statistically meaningful, a large number of data sets have to be included. The number of available data sets is usually limited and often the data is corrupted by imaging artifacts or missing information. We propose a method for building a statistical shape model from such "lousy" data sets. The method works by identifying the corrupted parts of a shape as statistical outliers and excluding these parts from the model. Only the parts of a shape that were identified as outliers are discarded, while all the intact parts are included in the model. The model building is then performed using the EM algorithm for probabilistic principal component analysis, which allows for a principled way to handle missing data. Our experiments on 2D synthetic and real 3D medical data sets confirm the feasibility of the approach. We show that it yields superior models compared to approaches using robust statistics, which only downweight the influence of outliers.

  13. One-factor model for the cross-correlation matrix in the Vietnamese stock market

    NASA Astrophysics Data System (ADS)

    Nguyen, Quang

    2013-07-01

    Random matrix theory (RMT) has been applied to the analysis of the cross-correlation matrix of a financial time series. The most important findings of previous studies using this method are that the eigenvalue spectrum largely follows that of random matrices but the largest eigenvalue is at least one order of magnitude higher than the maximum eigenvalue predicted by RMT. In this work, we investigate the cross-correlation matrix in the Vietnamese stock market using RMT and find similar results to those of studies realized in developed markets (US, Europe, Japan) [9-18] as well as in other emerging markets[20,21,19,22]. Importantly, we found that the largest eigenvalue could be approximated by the product of the average cross-correlation coefficient and the number of stocks studied. We demonstrate this dependence using a simple one-factor model. The model could be extended to describe other characteristics of the realistic data.

  14. Modeling and Scaling of the Distribution of Trade Avalanches in a STOCK Market

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Joo

    We study the trading activity in the Korea Stock Exchange by considering trade avalanches. A series of successive trading with small trade time interval is regarded as a trade avalanche of which the size s is defined as the number of trade in a series of successive trades. We measure the distribution of trade avalanches sizes P(s) and find that it follows the power-law behavior P(s) ~ s-α with the exponent α ≈ 2 for two stocks with the largest number of trades. A simple stochastic model which describes the power-law behavior of the distribution of trade avalanche size is introduced. In the model it is assumed that the some trades induce the accompanying trades, which results in the trade avalanches and we find that the distribution of the trade avalanche size also follows power-law behavior with the exponent α ≈ 2.

  15. Simulating Soil C Stock with the Process-based Model

    USDA-ARS?s Scientific Manuscript database

    The prospect of storing carbon (C) in soil, as soil organic matter (SOM), provides an opportunity for agriculture to contribute to the reduction of carbon dioxide in the atmosphere while enhancing soil properties. Soil C models are useful for examining the complex interactions between crop, soil man...

  16. Modelling soil organic carbon stocks along topographic transects under climate change scenarios using CarboSOIL

    NASA Astrophysics Data System (ADS)

    Kotb Abd-Elmabod, Sameh; Muñoz-Rojas, Miriam; Jordán, Antonio; Anaya-Romero, María; de la Rosa, Diego

    2014-05-01

    CarboSOIL is a land evaluation model for soil organic carbon (SOC) accounting under global change scenarios (Muñoz-Rojas et al., 2013a; 2013b) and is a new component of the MicroLEIS Decision Support System. MicroLEIS is a tool for decision-makers dealing with specific agro-ecological problems as, for example, soil contamination risks (Abd-Elmabod et al., 2010; Abd-Elmabod et al., 2012)which has been designed as a knowledge-based approach incorporating a set of interlinked data bases. Global change and land use changes in recent decades have caused relevant impacts in vegetation carbon stocks (Muñoz-Rojas et al., 2011) and soil organic carbon stocks, especially in sensible areas as the Mediterranean region (Muñoz-Rojas et al., 2012a; 2012b). This study aims to investigate the influence of topography, climate, land use and soil factors on SOC stocks by the application of CarboSOIL in a representative area of the Mediterranean region (Seville, Spain). Two topographic transects (S-N and W-E oriented) were considered, including 63 points separated 4 km each. These points are associated to 41 soil profiles extracted from the SDBm soil data base (De la Rosa et al., 2001) and climatic information (average minimum temperature, average maximum temperature and average rainfall per month) extracted from raster data bases (Andalusian Environmental Information Network, REDIAM). CarboSOIL has been applied along topographic transects at different soil depths and under different climate change scenarios. Climate scenarios have been calculated according to the global climate model (CNRMCM3) by extracting spatial climate data under IPCC A1B scenario for the current period (average data from 1960-2000), 2040, 2070 and 2100. In the current scenario, results show that the highest SOC stock values located on Typic Haploxeralfs under olive groves for soil sections 0-25 cm and for 25-50 cm, but the highest values were determined on fruit-cropped Rendolic Xerothent in the 50-75cm

  17. Building information models for astronomy projects

    NASA Astrophysics Data System (ADS)

    Ariño, Javier; Murga, Gaizka; Campo, Ramón; Eletxigerra, Iñigo; Ampuero, Pedro

    2012-09-01

    A Building Information Model is a digital representation of physical and functional characteristics of a building. BIMs represent the geometrical characteristics of the Building, but also properties like bills of quantities, definition of COTS components, status of material in the different stages of the project, project economic data, etc. The BIM methodology, which is well established in the Architecture Engineering and Construction (AEC) domain for conventional buildings, has been brought one step forward in its application for Astronomical/Scientific facilities. In these facilities steel/concrete structures have high dynamic and seismic requirements, M&E installations are complex and there is a large amount of special equipment and mechanisms involved as a fundamental part of the facility. The detail design definition is typically implemented by different design teams in specialized design software packages. In order to allow the coordinated work of different engineering teams, the overall model, and its associated engineering database, is progressively integrated using a coordination and roaming software which can be used before starting construction phase for checking interferences, planning the construction sequence, studying maintenance operation, reporting to the project office, etc. This integrated design & construction approach will allow to efficiently plan construction sequence (4D). This is a powerful tool to study and analyze in detail alternative construction sequences and ideally coordinate the work of different construction teams. In addition engineering, construction and operational database can be linked to the virtual model (6D), what gives to the end users a invaluable tool for the lifecycle management, as all the facility information can be easily accessed, added or replaced. This paper presents the BIM methodology as implemented by IDOM with the E-ELT and ATST Enclosures as application examples.

  18. Microscopic spin model for the dynamics of the return distribution of the Korean stock market index

    NASA Astrophysics Data System (ADS)

    Yang, Jae-Suk; Chae, Seungbyung; Jung, Woo-Sung; Moon, Hie-Tae

    2006-05-01

    In this paper, we studied the dynamics of the log-return distribution of the Korean Composition Stock Price Index (KOSPI) from 1992 to 2004. Based on the microscopic spin model, we found that while the index during the late 1990s showed a power-law distribution, the distribution in the early 2000s was exponential. This change in distribution shape was caused by the duration and velocity, among other parameters, of the information that flowed into the market.

  19. Boxes of Model Building and Visualization.

    PubMed

    Turk, Dušan

    2017-01-01

    Macromolecular crystallography and electron microscopy (single-particle and in situ tomography) are merging into a single approach used by the two coalescing scientific communities. The merger is a consequence of technical developments that enabled determination of atomic structures of macromolecules by electron microscopy. Technological progress in experimental methods of macromolecular structure determination, computer hardware, and software changed and continues to change the nature of model building and visualization of molecular structures. However, the increase in automation and availability of structure validation are reducing interactive manual model building to fiddling with details. On the other hand, interactive modeling tools increasingly rely on search and complex energy calculation procedures, which make manually driven changes in geometry increasingly powerful and at the same time less demanding. Thus, the need for accurate manual positioning of a model is decreasing. The user's push only needs to be sufficient to bring the model within the increasing convergence radius of the computing tools. It seems that we can now better than ever determine an average single structure. The tools work better, requirements for engagement of human brain are lowered, and the frontier of intellectual and scientific challenges has moved on. The quest for resolution of new challenges requires out-of-the-box thinking. A few issues such as model bias and correctness of structure, ongoing developments in parameters defining geometric restraints, limitations of the ideal average single structure, and limitations of Bragg spot data are discussed here, together with the challenges that lie ahead.

  20. A dynamical model describing stock market price distributions

    NASA Astrophysics Data System (ADS)

    Masoliver, Jaume; Montero, Miquel; Porrà, Josep M.

    2000-08-01

    High-frequency data in finance have led to a deeper understanding on probability distributions of market prices. Several facts seem to be well established by empirical evidence. Specifically, probability distributions have the following properties: (i) They are not Gaussian and their center is well adjusted by Lévy distributions. (ii) They are long-tailed but have finite moments of any order. (iii) They are self-similar on many time scales. Finally, (iv) at small time scales, price volatility follows a non-diffusive behavior. We extend Merton's ideas on speculative price formation and present a dynamical model resulting in a characteristic function that explains in a natural way all of the above features. The knowledge of such a distribution opens a new and useful way of quantifying financial risk. The results of the model agree - with high degree of accuracy - with empirical data taken from historical records of the Standard & Poor's 500 cash index.

  1. Fractional Brownian Motion with Stochastic Variance:. Modeling Absolute Returns in STOCK Markets

    NASA Astrophysics Data System (ADS)

    Roman, H. E.; Porto, M.

    We discuss a model for simulating a long-time memory in time series characterized in addition by a stochastic variance. The model is based on a combination of fractional Brownian motion (FBM) concepts, for dealing with the long-time memory, with an autoregressive scheme with conditional heteroskedasticity (ARCH), responsible for the stochastic variance of the series, and is denoted as FBMARCH. Unlike well-known fractionally integrated autoregressive models, FBMARCH admits finite second moments. The resulting probability distribution functions have power-law tails with exponents similar to ARCH models. This idea is applied to the description of long-time autocorrelations of absolute returns ubiquitously observed in stock markets.

  2. Modeling soil organic carbon stocks at national scales - systematic validation of models and carbon input estimations

    NASA Astrophysics Data System (ADS)

    Riggers, Catharina; Dechow, Rene; Poeplau, Christopher; Don, Axel

    2017-04-01

    Soil organic carbon (SOC) content of arable soils is an important factor which not only influences soil fertility but also formation of greenhouse gases. SOC models try to simulate and predict the changes in carbon content in soils depending on parameters like temperature, precipitation, clay content and also carbon (C) input. For future climate mitigation strategies, it is necessary to minimize uncertainty while predicting trends in soil carbon stocks. The aim of our study is to conduct model based estimations of trends of local, regional and national SOC contents on German grassland and arable soils and to quantify scale dependent uncertainties arising from input data uncertainty, parameter uncertainty and model structural uncertainty. Preanalysis of SOC models showed that a large fraction of uncertainty in SOC trends is related to C-input estimates from crop residues and organic fertilisation. Therefore, we are going to combine six different SOC models (RothC, C-Tool, Yasso07, Century, ICBM/2, CCB) with five different approaches to estimate carbon input (Bolinder, CCB, C-Tool, ICBM, IPCC). This set of model combinations will be evaluated with data from German permanent soil monitoring sites and long term field experiments. With the best model combinations, we will conduct parameter estimations to calibrate the models for Germany. Finally, the calibrated model ensemble will be combined with data from the German agricultural soil inventory which sampled agricultural soils in Germany in an 8x8 km2 grid following standardized protocols to quantify German SOC trends and associated uncertainties by Monte Carlo methods.

  3. Testing Yasso07 and CENTURY soil C models with boreal forest soil C stocks and CO2 efflux measurements

    NASA Astrophysics Data System (ADS)

    Tupek, Boris; Peltoniemi, Mikko; Launiainen, Samuli; Kulmala, Liisa; Penttilä, Timo; Lehtonen, Aleksi

    2017-04-01

    Soil C models need further development, especially in terms of factors influencing spatial variability of soil C stocks and soil C stock changes. In this study we tested the estimates of soil C stocks and C stock changes of two widely used soil C models (Yasso07 and CENTURY) against measurements of the boreal forest soil C stock and CO2 efflux at four forest sites in Finland. In addition we evaluated the effects of using coarse versus detailed meteorological, soil, and plant litter input data on modeled monthly CO2 estimates. We found out that CO2 estimates of both models showed similar seasonal CO2 efflux pattern as the upscaled monthly measurements regardless of the fact whether the models used soil properties as input data. Winter and early summer CO2 fluxes agreed somewhat better between estimates and measurements than summer CO2 peaks and autumn CO2 levels, which were underestimated by models. Both models also underestimated equilibrium soil carbon (SOC) stocks, although SOC of CENTURY were larger than SOCs of Yasso07. CENTURY was more sensitive to variation in meteorological input data than Yasso07 and also to functional form of temperature response to decomposition. In conclusion, for modeling boreal forest soil C Yasso07 would benefit from including soil properties in the model structure, while Century would benefit from reformulation of temperature and moisture functions.

  4. Building Chaotic Model From Incomplete Time Series

    NASA Astrophysics Data System (ADS)

    Siek, Michael; Solomatine, Dimitri

    2010-05-01

    This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual

  5. Simplified Models for Dark Matter Model Building

    NASA Astrophysics Data System (ADS)

    DiFranzo, Anthony Paul

    The largest mass component of the universe is a longstanding mystery to the physics community. As a glaring source of new physics beyond the Standard Model, there is a large effort to uncover the quantum nature of dark matter. Many probes have been formed to search for this elusive matter; cultivating a rich environment for a phenomenologist. In addition to the primary probes---colliders, direct detection, and indirect detection---each with their own complexities, there is a plethora of prospects to illuminate our unanswered questions. In this work, phenomenological techniques for studying dark matter and other possible hints of new physics will be discussed. This work primarily focuses on the use of Simplified Models, which are intended to be a compromise between generality and validity of the theoretical description. They are often used to parameterize a particular search, develop a well-defined sense of complementarity between searches, or motivate new search strategies. Explicit examples of such models and how they may be used will be the highlight of each chapter.

  6. Building character: a model for reflective practice.

    PubMed

    Bryan, Charles S; Babelay, Allison M

    2009-09-01

    In 1950, Harrison and colleagues proposed that the physician's ultimate and sufficient destiny should be to "build an enduring edifice of character." Recent work in philosophy underscores the importance of character ethics (virtue ethics) as a complement to ethical systems based on duty (deontology) or results (consequentialism). Recent work in psychology suggests that virtues and character strengths can, to at least some extent, be analyzed and taught. Building character might be enhanced by promoting among students, residents, and faculty a four-step method of reflective practice that includes (1) the details of a situation, (2) the relevant virtues, (3) the relevant principles, values, and ethical frameworks, and (4) the range of acceptable courses of action. Exercises using such a model bring together the major goals of ethics education in U.S. medical schools--teaching the set of skills needed for resolving ethical dilemmas and promoting virtue and professionalism among physicians.

  7. Model building in nonproportional hazard regression.

    PubMed

    Rodríguez-Girondo, Mar; Kneib, Thomas; Cadarso-Suárez, Carmen; Abu-Assi, Emad

    2013-12-30

    Recent developments of statistical methods allow for a very flexible modeling of covariates affecting survival times via the hazard rate, including also the inspection of possible time-dependent associations. Despite their immediate appeal in terms of flexibility, these models typically introduce additional difficulties when a subset of covariates and the corresponding modeling alternatives have to be chosen, that is, for building the most suitable model for given data. This is particularly true when potentially time-varying associations are given. We propose to conduct a piecewise exponential representation of the original survival data to link hazard regression with estimation schemes based on of the Poisson likelihood to make recent advances for model building in exponential family regression accessible also in the nonproportional hazard regression context. A two-stage stepwise selection approach, an approach based on doubly penalized likelihood, and a componentwise functional gradient descent approach are adapted to the piecewise exponential regression problem. These three techniques were compared via an intensive simulation study. An application to prognosis after discharge for patients who suffered a myocardial infarction supplements the simulation to demonstrate the pros and cons of the approaches in real data analyses.

  8. Building models of animals from video.

    PubMed

    Ramanan, Deva; Forsyth, David A; Barnard, Kobus

    2006-08-01

    This paper argues that tracking, object detection, and model building are all similar activities. We describe a fully automatic system that builds 2D articulated models known as pictorial structures from videos of animals. The learned model can be used to detect the animal in the original video--in this sense, the system can be viewed as a generalized tracker (one that is capable of modeling objects while tracking them). The learned model can be matched to a visual library; here, the system can be viewed as a video recognition algorithm. The learned model can also be used to detect the animal in novel images--in this case, the system can be seen as a method for learning models for object recognition. We find that we can significantly improve the pictorial structures by augmenting them with a discriminative texture model learned from a texture library. We develop a novel texture descriptor that outperforms the state-of-the-art for animal textures. We demonstrate the entire system on real video sequences of three different animals. We show that we can automatically track and identify the given animal. We use the learned models to recognize animals from two data sets; images taken by professional photographers from the Corel collection, and assorted images from the Web returned by Google. We demonstrate quite good performance on both data sets. Comparing our results with simple baselines, we show that, for the Google set, we can detect, localize, and recover part articulations from a collection demonstrably hard for object recognition.

  9. A Master Equation Approach to Modeling Short-term Behaviors of the Stock Market

    NASA Astrophysics Data System (ADS)

    Zhao, Conan; Yang, Xiaoxiang; Mazilu, Irina

    2015-03-01

    Short term fluctuations in stock prices are highly random, due to the multitude of external factors acting on the price determination process. While long-term economic factors such as inflation and revenue growth rate affect short-term price fluctuation, it is difficult to obtain the complete set of information and uncertainties associated with a given period of time. Instead, we propose a simpler short-term model based on only prior price averages and extrema. In this paper, we take a master equation under the random walk hypothesis and fit parameters based on AAPL stock price data over the past ten years. We report results for small system sizes and for the short term average price. These results may lead to a general closed-form solution to this particular master equation.

  10. Methodology for Modeling Building Energy Performance across the Commercial Sector

    SciTech Connect

    Griffith, B.; Long, N.; Torcellini, P.; Judkoff, R.; Crawley, D.; Ryan, J.

    2008-03-01

    This report uses EnergyPlus simulations of each building in the 2003 Commercial Buildings Energy Consumption Survey (CBECS) to document and demonstrate bottom-up methods of modeling the entire U.S. commercial buildings sector (EIA 2006). The ability to use a whole-building simulation tool to model the entire sector is of interest because the energy models enable us to answer subsequent 'what-if' questions that involve technologies and practices related to energy. This report documents how the whole-building models were generated from the building characteristics in 2003 CBECS and compares the simulation results to the survey data for energy use.

  11. Some new results on the Levy, Levy and Solomon microscopic stock market model

    NASA Astrophysics Data System (ADS)

    Zschischang, Elmar; Lux, Thomas

    2001-03-01

    We report some findings from our simulations of the Levy, Levy and Solomon microscopic stock market model. Our results cast doubts on some of the results published in the original papers (i.e., chaotic stock price movements). We also point out the possibility of sensitive dependence on initial conditions of the emerging wealth distribution among agents. Extensions of the model set-up show that with varying degrees of risk aversion, the less risk averse traders will tend to dominate the market. Similarly, when introducing a new trader group (or even a single trader) with a constant share of stocks in their portfolio, the latter will eventually take over and marginalize the other groups. The better performance of the more sober investors is in accordance with traditional perceptions in financial economics. Hence, the survival of ‘noise traders’ looking at short-term trends and patterns remains as much of a puzzle in this framework as in the traditional Efficient Market Theory.

  12. Are stock prices too volatile to be justified by the dividend discount model?

    NASA Astrophysics Data System (ADS)

    Akdeniz, Levent; Salih, Aslıhan Altay; Ok, Süleyman Tuluğ

    2007-03-01

    This study investigates excess stock price volatility using the variance bound framework of LeRoy and Porter [The present-value relation: tests based on implied variance bounds, Econometrica 49 (1981) 555-574] and of Shiller [Do stock prices move too much to be justified by subsequent changes in dividends? Am. Econ. Rev. 71 (1981) 421-436.]. The conditional variance bound relationship is examined using cross-sectional data simulated from the general equilibrium asset pricing model of Brock [Asset prices in a production economy, in: J.J. McCall (Ed.), The Economics of Information and Uncertainty, University of Chicago Press, Chicago (for N.B.E.R.), 1982]. Results show that the conditional variance bounds hold, hence, our hypothesis of the validity of the dividend discount model cannot be rejected. Moreover, in our setting, markets are efficient and stock prices are neither affected by herd psychology nor by the outcome of noise trading by naive investors; thus, we are able to control for market efficiency. Consequently, we show that one cannot infer any conclusions about market efficiency from the unconditional variance bounds tests.

  13. [Stock assessment and management for Illex argentinus in Southwest Atlantic Ocean based on Bayesian Schaefer model].

    PubMed

    Lu, Hua-Jie; Chen, Xin-Jun; Li, Gang; Cao, Jie

    2013-07-01

    Abstract: Bayesian Schaefer model was applied to assess the stock of Illex argentinus in the Southwest Atlantic Ocean, with the risk of alternative management strategies for the squid analyzed. Under the scenarios of normal and uniform prior assumptions, the estimated model parameters and reference points were similar, and higher than the values under the scenario of logarithmic normal prior assumption. Under the three proposed scenarios, the fishing mortalities and the total catches in 2001-2010 were lower than the reference point F0.1 and the maximum sustainable yield (MSY), indicating that the I. argentinus was in an expected sustainable exploited level but not in over-fishing and over-fished. The results of decision analysis indicated that at the same harvest rate, the stock of the I. argentinus under the scenario of logarithmic normal prior assumption in 2025 would be the lowest, and the probability of collapse would be the highest. Under the three scenarios, the harvest rate in 2025 would be all 0.6 if the catch was the maximum. However, if the harvest rate was set to 0.6, the stock of the I. argentinus after 2025 would have definite risk, and thus, the harvest rate 0.4 and the catch 550000 t appeared to be the best management regulation or the baseline case.

  14. A universal approach to estimate biomass and carbon stock in tropical forests using generic allometric models.

    PubMed

    Vieilledent, G; Vaudry, R; Andriamanohisoa, S F D; Rakotonarivo, O S; Randrianasolo, H Z; Razafindrabe, H N; Rakotoarivony, C Bidaud; Ebeling, J; Rasamoelina, M

    2012-03-01

    Allometric equations allow aboveground tree biomass and carbon stock to be estimated from tree size. The allometric scaling theory suggests the existence of a universal power-law relationship between tree biomass and tree diameter with a fixed scaling exponent close to 8/3. In addition, generic empirical models, like Chave's or Brown's models, have been proposed for tropical forests in America and Asia. These generic models have been used to estimate forest biomass and carbon worldwide. However, tree allometry depends on environmental and genetic factors that vary from region to region. Consequently, theoretical models that include too few ecological explicative variables or empirical generic models that have been calibrated at particular sites are unlikely to yield accurate tree biomass estimates at other sites. In this study, we based our analysis on a destructive sample of 481 trees in Madagascar spiny dry and moist forests characterized by a high rate of endemism (> 95%). We show that, among the available generic allometric models, Chave's model including diameter, height, and wood specific gravity as explicative variables for a particular forest type (dry, moist, or wet tropical forest) was the only one that gave accurate tree biomass estimates for Madagascar (R2 > 83%, bias < 6%), with estimates comparable to those obtained with regional allometric models. When biomass allometric models are not available for a given forest site, this result shows that a simple height-diameter allometry is needed to accurately estimate biomass and carbon stock from plot inventories.

  15. Modeling pollutant penetration across building envelopes

    SciTech Connect

    Liu, De-Ling; Nazaroff, William W.

    2001-04-01

    As air infiltrates through unintentional openings in building envelopes, pollutants may interact with adjacent surfaces. Such interactions can alter human exposure to air pollutants of outdoor origin. We present modeling explorations of the proportion of particles and reactive gases (e.g., ozone) that penetrate building envelopes as air enters through cracks and wall cavities. Calculations were performed for idealized rectangular cracks, assuming regular geometry, smooth inner crack surface and steady airflow. Particles of 0.1-1.0 {micro}m diameter are predicted to have the highest penetration efficiency, nearly unity for crack heights of 0.25 mm or larger, assuming a pressure difference of 4 Pa or greater and a flow path length of 3 cm or less. Supermicron and ultrafine particles are significantly removed by means of gravitational settling and Brownian diffusion, respectively. In addition to crack geometry, ozone penetration depends on its reactivity with crack surfaces, as parameterized by the reaction probability. For reaction probabilities less than {approx}10{sup -5}, penetration is complete for cracks heights greater than 1 mm. However, penetration through mm scale cracks is small if the reaction probability is {approx}10{sup -4} or greater. For wall cavities, fiberglass insulation is an efficient particle filter, but particles would penetrate efficiently through uninsulated wall cavities or through insulated cavities with significant airflow bypass. The ozone reaction probability on fiberglass fibers was measured to be 10{sup -7} for fibers previously exposed to high ozone levels and 6 x 10{sup -6} for unexposed fibers. Over this range, ozone penetration through fiberglass insulation would vary from >90% to {approx}10-40%. Thus, under many conditions penetration is high; however, there are realistic circumstances in which building envelopes can provide substantial pollutant removal. Not enough is yet known about the detailed nature of pollutant penetration

  16. Critical comparison of several order-book models for stock-market fluctuations

    NASA Astrophysics Data System (ADS)

    Slanina, F.

    2008-01-01

    Far-from-equilibrium models of interacting particles in one dimension are used as a basis for modelling the stock-market fluctuations. Particle types and their positions are interpreted as buy and sel orders placed on a price axis in the order book. We revisit some modifications of well-known models, starting with the Bak-Paczuski-Shubik model. We look at the four decades old Stigler model and investigate its variants. One of them is the simplified version of the Genoa artificial market. The list of studied models is completed by the models of Maslov and Daniels et al. Generically, in all cases we compare the return distribution, absolute return autocorrelation and the value of the Hurst exponent. It turns out that none of the models reproduces satisfactorily all the empirical data, but the most promising candidates for further development are the Genoa artificial market and the Maslov model with moderate order evaporation.

  17. Iterative build OMIT maps: Map improvement by iterative model-building and refinement without model bias

    SciTech Connect

    Los Alamos National Laboratory, Mailstop M888, Los Alamos, NM 87545, USA; Lawrence Berkeley National Laboratory, One Cyclotron Road, Building 64R0121, Berkeley, CA 94720, USA; Department of Haematology, University of Cambridge, Cambridge CB2 0XY, England; Terwilliger, Thomas; Terwilliger, T.C.; Grosse-Kunstleve, Ralf Wilhelm; Afonine, P.V.; Moriarty, N.W.; Zwart, P.H.; Hung, L.-W.; Read, R.J.; Adams, P.D.

    2008-02-12

    A procedure for carrying out iterative model-building, density modification and refinement is presented in which the density in an OMIT region is essentially unbiased by an atomic model. Density from a set of overlapping OMIT regions can be combined to create a composite 'Iterative-Build' OMIT map that is everywhere unbiased by an atomic model but also everywhere benefiting from the model-based information present elsewhere in the unit cell. The procedure may have applications in the validation of specific features in atomic models as well as in overall model validation. The procedure is demonstrated with a molecular replacement structure and with an experimentally-phased structure, and a variation on the method is demonstrated by removing model bias from a structure from the Protein Data Bank.

  18. Directions for model building from asymptotic safety

    NASA Astrophysics Data System (ADS)

    Bond, Andrew D.; Hiller, Gudrun; Kowalska, Kamila; Litim, Daniel F.

    2017-08-01

    Building on recent advances in the understanding of gauge-Yukawa theories we explore possibilities to UV-complete the Standard Model in an asymptotically safe manner. Minimal extensions are based on a large flavor sector of additional fermions coupled to a scalar singlet matrix field. We find that asymptotic safety requires fermions in higher representations of SU(3) C × SU(2) L . Possible signatures at colliders are worked out and include R-hadron searches, diboson signatures and the evolution of the strong and weak coupling constants.

  19. A focus on building information modelling.

    PubMed

    Ryan, Alison

    2014-03-01

    With the Government Construction Strategy requiring a strengthening of the public sector's capability to implement Building Information Modelling (BIM) protocols, the goal being that all central government departments will be adopting, as a minimum, collaborative Level 2 BIM by 2016, Alison Ryan, of consulting engineers, DSSR, explains the principles behind BIM, its history and evolution, and some of the considerable benefits it can offer. These include lowering capital project costs through enhanced co-ordination, cutting carbon emissions, and the ability to manage facilities more efficiently.

  20. The role of building models in the evaluation of heat-related risks

    NASA Astrophysics Data System (ADS)

    Buchin, Oliver; Jänicke, Britta; Meier, Fred; Scherer, Dieter; Ziegler, Felix

    2016-04-01

    Hazard-risk relationships in epidemiological studies are generally based on the outdoor climate, despite the fact that most of humans' lifetime is spent indoors. By coupling indoor and outdoor climates with a building model, the risk concept developed can still be based on the outdoor conditions but also includes exposure to the indoor climate. The influence of non-linear building physics and the impact of air conditioning on heat-related risks can be assessed in a plausible manner using this risk concept. For proof of concept, the proposed risk concept is compared to a traditional risk analysis. As an example, daily and city-wide mortality data of the age group 65 and older in Berlin, Germany, for the years 2001-2010 are used. Four building models with differing complexity are applied in a time-series regression analysis. This study shows that indoor hazard better explains the variability in the risk data compared to outdoor hazard, depending on the kind of building model. Simplified parameter models include the main non-linear effects and are proposed for the time-series analysis. The concept shows that the definitions of heat events, lag days, and acclimatization in a traditional hazard-risk relationship are influenced by the characteristics of the prevailing building stock.

  1. Application of a Delay-difference model for the stock assessment of southern Atlantic albacore ( Thunnus alalunga)

    NASA Astrophysics Data System (ADS)

    Zhang, Kui; Liu, Qun; Kalhoro, Muhsan Ali

    2015-06-01

    Delay-difference models are intermediate between simple surplus-production models and complicated age-structured models. Such intermediate models are more efficient and require less data than age-structured models. In this study, a delay-difference model was applied to fit catch and catch per unit effort (CPUE) data (1975-2011) of the southern Atlantic albacore ( Thunnus alalunga) stock. The proposed delay-difference model captures annual fluctuations in predicted CPUE data better than Fox model. In a Monte Carlo simulation, white noises (CVs) were superimposed on the observed CPUE data at four levels. Relative estimate error was then calculated to compare the estimated results with the true values of parameters α and β in Ricker stock-recruitment model and the catchability coefficient q. a is more sensitive to CV than β and q. We also calculated an 80% percentile confidence interval of the maximum sustainable yield (MSY, 21756 t to 23408 t; median 22490 t) with the delay-difference model. The yield of the southern Atlantic albacore stock in 2011 was 24122 t, and the estimated ratios of catch against MSY for the past seven years were approximately 1.0. We suggest that care should be taken to protect the albacore fishery in the southern Atlantic Ocean. The proposed delay-difference model provides a good fit to the data of southern Atlantic albacore stock and may be a useful choice for the assessment of regional albacore stock.

  2. The scientific modeling assistant: An advanced software tool for scientific model building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  3. The scientific modeling assistant: An advanced software tool for scientific model building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  4. Interactive model building for Q-learning.

    PubMed

    Laber, Eric B; Linn, Kristin A; Stefanski, Leonard A

    2014-10-20

    Evidence-based rules for optimal treatment allocation are key components in the quest for efficient, effective health care delivery. Q-learning, an approximate dynamic programming algorithm, is a popular method for estimating optimal sequential decision rules from data. Q-learning requires the modeling of nonsmooth, nonmonotone transformations of the data, complicating the search for adequately expressive, yet parsimonious, statistical models. The default Q-learning working model is multiple linear regression, which is not only provably misspecified under most data-generating models, but also results in nonregular regression estimators, complicating inference. We propose an alternative strategy for estimating optimal sequential decision rules for which the requisite statistical modeling does not depend on nonsmooth, nonmonotone transformed data, does not result in nonregular regression estimators, is consistent under a broader array of data-generation models than Q-learning, results in estimated sequential decision rules that have better sampling properties, and is amenable to established statistical approaches for exploratory data analysis, model building, and validation. We derive the new method, IQ-learning, via an interchange in the order of certain steps in Q-learning. In simulated experiments IQ-learning improves on Q-learning in terms of integrated mean squared error and power. The method is illustrated using data from a study of major depressive disorder.

  5. Incorporating covariates into fisheries stock assessment models with application to Pacific herring.

    PubMed

    Deriso, Richard B; Maunder, Mark N; Pearson, Walter H

    2008-07-01

    We present a framework for evaluating the cause of fishery declines by integrating covariates into a fisheries stock assessment model. This allows the evaluation of fisheries' effects vs. natural and other human impacts. The analyses presented are based on integrating ecological science and statistics and form the basis for environmental decision-making advice. Hypothesis tests are described to rank hypotheses and determine the size of a multiple covariate model. We extend recent developments in integrated analysis and use novel methods to produce effect size estimates that are relevant to policy makers and include estimates of uncertainty. Results can be directly applied to evaluate trade-offs among alternative management decisions. The methods and results are also broadly applicable outside fisheries stock assessment. We show that multiple factors influence populations and that analysis of factors in isolation can be misleading. We illustrate the framework by applying it to Pacific herring of Prince William Sound, Alaska (USA). The Pacific herring stock that spawns in Prince William Sound is a stock that has collapsed, but there are several competing or alternative hypotheses to account for the initial collapse and subsequent lack of recovery. Factors failing the initial screening tests for statistical significance included indicators of the 1989 Exxon Valdez oil spill, coho salmon predation, sea lion predation, Pacific Decadal Oscillation, Northern Oscillation Index, and effects of containment in the herring egg-on-kelp pound fishery. The overall results indicate that the most statistically significant factors related to the lack of recovery of the herring stock involve competition or predation by juvenile hatchery pink salmon on herring juveniles. Secondary factors identified in the analysis were poor nutrition in the winter, ocean (Gulf of Alaska) temperature in the winter, the viral hemorrhagic septicemia virus, and the pathogen Ichthyophonus hoferi. The

  6. Cattle stocking rates estimated in temperate intensive grasslands with a spring growth model derived from MODIS NDVI time-series

    NASA Astrophysics Data System (ADS)

    Green, Stuart; Cawkwell, Fiona; Dwyer, Edward

    2016-10-01

    There is an identified need for high resolution animal stocking rate data in temperate grassland systems. Here is presented a 250 m scale characterization of early spring vegetation growth (DOY 32-DOY 120) from 2003 to 2012 based on MODIS NDVI products for this period for Ireland. The average rate of grass growth is determined locally as a simple linear model for each pixel, using only the highest quality data for the period. These decadal spring growth model coefficients, start of season cover and growth rate, are regressed against log of stocking rate (r2 = 0.75). This model stocking rate is used to map grassland use intensity in Ireland, which, when tested against an independent set of stocking rate data, is shown to be successful with an RMSE error of 0.13 for a range of stocking densities from 0.1 to 3.0 LSU/Ha. This model provides the first validated high resolution approach to mapping stocking rates in intensively managed European grassland systems.

  7. A Team Building Model for Software Engineering Courses Term Projects

    ERIC Educational Resources Information Center

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  8. Photograph of model projected new hospital building and new landscaping ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photograph of model projected new hospital building and new landscaping for area north of building 500. Model displayed on the mezzanine level of building 500. - Fitzsimons General Hospital, Bounded by East Colfax to south, Peoria Street to west, Denver City/County & Adams County Line to north, & U.S. Route 255 to east, Aurora, Adams County, CO

  9. A Team Building Model for Software Engineering Courses Term Projects

    ERIC Educational Resources Information Center

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  10. Generalized Weierstrass-Mandelbrot Function Model for Actual Stocks Markets Indexes with Nonlinear Characteristics

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Yu, C.; Sun, J. Q.

    2015-03-01

    It is difficult to simulate the dynamical behavior of actual financial markets indexes effectively, especially when they have nonlinear characteristics. So it is significant to propose a mathematical model with these characteristics. In this paper, we investigate a generalized Weierstrass-Mandelbrot function (WMF) model with two nonlinear characteristics: fractal dimension D where 2 > D > 1.5 and Hurst exponent (H) where 1 > H > 0.5 firstly. And then we study the dynamical behavior of H for WMF as D and the spectrum of the time series γ change in three-dimensional space, respectively. Because WMF and the actual stock market indexes have two common features: fractal behavior using fractal dimension and long memory effect by Hurst exponent, we study the relationship between WMF and the actual stock market indexes. We choose a random value of γ and fixed value of D for WMF to simulate the S&P 500 indexes at different time ranges. As shown in the simulation results of three-dimensional space, we find that γ is important in WMF model and different γ may have the same effect for the nonlinearity of WMF. Then we calculate the skewness and kurtosis of actual Daily S&P 500 index in different time ranges which can be used to choose the value of γ. Based on these results, we choose appropriate γ, D and initial value into WMF to simulate Daily S&P 500 indexes. Using the fit line method in two-dimensional space for the simulated values, we find that the generalized WMF model is effective for simulating different actual stock market indexes in different time ranges. It may be useful for understanding the dynamical behavior of many different financial markets.

  11. Near-Source Modeling Updates: Building Downwash & Near-Road

    EPA Science Inventory

    The presentation describes recent research efforts in near-source model development focusing on building downwash and near-road barriers. The building downwash section summarizes a recent wind tunnel study, ongoing computational fluid dynamics simulations and efforts to improve ...

  12. Near-Source Modeling Updates: Building Downwash & Near-Road

    EPA Science Inventory

    The presentation describes recent research efforts in near-source model development focusing on building downwash and near-road barriers. The building downwash section summarizes a recent wind tunnel study, ongoing computational fluid dynamics simulations and efforts to improve ...

  13. Modelling carbon stocks and fluxes in the wood product sector: a comparative review.

    PubMed

    Brunet-Navarro, Pau; Jochheim, Hubert; Muys, Bart

    2016-07-01

    In addition to forest ecosystems, wood products are carbon pools that can be strategically managed to mitigate climate change. Wood product models (WPMs) simulating the carbon balance of wood production, use and end of life can complement forest growth models to evaluate the mitigation potential of the forest sector as a whole. WPMs can be used to compare scenarios of product use and explore mitigation strategies. A considerable number of WPMs have been developed in the last three decades, but there is no review available analysing their functionality and performance. This study analyses and compares 41 WPMs. One surprising initial result was that we discovered the erroneous implementation of a few concepts and assumptions in some of the models. We further described and compared the models using six model characteristics (bucking allocation, industrial processes, carbon pools, product removal, recycling and substitution effects) and three model-use characteristics (system boundaries, model initialization and evaluation of results). Using a set of indicators based on the model characteristics, we classified models using a hierarchical clustering technique and differentiated them according to their increasing degrees of complexity and varying levels of user support. For purposes of simulating carbon stock in wood products, models with a simple structure may be sufficient, but to compare climate change mitigation options, complex models are needed. The number of models has increased substantially over the last ten years, introducing more diversity and accuracy. Calculation of substitution effects and recycling has also become more prominent. However, the lack of data is still an important constraint for a more realistic estimation of carbon stocks and fluxes. Therefore, if the sector wants to demonstrate the environmental quality of its products, it should make it a priority to provide reliable life cycle inventory data, particularly regarding aspects of time and

  14. Building Markov state models with solvent dynamics.

    PubMed

    Gu, Chen; Chang, Huang-Wei; Maibaum, Lutz; Pande, Vijay S; Carlsson, Gunnar E; Guibas, Leonidas J

    2013-01-01

    Markov state models have been widely used to study conformational changes of biological macromolecules. These models are built from short timescale simulations and then propagated to extract long timescale dynamics. However, the solvent information in molecular simulations are often ignored in current methods, because of the large number of solvent molecules in a system and the indistinguishability of solvent molecules upon their exchange. We present a solvent signature that compactly summarizes the solvent distribution in the high-dimensional data, and then define a distance metric between different configurations using this signature. We next incorporate the solvent information into the construction of Markov state models and present a fast geometric clustering algorithm which combines both the solute-based and solvent-based distances. We have tested our method on several different molecular dynamical systems, including alanine dipeptide, carbon nanotube, and benzene rings. With the new solvent-based signatures, we are able to identify different solvent distributions near the solute. Furthermore, when the solute has a concave shape, we can also capture the water number inside the solute structure. Finally we have compared the performances of different Markov state models. The experiment results show that our approach improves the existing methods both in the computational running time and the metastability. In this paper we have initiated an study to build Markov state models for molecular dynamical systems with solvent degrees of freedom. The methods we described should also be broadly applicable to a wide range of biomolecular simulation analyses.

  15. Building Markov state models with solvent dynamics

    PubMed Central

    2013-01-01

    Background Markov state models have been widely used to study conformational changes of biological macromolecules. These models are built from short timescale simulations and then propagated to extract long timescale dynamics. However, the solvent information in molecular simulations are often ignored in current methods, because of the large number of solvent molecules in a system and the indistinguishability of solvent molecules upon their exchange. Methods We present a solvent signature that compactly summarizes the solvent distribution in the high-dimensional data, and then define a distance metric between different configurations using this signature. We next incorporate the solvent information into the construction of Markov state models and present a fast geometric clustering algorithm which combines both the solute-based and solvent-based distances. Results We have tested our method on several different molecular dynamical systems, including alanine dipeptide, carbon nanotube, and benzene rings. With the new solvent-based signatures, we are able to identify different solvent distributions near the solute. Furthermore, when the solute has a concave shape, we can also capture the water number inside the solute structure. Finally we have compared the performances of different Markov state models. The experiment results show that our approach improves the existing methods both in the computational running time and the metastability. Conclusions In this paper we have initiated an study to build Markov state models for molecular dynamical systems with solvent degrees of freedom. The methods we described should also be broadly applicable to a wide range of biomolecular simulation analyses. PMID:23368418

  16. LFRic: Building a new Unified Model

    NASA Astrophysics Data System (ADS)

    Melvin, Thomas; Mullerworth, Steve; Ford, Rupert; Maynard, Chris; Hobson, Mike

    2017-04-01

    The LFRic project, named for Lewis Fry Richardson, aims to develop a replacement for the Met Office Unified Model in order to meet the challenges which will be presented by the next generation of exascale supercomputers. This project, a collaboration between the Met Office, STFC Daresbury and the University of Manchester, builds on the earlier GungHo project to redesign the dynamical core, in partnership with NERC. The new atmospheric model aims to retain the performance of the current ENDGame dynamical core and associated subgrid physics, while also enabling a far greater scalability and flexibility to accommodate future supercomputer architectures. Design of the model revolves around a principle of a 'separation of concerns', whereby the natural science aspects of the code can be developed without worrying about the underlying architecture, while machine dependent optimisations can be carried out at a high level. These principles are put into practice through the development of an autogenerated Parallel Systems software layer (known as the PSy layer) using a domain-specific compiler called PSyclone. The prototype model includes a re-write of the dynamical core using a mixed finite element method, in which different function spaces are used to represent the various fields. It is able to run in parallel with MPI and OpenMP and has been tested on over 200,000 cores. In this talk an overview of the both the natural science and computational science implementations of the model will be presented.

  17. Virtual Solar System Project: Building Understanding through Model Building.

    ERIC Educational Resources Information Center

    Barab, Sasha A.; Hay, Kenneth E.; Barnett, Michael; Keating, Thomas

    2000-01-01

    Describes an introductory astronomy course for undergraduate students in which students use three-dimensional (3-D) modeling tools to model the solar system and develop rich understandings of astronomical phenomena. Indicates that 3-D modeling can be used effectively in regular undergraduate university courses as a tool to develop understandings…

  18. Virtual Solar System Project: Building Understanding through Model Building.

    ERIC Educational Resources Information Center

    Barab, Sasha A.; Hay, Kenneth E.; Barnett, Michael; Keating, Thomas

    2000-01-01

    Describes an introductory astronomy course for undergraduate students in which students use three-dimensional (3-D) modeling tools to model the solar system and develop rich understandings of astronomical phenomena. Indicates that 3-D modeling can be used effectively in regular undergraduate university courses as a tool to develop understandings…

  19. Allometric models and aboveground biomass stocks of a West African Sudan Savannah watershed in Benin.

    PubMed

    Chabi, Adéyèmi; Lautenbach, Sven; Orekan, Vincent Oladokoun Agnila; Kyei-Baffour, Nicholas

    2016-12-01

    The estimation of forest biomass changes due to land-use change is of significant importance for estimates of the global carbon budget. The accuracy of biomass density maps depends on the availability of reliable allometric models used in combination with data derived from satellites images and forest inventory data. To reduce the uncertainty in estimates of carbon emissions resulting from deforestation and forest degradation, better information on allometric equations and the spatial distribution of aboveground biomass stocks in each land use/land cover (LULC) class is needed for the different ecological zones. Such information has been sparse for the West African Sudan Savannah zone. This paper provides new data and results for this important zone. The analysis combines satellite images and locally derived allometric models based on non-destructive measurements to estimate aboveground biomass stocks at the watershed level in the Sudan Savannah zone in Benin. We compared three types of empirically fitted allometric models of varying model complexity with respect to the number of input parameters that are easy to measure at the ground: model type I based only on the diameter at breast height (DBH), type II which used DBH and tree height and model type III which used DBH, tree height and wood density as predictors. While for most LULC classes model III outperformed the other models even the simple model I showed a good performance. The estimated mean dry biomass density values and attached standard error for the different LULC class were 3.28 ± 0.31 (for cropland and fallow), 3.62 ± 0.36 (for Savanna grassland), 4.86 ± 1.03 (for Settlements), 14.05 ± 0.72 (for Shrub savanna), 45.29 ± 2.51 (for Savanna Woodland), 46.06 ± 14.40 (for Agroforestry), 94.58 ± 4.98 (for riparian forest and woodland), 162 ± 64.88 (for Tectona grandis plantations), 179.62 ± 57.61 (for Azadirachta indica plantations), 25.17 ± 7.46 (for Gmelina arborea plantations

  20. Simulating Soil C Stock with the Process-based Model CQESTR

    NASA Astrophysics Data System (ADS)

    Gollany, H.; Liang, Y.; Rickman, R.; Albrecht, S.; Follett, R.; Wilhelm, W.; Novak, J.; Douglas, C.

    2009-04-01

    The prospect of storing carbon (C) in soil, as soil organic matter (SOM), provides an opportunity for agriculture to contribute to the reduction of carbon dioxide in the atmosphere while enhancing soil properties. Soil C models are useful for examining the complex interactions between crop, soil management practices and climate and their effects on long-term carbon storage or loss. The process-based carbon model CQESTR, pronounced ‘sequester,' was developed by USDA-ARS scientists at the Columbia Plateau Conservation Research Center, Pendleton, Oregon, USA. It computes the rate of biological decomposition of crop residues or organic amendments as they convert to SOM. CQESTR uses readily available field-scale data to assess long-term effects of cropping systems or crop residue removal on SOM accretion/loss in agricultural soil. Data inputs include weather, above- ground and below-ground biomass additions, N content of residues and amendments, soil properties, and management factors such as tillage and crop rotation. The model was calibrated using information from six long-term experiments across North America (Florence, SC, 19 yrs; Lincoln, NE, 26 yrs; Hoytville, OH, 31 yrs; Breton, AB, 60 yrs; Pendleton, OR, 76 yrs; and Columbia, MO, >100 yrs) having a range of soil properties and climate. CQESTR was validated using data from several additional long-term experiments (8 - 106 yrs) across North America having a range of SOM (7.3 - 57.9 g SOM/kg). Regression analysis of 306 pairs of predicted and measured SOM data under diverse climate, soil texture and drainage classes, and agronomic practices at 13 agricultural sites resulted in a linear relationship with an r2 of 0.95 (P < 0.0001) and a 95% confidence interval of 4.3 g SOM/kg. Estimated SOC values from CQESTR and IPCC (the Intergovernmental Panel on Climate Change) were compared to observed values in three relatively long-term experiments (20 - 24 years). At one site, CQESTR and IPCC estimates of SOC stocks were

  1. Demonstration of reduced-order urban scale building energy models

    DOE PAGES

    Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew; ...

    2017-09-08

    The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less

  2. Modeling Distributed Electricity Generation in the NEMS Buildings Models

    EIA Publications

    2011-01-01

    This paper presents the modeling methodology, projected market penetration, and impact of distributed generation with respect to offsetting future electricity needs and carbon dioxide emissions in the residential and commercial buildings sector in the Annual Energy Outlook 2000 (AEO2000) reference case.

  3. A national scale estimation of soil carbon stocks of Pinus densiflora forests in Korea: a modelling approach

    NASA Astrophysics Data System (ADS)

    Yi, K.; Park, C.; Ryu, S.; Lee, K.; Yi, M.; Kim, C.; Park, G.; Kim, R.; Son, Y.

    2011-12-01

    Soil carbon (C) stocks of Pinus densiflora forests in Korea were estimated using a generic forest soil C dynamics model based on the process of dead organic matter input and decomposition. Annual input of dead organic matter to the soil was determined by stand biomass and turnover rates of tree components (stem, branch, twig, foliage, coarse root, and fine root). The model was designed to have a simplified structure consisting of three dead organic matter C (DOC) pools (aboveground woody debris (AWD), belowground woody debris (BWD), and litter (LTR) pool) and one soil organic C (SOC) pool. C flows in the model were regulated by six turnover rates of stem, branch, twig, foliage, coarse root, and fine root, and four decay rates of AWD, BWD, LTR, and SOC. To simulate the soil C stocks of P. densiflora forests, statistical data of forest land area (1,339,791 ha) and growing stock (191,896,089 m3) sorted by region (nine provinces and seven metropolitan cities) and stand age class (11 to 20- (II), 21 to 30- (III), 31 to 40- (IV), 41 to 50- (V), and 51 to 60-year-old (VI)) were used. The growing stock of each stand age class was calculated for every region and representable site index was also determined by consulting the yield table. Other model parameters related to the stand biomass, annual input of dead organic matter and decomposition were estimated from previous studies conducted on P. densiflora forests in Korea, which were also applied for model validation. As a result of simulation, total soil C stock of P. densiflora forests were estimated as 53.9 MtC and soil C stocks per unit area ranged from 28.71 to 47.81 tC ha-1 within the soil depth of 30 cm. Also, soil C stocks in the P. densiflora forests of age class II, III, IV, V, and VI were 16,780,818, 21,450,812, 12,677,872, 2,366,939, and 578,623 tC, respectively, and highly related to the distribution of age classes. Soil C stocks per unit area initially decreased with stand age class and started to increase

  4. Encoding Dissimilarity Data for Statistical Model Building

    PubMed Central

    Wahba, Grace

    2010-01-01

    We summarize, review and comment upon three papers which discuss the use of discrete, noisy, incomplete, scattered pairwise dissimilarity data in statistical model building. Convex cone optimization codes are used to embed the objects into a Euclidean space which respects the dissimilarity information while controlling the dimension of the space. A “newbie” algorithm is provided for embedding new objects into this space. This allows the dissimilarity information to be incorporated into a Smoothing Spline ANOVA penalized likelihood model, a Support Vector Machine, or any model that will admit Reproducing Kernel Hilbert Space components, for nonparametric regression, supervised learning, or semi-supervised learning. Future work and open questions are discussed. The papers are: F. Lu, S. Keles, S. Wright and G. Wahba 2005. A framework for kernel regularization with application to protein clustering. Proceedings of the National Academy of Sciences 102, 12332–1233.G. Corrada Bravo, G. Wahba, K. Lee, B. Klein, R. Klein and S. Iyengar 2009. Examining the relative influence of familial, genetic and environmental covariate information in flexible risk models. Proceedings of the National Academy of Sciences 106, 8128–8133F. Lu, Y. Lin and G. Wahba. Robust manifold unfolding with kernel regularization. TR 1008, Department of Statistics, University of Wisconsin-Madison. PMID:20814436

  5. Encoding Dissimilarity Data for Statistical Model Building.

    PubMed

    Wahba, Grace

    2010-12-01

    We summarize, review and comment upon three papers which discuss the use of discrete, noisy, incomplete, scattered pairwise dissimilarity data in statistical model building. Convex cone optimization codes are used to embed the objects into a Euclidean space which respects the dissimilarity information while controlling the dimension of the space. A "newbie" algorithm is provided for embedding new objects into this space. This allows the dissimilarity information to be incorporated into a Smoothing Spline ANOVA penalized likelihood model, a Support Vector Machine, or any model that will admit Reproducing Kernel Hilbert Space components, for nonparametric regression, supervised learning, or semi-supervised learning. Future work and open questions are discussed. The papers are: F. Lu, S. Keles, S. Wright and G. Wahba 2005. A framework for kernel regularization with application to protein clustering. Proceedings of the National Academy of Sciences 102, 12332-1233.G. Corrada Bravo, G. Wahba, K. Lee, B. Klein, R. Klein and S. Iyengar 2009. Examining the relative influence of familial, genetic and environmental covariate information in flexible risk models. Proceedings of the National Academy of Sciences 106, 8128-8133F. Lu, Y. Lin and G. Wahba. Robust manifold unfolding with kernel regularization. TR 1008, Department of Statistics, University of Wisconsin-Madison.

  6. Model requirements for estimating and reporting soil C stock changes in national greenhouse gas inventories

    NASA Astrophysics Data System (ADS)

    Didion, Markus; Blujdea, Viorel; Grassi, Giacomo; Hernández, Laura; Jandl, Robert; Kriiska, Kaie; Lehtonen, Aleksi; Saint-André, Laurent

    2016-04-01

    Globally, soils are the largest terrestrial store of carbon (C) and small changes may contribute significantly to the global C balance. Due to the potential implications for climate change, accurate and consistent estimates of C fluxes at the large-scale are important as recognized, for example, in international agreements such as the United Nations Framework Convention on Climate Change (UNFCCC). Under the UNFCCC and also under the Kyoto Protocol it is required to report C balances annually. Most measurement-based soil inventories are currently not able to detect annual changes in soil C stocks consistently across space and representative at national scales. The use of models to obtain relevant estimates is considered an appropriate alternative under the UNFCCC and the Kyoto Protocol. Several soil carbon models have been developed but few models are suitable for a consistent application across larger-scales. Consistency is often limited by the lack of input data for models, which can result in biased estimates and, thus, the reporting criteria of accuracy (i.e., emission and removal estimates are systematically neither over nor under true emissions or removals) may be met. Based on a qualitative assessment of the ability to meet criteria established for GHG reporting under the UNFCCC including accuracy, consistency, comparability, completeness, and transparency, we identified the suitability of commonly used simulation models for estimating annual C stock changes in mineral soil in European forests. Among six discussed simulation models we found a clear trend toward models for providing quantitative precise site-specific estimates which may lead to biased estimates across space. To meet reporting needs for national GHG inventories, we conclude that there is a need for models producing qualitative realistic results in a transparent and comparable manner. Based on the application of one model along a gradient from Boreal forests in Finland to Mediterranean forests

  7. Spatiotemporal modeling of soil organic carbon stocks across a subtropical region.

    PubMed

    Ross, Christopher Wade; Grunwald, Sabine; Myers, David Brenton

    2013-09-01

    Given the significance and complex nature of soil organic carbon in the context of the global carbon cycle, the need exists for more accurate and economically feasible means of soil organic carbon analysis and its underlying spatial variation at regional scale. The overarching goal of this study was to assess both the spatial and temporal variability of soil organic carbon within a subtropical region of Florida, USA. Specifically, the objectives were to: i) quantify regional soil organic carbon stocks for historical and current conditions and ii) determine whether the soils have acted as a net sink or a net source for atmospheric carbon-dioxide over an approximate 40 year time period. To achieve these objectives, geostatistical interpolation models were used in conjunction with "historical" and "current" datasets to predict soil organic carbon stocks for the upper 20 cm soil profile of the study area. Soil organic carbon estimates derived from the models ranged from 102 to 108 Tg for historical conditions and 211 to 320 Tg for current conditions, indicating that soils in the study area have acted as a net sink for atmospheric carbon over the last 40 years. A paired resampling of historical sites supported the geostatistical estimates, and resulted in an average increase of 0.8 g carbon m(-2) yr(-1) across all collocated samples. Accurately assessing the spatial and temporal state of soil organic carbon at regional scale is critical to further our understanding of global carbon stocks and provide a baseline so that the effects sustainable land use policy can be evaluated.

  8. Global socioeconomic material stocks rise 23-fold over the 20th century and require half of annual resource use.

    PubMed

    Krausmann, Fridolin; Wiedenhofer, Dominik; Lauk, Christian; Haas, Willi; Tanikawa, Hiroki; Fishman, Tomer; Miatto, Alessio; Schandl, Heinz; Haberl, Helmut

    2017-02-21

    Human-made material stocks accumulating in buildings, infrastructure, and machinery play a crucial but underappreciated role in shaping the use of material and energy resources. Building, maintaining, and in particular operating in-use stocks of materials require raw materials and energy. Material stocks create long-term path-dependencies because of their longevity. Fostering a transition toward environmentally sustainable patterns of resource use requires a more complete understanding of stock-flow relations. Here we show that about half of all materials extracted globally by humans each year are used to build up or renew in-use stocks of materials. Based on a dynamic stock-flow model, we analyze stocks, inflows, and outflows of all materials and their relation to economic growth, energy use, and CO2 emissions from 1900 to 2010. Over this period, global material stocks increased 23-fold, reaching 792 Pg (±5%) in 2010. Despite efforts to improve recycling rates, continuous stock growth precludes closing material loops; recycling still only contributes 12% of inflows to stocks. Stocks are likely to continue to grow, driven by large infrastructure and building requirements in emerging economies. A convergence of material stocks at the level of industrial countries would lead to a fourfold increase in global stocks, and CO2 emissions exceeding climate change goals. Reducing expected future increases of material and energy demand and greenhouse gas emissions will require decoupling of services from the stocks and flows of materials through, for example, more intensive utilization of existing stocks, longer service lifetimes, and more efficient design.

  9. Global socioeconomic material stocks rise 23-fold over the 20th century and require half of annual resource use

    PubMed Central

    Wiedenhofer, Dominik; Lauk, Christian; Haas, Willi; Tanikawa, Hiroki; Miatto, Alessio; Haberl, Helmut

    2017-01-01

    Human-made material stocks accumulating in buildings, infrastructure, and machinery play a crucial but underappreciated role in shaping the use of material and energy resources. Building, maintaining, and in particular operating in-use stocks of materials require raw materials and energy. Material stocks create long-term path-dependencies because of their longevity. Fostering a transition toward environmentally sustainable patterns of resource use requires a more complete understanding of stock-flow relations. Here we show that about half of all materials extracted globally by humans each year are used to build up or renew in-use stocks of materials. Based on a dynamic stock-flow model, we analyze stocks, inflows, and outflows of all materials and their relation to economic growth, energy use, and CO2 emissions from 1900 to 2010. Over this period, global material stocks increased 23-fold, reaching 792 Pg (±5%) in 2010. Despite efforts to improve recycling rates, continuous stock growth precludes closing material loops; recycling still only contributes 12% of inflows to stocks. Stocks are likely to continue to grow, driven by large infrastructure and building requirements in emerging economies. A convergence of material stocks at the level of industrial countries would lead to a fourfold increase in global stocks, and CO2 emissions exceeding climate change goals. Reducing expected future increases of material and energy demand and greenhouse gas emissions will require decoupling of services from the stocks and flows of materials through, for example, more intensive utilization of existing stocks, longer service lifetimes, and more efficient design. PMID:28167761

  10. Managing critical materials with a technology-specific stocks and flows model.

    PubMed

    Busch, Jonathan; Steinberger, Julia K; Dawson, David A; Purnell, Phil; Roelich, Katy

    2014-01-21

    The transition to low carbon infrastructure systems required to meet climate change mitigation targets will involve an unprecedented roll-out of technologies reliant upon materials not previously widespread in infrastructure. Many of these materials (including lithium and rare earth metals) are at risk of supply disruption. To ensure the future sustainability and resilience of infrastructure, circular economy policies must be crafted to manage these critical materials effectively. These policies can only be effective if supported by an understanding of the material demands of infrastructure transition and what reuse and recycling options are possible given the future availability of end-of-life stocks. This Article presents a novel, enhanced stocks and flows model for the dynamic assessment of material demands resulting from infrastructure transitions. By including a hierarchical, nested description of infrastructure technologies, their components, and the materials they contain, this model can be used to quantify the effectiveness of recovery at both a technology remanufacturing and reuse level and a material recycling level. The model's potential is demonstrated on a case study on the roll-out of electric vehicles in the UK forecast by UK Department of Energy and Climate Change scenarios. The results suggest policy action should be taken to ensure Li-ion battery recycling infrastructure is in place by 2025 and NdFeB motor magnets should be designed for reuse. This could result in a reduction in primary demand for lithium of 40% and neodymium of 70%.

  11. A geodynamic model of Andean mountain building

    NASA Astrophysics Data System (ADS)

    Schellart, Wouter P.

    2017-04-01

    The Andes mountain range in South America is the longest in the world and is unique in that it has formed at a subduction zone and not at a continent-continent collision zone. The mountain range has formed due to overriding plate shortening since the Late Cretaceous, and its origin and the driving mechanism(s) responsible for its formation remain a topic of intense debate. Here I present a buoyancy-driven geodynamic model of South American-style subduction, mantle flow and overriding plate deformation, illustrating how subduction-induced mantle flow drives overriding plate deformation. The model reproduces several first-order characteristics of the Andes, including major crustal thickening (up to double the initial crustal thickness) and hundreds of km of east-west shortening in the Central Andes, as well as a slab geometry that is comparable to that of the Nazca slab below the Central Andes. Ultimately, the geodynamic model shows that subduction-induced mantle flow is responsible for Andean-style mountain building.

  12. Web tools for predictive toxicology model building.

    PubMed

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  13. Stock assessment and end-to-end ecosystem models alter dynamics of fisheries data.

    PubMed

    Storch, Laura S; Glaser, Sarah M; Ye, Hao; Rosenberg, Andrew A

    2017-01-01

    Although all models are simplified approximations of reality, they remain useful tools for understanding, predicting, and managing populations and ecosystems. However, a model's utility is contingent on its suitability for a given task. Here, we examine two model types: single-species fishery stock assessment and multispecies marine ecosystem models. Both are efforts to predict trajectories of populations and ecosystems to inform fisheries management and conceptual understanding. However, many of these ecosystems exhibit nonlinear dynamics, which may not be represented in the models. As a result, model outputs may underestimate variability and overestimate stability. Using nonlinear forecasting methods, we compare predictability and nonlinearity of model outputs against model inputs using data and models for the California Current System. Compared with model inputs, time series of model-processed outputs show more predictability but a higher prevalence of linearity, suggesting that the models misrepresent the actual predictability of the modeled systems. Thus, caution is warranted: using such models for management or scenario exploration may produce unforeseen consequences, especially in the context of unknown future impacts.

  14. Application of the Beck model to stock markets: Value-at-Risk and portfolio risk assessment

    NASA Astrophysics Data System (ADS)

    Kozaki, M.; Sato, A.-H.

    2008-02-01

    We apply the Beck model, developed for turbulent systems that exhibit scaling properties, to stock markets. Our study reveals that the Beck model elucidates the properties of stock market returns and is applicable to practical use such as the Value-at-Risk estimation and the portfolio analysis. We perform empirical analysis with daily/intraday data of the S&P500 index return and find that the volatility fluctuation of real markets is well-consistent with the assumptions of the Beck model: The volatility fluctuates at a much larger time scale than the return itself and the inverse of variance, or “inverse temperature”, β obeys Γ-distribution. As predicted by the Beck model, the distribution of returns is well-fitted by q-Gaussian distribution of Tsallis statistics. The evaluation method of Value-at-Risk (VaR), one of the most significant indicators in risk management, is studied for q-Gaussian distribution. Our proposed method enables the VaR evaluation in consideration of tail risk, which is underestimated by the variance-covariance method. A framework of portfolio risk assessment under the existence of tail risk is considered. We propose a multi-asset model with a single volatility fluctuation shared by all assets, named the single β model, and empirically examine the agreement between the model and an imaginary portfolio with Dow Jones indices. It turns out that the single β model gives good approximation to portfolios composed of the assets with non-Gaussian and correlated returns.

  15. Energy savings modelling of re-tuning energy conservation measures in large office buildings

    SciTech Connect

    Fernandez, Nick; Katipamula, Srinivas; Wang, Weimin; Huang, Yunzhi; Liu, Guopeng

    2014-10-20

    Today, many large commercial buildings use sophisticated building automation systems (BASs) to manage a wide range of building equipment. While the capabilities of BASs have increased over time, many buildings still do not fully use the BAS’s capabilities and are not properly commissioned, operated or maintained, which leads to inefficient operation, increased energy use, and reduced lifetimes of the equipment. This paper investigates the energy savings potential of several common HVAC system re-tuning measures on a typical large office building, using the Department of Energy’s building energy modeling software, EnergyPlus. The baseline prototype model uses roughly as much energy as an average large office building in existing building stock, but does not utilize any re-tuning measures. Individual re-tuning measures simulated against this baseline include automatic schedule adjustments, damper minimum flow adjustments, thermostat adjustments, as well as dynamic resets (set points that change continuously with building and/or outdoor conditions) to static pressure, supply-air temperature, condenser water temperature, chilled and hot water temperature, and chilled and hot water differential pressure set points. Six combinations of these individual measures have been formulated – each designed to conform to limitations to implementation of certain individual measures that might exist in typical buildings. All the individual measures and combinations were simulated in 16 climate locations representative of specific U.S. climate zones. The modeling results suggest that the most effective energy savings measures are those that affect the demand-side of the building (air-systems and schedules). Many of the demand-side individual measures were capable of reducing annual total HVAC system energy consumption by over 20% in most cities that were modeled. Supply side measures affecting HVAC plant conditions were only modestly successful (less than 5% annual HVAC energy

  16. Teaching Model Building to High School Students: Theory and Reality.

    ERIC Educational Resources Information Center

    Roberts, Nancy; Barclay, Tim

    1988-01-01

    Builds on a National Science Foundation (NSF) microcomputer based laboratory project to introduce system dynamics into the precollege setting. Focuses on providing students with powerful and investigatory theory building tools. Discusses developed hardware, software, and curriculum materials used to introduce model building and simulations into…

  17. Evidence of Large Fluctuations of Stock Return and Financial Crises from Turkey: Using Wavelet Coherency and Varma Modeling to Forecast Stock Return

    NASA Astrophysics Data System (ADS)

    Oygur, Tunc; Unal, Gazanfer

    Shocks, jumps, booms and busts are typical large fluctuation markers which appear in crisis. Models and leading indicators vary according to crisis type in spite of the fact that there are a lot of different models and leading indicators in literature to determine structure of crisis. In this paper, we investigate structure of dynamic correlation of stock return, interest rate, exchange rate and trade balance differences in crisis periods in Turkey over the period between October 1990 and March 2015 by applying wavelet coherency methodologies to determine nature of crises. The time period includes the Turkeys currency and banking crises; US sub-prime mortgage crisis and the European sovereign debt crisis occurred in 1994, 2001, 2008 and 2009, respectively. Empirical results showed that stock return, interest rate, exchange rate and trade balance differences are significantly linked during the financial crises in Turkey. The cross wavelet power, the wavelet coherency, the multiple wavelet coherency and the quadruple wavelet coherency methodologies have been used to examine structure of dynamic correlation. Moreover, in consequence of quadruple and multiple wavelet coherence, strongly correlated large scales indicate linear behavior and, hence VARMA (vector autoregressive moving average) gives better fitting and forecasting performance. In addition, increasing the dimensions of the model for strongly correlated scales leads to more accurate results compared to scalar counterparts.

  18. Stock assessment and end-to-end ecosystem models alter dynamics of fisheries data

    PubMed Central

    Storch, Laura S.; Glaser, Sarah M.; Ye, Hao; Rosenberg, Andrew A.

    2017-01-01

    Although all models are simplified approximations of reality, they remain useful tools for understanding, predicting, and managing populations and ecosystems. However, a model’s utility is contingent on its suitability for a given task. Here, we examine two model types: single-species fishery stock assessment and multispecies marine ecosystem models. Both are efforts to predict trajectories of populations and ecosystems to inform fisheries management and conceptual understanding. However, many of these ecosystems exhibit nonlinear dynamics, which may not be represented in the models. As a result, model outputs may underestimate variability and overestimate stability. Using nonlinear forecasting methods, we compare predictability and nonlinearity of model outputs against model inputs using data and models for the California Current System. Compared with model inputs, time series of model-processed outputs show more predictability but a higher prevalence of linearity, suggesting that the models misrepresent the actual predictability of the modeled systems. Thus, caution is warranted: using such models for management or scenario exploration may produce unforeseen consequences, especially in the context of unknown future impacts. PMID:28199344

  19. Managing Critical Materials with a Technology-Specific Stocks and Flows Model

    PubMed Central

    2013-01-01

    The transition to low carbon infrastructure systems required to meet climate change mitigation targets will involve an unprecedented roll-out of technologies reliant upon materials not previously widespread in infrastructure. Many of these materials (including lithium and rare earth metals) are at risk of supply disruption. To ensure the future sustainability and resilience of infrastructure, circular economy policies must be crafted to manage these critical materials effectively. These policies can only be effective if supported by an understanding of the material demands of infrastructure transition and what reuse and recycling options are possible given the future availability of end-of-life stocks. This Article presents a novel, enhanced stocks and flows model for the dynamic assessment of material demands resulting from infrastructure transitions. By including a hierarchical, nested description of infrastructure technologies, their components, and the materials they contain, this model can be used to quantify the effectiveness of recovery at both a technology remanufacturing and reuse level and a material recycling level. The model’s potential is demonstrated on a case study on the roll-out of electric vehicles in the UK forecast by UK Department of Energy and Climate Change scenarios. The results suggest policy action should be taken to ensure Li-ion battery recycling infrastructure is in place by 2025 and NdFeB motor magnets should be designed for reuse. This could result in a reduction in primary demand for lithium of 40% and neodymium of 70%. PMID:24328245

  20. Building groundwater modeling capacity in Mongolia

    USGS Publications Warehouse

    Valder, Joshua F.; Carter, Janet M.; Anderson, Mark T.; Davis, Kyle W.; Haynes, Michelle A.; Dorjsuren Dechinlhundev,

    2016-06-16

    Ulaanbaatar, the capital city of Mongolia (fig. 1), is dependent on groundwater for its municipal and industrial water supply. The population of Mongolia is about 3 million people, with about one-half the population residing in or near Ulaanbaatar (World Population Review, 2016). Groundwater is drawn from a network of shallow wells in an alluvial aquifer along the Tuul River. Evidence indicates that current water use may not be sustainable from existing water sources, especially when factoring the projected water demand from a rapidly growing urban population (Ministry of Environment and Green Development, 2013). In response, the Government of Mongolia Ministry of Environment, Green Development, and Tourism (MEGDT) and the Freshwater Institute, Mongolia, requested technical assistance on groundwater modeling through the U.S. Army Corps of Engineers (USACE) to the U.S. Geological Survey (USGS). Scientists from the USGS and USACE provided two workshops in 2015 to Mongolian hydrology experts on basic principles of groundwater modeling using the USGS groundwater modeling program MODFLOW-2005 (Harbaugh, 2005). The purpose of the workshops was to bring together representatives from the Government of Mongolia, local universities, technical experts, and other key stakeholders to build in-country capacity in hydrogeology and groundwater modeling.A preliminary steady-state groundwater-flow model was developed as part of the workshops to demonstrate groundwater modeling techniques to simulate groundwater conditions in alluvial deposits along the Tuul River in the vicinity of Ulaanbaatar. ModelMuse (Winston, 2009) was used as the graphical user interface for MODFLOW for training purposes during the workshops. Basic and advanced groundwater modeling concepts included in the workshops were groundwater principles; estimating hydraulic properties; developing model grids, data sets, and MODFLOW input files; and viewing and evaluating MODFLOW output files. A key to success was

  1. A model for the evaluation of systemic risk in stock markets

    NASA Astrophysics Data System (ADS)

    Caetano, Marco Antonio Leonel; Yoneyama, Takashi

    2011-06-01

    Systemic risk refers to the possibility of a collapse of an entire financial system or market, differing from the risk associated with any particular individual or a group pertaining to the system, which may include banks, government, brokers, and creditors. After the 2008 financial crisis, a significant amount of effort has been directed to the study of systemic risk and its consequences around the world. Although it is very difficult to predict when people begin to lose confidence in a financial system, it is possible to model the relationships among the stock markets of different countries and perform a Monte Carlo-type analysis to study the contagion effect. Because some larger and stronger markets influence smaller ones, a model inspired by a catalytic chemical model is proposed. In chemical reactions, reagents with higher concentrations tend to favor their conversion to products. In order to modulate the conversion process, catalyzers may be used. In this work, a mathematical modeling is proposed with bases on the catalytic chemical reaction model. More specifically, the Hang Seng and Dow Jones indices are assumed to dominate Ibovespa (the Brazilian Stock Market index), such that the indices of strong markets are taken as being analogous to the concentrations of the reagents and the indices of smaller markets as concentrations of products. The role of the catalyst is to model the degree of influence of one index on another. The actual data used to fit the model parameter consisted of the Hang Seng index, Dow Jones index, and Ibovespa, since 1993. “What if” analyses were carried out considering some intervention policies.

  2. Investigating the Influence Relationship Models for Stocks in Indian Equity Market: A Weighted Network Modelling Study.

    PubMed

    Bhattacharjee, Biplab; Shafi, Muhammad; Acharjee, Animesh

    2016-01-01

    The socio-economic systems today possess high levels of both interconnectedness and interdependencies, and such system-level relationships behave very dynamically. In such situations, it is all around perceived that influence is a perplexing power that has an overseeing part in affecting the dynamics and behaviours of involved ones. As a result of the force & direction of influence, the transformative change of one entity has a cogent aftereffect on the other entities in the system. The current study employs directed weighted networks for investigating the influential relationship patterns existent in a typical equity market as an outcome of inter-stock interactions happening at the market level, the sectorial level and the industrial level. The study dataset is derived from 335 constituent stocks of 'Standard & Poor Bombay Stock Exchange 500 index' and study period is 1st June 2005 to 30th June 2015. The study identifies the set of most dynamically influential stocks & their respective temporal pattern at three hierarchical levels: the complete equity market, different sectors, and constituting industry segments of those sectors. A detailed influence relationship analysis is performed for the sectorial level network of the construction sector, and it was found that stocks belonging to the cement industry possessed high influence within this sector. Also, the detailed network analysis of construction sector revealed that it follows scale-free characteristics and power law distribution. In the industry specific influence relationship analysis for cement industry, methods based on threshold filtering and minimum spanning tree were employed to derive a set of sub-graphs having temporally stable high-correlation structure over this ten years period.

  3. Investigating the Influence Relationship Models for Stocks in Indian Equity Market: A Weighted Network Modelling Study

    PubMed Central

    Acharjee, Animesh

    2016-01-01

    The socio-economic systems today possess high levels of both interconnectedness and interdependencies, and such system-level relationships behave very dynamically. In such situations, it is all around perceived that influence is a perplexing power that has an overseeing part in affecting the dynamics and behaviours of involved ones. As a result of the force & direction of influence, the transformative change of one entity has a cogent aftereffect on the other entities in the system. The current study employs directed weighted networks for investigating the influential relationship patterns existent in a typical equity market as an outcome of inter-stock interactions happening at the market level, the sectorial level and the industrial level. The study dataset is derived from 335 constituent stocks of ‘Standard & Poor Bombay Stock Exchange 500 index’ and study period is 1st June 2005 to 30th June 2015. The study identifies the set of most dynamically influential stocks & their respective temporal pattern at three hierarchical levels: the complete equity market, different sectors, and constituting industry segments of those sectors. A detailed influence relationship analysis is performed for the sectorial level network of the construction sector, and it was found that stocks belonging to the cement industry possessed high influence within this sector. Also, the detailed network analysis of construction sector revealed that it follows scale-free characteristics and power law distribution. In the industry specific influence relationship analysis for cement industry, methods based on threshold filtering and minimum spanning tree were employed to derive a set of sub-graphs having temporally stable high-correlation structure over this ten years period. PMID:27846251

  4. Optimizing Energy Consumption in Building Designs Using Building Information Model (BIM)

    NASA Astrophysics Data System (ADS)

    Egwunatum, Samuel; Joseph-Akwara, Esther; Akaigwe, Richard

    2016-09-01

    Given the ability of a Building Information Model (BIM) to serve as a multi-disciplinary data repository, this paper seeks to explore and exploit the sustainability value of Building Information Modelling/models in delivering buildings that require less energy for their operation, emit less CO2 and at the same time provide a comfortable living environment for their occupants. This objective was achieved by a critical and extensive review of the literature covering: (1) building energy consumption, (2) building energy performance and analysis, and (3) building information modeling and energy assessment. The literature cited in this paper showed that linking an energy analysis tool with a BIM model helped project design teams to predict and create optimized energy consumption. To validate this finding, an in-depth analysis was carried out on a completed BIM integrated construction project using the Arboleda Project in the Dominican Republic. The findings showed that the BIM-based energy analysis helped the design team achieve the world's first 103% positive energy building. From the research findings, the paper concludes that linking an energy analysis tool with a BIM model helps to expedite the energy analysis process, provide more detailed and accurate results as well as deliver energy-efficient buildings. The study further recommends that the adoption of a level 2 BIM and the integration of BIM in energy optimization analyse should be made compulsory for all projects irrespective of the method of procurement (government-funded or otherwise) or its size.

  5. Iterative-build OMIT maps: map improvement by iterative model building and refinement without model bias

    PubMed Central

    Terwilliger, Thomas C.; Grosse-Kunstleve, Ralf W.; Afonine, Pavel V.; Moriarty, Nigel W.; Adams, Paul D.; Read, Randy J.; Zwart, Peter H.; Hung, Li-Wei

    2008-01-01

    A procedure for carrying out iterative model building, density modification and refinement is presented in which the density in an OMIT region is essentially unbiased by an atomic model. Density from a set of overlapping OMIT regions can be combined to create a composite ‘iterative-build’ OMIT map that is everywhere unbiased by an atomic model but also everywhere benefiting from the model-based information present elsewhere in the unit cell. The procedure may have applications in the validation of specific features in atomic models as well as in overall model validation. The procedure is demonstrated with a molecular-replacement structure and with an experimentally phased structure and a variation on the method is demonstrated by removing model bias from a structure from the Protein Data Bank. PMID:18453687

  6. Model Predictive Control for the Operation of Building Cooling Systems

    SciTech Connect

    Ma, Yudong; Borrelli, Francesco; Hencey, Brandon; Coffey, Brian; Bengea, Sorin; Haves, Philip

    2010-06-29

    A model-based predictive control (MPC) is designed for optimal thermal energy storage in building cooling systems. We focus on buildings equipped with a water tank used for actively storing cold water produced by a series of chillers. Typically the chillers are operated at night to recharge the storage tank in order to meet the building demands on the following day. In this paper, we build on our previous work, improve the building load model, and present experimental results. The experiments show that MPC can achieve reduction in the central plant electricity cost and improvement of its efficiency.

  7. A legacy building model for holistic nursing.

    PubMed

    Lange, Bernadette; Zahourek, Rothlyn P; Mariano, Carla

    2014-06-01

    This pilot project was an effort to record the historical roots, development, and legacy of holistic nursing through the visionary spirit of four older American Holistic Nurses Association (AHNA) members. The aim was twofold: (a) to capture the holistic nursing career experiences of elder AHNA members and (b) to begin to create a Legacy Building Model for Holistic Nursing. The narratives will help initiate an ongoing, systematic method for the collection of historical data and serve as a perpetual archive of knowledge and inspiration for present and future holistic nurses. An aesthetic inquiry approach was used to conduct in-depth interviews with four older AHNA members who have made significant contributions to holistic nursing. The narratives provide a rich description of their personal and professional evolution as holistic nurses. The narratives are presented in an aesthetic format of the art forms of snapshot, pastiche, and collage rather than traditional presentations of research findings. A synopsis of the narratives is a dialogue between the three authors and provides insight for how a Legacy Model can guide our future. Considerations for practice, education, and research are discussed based on the words of wisdom from the four older holistic nurses.

  8. Convergent modelling of past soil organic carbon stocks but divergent projections

    NASA Astrophysics Data System (ADS)

    Luo, Z.; Wang, E.; Zheng, H.; Baldock, J. A.; Sun, O. J.; Shao, Q.

    2015-07-01

    Soil carbon (C) models are important tools for understanding soil C balance and projecting C stocks in terrestrial ecosystems, particularly under global change. The initialization and/or parameterization of soil C models can vary among studies even when the same model and data set are used, causing potential uncertainties in projections. Although a few studies have assessed such uncertainties, it is yet unclear what these uncertainties are correlated with and how they change across varying environmental and management conditions. Here, applying a process-based biogeochemical model to 90 individual field experiments (ranging from 5 to 82 years of experimental duration) across the Australian cereal-growing regions, we demonstrated that well-designed optimization procedures enabled the model to accurately simulate changes in measured C stocks, but did not guarantee convergent forward projections (100 years). Major causes of the projection uncertainty were due to insufficient understanding of how microbial processes and soil C pool change to modulate C turnover. For a given site, the uncertainty significantly increased with the magnitude of future C input and years of the projection. Across sites, the uncertainty correlated positively with temperature but negatively with rainfall. On average, a 331 % uncertainty in projected C sequestration ability can be inferred in Australian agricultural soils. This uncertainty would increase further if projections were made for future warming and drying conditions. Future improvement in soil C modelling should focus on how the microbial community and its C use efficiency change in response to environmental changes, and better conceptualization of heterogeneous soil C pools and the C transformation among those pools.

  9. Convergent modeling of past soil organic carbon stocks but divergent projections

    NASA Astrophysics Data System (ADS)

    Luo, Z.; Wang, E.; Zheng, H.; Baldock, J. A.; Sun, O. J.; Shao, Q.

    2015-03-01

    Soil carbon models are important tool to understand soil carbon balance and project carbon stocks in terrestrial ecosystems, particularly under global change. The initialization and/or parameterization of soil carbon models can vary among studies even when the same model and dataset are used, causing potential uncertainties in projections. Although a few studies have assessed such uncertainties, it is yet unclear what these uncertainties are correlated with and how they change across varying environmental and management conditions. Here, applying a process-based biogeochemical model to 90 individual field experiments (ranging from 5 to 82 years of experimental duration) across the Australian cereal-growing regions, we demonstrated that well-designed calibration procedures enabled the model to accurately simulate changes in measured carbon stocks, but did not guarantee convergent forward projections (100 years). Major causes of the projection uncertainty were due to insufficient understanding of how microbial processes and soil carbon composition change to modulate carbon turnover. For a given site, the uncertainty significantly increased with the magnitude of future carbon input and years of the projection. Across sites, the uncertainty correlated positively with temperature, but negatively with rainfall. On average, a 331% uncertainty in projected carbon sequestration ability can be inferred in Australian agricultural soils. This uncertainty would increase further if projections were made for future warming and drying conditions. Future improvement in soil carbon modeling should focus on how microbial community and its carbon use efficiency change in response to environmental changes, better quantification of composition of soil carbon and its change, and how the soil carbon composition will affect its turnover time.

  10. Predicting the microbial exposure risks in urban floods using GIS, building simulation, and microbial models.

    PubMed

    Taylor, Jonathon; Biddulph, Phillip; Davies, Michael; Lai, Ka man

    2013-01-01

    London is expected to experience more frequent periods of intense rainfall and tidal surges, leading to an increase in the risk of flooding. Damp and flooded dwellings can support microbial growth, including mould, bacteria, and protozoa, as well as persistence of flood-borne microorganisms. The amount of time flooded dwellings remain damp will depend on the duration and height of the flood, the contents of the flood water, the drying conditions, and the building construction, leading to particular properties and property types being prone to lingering damp and human pathogen growth or persistence. The impact of flooding on buildings can be simulated using Heat Air and Moisture (HAM) models of varying complexity in order to understand how water can be absorbed and dry out of the building structure. This paper describes the simulation of the drying of building archetypes representative of the English building stock using the EnergyPlus based tool 'UCL-HAMT' in order to determine the drying rates of different abandoned structures flooded to different heights and during different seasons. The results are mapped out using GIS in order to estimate the spatial risk across London in terms of comparative flood vulnerability, as well as for specific flood events. Areas of South and East London were found to be particularly vulnerable to long-term microbial exposure following major flood events. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. DEVELOPMENT OF A FLEXIBLE, MULTIZONE, MULTIFAMILY BUILDING SIMULATION MODEL

    SciTech Connect

    Malhotra, Mini; Im, Piljae

    2012-01-01

    Weatherization of multifamily buildings is gaining increased attention in the U.S. Available energy audit tools for multifamily buildings were found to need desirable improvements. On the wish list of field experts for enhanced features was the basic ability to model multizone buildings (i.e., one thermal zone per dwelling unit) with simplified user inputs, which allows a better analysis of decentralized and centralized HVAC and domestic hot water systems of multifamily buildings without having to create detailed building models. To address the desired capabilities, development of an enhanced energy audit tool was begun in 2011. The tool is a strategically structured, flexible, one-zone-per-unit, DOE-2.1e model coupled with a simplified user interface to model small to large multifamily buildings with decentralized or centralized systems and associated energy measures. This paper describes the modeling concept and its implementation.

  12. A View on Future Building System Modeling and Simulation

    SciTech Connect

    Wetter, Michael

    2011-04-01

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).

  13. Modeling arson - An exercise in qualitative model building

    NASA Technical Reports Server (NTRS)

    Heineke, J. M.

    1975-01-01

    A detailed example is given of the role of von Neumann and Morgenstern's 1944 'expected utility theorem' (in the theory of games and economic behavior) in qualitative model building. Specifically, an arsonist's decision as to the amount of time to allocate to arson and related activities is modeled, and the responsiveness of this time allocation to changes in various policy parameters is examined. Both the activity modeled and the method of presentation are intended to provide an introduction to the scope and power of the expected utility theorem in modeling situations of 'choice under uncertainty'. The robustness of such a model is shown to vary inversely with the number of preference restrictions used in the analysis. The fewer the restrictions, the wider is the class of agents to which the model is applicable, and accordingly more confidence is put in the derived results. A methodological discussion on modeling human behavior is included.

  14. Modeling arson - An exercise in qualitative model building

    NASA Technical Reports Server (NTRS)

    Heineke, J. M.

    1975-01-01

    A detailed example is given of the role of von Neumann and Morgenstern's 1944 'expected utility theorem' (in the theory of games and economic behavior) in qualitative model building. Specifically, an arsonist's decision as to the amount of time to allocate to arson and related activities is modeled, and the responsiveness of this time allocation to changes in various policy parameters is examined. Both the activity modeled and the method of presentation are intended to provide an introduction to the scope and power of the expected utility theorem in modeling situations of 'choice under uncertainty'. The robustness of such a model is shown to vary inversely with the number of preference restrictions used in the analysis. The fewer the restrictions, the wider is the class of agents to which the model is applicable, and accordingly more confidence is put in the derived results. A methodological discussion on modeling human behavior is included.

  15. A spatial model approach for assessing windbreak growth and carbon stocks.

    PubMed

    Hou, Qingjiang; Young, Linda J; Brandle, James R; Schoeneberger, Michele M

    2011-01-01

    Agroforestry, the deliberate integration of trees into agricultural operations, sequesters carbon (C) while providing valuable services on agricultural lands. However, methods to quantify present and projected C stocks in these open-grown woody systems are limited. As an initial step to address C accounting in agroforestry systems, a spatial Markov random field model for predicting the natural logarithm (log) of the mean aboveground volume of green ash ( Marsh.) within a shelterbelt, referred to as the log of aboveground volume, was developed using data from an earlier study and web-available soil and climate information. Windbreak characteristics, site, and climate variables were used to model the large-scale trend of the log of aboveground volume. The residuals from this initial model were correlated among sites up to 24 km from a point of interest. Therefore, a spatial dependence parameter was used to incorporate information from sites within 24 km into the prediction of the log of the aboveground volume. Age is an important windbreak characteristic in the model. Thus, the log of aboveground volume can be predicted for a given windbreak age and for values of other explanatory variables associated with a site of interest. Such predictions can be exponentiated to obtain predictions of aboveground volume for windbreaks without repeated inventory. With the capability of quantifying uncertainty, the model has the potential for large regional planning efforts and C stock assessments for many deciduous tree species used in windbreaks and riparian buffers once it is calibrated. American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America.

  16. Iterative model-building, structure refinement, and density modification with the PHENIX AutoBuild Wizard

    SciTech Connect

    Los Alamos National Laboratory, Mailstop M888, Los Alamos, NM 87545, USA; Lawrence Berkeley National Laboratory, One Cyclotron Road, Building 64R0121, Berkeley, CA 94720, USA; Department of Haematology, University of Cambridge, Cambridge CB2 0XY, England; Terwilliger, Thomas; Terwilliger, T.C.; Grosse-Kunstleve, Ralf Wilhelm; Afonine, P.V.; Moriarty, N.W.; Zwart, P.H.; Hung, L.-W.; Read, R.J.; Adams, P.D.

    2007-04-29

    The PHENIX AutoBuild Wizard is a highly automated tool for iterative model-building, structure refinement and density modification using RESOLVE or TEXTAL model-building, RESOLVE statistical density modification, and phenix.refine structure refinement. Recent advances in the AutoBuild Wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model completion algorithms, and automated solvent molecule picking. Model completion algorithms in the AutoBuild Wizard include loop-building, crossovers between chains in different models of a structure, and side-chain optimization. The AutoBuild Wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 {angstrom} to 3.2 {angstrom}, resulting in a mean R-factor of 0.24 and a mean free R factor of 0.29. The R-factor of the final model is dependent on the quality of the starting electron density, and relatively independent of resolution.

  17. Iterative model building, structure refinement and density modification with the PHENIX AutoBuild wizard.

    PubMed

    Terwilliger, Thomas C; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Moriarty, Nigel W; Zwart, Peter H; Hung, Li Wei; Read, Randy J; Adams, Paul D

    2008-01-01

    The PHENIX AutoBuild wizard is a highly automated tool for iterative model building, structure refinement and density modification using RESOLVE model building, RESOLVE statistical density modification and phenix.refine structure refinement. Recent advances in the AutoBuild wizard and phenix.refine include automated detection and application of NCS from models as they are built, extensive model-completion algorithms and automated solvent-molecule picking. Model-completion algorithms in the AutoBuild wizard include loop building, crossovers between chains in different models of a structure and side-chain optimization. The AutoBuild wizard has been applied to a set of 48 structures at resolutions ranging from 1.1 to 3.2 A, resulting in a mean R factor of 0.24 and a mean free R factor of 0.29. The R factor of the final model is dependent on the quality of the starting electron density and is relatively independent of resolution.

  18. SENCAR mouse skin tumorigenesis model versus other strains and stocks of mice.

    PubMed Central

    Slaga, T J

    1986-01-01

    The SENCAR mouse stock was selectively bred for eight generations for sensitivity to skin tumor induction by the two-stage tumorigenesis protocol using 7,12-dimethylbenz(a)anthracene (DMBA) as the initiator and 12-O-tetradecanoylphorbol-13-acetate (TPA) as the promoter. The SENCAR mouse was derived by crossing Charles River CD-1 mice with skin-tumor-sensitive mice (STS). The SENCAR mice are much more sensitive to both DMBA tumor initiation and TPA tumor promotion than CD-1, BALB/c, and DBA/2 mice. An even greater difference in the sensitivity to two-stage skin tumorigenesis is apparent between SENCAR and C57BL/6 mice when using DMBA-TPA treatment. However, the SENCAR and C57BL/6 mice have a similar tumor response to DMBA-benzoyl peroxide treatment, suggesting that TPA is not an effective promoter in C57BL/6 mice. The DBA/2 mice respond in a similar manner to the SENCAR mice when using N-methyl-N-nitro-N-nitrosoguanidine (MNNG)-TPA treatment. The SENCAR mouse model provides a good dose-response relationship for many carcinogens used as tumor initiators and for many compounds used as tumor promoter. When compared to other stocks and strains of mice, the SENCAR mouse has one of the largest data bases for carcinogens and promoters. PMID:3096709

  19. SENCAR mouse skin tumorigenesis model versus other strains and stocks of mice

    SciTech Connect

    Slaga, T.J.

    1986-09-01

    The SENCAR mouse stock was selectively bred for eight generations for sensitivity to skin tumor induction by the two-stage tumorigenesis protocol using 7,12-dimethylbenz(a)anthracene (DMBA) as the initiator and 12-O-tetradecanoylphorbol-13-acetate (TPA) as the promoter. The SENCAR mouse was derived by crossing Charles River CD-1 mice with skin-tumor-sensitive mice (STS). The SENCAR mice are much more sensitive to both DMBA tumor initiation and TPA tumor promotion than CD-1, BALB/c, and DBA/2 mice. An even greater difference in the sensitivity to two-stage skin tumorigenesis is apparent between SENCAR and C57BL/6 mice when using DMBA-TPA treatment. However, the SENCAR and C57BL/6 mice have a similar tumor response to DMBA-benzoyl peroxide treatment, suggesting that TPA is not an effective promoter in C57BL/6 mice. The DBA/2 mice respond in a similar manner to the SENCAR mice when using N-methyl-N-nitro-N-nitrosoguanidine (MNNG)-TPA treatment. The SENCAR mouse model provides a good dose-response relationship for many carcinogens used as tumor initiators and for many compounds used as tumor promoter. When compared to other stocks and strains of mice, the SENCAR mouse has one of the largest data bases for carcinogens and promoters.

  20. Stock management in hospital pharmacy using chance-constrained model predictive control.

    PubMed

    Jurado, I; Maestre, J M; Velarde, P; Ocampo-Martinez, C; Fernández, I; Tejera, B Isla; Prado, J R Del

    2016-05-01

    One of the most important problems in the pharmacy department of a hospital is stock management. The clinical need for drugs must be satisfied with limited work labor while minimizing the use of economic resources. The complexity of the problem resides in the random nature of the drug demand and the multiple constraints that must be taken into account in every decision. In this article, chance-constrained model predictive control is proposed to deal with this problem. The flexibility of model predictive control allows taking into account explicitly the different objectives and constraints involved in the problem while the use of chance constraints provides a trade-off between conservativeness and efficiency. The solution proposed is assessed to study its implementation in two Spanish hospitals. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Building Energy Modeling: A Data-Driven Approach

    NASA Astrophysics Data System (ADS)

    Cui, Can

    Buildings consume nearly 50% of the total energy in the United States, which drives the need to develop high-fidelity models for building energy systems. Extensive methods and techniques have been developed, studied, and applied to building energy simulation and forecasting, while most of work have focused on developing dedicated modeling approach for generic buildings. In this study, an integrated computationally efficient and high-fidelity building energy modeling framework is proposed, with the concentration on developing a generalized modeling approach for various types of buildings. First, a number of data-driven simulation models are reviewed and assessed on various types of computationally expensive simulation problems. Motivated by the conclusion that no model outperforms others if amortized over diverse problems, a meta-learning based recommendation system for data-driven simulation modeling is proposed. To test the feasibility of the proposed framework on the building energy system, an extended application of the recommendation system for short-term building energy forecasting is deployed on various buildings. Finally, Kalman filter-based data fusion technique is incorporated into the building recommendation system for on-line energy forecasting. Data fusion enables model calibration to update the state estimation in real-time, which filters out the noise and renders more accurate energy forecast. The framework is composed of two modules: off-line model recommendation module and on-line model calibration module. Specifically, the off-line model recommendation module includes 6 widely used data-driven simulation models, which are ranked by meta-learning recommendation system for off-line energy modeling on a given building scenario. Only a selective set of building physical and operational characteristic features is needed to complete the recommendation task. The on-line calibration module effectively addresses system uncertainties, where data fusion on

  2. Comparing field- and model-based standing dead tree carbon stock estimates across forests of the US

    Treesearch

    Chistopher W. Woodall; Grant M. Domke; David W. MacFarlane; Christopher M. Oswalt

    2012-01-01

    As signatories to the United Nation Framework Convention on Climate Change, the US has been estimating standing dead tree (SDT) carbon (C) stocks using a model based on live tree attributes. The USDA Forest Service began sampling SDTs nationwide in 1999. With comprehensive field data now available, the objective of this study was to compare field- and model-based...

  3. Vibration Response of Multi Storey Building Using Finite Element Modelling

    NASA Astrophysics Data System (ADS)

    Chik, T. N. T.; Zakaria, M. F.; Remali, M. A.; Yusoff, N. A.

    2016-07-01

    Interaction between building, type of foundation and the geotechnical parameter of ground may trigger a significant effect on the building. In general, stiffer foundations resulted in higher natural frequencies of the building-soil system and higher input frequencies are often associated with other ground. Usually, vibrations transmitted to the buildings by ground borne are often noticeable and can be felt. It might affect the building and become worse if the vibration level is not controlled. UTHM building is prone to the ground borne vibration due to closed distance from the main road, and the construction activities adjacent to the buildings. This paper investigates the natural frequency and vibration mode of multi storey office building with the presence of foundation system and comparison between both systems. Finite element modelling (FEM) package software of LUSAS is used to perform the vibration analysis of the building. The building is modelled based on the original plan with the foundation system on the structure model. The FEM results indicated that the structure which modelled with rigid base have high natural frequency compare to the structure with foundation system. These maybe due to soil structure interaction and also the damping of the system which related to the amount of energy dissipated through the foundation soil. Thus, this paper suggested that modelling with soil is necessary to demonstrate the soil influence towards vibration response to the structure.

  4. Houston biosecurity: building a national model.

    PubMed Central

    Casscells, Ward; Mirhaji, Parsa; Lillibridge, Scott; Madjid, Mohammad

    2004-01-01

    On September 11, 2001, Al Qaeda terrorists committed an atrocity when they used domestic jetliners to crash into buildings in New York City and Washington, DC, killing thousands of people. In October 2001, another act of savagery occurred, this time using anthrax, not airplanes, to take innocent lives. Each incident demonstrates the vulnerability of an open society, and Americans are left to wonder how such acts can be prevented. Two years later, Al Qaeda operatives are reportedly regrouping, recruiting, and changing their tactics to distribute money and messages to operatives around the world. Many experts believe that terrorist attacks are inevitable. Every city is vulnerable to an attack, and none are fully prepared to handle the residual impact of a biological or chemical attack. A survey conducted by the Cable News Network (CNN) in January 2002, studied 30 major US cities, ranking them based on 6 statistical indices of vulnerability. Thirteen cities were deemed better prepared than Houston, 10 were in a similar state of preparedness, and only 6 were less prepared than Houston. We will discuss the protective measures that have been put in place in Houston, and future steps to take. Other cities can model Houston's experience to develop similar plans nation-wide. PMID:17060983

  5. Complementarity of Historic Building Information Modelling and Geographic Information Systems

    NASA Astrophysics Data System (ADS)

    Yang, X.; Koehl, M.; Grussenmeyer, P.; Macher, H.

    2016-06-01

    In this paper, we discuss the potential of integrating both semantically rich models from Building Information Modelling (BIM) and Geographical Information Systems (GIS) to build the detailed 3D historic model. BIM contributes to the creation of a digital representation having all physical and functional building characteristics in several dimensions, as e.g. XYZ (3D), time and non-architectural information that are necessary for construction and management of buildings. GIS has potential in handling and managing spatial data especially exploring spatial relationships and is widely used in urban modelling. However, when considering heritage modelling, the specificity of irregular historical components makes it problematic to create the enriched model according to its complex architectural elements obtained from point clouds. Therefore, some open issues limiting the historic building 3D modelling will be discussed in this paper: how to deal with the complex elements composing historic buildings in BIM and GIS environment, how to build the enriched historic model, and why to construct different levels of details? By solving these problems, conceptualization, documentation and analysis of enriched Historic Building Information Modelling are developed and compared to traditional 3D models aimed primarily for visualization.

  6. Emotional stocks and bonds: a metaphorical model for conceptualizing and treating codependency and other forms of emotional overinvesting.

    PubMed

    Daire, Andrew P; Jacobson, Lamerial; Carlson, Ryan G

    2012-01-01

    Codependent behaviors are associated with an unhealthy reliance on others for meeting emotional needs. This over-reliance on others often leads to dysfunctional interpersonal relationships. This article presents emotional stocks and bonds (ESB), a metaphorical model for use with clients who display codependent behaviors. Emotional stocks and bonds incorporates theoretical tenets from Bowen family systems and attachment theory and aids clients in understanding and changing unhealthy relationship behavior patterns. In addition to an overview of the model's key concepts and its use in clinical practice, we provide a case illustration and a discussion of practice implications and limitations.

  7. The asymmetric reactions of mean and volatility of stock returns to domestic and international information based on a four-regime double-threshold GARCH model

    NASA Astrophysics Data System (ADS)

    Chen, Cathy W. S.; Yang, Ming Jing; Gerlach, Richard; Jim Lo, H.

    2006-07-01

    In this paper, we investigate the asymmetric reactions of mean and volatility of stock returns in five major markets to their own local news and the US information via linear and nonlinear models. We introduce a four-regime Double-Threshold GARCH (DTGARCH) model, which allows asymmetry in both the conditional mean and variance equations simultaneously by employing two threshold variables, to analyze the stock markets’ reactions to different types of information (good/bad news) generated from the domestic markets and the US stock market. By applying the four-regime DTGARCH model, this study finds that the interaction between the information of domestic and US stock markets leads to the asymmetric reactions of stock returns and their variability. In addition, this research also finds that the positive autocorrelation reported in the previous studies of financial markets may in fact be mis-specified, and actually due to the local market's positive response to the US stock market.

  8. Temperature response functions introduce high uncertainty in modelled carbon stocks in cold temperature regimes

    NASA Astrophysics Data System (ADS)

    Portner, H.; Bugmann, H.; Wolf, A.

    2009-08-01

    Models of carbon cycling in terrestrial ecosystems contain formulations for the dependence of respiration on temperature, but the sensitivity of predicted carbon pools and fluxes to these formulations and their parameterization is not understood. Thus, we made an uncertainty analysis of soil organic matter decomposition with respect to its temperature dependency using the ecosystem model LPJ-GUESS. We used five temperature response functions (Exponential, Arrhenius, Lloyd-Taylor, Gaussian, Van't Hoff). We determined the parameter uncertainty ranges of the functions by nonlinear regression analysis based on eight experimental datasets from northern hemisphere ecosystems. We sampled over the uncertainty bounds of the parameters and run simulations for each pair of temperature response function and calibration site. The uncertainty in both long-term and short-term soil carbon dynamics was analyzed over an elevation gradient in southern Switzerland. The function of Lloyd-Taylor turned out to be adequate for modelling the temperature dependency of soil organic matter decomposition, whereas the other functions either resulted in poor fits (Exponential, Arrhenius) or were not applicable for all datasets (Gaussian, Van't Hoff). There were two main sources of uncertainty for model simulations: (1) the uncertainty in the parameter estimates of the response functions, which increased with increasing temperature and (2) the uncertainty in the simulated size of carbon pools, which increased with elevation, as slower turn-over times lead to higher carbon stocks and higher associated uncertainties. The higher uncertainty in carbon pools with slow turn-over rates has important implications for the uncertainty in the projection of the change of soil carbon stocks driven by climate change, which turned out to be more uncertain for higher elevations and hence higher latitudes, which are of key importance for the global terrestrial carbon budget.

  9. Building Energy Model Development for Retrofit Homes

    SciTech Connect

    Chasar, David; McIlvaine, Janet; Blanchard, Jeremy; Widder, Sarah H.; Baechler, Michael C.

    2012-09-30

    Based on previous research conducted by Pacific Northwest National Laboratory and Florida Solar Energy Center providing technical assistance to implement 22 deep energy retrofits across the nation, 6 homes were selected in Florida and Texas for detailed post-retrofit energy modeling to assess realized energy savings (Chandra et al, 2012). However, assessing realized savings can be difficult for some homes where pre-retrofit occupancy and energy performance are unknown. Initially, savings had been estimated using a HERS Index comparison for these homes. However, this does not account for confounding factors such as occupancy and weather. This research addresses a method to more reliably assess energy savings achieved in deep energy retrofits for which pre-retrofit utility bills or occupancy information in not available. A metered home, Riverdale, was selected as a test case for development of a modeling procedure to account occupancy and weather factors, potentially creating more accurate estimates of energy savings. This “true up” procedure was developed using Energy Gauge USA software and post-retrofit homeowner information and utility bills. The 12 step process adjusts the post-retrofit modeling results to correlate with post-retrofit utility bills and known occupancy information. The “trued” post retrofit model is then used to estimate pre-retrofit energy consumption by changing the building efficiency characteristics to reflect the pre-retrofit condition, but keeping all weather and occupancy-related factors the same. This creates a pre-retrofit model that is more comparable to the post-retrofit energy use profile and can improve energy savings estimates. For this test case, a home for which pre- and post- retrofit utility bills were available was selected for comparison and assessment of the accuracy of the “true up” procedure. Based on the current method, this procedure is quite time intensive. However, streamlined processing spreadsheets or

  10. 12. Exterior of main offices, stock room and payroll offices ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. Exterior of main offices, stock room and payroll offices view from yard (middle building formerly mold loft #1). Building at left is stock room desk and offices. - Barbour Boat Works, Tryon Palace Drive, New Bern, Craven County, NC

  11. 13. Exterior of main offices and stock room from yard. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. Exterior of main offices and stock room from yard. Middle Building formerly mold loft #1. Building at left is stock room desk and offices. - Barbour Boat Works, Tryon Palace Drive, New Bern, Craven County, NC

  12. From Models to Measurements: Comparing Downed Dead Wood Carbon Stock Estimates in the U.S. Forest Inventory

    PubMed Central

    Domke, Grant M.; Woodall, Christopher W.; Walters, Brian F.; Smith, James E.

    2013-01-01

    The inventory and monitoring of coarse woody debris (CWD) carbon (C) stocks is an essential component of any comprehensive National Greenhouse Gas Inventory (NGHGI). Due to the expense and difficulty associated with conducting field inventories of CWD pools, CWD C stocks are often modeled as a function of more commonly measured stand attributes such as live tree C density. In order to assess potential benefits of adopting a field-based inventory of CWD C stocks in lieu of the current model-based approach, a national inventory of downed dead wood C across the U.S. was compared to estimates calculated from models associated with the U.S.’s NGHGI and used in the USDA Forest Service, Forest Inventory and Analysis program. The model-based population estimate of C stocks for CWD (i.e., pieces and slash piles) in the conterminous U.S. was 9 percent (145.1 Tg) greater than the field-based estimate. The relatively small absolute difference was driven by contrasting results for each CWD component. The model-based population estimate of C stocks from CWD pieces was 17 percent (230.3 Tg) greater than the field-based estimate, while the model-based estimate of C stocks from CWD slash piles was 27 percent (85.2 Tg) smaller than the field-based estimate. In general, models overestimated the C density per-unit-area from slash piles early in stand development and underestimated the C density from CWD pieces in young stands. This resulted in significant differences in CWD C stocks by region and ownership. The disparity in estimates across spatial scales illustrates the complexity in estimating CWD C in a NGHGI. Based on the results of this study, it is suggested that the U.S. adopt field-based estimates of CWD C stocks as a component of its NGHGI to both reduce the uncertainty within the inventory and improve the sensitivity to potential management and climate change events. PMID:23544112

  13. Underestimation of soil carbon stocks by Yasso07, Q, and CENTURY models in boreal forest linked to overlooking site fertility

    NASA Astrophysics Data System (ADS)

    Ťupek, Boris; Ortiz, Carina; Hashimoto, Shoji; Stendahl, Johan; Dahlgren, Jonas; Karltun, Erik; Lehtonen, Aleksi

    2016-04-01

    The soil organic carbon stock (SOC) changes estimated by the most process based soil carbon models (e.g. Yasso07, Q and CENTURY), needed for reporting of changes in soil carbon amounts for the United Nations Framework Convention on Climate Change (UNFCCC) and for mitigation of anthropogenic CO2 emissions by soil carbon management, can be biased if in a large mosaic of environments the models are missing a key factor driving SOC sequestration. To our knowledge soil nutrient status as a missing driver of these models was not tested in previous studies. Although, it's known that models fail to reconstruct the spatial variation and that soil nutrient status drives the ecosystem carbon use efficiency and soil carbon sequestration. We evaluated SOC stock estimates of Yasso07, Q and CENTURY process based models against the field data from Swedish Forest Soil National Inventories (3230 samples) organized by recursive partitioning method (RPART) into distinct soil groups with underlying SOC stock development linked to physicochemical conditions. These models worked for most soils with approximately average SOC stocks, but could not reproduce higher measured SOC stocks in our application. The Yasso07 and Q models that used only climate and litterfall input data and ignored soil properties generally agreed with two third of measurements. However, in comparison with measurements grouped according to the gradient of soil nutrient status we found that the models underestimated for the Swedish boreal forest soils with higher site fertility. Accounting for soil texture (clay, silt, and sand content) and structure (bulk density) in CENTURY model showed no improvement on carbon stock estimates, as CENTURY deviated in similar manner. We highlighted the mechanisms why models deviate from the measurements and the ways of considering soil nutrient status in further model development. Our analysis suggested that the models indeed lack other predominat drivers of SOC stabilization

  14. Compression stockings

    MedlinePlus

    ... knee bend. Compression Stockings Can Be Hard to Put on If it's hard for you to put on the stockings, try these tips: Apply lotion ... your legs, but let it dry before you put on the stockings. Use a little baby powder ...

  15. A Comparison of Two Balance Calibration Model Building Methods

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard; Ulbrich, Norbert

    2007-01-01

    Simulated strain-gage balance calibration data is used to compare the accuracy of two balance calibration model building methods for different noise environments and calibration experiment designs. The first building method obtains a math model for the analysis of balance calibration data after applying a candidate math model search algorithm to the calibration data set. The second building method uses stepwise regression analysis in order to construct a model for the analysis. Four balance calibration data sets were simulated in order to compare the accuracy of the two math model building methods. The simulated data sets were prepared using the traditional One Factor At a Time (OFAT) technique and the Modern Design of Experiments (MDOE) approach. Random and systematic errors were introduced in the simulated calibration data sets in order to study their influence on the math model building methods. Residuals of the fitted calibration responses and other statistical metrics were compared in order to evaluate the calibration models developed with different combinations of noise environment, experiment design, and model building method. Overall, predicted math models and residuals of both math model building methods show very good agreement. Significant differences in model quality were attributable to noise environment, experiment design, and their interaction. Generally, the addition of systematic error significantly degraded the quality of calibration models developed from OFAT data by either method, but MDOE experiment designs were more robust with respect to the introduction of a systematic component of the unexplained variance.

  16. Parameters estimation in marine ecosystem models: limitations of typical standing stock observations

    NASA Astrophysics Data System (ADS)

    Loeptien, Ulrike; Dietze, Heiner

    2015-04-01

    Marine biogeochemical models coupled to 3-dimensional numerical models of the climate system have matured to general tools employed to assess impacts of a warming world and to explore geo-engineering options. Typically, the nucleus of these biogeochemical modules is based on a set of partial differential equations which describe the interaction between prognostic variables such as nutrients, phytoplankton, zooplankton, and sinking detritus. The dynamics of those differential equations is governed by a set of parameters such as, e.g., the maximum growth rate of phytoplankton. These parameters are, per se, not known. A generic way to estimate these parameters in 3-dimensional (and computationally expensive) frameworks are trial-and-error exercises where parameters are changed until a "reasonable" similarity with observed standing stocks of prognostic variables is achieved. Based on recent advances in compute hardware, offline techniques and optimization the development of more objective approaches to estimate parameters are underway. Here we add to the ongoing development by exploring with twin experiments (i.e. synthetic "observations") the demands on observations that would allow for a more objective estimation of model parameters. We start with parameter retrieval experiments based on "perfect" (synthetic) observations of standing stocks which we, step by step, distort to approach realistic conditions and confirm our findings with real-world observations. We illustrate that even modest noise (10%) inherent to observations can forestall the parameter retrieval already and that, e.g., the parameters occurring in the hyperbolic Michaelis-Menten (MM) formulation (that is commonly used to describe nutrient and light limitation of phytoplankton growth) are particularly difficult to constrain.

  17. Rhode Island Model Evaluation & Support System: Building Administrator. Edition III

    ERIC Educational Resources Information Center

    Rhode Island Department of Education, 2015

    2015-01-01

    Rhode Island educators believe that implementing a fair, accurate, and meaningful educator evaluation and support system will help improve teaching, learning, and school leadership. The primary purpose of the Rhode Island Model Building Administrator Evaluation and Support System (Rhode Island Model) is to help all building administrators improve.…

  18. 40. Photocopy of building model photograph, ca., 1974, photographer unknown. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    40. Photocopy of building model photograph, ca., 1974, photographer unknown. Original photograph property of United States Air Force, 21" Space Command. CAPE COD AIR STATION PAVE PAWS FACILITY MODEL - ELEVATION SHOWING FLOOR AND EQUIPMENT LAYOUT. - Cape Cod Air Station, Technical Facility-Scanner Building & Power Plant, Massachusetts Military Reservation, Sandwich, Barnstable County, MA

  19. 39. Photocopy of building model photograph, ca. 1974, photographer unknown. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    39. Photocopy of building model photograph, ca. 1974, photographer unknown. Original photograph property of United States Air Force, 21" Space Command. CAPE COD AIR STATION PAVE PAWS FACILITY MODEL - SHOWING "A" AND "B" FACES. - Cape Cod Air Station, Technical Facility-Scanner Building & Power Plant, Massachusetts Military Reservation, Sandwich, Barnstable County, MA

  20. Digital Learning Material for Model Building in Molecular Biology

    ERIC Educational Resources Information Center

    Aegerter-Wilmsen, Tinri; Janssen, Fred; Hartog, Rob; Bisseling, Ton

    2005-01-01

    Building models to describe processes forms an essential part of molecular biology research. However, in molecular biology curricula little attention is generally being paid to the development of this skill. In order to provide students the opportunity to improve their model building skills, we decided to develop a number of digital cases about…

  1. Digital Learning Material for Model Building in Molecular Biology

    ERIC Educational Resources Information Center

    Aegerter-Wilmsen, Tinri; Janssen, Fred; Hartog, Rob; Bisseling, Ton

    2005-01-01

    Building models to describe processes forms an essential part of molecular biology research. However, in molecular biology curricula little attention is generally being paid to the development of this skill. In order to provide students the opportunity to improve their model building skills, we decided to develop a number of digital cases about…

  2. Modelling stock order flows with non-homogeneous intensities from high-frequency data

    NASA Astrophysics Data System (ADS)

    Gorshenin, Andrey K.; Korolev, Victor Yu.; Zeifman, Alexander I.; Shorgin, Sergey Ya.; Chertok, Andrey V.; Evstafyev, Artem I.; Korchagin, Alexander Yu.

    2013-10-01

    A micro-scale model is proposed for the evolution of such information system as the limit order book in financial markets. Within this model, the flows of orders (claims) are described by doubly stochastic Poisson processes taking account of the stochastic character of intensities of buy and sell orders that determine the price discovery mechanism. The proposed multiplicative model of stochastic intensities makes it possible to analyze the characteristics of the order flows as well as the instantaneous proportion of the forces of buyers and sellers, that is, the imbalance process, without modelling the external information background. The proposed model gives the opportunity to link the micro-scale (high-frequency) dynamics of the limit order book with the macro-scale models of stock price processes of the form of subordinated Wiener processes by means of limit theorems of probability theory and hence, to use the normal variance-mean mixture models of the corresponding heavy-tailed distributions. The approach can be useful in different areas with similar properties (e.g., in plasma physics).

  3. Perception-based shape retrieval for 3D building models

    NASA Astrophysics Data System (ADS)

    Zhang, Man; Zhang, Liqiang; Takis Mathiopoulos, P.; Ding, Yusi; Wang, Hao

    2013-01-01

    With the help of 3D search engines, a large number of 3D building models can be retrieved freely online. A serious disadvantage of most rotation-insensitive shape descriptors is their inability to distinguish between two 3D building models which are different at their main axes, but appear similar when one of them is rotated. To resolve this problem, we present a novel upright-based normalization method which not only correctly rotates such building models, but also greatly simplifies and accelerates the abstraction and the matching of building models' shape descriptors. Moreover, the abundance of architectural styles significantly hinders the effective shape retrieval of building models. Our research has shown that buildings with different designs are not well distinguished by the widely recognized shape descriptors for general 3D models. Motivated by this observation and to further improve the shape retrieval quality, a new building matching method is introduced and analyzed based on concepts found in the field of perception theory and the well-known Light Field descriptor. The resulting normalized building models are first classified using the qualitative shape descriptors of Shell and Unevenness which outline integral geometrical and topological information. These models are then put in on orderly fashion with the help of an improved quantitative shape descriptor which we will term as Horizontal Light Field Descriptor, since it assembles detailed shape characteristics. To accurately evaluate the proposed methodology, an enlarged building shape database which extends previous well-known shape benchmarks was implemented as well as a model retrieval system supporting inputs from 2D sketches and 3D models. Various experimental performance evaluation results have shown that, as compared to previous methods, retrievals employing the proposed matching methodology are faster and more consistent with human recognition of spatial objects. In addition these performance

  4. Distribution characteristics of stock market liquidity

    NASA Astrophysics Data System (ADS)

    Luo, Jiawen; Chen, Langnan; Liu, Hao

    2013-12-01

    We examine the distribution characteristics of stock market liquidity by employing the generalized additive models for location, scale and shape (GAMLSS) model and three-minute frequency data from Chinese stock markets. We find that the BCPE distribution within the GAMLSS framework fits the distributions of stock market liquidity well with the diagnosis test. We also find that the stock market index exhibits a significant impact on the distributions of stock market liquidity. The stock market liquidity usually exhibits a positive skewness, but a normal distribution at a low level of stock market index and a high-peak and fat-tail shape at a high level of stock market index.

  5. Estimating shaking-induced casualties and building damage for global earthquake events: a proposed modelling approach

    USGS Publications Warehouse

    So, Emily; Spence, Robin

    2013-01-01

    Recent earthquakes such as the Haiti earthquake of 12 January 2010 and the Qinghai earthquake on 14 April 2010 have highlighted the importance of rapid estimation of casualties after the event for humanitarian response. Both of these events resulted in surprisingly high death tolls, casualties and survivors made homeless. In the Mw = 7.0 Haiti earthquake, over 200,000 people perished with more than 300,000 reported injuries and 2 million made homeless. The Mw = 6.9 earthquake in Qinghai resulted in over 2,000 deaths with a further 11,000 people with serious or moderate injuries and 100,000 people have been left homeless in this mountainous region of China. In such events relief efforts can be significantly benefitted by the availability of rapid estimation and mapping of expected casualties. This paper contributes to ongoing global efforts to estimate probable earthquake casualties very rapidly after an earthquake has taken place. The analysis uses the assembled empirical damage and casualty data in the Cambridge Earthquake Impacts Database (CEQID) and explores data by event and across events to test the relationships of building and fatality distributions to the main explanatory variables of building type, building damage level and earthquake intensity. The prototype global casualty estimation model described here uses a semi-empirical approach that estimates damage rates for different classes of buildings present in the local building stock, and then relates fatality rates to the damage rates of each class of buildings. This approach accounts for the effect of the very different types of buildings (by climatic zone, urban or rural location, culture, income level etc), on casualties. The resulting casualty parameters were tested against the overall casualty data from several historical earthquakes in CEQID; a reasonable fit was found.

  6. Modeling forest carbon and nitrogen cycles based on long term carbon stock field measurement in the Delaware River Basin

    NASA Astrophysics Data System (ADS)

    Xu, B.; Pan, Y.; McCullough, K.; Plante, A. F.; Birdsey, R.

    2015-12-01

    Process-based models are a powerful approach to test our understanding of biogeochemical processes, to extrapolate ground survey data from limited plots to the landscape scale and to simulate the effects of climate change, nitrogen deposition, elevated atmospheric CO2, increasing natural disturbances and land use change on ecological processes. However, in most studies, the models are calibrated using ground measurements from only a few sites, though they may be extrapolated to much larger areas. Estimation accuracy can be improved if the models are parameterized using long-term carbon stock data from multiple sites representative of the simulated region. In this study, vegetation biomass and soil carbon stocks, and changes in these stocks over a recent decade, were measured in 61 forested plots located in three small watersheds in the Delaware River Basin (DRB). On average, total vegetation biomass was 160.2 Mg C ha-1 and the soil carbon stock was 76.6 Mg C ha-1, measured during 2012-2014. The biomass carbon stock increased by 2.45 Mg C ha-1 yr-1 from 2001-2003 to 2012-2014. This dataset was subsequently used to parameterize the PnET-CN model at the individual plot basis, and averaged parameters among plots were then applied to generate new watershed-scale model parameters for each of the three watersheds. The parameterized model was further validated by the field measurements in each of the major forest types. The spatial distribution of forest carbon pools and fluxes in three watersheds were mapped based on the simulation results from the newly parameterized PnET-CN model. The model will also be run under different scenarios to test the effects of climate change, altered atmospheric composition, land use change, and their interactions within the three watersheds and across the whole DRB.

  7. Controls of soil carbon stock development – comparison of Swedish forest soil carbon inventory measurements and two process based models

    NASA Astrophysics Data System (ADS)

    Ťupek, Boris; Ortiz, Carina; Stendahl, Johan; Hashimoto, Shoji; Dahlgren, Jonas; Lehtonen, Aleksi

    2015-04-01

    The key question in greenhouse gas research, whether the soils continue to sequester carbon under the conditions of climate change, is mainly evaluated by process based modelling. However, the models based on key processes of carbon cycle ignore more complex environmental effects for the sake of simplicity. In our study, based on extensive measurements of Swedish forest soil carbon inventory, we used the recursive partitioning and boosted regression trees methods to identify the governing controls of soil carbon stocks, and for these controls we compared the carbon stocks of measurements with carbon estimates of Yasso07 and CENTURY state of art models. The models were strongly vegetation and weather driven, whereas the soil carbon stocks of measurements were controlled mainly by the soil factors (e.g. cation exchange capacity, C/N ratio). Contrary to our expectation, the more complex CENTURY, which indirectly accounted for the exchangeable cations by incorporating the clay content into the model structure, still heavily depended on the amount of litter input and generally performed worse, than simpler Yasso07, that ignored the soil properties. When estimating the carbon stock for the specific soil type management, the soil properties should be considered while keeping the plant-weather related processes and parameters in their calibrated optimum.

  8. Long-time fluctuations in a dynamical model of stock market indices.

    PubMed

    Biham, O; Huang, Z F; Malcai, O; Solomon, S

    2001-08-01

    Financial time series typically exhibit strong fluctuations that cannot be described by a Gaussian distribution. Recent empirical studies of stock market indices examined whether the distribution P(r) of returns r(tau) after some time tau can be described by a (truncated) Lévy-stable distribution L(alpha)(r) with some index 0model that describes the dynamics of stock market indices. For the distributions P(r) generated by this model, we observe that the scaling of the central peak is consistent with a Lévy distribution while the tails exhibit a power-law distribution with an exponent alpha>2, namely, beyond the range of Lévy-stable distributions. Our results are in agreement with both empirical studies and reconcile the apparent disagreement between their results.

  9. An agent-based model of stock markets incorporating momentum investors

    NASA Astrophysics Data System (ADS)

    Wei, J. R.; Huang, J. P.; Hui, P. M.

    2013-06-01

    It has been widely accepted that there exist investors who adopt momentum strategies in real stock markets. Understanding the momentum behavior is of both academic and practical importance. For this purpose, we propose and study a simple agent-based model of trading incorporating momentum investors and random investors. The random investors trade randomly all the time. The momentum investors could be idle, buying or selling, and they decide on their action by implementing an action threshold that assesses the most recent price movement. The model is able to reproduce some of the stylized facts observed in real markets, including the fat-tails in returns, weak long-term correlation and scaling behavior in the kurtosis of returns. An analytic treatment of the model relates the model parameters to several quantities that can be extracted from real data sets. To illustrate how the model can be applied, we show that real market data can be used to constrain the model parameters, which in turn provide information on the behavior of momentum investors in different markets.

  10. Volume of the steady-state space of financial flows in a monetary stock-flow-consistent model

    NASA Astrophysics Data System (ADS)

    Hazan, Aurélien

    2017-05-01

    We show that a steady-state stock-flow consistent macro-economic model can be represented as a Constraint Satisfaction Problem (CSP). The set of solutions is a polytope, which volume depends on the constraints applied and reveals the potential fragility of the economic circuit, with no need to study the dynamics. Several methods to compute the volume are compared, inspired by operations research methods and the analysis of metabolic networks, both exact and approximate. We also introduce a random transaction matrix, and study the particular case of linear flows with respect to money stocks.

  11. A new baseline of organic carbon stock in European agricultural soils using a modelling approach.

    PubMed

    Lugato, Emanuele; Panagos, Panos; Bampa, Francesca; Jones, Arwyn; Montanarella, Luca

    2014-01-01

    Proposed European policy in the agricultural sector will place higher emphasis on soil organic carbon (SOC), both as an indicator of soil quality and as a means to offset CO2 emissions through soil carbon (C) sequestration. Despite detailed national SOC data sets in several European Union (EU) Member States, a consistent C stock estimation at EU scale remains problematic. Data are often not directly comparable, different methods have been used to obtain values (e.g. sampling, laboratory analysis) and access may be restricted. Therefore, any evolution of EU policies on C accounting and sequestration may be constrained by a lack of an accurate SOC estimation and the availability of tools to carry out scenario analysis, especially for agricultural soils. In this context, a comprehensive model platform was established at a pan-European scale (EU + Serbia, Bosnia and Herzegovina, Croatia, Montenegro, Albania, Former Yugoslav Republic of Macedonia and Norway) using the agro-ecosystem SOC model CENTURY. Almost 164 000 combinations of soil-climate-land use were computed, including the main arable crops, orchards and pasture. The model was implemented with the main management practices (e.g. irrigation, mineral and organic fertilization, tillage) derived from official statistics. The model results were tested against inventories from the European Environment and Observation Network (EIONET) and approximately 20 000 soil samples from the 2009 LUCAS survey, a monitoring project aiming at producing the first coherent, comprehensive and harmonized top-soil data set of the EU based on harmonized sampling and analytical methods. The CENTURY model estimation of the current 0-30 cm SOC stock of agricultural soils was 17.63 Gt; the model uncertainty estimation was below 36% in half of the NUTS2 regions considered. The model predicted an overall increase of this pool according to different climate-emission scenarios up to 2100, with C loss in the south and east of the area

  12. Examining the impact of lahars on buildings using numerical modelling

    NASA Astrophysics Data System (ADS)

    Mead, Stuart R.; Magill, Christina; Lemiale, Vincent; Thouret, Jean-Claude; Prakash, Mahesh

    2017-05-01

    Lahars are volcanic flows containing a mixture of fluid and sediment which have the potential to cause significant damage to buildings, critical infrastructure and human life. The extent of this damage is controlled by properties of the lahar, location of elements at risk and susceptibility of these elements to the lahar. Here we focus on understanding lahar-induced building damage. Quantification of building damage can be difficult due to the complexity of lahar behaviour (hazard), varying number and type of buildings exposed to the lahar (exposure) and the uncertain susceptibility of buildings to lahar impacts (vulnerability). In this paper, we quantify and examine the importance of lahar hazard, exposure and vulnerability in determining building damage with reference to a case study in the city of Arequipa, Peru. Numerical modelling is used to investigate lahar properties that are important in determining the inundation area and forces applied to buildings. Building vulnerability is quantified through the development of critical depth-pressure curves based on the ultimate bending moment of masonry structures. In the case study area, results suggest that building strength plays a minor role in determining overall building losses in comparison to the effects of building exposure and hydraulic characteristics of the lahar.

  13. Allometric models for predicting aboveground biomass and carbon stock of tropical perennial C4 grasses in Hawaii

    USDA-ARS?s Scientific Manuscript database

    Biomass represents a promising renewable energy opportunity that mayprovide a more sustainable alternative to the use of fossil resources by minimizing the net production of greenhouse gases. Yet, allometric models that allow the prediction of biomass, biomass carbon (C) and nitrogen (N) stocks rap...

  14. Modeling climate and fuel reduction impacts on mixed-conifer forest carbon stocks in the Sierra Nevada, California

    Treesearch

    Matthew D. Hurteau; Timothy A. Robards; Donald Stevens; David Saah; Malcolm North; George W. Koch

    2014-01-01

    Quantifying the impacts of changing climatic conditions on forest growth is integral to estimating future forest carbon balance. We used a growth-and-yield model, modified for climate sensitivity, to quantify the effects of altered climate on mixed-conifer forest growth in the Lake Tahoe Basin, California. Estimates of forest growth and live tree carbon stocks were...

  15. Modeling weather and stocking rate threshold effects on forage and steer production in northern mixed-grass prairie

    USDA-ARS?s Scientific Manuscript database

    Model evaluations of forage production and yearling steer weight gain (SWG) responses to stocking density (SD) and seasonal weather patterns are presented for semi-arid northern mixed-grass prairie. We used the improved Great Plains Framework for Agricultural Resource Management-Range (GPFARM-Range)...

  16. Measuring and modeling carbon stock change estimates for US forests and uncertainties from apparent inter-annual variability

    Treesearch

    James E. Smith; Linda S. Heath

    2015-01-01

    Our approach is based on a collection of models that convert or augment the USDA Forest Inventory and Analysis program survey data to estimate all forest carbon component stocks, including live and standing dead tree aboveground and belowground biomass, forest floor (litter), down deadwood, and soil organic carbon, for each inventory plot. The data, which include...

  17. Simulating tropical carbon stocks and fluxes in a changing world using an individual-based forest model.

    NASA Astrophysics Data System (ADS)

    Fischer, Rico; Huth, Andreas

    2014-05-01

    Large areas of tropical forests are disturbed due to climate change and human influence. Experts estimate that the last remaining rainforests could be destroyed in less than 100 years with strong consequences for both developing and industrial countries. Using a modelling approach we analyse how disturbances modify carbon stocks and carbon fluxes of African rainforests. In this study we use the process-based, individual-oriented forest model FORMIND. The main processes of this model are tree growth, mortality, regeneration and competition. The study regions are tropical rainforests in the Kilimanjaro region and Madagascar. Modelling above and below ground carbon stocks, we analyze the impact of disturbances and climate change on forest dynamics and forest carbon stocks. Droughts and fire events change the structure of tropical rainforests. Human influence like logging intensify this effect. With the presented results we could establish new allometric relationships between forest variables and above ground carbon stocks in tropical regions. Using remote sensing techniques, these relationships would offer the possibility for a global monitoring of the above ground carbon stored in the vegetation.

  18. Observations and modeling of aboveground tree carbon stocks and fluxes following a bark beetle outbreak in the western United States

    Treesearch

    Eric M. Pfeifer; Jeffrey A. Hicke; Arjan J.H. Meddens

    2011-01-01

    Bark beetle epidemics result in tree mortality across millions of hectares in North America. However, few studies have quantified impacts on carbon (C) cycling. In this study, we quantified the immediate response and subsequent trajectories of stand-level aboveground tree C stocks and fluxes using field measurements and modeling for a location in central Idaho, USA...

  19. Armagh Observatory - Historic Building Information Modelling for Virtual Learning in Building Conservation

    NASA Astrophysics Data System (ADS)

    Murphy, M.; Chenaux, A.; Keenaghan, G.; GIbson, V..; Butler, J.; Pybusr, C.

    2017-08-01

    In this paper the recording and design for a Virtual Reality Immersive Model of Armagh Observatory is presented, which will replicate the historic buildings and landscape with distant meridian markers and position of its principal historic instruments within a model of the night sky showing the position of bright stars. The virtual reality model can be used for educational purposes allowing the instruments within the historic building model to be manipulated within 3D space to demonstrate how the position measurements of stars were made in the 18th century. A description is given of current student and researchers activities concerning on-site recording and surveying and the virtual modelling of the buildings and landscape. This is followed by a design for a Virtual Reality Immersive Model of Armagh Observatory use game engine and virtual learning platforms and concepts.

  20. Developing Verification Systems for Building Information Models of Heritage Buildings with Heterogeneous Datasets

    NASA Astrophysics Data System (ADS)

    Chow, L.; Fai, S.

    2017-08-01

    The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  1. Landscape scale estimation of soil carbon stock using 3D modelling.

    PubMed

    Veronesi, F; Corstanje, R; Mayr, T

    2014-07-15

    Soil C is the largest pool of carbon in the terrestrial biosphere, and yet the processes of C accumulation, transformation and loss are poorly accounted for. This, in part, is due to the fact that soil C is not uniformly distributed through the soil depth profile and most current landscape level predictions of C do not adequately account the vertical distribution of soil C. In this study, we apply a method based on simple soil specific depth functions to map the soil C stock in three-dimensions at landscape scale. We used soil C and bulk density data from the Soil Survey for England and Wales to map an area in the West Midlands region of approximately 13,948 km(2). We applied a method which describes the variation through the soil profile and interpolates this across the landscape using well established soil drivers such as relief, land cover and geology. The results indicate that this mapping method can effectively reproduce the observed variation in the soil profiles samples. The mapping results were validated using cross validation and an independent validation. The cross-validation resulted in an R(2) of 36% for soil C and 44% for BULKD. These results are generally in line with previous validated studies. In addition, an independent validation was undertaken, comparing the predictions against the National Soil Inventory (NSI) dataset. The majority of the residuals of this validation are between ± 5% of soil C. This indicates high level of accuracy in replicating topsoil values. In addition, the results were compared to a previous study estimating the carbon stock of the UK. We discuss the implications of our results within the context of soil C loss factors such as erosion and the impact on regional C process models. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. A Multidisciplinary Model of Evaluation Capacity Building

    ERIC Educational Resources Information Center

    Preskill, Hallie; Boyle, Shanelle

    2008-01-01

    Evaluation capacity building (ECB) has become a hot topic of conversation, activity, and study within the evaluation field. Seeking to enhance stakeholders' understanding of evaluation concepts and practices, and in an effort to create evaluation cultures, organizations have been implementing a variety of strategies to help their members learn…

  3. Partnerships for Knowledge Building: An Emerging Model

    ERIC Educational Resources Information Center

    Laferriere, Therese; Montane, Mireia; Gros, Begona; Alvarez, Isabel; Bernaus, Merce; Breuleux, Alain; Allaire, Stephane; Hamel, Christine; Lamon, Mary

    2010-01-01

    Knowledge Building is approached in this study from an organizational perspective, with a focus on the nature of school-university-government partnerships to support research-based educational innovation. The paper starts with an overview of what is known about effective partnerships and elaborates a conceptual framework for Knowledge Building…

  4. Evolution and anti-evolution in a minimal stock market model

    NASA Astrophysics Data System (ADS)

    Rothenstein, R.; Pawelzik, K.

    2003-08-01

    We present a novel microscopic stock market model consisting of a large number of random agents modeling traders in a market. Each agent is characterized by a set of parameters that serve to make iterated predictions of two successive returns. The future price is determined according to the offer and the demand of all agents. The system evolves by redistributing the capital among the agents in each trading cycle. Without noise the dynamics of this system is nearly regular and thereby fails to reproduce the stochastic return fluctuations observed in real markets. However, when in each cycle a small amount of noise is introduced we find the typical features of real financial time series like fat-tails of the return distribution and large temporal correlations in the volatility without significant correlations in the price returns. Introducing the noise by an evolutionary process leads to different scalings of the return distributions that depend on the definition of fitness. Because our realistic model has only very few parameters, and the results appear to be robust with respect to the noise level and the number of agents we expect that our framework may serve as new paradigm for modeling self-generated return fluctuations in markets.

  5. A continuous time delay-difference type model (CTDDM) applied to stock assessment of the southern Atlantic albacore Thunnus alalunga

    NASA Astrophysics Data System (ADS)

    Liao, Baochao; Liu, Qun; Zhang, Kui; Baset, Abdul; Memon, Aamir Mahmood; Memon, Khadim Hussain; Han, Yanan

    2016-09-01

    A continuous time delay-diff erence model (CTDDM) has been established that considers continuous time delays of biological processes. The southern Atlantic albacore ( Thunnus alalunga) stock is the one of the commercially important tuna population in the marine world. The age structured production model (ASPM) and the surplus production model (SPM) have already been used to assess the albacore stock. However, the ASPM requires detailed biological information and the SPM lacks the biological realism. In this study, we focus on applying a CTDDM to the southern Atlantic albacore ( T. alalunga) species, which provides an alternative method to assess this fishery. It is the first time that CTDDM has been provided for assessing the Atlantic albacore ( T. alalunga) fishery. CTDDM obtained the 80% confidence interval of MSY (maximum sustainable yield) of (21 510 t, 23 118t). The catch in 2011 (24 100 t) is higher than the MSY values and the relative fishing mortality ratio ( F 2011/ F MSY) is higher than 1.0. The results of CTDDM were analyzed to verify the proposed methodology and provide reference information for the sustainable management of the southern Atlantic albacore stock. The CTDDM treats the recruitment, the growth, and the mortality rates as all varying continuously over time and fills gaps between ASPM and SPM in this stock assessment.

  6. Fast, Automated, Photo realistic, 3D Modeling of Building Interiors

    DTIC Science & Technology

    2016-09-12

    important for applications in augmented and virtual reality , indoor navigation, and building simulation software. This paper presents a method to...navigation, augmented and virtual reality , as well as building energy simulation software. These applications require watertight models with limited...preservation, entertainment, and augmented reality , the demand for both fast and accurate scanning technologies has dramatically increased. In this

  7. Modelling Technology for Building Fire Scene with Virtual Geographic Environment

    NASA Astrophysics Data System (ADS)

    Song, Y.; Zhao, L.; Wei, M.; Zhang, H.; Liu, W.

    2017-09-01

    Building fire is a risky activity that can lead to disaster and massive destruction. The management and disposal of building fire has always attracted much interest from researchers. Integrated Virtual Geographic Environment (VGE) is a good choice for building fire safety management and emergency decisions, in which a more real and rich fire process can be computed and obtained dynamically, and the results of fire simulations and analyses can be much more accurate as well. To modelling building fire scene with VGE, the application requirements and modelling objective of building fire scene were analysed in this paper. Then, the four core elements of modelling building fire scene (the building space environment, the fire event, the indoor Fire Extinguishing System (FES) and the indoor crowd) were implemented, and the relationship between the elements was discussed also. Finally, with the theory and framework of VGE, the technology of building fire scene system with VGE was designed within the data environment, the model environment, the expression environment, and the collaborative environment as well. The functions and key techniques in each environment are also analysed, which may provide a reference for further development and other research on VGE.

  8. Making Connections to the "Real World": A Model Building Lesson

    ERIC Educational Resources Information Center

    Horibe, Shusaku; Underwood, Bret

    2009-01-01

    Classroom activities that include the process of model building, in which students build simplified physical representations of a system, have the potential to help students make meaningful connections between physics and the real world. We describe a lesson designed with this intent for an introductory college classroom that engages students in…

  9. Climate change impacts on carbon stocks of Mediterranean soils: a CarboSOIL model application

    NASA Astrophysics Data System (ADS)

    Muñoz-Rojas, Miriam; Jordán, Antonio; Zavala, Lorena M.; de la Rosa, Diego; González-Peñaloza, Félix A.; Kotb Abd-Elmabod, Sameh; Anaya-Romero, María

    2013-04-01

    The Mediterranean area is among the most sensible regions to climate change and large increases in temperature as well as drought periods and heavy rainfall events have been forecasted in the next decades. Soil organic C (SOC) prevents from soil erosion and desertification and enhances bio-diversity. Therefore, soil C accumulation capacity should be considered regarding to adaptation strategies to climate change in view of the high resilience of soils with an adequate level of organic C to a warming, drying climate. In this research we propose a new methodology to predict SOC contents and changes under different climate change scenarios: CarboSoil model. CarboSOIL model is part of the land evaluation decision support system MicroLEIS DSS and was designed as a GIS tool to predict SOC stored at different depths (0-25, 25-50, 50-75 and 0-75 cm). The model includes site, land use, climate and soil variables, and was trained and validated in two Mediterranean areas (Andalusia, S Spain, and Valencia, E Spain, respectively) and applied in different IPCC scenarios (A1B, A2 and B1) according to different Global Climate Models (BCCR-BCM2, CNRMCM3 and ECHAM5) downscaled for the region of Andalusia. Output data were linked to spatial datasets (soil and land use) and spatial analysis was performed to quantify organic C stocks for different soil types under a range of land uses. Results highlight the negative impact of climate change on SOC. In particular, SOC contents are expected to decrease severely in the medium-high emissions A2 scenario by 2100. There is an overall trend towards decreasing of organic C stocks in the upper soil sections (0-25 cm and 25-50 cm) of most soil types. In Regosols under "open spaces" 80.4% of the current SOC is predicted to be lost in 2100 under the A2 scenario. CarboSOIL has proved its ability to predict the short, intermediate and long-term trends (2040s, 2070s and 2100s) of SOC dynamics and sequestration under projected future scenarios of

  10. Building Component Library: An Online Repository to Facilitate Building Energy Model Creation; Preprint

    SciTech Connect

    Fleming, K.; Long, N.; Swindler, A.

    2012-05-01

    This paper describes the Building Component Library (BCL), the U.S. Department of Energy's (DOE) online repository of building components that can be directly used to create energy models. This comprehensive, searchable library consists of components and measures as well as the metadata which describes them. The library is also designed to allow contributors to easily add new components, providing a continuously growing, standardized list of components for users to draw upon.

  11. Models for estimating organic emissions from building materials: Formaldehyde example

    NASA Astrophysics Data System (ADS)

    Hawthorne, Alan R.; Matthews, Thomas G.

    One important source of chronic exposure to low levels of organic compounds in the indoor environment is emissions from building materials. Because removal of offending products may be costly or otherwise impractical, it is important that the emissions of organic pollutants be understood prior to incorporation of these materials into buildings. Once the organic pollutants of concern are identified, based on potential health effects and emission potential from the building material, it is necessary that an emission model be developed to predict the behavior of emission rates under various indoor conditions. Examples of the type of requirements that must be addressed in developing models for estimating organic emissions from building materials into the indoor environment are presented. Important factors include the products' characteristic source strengths at standard test conditions, impact of variations in environmental conditions (such as temperature and humidity), concentrations of the modeled organic pollutants in indoor environments and product ages. Ideally, emission models should have physical/chemical bases so that the important physical factors can be identified and their relative importance quantified. Although a universal model describing organic emissions from all building materials may not be feasible due to the tremendous variety of organic products and building materials in use, the most studied of the volatile organic compounds from building materials, formaldehyde, is used to illustrate an approach to the development of a specific model for organic emissions.

  12. D Topological Indoor Building Modeling Integrated with Open Street Map

    NASA Astrophysics Data System (ADS)

    Jamali, A.; Rahman, A. Abdul; Boguslawski, P.

    2016-09-01

    Considering various fields of applications for building surveying and various demands, geometry representation of a building is the most crucial aspect of a building survey. The interiors of the buildings need to be described along with the relative locations of the rooms, corridors, doors and exits in many kinds of emergency response, such as fire, bombs, smoke, and pollution. Topological representation is a challenging task within the Geography Information Science (GIS) environment, as the data structures required to express these relationships are particularly difficult to develop. Even within the Computer Aided Design (CAD) community, the structures for expressing the relationships between adjacent building parts are complex and often incomplete. In this paper, an integration of 3D topological indoor building modeling in Dual Half Edge (DHE) data structure and outdoor navigation network from Open Street Map (OSM) is presented.

  13. Implementation and evaluation of a web based system for pharmacy stock management in rural Haiti.

    PubMed

    Berger, Elisabeth J; Jazayeri, Darius; Sauveur, Marcel; Manasse, Jean Joel; Plancher, Inel; Fiefe, Marquise; Laurat, Guerline; Joseph, Samahel; Kempton, Kathryn; Fraser, Hamish S F

    2007-10-11

    Managing the stock and supply of medication is essential for the provision of health care, especially in resource poor areas of the world. We have developed an innovative, web-based stock management system to support nine clinics in rural Haiti. Building on our experience with a web-based EMR system for our HIV patients, we developed a comprehensive stock tracking system that is modeled on the appearance of standardized WHO stock cards. The system allows pharmacy staff at all clinics to enter stock levels and also to request drugs and track shipments. Use of the system over the last 2 years has increased rapidly and we now track 450 products supporting care for 1.78 million patient visits annually. Over the last year drug stockouts have fallen from 2.6% to 1.1% and 97% of stock requests delivered were shipped within 1 day. We are now setting up this system in our clinics in rural Rwanda.

  14. Structural Simulations and Conservation Analysis -Historic Building Information Model (HBIM)

    NASA Astrophysics Data System (ADS)

    Dore, C.; Murphy, M.; McCarthy, S.; Brechin, F.; Casidy, C.; Dirix, E.

    2015-02-01

    In this paper the current findings to date of the Historic Building Information Model (HBIM) of the Four Courts in Dublin are presented. The Historic Building Information Model (HBIM) forms the basis for both structural and conservation analysis to measure the impact of war damage which still impacts on the building. The laser scan survey was carried out in the summer of 2014 of the internal and external structure. After registration and processing of the laser scan survey, the HBIM was created of the damaged section of the building and is presented as two separate workflows in this paper. The first is the model created from historic data, the second a procedural and segmented model developed from laser scan survey of the war damaged drum and dome. From both models structural damage and decay simulations will be developed for documentation and conservation analysis.

  15. Modelling inspection policies for building maintenance.

    PubMed

    Christer, A H

    1982-08-01

    A method of assessing the potential of an inspection maintenance policy as opposed to an existing breakdown maintenance policy for a building complex is developed. The method is based upon information likely to be available and specific subjective assessments which could be made available. Estimates of the expected number of defects identified at an inspection and the consequential cost saving are presented as functions of the inspection frequency.

  16. Stock enhancement to address multiple recreational fisheries objectives: an integrated model applied to red drum Sciaenops ocellatus in Florida.

    PubMed

    Camp, E V; Lorenzen, K; Ahrens, R N M; Allen, M S

    2014-12-01

    An integrated socioecological model was developed to evaluate the potential for stock enhancement with hatchery fishes to achieve socioeconomic and conservation objectives in recreational fisheries. As a case study, this model was applied to the red drum Sciaenops ocellatus recreational fishery in the Tampa Bay estuary, Florida, U.S.A. The results suggest that stocking of juvenile fish larger than the size at which the strongest density dependence in mortality occurs can help increase angler satisfaction and total fishing effort (socioeconomic objectives) but are likely to result in decreases to the abundance of wild fishes (a conservation objective). Stocking of small juveniles that are susceptible to density-dependent mortality after release does not achieve socioeconomic objectives (or only at excessive cost) but still leads to a reduction of wild fish abundance. The intensity and type of socioeconomic gains depended on assumptions of dynamic angler-effort responses and importance of catch-related satisfaction, with greatest gains possible if aggregate effort is responsive to increases in abundance and satisfaction that are greatly related to catch rates. These results emphasize the view of stock enhancement, not as a panacea but rather as a management tool with inherent costs that is best applied to recreational fisheries under certain conditions. © 2014 The Fisheries Society of the British Isles.

  17. Building energy modeling for green architecture and intelligent dashboard applications

    NASA Astrophysics Data System (ADS)

    DeBlois, Justin

    Buildings are responsible for 40% of the carbon emissions in the United States. Energy efficiency in this sector is key to reducing overall greenhouse gas emissions. This work studied the passive technique called the roof solar chimney for reducing the cooling load in homes architecturally. Three models of the chimney were created: a zonal building energy model, computational fluid dynamics model, and numerical analytic model. The study estimated the error introduced to the building energy model (BEM) through key assumptions, and then used a sensitivity analysis to examine the impact on the model outputs. The conclusion was that the error in the building energy model is small enough to use it for building simulation reliably. Further studies simulated the roof solar chimney in a whole building, integrated into one side of the roof. Comparisons were made between high and low efficiency constructions, and three ventilation strategies. The results showed that in four US climates, the roof solar chimney results in significant cooling load energy savings of up to 90%. After developing this new method for the small scale representation of a passive architecture technique in BEM, the study expanded the scope to address a fundamental issue in modeling - the implementation of the uncertainty from and improvement of occupant behavior. This is believed to be one of the weakest links in both accurate modeling and proper, energy efficient building operation. A calibrated model of the Mascaro Center for Sustainable Innovation's LEED Gold, 3,400 m2 building was created. Then algorithms were developed for integration to the building's dashboard application that show the occupant the energy savings for a variety of behaviors in real time. An approach using neural networks to act on real-time building automation system data was found to be the most accurate and efficient way to predict the current energy savings for each scenario. A stochastic study examined the impact of the

  18. Jeddah Historical Building Information Modelling "JHBIM" - Object Library

    NASA Astrophysics Data System (ADS)

    Baik, A.; Alitany, A.; Boehm, J.; Robson, S.

    2014-05-01

    The theory of using Building Information Modelling "BIM" has been used in several Heritage places in the worldwide, in the case of conserving, documenting, managing, and creating full engineering drawings and information. However, one of the most serious issues that facing many experts in order to use the Historical Building Information Modelling "HBIM", is creating the complicated architectural elements of these Historical buildings. In fact, many of these outstanding architectural elements have been designed and created in the site to fit the exact location. Similarly, this issue has been faced the experts in Old Jeddah in order to use the BIM method for Old Jeddah historical Building. Moreover, The Saudi Arabian City has a long history as it contains large number of historic houses and buildings that were built since the 16th century. Furthermore, the BIM model of the historical building in Old Jeddah always take a lot of time, due to the unique of Hijazi architectural elements and no such elements library, which have been took a lot of time to be modelled. This paper will focus on building the Hijazi architectural elements library based on laser scanner and image survey data. This solution will reduce the time to complete the HBIM model and offering in depth and rich digital architectural elements library to be used in any heritage projects in Al-Balad district, Jeddah City.

  19. Artificial intelligence support for scientific model-building

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1992-01-01

    Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.

  20. Estimation of a simple agent-based model of financial markets: An application to Australian stock and foreign exchange data

    NASA Astrophysics Data System (ADS)

    Alfarano, Simone; Lux, Thomas; Wagner, Friedrich

    2006-10-01

    Following Alfarano et al. [Estimation of agent-based models: the case of an asymmetric herding model, Comput. Econ. 26 (2005) 19-49; Excess volatility and herding in an artificial financial market: analytical approach and estimation, in: W. Franz, H. Ramser, M. Stadler (Eds.), Funktionsfähigkeit und Stabilität von Finanzmärkten, Mohr Siebeck, Tübingen, 2005, pp. 241-254], we consider a simple agent-based model of a highly stylized financial market. The model takes Kirman's ant process [A. Kirman, Epidemics of opinion and speculative bubbles in financial markets, in: M.P. Taylor (Ed.), Money and Financial Markets, Blackwell, Cambridge, 1991, pp. 354-368; A. Kirman, Ants, rationality, and recruitment, Q. J. Econ. 108 (1993) 137-156] of mimetic contagion as its starting point, but allows for asymmetry in the attractiveness of both groups. Embedding the contagion process into a standard asset-pricing framework, and identifying the abstract groups of the herding model as chartists and fundamentalist traders, a market with periodic bubbles and bursts is obtained. Taking stock of the availability of a closed-form solution for the stationary distribution of returns for this model, we can estimate its parameters via maximum likelihood. Expanding our earlier work, this paper presents pertinent estimates for the Australian dollar/US dollar exchange rate and the Australian stock market index. As it turns out, our model indicates dominance of fundamentalist behavior in both the stock and foreign exchange market.

  1. Integration of Building Knowledge Into Binary Space Partitioning for the Reconstruction of Regularized Building Models

    NASA Astrophysics Data System (ADS)

    Wichmann, A.; Jung, J.; Sohn, G.; Kada, M.; Ehlers, M.

    2015-09-01

    Recent approaches for the automatic reconstruction of 3D building models from airborne point cloud data integrate prior knowledge of roof shapes with the intention to improve the regularization of the resulting models without lessening the flexibility to generate all real-world occurring roof shapes. In this paper, we present a method to integrate building knowledge into the data-driven approach that uses binary space partitioning (BSP) for modeling the 3D building geometry. A retrospective regularization of polygons that emerge from the BSP tree is not without difficulty because it has to deal with the 2D BSP subdivision itself and the plane definitions of the resulting partition regions to ensure topological correctness. This is aggravated by the use of hyperplanes during the binary subdivision that often splits planar roof regions into several parts that are stored in different subtrees of the BSP tree. We therefore introduce the use of hyperpolylines in the generation of the BSP tree to avoid unnecessary spatial subdivisions, so that the spatial integrity of planar roof regions is better maintained. The hyperpolylines are shown to result from basic building roof knowledge that is extracted based on roof topology graphs. An adjustment of the underlying point segments ensures that the positions of the extracted hyperpolylines result in regularized 2D partitions as well as topologically correct 3D building models. The validity and limitations of the approach are demonstrated on real-world examples.

  2. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings.

  3. A Tutorial for Building CMMI Process Performance Models

    DTIC Science & Technology

    2010-04-26

    Process Simulation • Other Advanced Modeling Techniques • Markov Petri net Neural Nets Systems Dynamics, - , , 47 Robert Stoddard and Dave Zubrow...SSTC 2010 A Tutorial for Building CMMI Process Performance Models Software Engineering Institute C i M ll U i itarneg e e on n vers y...2010 to 00-00-2010 4. TITLE AND SUBTITLE A Tutorial for Building CMMI Process Performance Moels 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM

  4. Estimating national forest carbon stocks and dynamics: combining models and remotely sensed information

    NASA Astrophysics Data System (ADS)

    Smallman, Thomas Luke; Exbrayat, Jean-François; Bloom, Anthony; Williams, Mathew

    2017-04-01

    Forests are a critical component of the global carbon cycle, storing significant amounts of carbon, split between living biomass and dead organic matter. The carbon budget of forests is the most uncertain component of the global carbon cycle - it is currently impossible to quantify accurately the carbon source/sink strength of forest biomes due to their heterogeneity and complex dynamics. It has been a major challenge to generate robust carbon budgets across landscapes due to data scarcity. Models have been used for estimating carbon budgets, but outputs have lacked an assessment of uncertainty, making a robust assessment of their reliability and accuracy challenging. Here a Metropolis Hastings - Markov Chain Monte Carlo (MH-MCMC) data assimilation framework has been used to combine remotely sensed leaf area index (MODIS), biomass (where available) and deforestation estimates, in addition to forest planting information from the UK's national forest inventory, an estimate of soil carbon from the Harmonized World Database (HWSD) and plant trait information with a process model (DALEC) to produce a constrained analysis with a robust estimate of uncertainty of the UK forestry carbon budget between 2000 and 2010. Our analysis estimates the mean annual UK forest carbon sink at -3.9 MgC ha-1 yr-1 with a 95 % confidence interval between -4.0 and -3.1 MgC ha-1yr-1. The UK national forest inventory (NFI) estimates the mean UK forest carbon sink to be between -1.4 and -5.5 MgC ha-1 yr-1. The analysis estimate for total forest biomass stock in 2010 is estimated at 229 (177/232) TgC, while the NFI an estimated total forest biomass carbon stock of 216 TgC. Leaf carbon area (LCA) is a key plant trait which we are able to estimate using our analysis. Comparison of median estimates for (LCA) retrieved from the analysis and a UK land cover map show higher and lower values for LCA are estimated areas dominated by needle leaf and broad leaf forests forest respectively, consistent with

  5. Modelling growth variability in longline mussel farms as a function of stocking density and farm design

    NASA Astrophysics Data System (ADS)

    Rosland, Rune; Bacher, Cédric; Strand, Øivind; Aure, Jan; Strohmeier, Tore

    2011-11-01

    Mussels ( Mytilus edulis) are commonly cultivated on artificial structures like rafts, poles or longlines to facilitate farming operations. Farm structures and dense mussel populations may result in water flow reduction and seston depletion and thus reduced individual mussel growth and spatial growth variability inside a farm. One of the challenges in mussel farming is thus to scale and configure farms in order to optimise total mussel production and individual mussel quality under different environmental regimes. Here we present a spatially resolved model for simulation of flow reduction, seston depletion and individual mussel growth inside a longline farm based on information about farm configuration (spacing between longlines, farm length and stocking density) and background environmental conditions (current speed, seston concentration and temperature). The model simulations are forced by environmental data from two fjords in south-western Norway and the farm configurations are defined within operational ranges. The simulations demonstrate spatial growth patterns at longlines under environmental settings and farm configurations where flow reduction and seston depletion have significant impacts on individual mussel growth. Longline spacing has a strong impact on the spatial distribution of individual growth, and the spacing is characterised by a threshold value. Below the threshold growth reduction and spatial growth variability increase rapidly as a consequence of reduced water flow and seston supply rate, but increased filtration due to higher mussel densities also contributes to the growth reduction. The spacing threshold is moderated by other farm configuration factors and environmental conditions. Comparisons with seston depletion reported from other farm sites show that the model simulations are within observed ranges. A demonstration is provided on how the model can guide farm configuration with the aim of optimising total farm biomass and individual

  6. On the probability distribution of stock returns in the Mike-Farmer model

    NASA Astrophysics Data System (ADS)

    Gu, G.-F.; Zhou, W.-X.

    2009-02-01

    Recently, Mike and Farmer have constructed a very powerful and realistic behavioral model to mimick the dynamic process of stock price formation based on the empirical regularities of order placement and cancelation in a purely order-driven market, which can successfully reproduce the whole distribution of returns, not only the well-known power-law tails, together with several other important stylized facts. There are three key ingredients in the Mike-Farmer (MF) model: the long memory of order signs characterized by the Hurst index Hs, the distribution of relative order prices x in reference to the same best price described by a Student distribution (or Tsallis’ q-Gaussian), and the dynamics of order cancelation. They showed that different values of the Hurst index Hs and the freedom degree αx of the Student distribution can always produce power-law tails in the return distribution fr(r) with different tail exponent αr. In this paper, we study the origin of the power-law tails of the return distribution fr(r) in the MF model, based on extensive simulations with different combinations of the left part L(x) for x < 0 and the right part R(x) for x > 0 of fx(x). We find that power-law tails appear only when L(x) has a power-law tail, no matter R(x) has a power-law tail or not. In addition, we find that the distributions of returns in the MF model at different timescales can be well modeled by the Student distributions, whose tail exponents are close to the well-known cubic law and increase with the timescale.

  7. Development of hazard-compatible building fragility and vulnerability models

    USGS Publications Warehouse

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  8. Knowledge-based model building of proteins: concepts and examples.

    PubMed Central

    Bajorath, J.; Stenkamp, R.; Aruffo, A.

    1993-01-01

    We describe how to build protein models from structural templates. Methods to identify structural similarities between proteins in cases of significant, moderate to low, or virtually absent sequence similarity are discussed. The detection and evaluation of structural relationships is emphasized as a central aspect of protein modeling, distinct from the more technical aspects of model building. Computational techniques to generate and complement comparative protein models are also reviewed. Two examples, P-selectin and gp39, are presented to illustrate the derivation of protein model structures and their use in experimental studies. PMID:7505680

  9. A stock-and-flow simulation model of the US blood supply.

    PubMed

    Simonetti, Arianna; Forshee, Richard A; Anderson, Steven A; Walderhaug, Mark

    2014-03-01

    Lack of reporting requirements for the amount of blood stored in blood banks and hospitals poses challenges to effectively monitor the US blood supply. Effective strategies to minimize collection and donation disruptions in the supply require an understanding of the daily amount of blood available in the system. A stock-and-flow simulation model of the US blood supply was developed to obtain estimates of the daily on-hand availability of blood, with uncertainty and by ABO/Rh type. The model simulated potential impact on supply of using different blood management practices for transfusion: first in-first out (FIFO), using the oldest stored red blood cell units first; non-FIFO likely oldest, preferentially selecting older blood; and non-FIFO likely newest, preferentially selecting younger blood. Simulation results showed higher estimates of the steady-state of the blood supply level for FIFO (1,630,000 units, 95% prediction interval [PI] 1,610,000-1,650,000) than non-FIFO scenarios (likely oldest, 1,530,000 units, 95% PI 1,500,000-1,550,000; and likely newest, 1,190,000 units, 95% PI 1,160,000-1,220,000), either for overall blood or by blood types. To our knowledge, this model represents a first attempt to evaluate the impact of different blood management practices on daily availability and distribution of blood in the US blood supply. The average storage time before blood is being issued was influenced by blood management practices, for preferences of blood that is younger and also that use specific blood types. The model also suggests which practice could best approximate the current blood management system and may serve as useful tool for blood management. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  10. Modeling and Analysis of Solar Radiation Potentials on Building Rooftops

    SciTech Connect

    Omitaomu, Olufemi A; Kodysh, Jeffrey B; Bhaduri, Budhendra L

    2012-01-01

    The active application of photovoltaic for electricity generation could effectively transform neighborhoods and commercial districts into small, localized power plants. This application, however, relies heavily on an accurate estimation of the amount of solar radiation that is available on individual building rooftops. While many solar energy maps exist at higher spatial resolution for concentrated solar energy applications, the data from these maps are not suitable for roof-mounted photovoltaic for several reasons, including lack of data at the appropriate spatial resolution and lack of integration of building-specific characteristics into the models used to generate the maps. To address this problem, we have developed a modeling framework for estimating solar radiation potentials on individual building rooftops that is suitable for utility-scale applications as well as building-specific applications. The framework uses light detection and ranging (LIDAR) data at approximately 1-meter horizontal resolution and 0.3-meter vertical resolution as input for modeling a large number of buildings quickly. One of the strengths of this framework is the ability to parallelize its implementation. Furthermore, the framework accounts for building specific characteristics, such as roof slope, roof aspect, and shadowing effects, that are critical to roof-mounted photovoltaic systems. The resulting data has helped us to identify the so-called solar panel sweet spots on individual building rooftops and obtain accurate statistics of the variation in solar radiation as a function of time of year and geographical location.

  11. Building footprint extraction from digital surface models using neural networks

    NASA Astrophysics Data System (ADS)

    Davydova, Ksenia; Cui, Shiyong; Reinartz, Peter

    2016-10-01

    Two-dimensional building footprints are a basis for many applications: from cartography to three-dimensional building models generation. Although, many methodologies have been proposed for building footprint extraction, this topic remains an open research area. Neural networks are able to model the complex relationships between the multivariate input vector and the target vector. Based on these abilities we propose a methodology using neural networks and Markov Random Fields (MRF) for automatic building footprint extraction from normalized Digital Surface Model (nDSM) and satellite images within urban areas. The proposed approach has mainly two steps. In the first step, the unary terms are learned for the MRF energy function by a four-layer neural network. The neural network is learned on a large set of patches consisting of both nDSM and Normalized Difference Vegetation Index (NDVI). Then prediction is performed to calculate the unary terms that are used in the MRF. In the second step, the energy function is minimized using a maxflow algorithm, which leads to a binary building mask. The building extraction results are compared with available ground truth. The comparison illustrates the efficiency of the proposed algorithm which can extract approximately 80% of buildings from nDSM with high accuracy.

  12. An Occupant Behavior Model for Building Energy Efficiency and Safety

    NASA Astrophysics Data System (ADS)

    Pan, L. L.; Chen, T.; Jia, Q. S.; Yuan, R. X.; Wang, H. T.; Ding, R.

    2010-05-01

    An occupant behavior model is suggested to improve building energy efficiency and safety. This paper provides a generic outline of the model, which includes occupancy behavior abstraction, model framework and primary structure, input and output, computer simulation results as well as summary and outlook. Using information technology, now it's possible to collect large amount of information of occupancy. Yet this can only provide partial and historical information, so it's important to develop a model to have full view of the researched building as well as prediction. We used the infrared monitoring system which is set at the front door of the Low Energy Demo Building (LEDB) at Tsinghua University in China, to provide the time variation of the total number of occupants in the LEDB building. This information is used as input data for the model. While the RFID system is set on the 1st floor, which provides the time variation of the occupants' localization in each region. The collected data are used to validate the model. The simulation results show that this presented model provides a feasible framework to simulate occupants' behavior and predict the time variation of the number of occupants in the building. Further development and application of the model is also discussed.

  13. Building aggregate timber supply models from individual harvest choice

    Treesearch

    Maksym Polyakov; David N. Wear; Robert Huggett

    2009-01-01

    Timber supply has traditionally been modelled using aggregate data. In this paper, we build aggregate supply models for four roundwood products for the US state of North Carolina from a stand-level harvest choice model applied to detailed forest inventory. The simulated elasticities of pulpwood supply are much lower than reported by previous studies. Cross price...

  14. A New Method of Building Scale-Model Houses

    Treesearch

    Richard N. Malcolm

    1978-01-01

    Scale-model houses are used to display new architectural and construction designs.Some scale-model houses will not withstand the abuse of shipping and handling.This report describes how to build a solid-core model house which is rigid, lightweight, and sturdy.

  15. Building Test Cases through Model Driven Engineering

    NASA Astrophysics Data System (ADS)

    Sousa, Helaine; Lopes, Denivaldo; Abdelouahab, Zair; Hammoudi, Slimane; Claro, Daniela Barreiro

    Recently, Model Driven Engineering (MDE) has been proposed to face the complexity in the development, maintenance and evolution of large and distributed software systems. Model Driven Architecture (MDA) is an example of MDE. In this context, model transformations enable a large reuse of software systems through the transformation of a Platform Independent Model into a Platform Specific Model. Although source code can be generated from models, defects can be injected during the modeling or transformation process. In order to delivery software systems without defects that cause errors and fails, the source code must be submitted to test. In this paper, we present an approach that takes care of test in the whole software life cycle, i.e. it starts in the modeling level and finishes in the test of source code of software systems. We provide an example to illustrate our approach.

  16. Building Simple Hidden Markov Models. Classroom Notes

    ERIC Educational Resources Information Center

    Ching, Wai-Ki; Ng, Michael K.

    2004-01-01

    Hidden Markov models (HMMs) are widely used in bioinformatics, speech recognition and many other areas. This note presents HMMs via the framework of classical Markov chain models. A simple example is given to illustrate the model. An estimation method for the transition probabilities of the hidden states is also discussed.

  17. Temperature response functions introduce high uncertainty in modelled carbon stocks in cold temperature regimes

    NASA Astrophysics Data System (ADS)

    Portner, H.; Bugmann, H.; Wolf, A.

    2010-11-01

    Models of carbon cycling in terrestrial ecosystems contain formulations for the dependence of respiration on temperature, but the sensitivity of predicted carbon pools and fluxes to these formulations and their parameterization is not well understood. Thus, we performed an uncertainty analysis of soil organic matter decomposition with respect to its temperature dependency using the ecosystem model LPJ-GUESS. We used five temperature response functions (Exponential, Arrhenius, Lloyd-Taylor, Gaussian, Van't Hoff). We determined the parameter confidence ranges of the formulations by nonlinear regression analysis based on eight experimental datasets from Northern Hemisphere ecosystems. We sampled over the confidence ranges of the parameters and ran simulations for each pair of temperature response function and calibration site. We analyzed both the long-term and the short-term heterotrophic soil carbon dynamics over a virtual elevation gradient in southern Switzerland. The temperature relationship of Lloyd-Taylor fitted the overall data set best as the other functions either resulted in poor fits (Exponential, Arrhenius) or were not applicable for all datasets (Gaussian, Van't Hoff). There were two main sources of uncertainty for model simulations: (1) the lack of confidence in the parameter estimates of the temperature response, which increased with increasing temperature, and (2) the size of the simulated soil carbon pools, which increased with elevation, as slower turn-over times lead to higher carbon stocks and higher associated uncertainties. Our results therefore indicate that such projections are more uncertain for higher elevations and hence also higher latitudes, which are of key importance for the global terrestrial carbon budget.

  18. Heterogeneous stock rat: a unique animal model for mapping genes influencing bone fragility.

    PubMed

    Alam, Imranul; Koller, Daniel L; Sun, Qiwei; Roeder, Ryan K; Cañete, Toni; Blázquez, Gloria; López-Aumatell, Regina; Martínez-Membrives, Esther; Vicens-Costa, Elia; Mont, Carme; Díaz, Sira; Tobeña, Adolf; Fernández-Teruel, Alberto; Whitley, Adam; Strid, Pernilla; Diez, Margarita; Johannesson, Martina; Flint, Jonathan; Econs, Michael J; Turner, Charles H; Foroud, Tatiana

    2011-05-01

    Previously, we demonstrated that skeletal mass, structure and biomechanical properties vary considerably among 11 different inbred rat strains. Subsequently, we performed quantitative trait loci (QTL) analysis in four inbred rat strains (F344, LEW, COP and DA) for different bone phenotypes and identified several candidate genes influencing various bone traits. The standard approach to narrowing QTL intervals down to a few candidate genes typically employs the generation of congenic lines, which is time consuming and often not successful. A potential alternative approach is to use a highly genetically informative animal model resource capable of delivering very high resolution gene mapping such as Heterogeneous stock (HS) rat. HS rat was derived from eight inbred progenitors: ACI/N, BN/SsN, BUF/N, F344/N, M520/N, MR/N, WKY/N and WN/N. The genetic recombination pattern generated across 50 generations in these rats has been shown to deliver ultra-high even gene-level resolution for complex genetic studies. The purpose of this study is to investigate the usefulness of the HS rat model for fine mapping and identification of genes underlying bone fragility phenotypes. We compared bone geometry, density and strength phenotypes at multiple skeletal sites in HS rats with those obtained from five of the eight progenitor inbred strains. In addition, we estimated the heritability for different bone phenotypes in these rats and employed principal component analysis to explore relationships among bone phenotypes in the HS rats. Our study demonstrates that significant variability exists for different skeletal phenotypes in HS rats compared with their inbred progenitors. In addition, we estimated high heritability for several bone phenotypes and biologically interpretable factors explaining significant overall variability, suggesting that the HS rat model could be a unique genetic resource for rapid and efficient discovery of the genetic determinants of bone fragility. Copyright

  19. Heterogeneous stock rats: a new model to study the genetics of renal phenotypes

    PubMed Central

    Solberg Woods, Leah C.; Stelloh, Cary; Regner, Kevin R.; Schwabe, Tiffany; Eisenhauer, Jessica

    2010-01-01

    Chronic kidney disease is a growing medical concern, with an estimated 25.6 million people in the United States exhibiting some degree of kidney injury and/or decline in kidney function. Animal models provide great insight into the study of the genetics of complex diseases. In particular, heterogeneous stock (HS) rats represent a unique genetic resource enabling rapid fine-mapping of complex traits. However, they have not been explored as a model to study renal phenotypes. To evaluate the usefulness of HS rats in the genetics of renal traits, a time course evaluation (weeks 8–40) was performed for several renal phenotypes. As expected, a large degree of variation was seen for most renal traits. By week 24, three (of 40) rats exhibited marked proteinuria that increased gradually until week 40 and ranged from 33.7 to 80.2 mg/24 h. Detailed histological analysis confirmed renal damage in these rats. In addition, several rats consistently exhibited significant hematuria (5/41). Interestingly, these rats were not the same rats that exhibited proteinuria, indicating that susceptibility to different types of kidney injury is likely segregating within the HS population. One HS rat exhibited unilateral renal agenesis (URA), which was accompanied by a significant degree of proteinuria and glomerular and tubulointerstitial injury. The parents of this HS rat were identified and bred further. Additional offspring of this pair were observed to exhibit URA at frequency between 40% and 60%. In summary, these novel data demonstrate that HS rats exhibit variation in proteinuria and other kidney-related traits, confirming that the model harbors susceptibility alleles for kidney injury and providing the basis for further genetic studies. PMID:20219828

  20. Modelling forest carbon stock changes as affected by harvest and natural disturbances. I. Comparison with countries' estimates for forest management.

    PubMed

    Pilli, Roberto; Grassi, Giacomo; Kurz, Werner A; Viñas, Raúl Abad; Guerrero, Nuria Hue

    2016-12-01

    According to the post-2012 rules under the Kyoto protocol, developed countries that are signatories to the protocol have to estimate and report the greenhouse gas (GHG) emissions and removals from forest management (FM), with the option to exclude the emissions associated to natural disturbances, following the Intergovernmental Panel on Climate Change (IPCC) guidelines. To increase confidence in GHG estimates, the IPCC recommends performing verification activities, i.e. comparing country data with independent estimates. However, countries currently conduct relatively few verification efforts. The aim of this study is to implement a consistent methodological approach using the Carbon Budget Model (CBM) to estimate the net CO2 emissions from FM in 26 European Union (EU) countries for the period 2000-2012, including the impacts of natural disturbances. We validated our results against a totally independent case study and then we compared the CBM results with the data reported by countries in their 2014 Greenhouse Gas Inventories (GHGIs) submitted to the United Nations Framework Convention on Climate Change (UNFCCC). The match between the CBM results and the GHGIs was good in nine countries (i.e. the average of our results is within ±25 % compared to the GHGI and the correlation between CBM and GHGI is significant at P < 0.05) and partially good in ten countries. When the comparison was not satisfactory, in most cases we were able to identify possible reasons for these discrepancies, including: (1) a different representation of the interannual variability, e.g. where the GHGIs used the stock-change approach; (2) different assumptions for non-biomass pools, and for CO2 emissions from fires and harvest residues. In few cases, further analysis will be needed to identify any possible inappropriate data used by the CBM or problems in the GHGI. Finally, the frequent updates to data and methods used by countries to prepare GHGI makes the implementation of a consistent

  1. Duct thermal performance models for large commercial buildings

    SciTech Connect

    Wray, Craig P.

    2003-10-01

    Despite the potential for significant energy savings by reducing duct leakage or other thermal losses from duct systems in large commercial buildings, California Title 24 has no provisions to credit energy-efficient duct systems in these buildings. A substantial reason is the lack of readily available simulation tools to demonstrate the energy-saving benefits associated with efficient duct systems in large commercial buildings. The overall goal of the Efficient Distribution Systems (EDS) project within the PIER High Performance Commercial Building Systems Program is to bridge the gaps in current duct thermal performance modeling capabilities, and to expand our understanding of duct thermal performance in California large commercial buildings. As steps toward this goal, our strategy in the EDS project involves two parts: (1) developing a whole-building energy simulation approach for analyzing duct thermal performance in large commercial buildings, and (2) using the tool to identify the energy impacts of duct leakage in California large commercial buildings, in support of future recommendations to address duct performance in the Title 24 Energy Efficiency Standards for Nonresidential Buildings. The specific technical objectives for the EDS project were to: (1) Identify a near-term whole-building energy simulation approach that can be used in the impacts analysis task of this project (see Objective 3), with little or no modification. A secondary objective is to recommend how to proceed with long-term development of an improved compliance tool for Title 24 that addresses duct thermal performance. (2) Develop an Alternative Calculation Method (ACM) change proposal to include a new metric for thermal distribution system efficiency in the reporting requirements for the 2005 Title 24 Standards. The metric will facilitate future comparisons of different system types using a common ''yardstick''. (3) Using the selected near-term simulation approach, assess the impacts of

  2. Desk-top model buildings for dynamic earthquake response demonstrations

    USGS Publications Warehouse

    Brady, A. Gerald

    1992-01-01

    Models of buildings that illustrate dynamic resonance behavior when excited by hand are designed and built. Two types of buildings are considered, one with columns stronger than floors, the other with columns weaker than floors. Combinations and variations of these two types are possible. Floor masses and column stiffnesses are chosen in order that the frequency of the second mode is approximately five cycles per second, so that first and second modes can be excited manually. The models are expected to be resonated by hand by schoolchildren or persons unfamiliar with the dynamic resonant response of tall buildings, to gain an understanding of structural behavior during earthquakes. Among other things, this experience will develop a level of confidence in the builder and experimenter should they be in a high-rise building during an earthquake, sensing both these resonances and other violent shaking.

  3. NREL's Building Component Library for Use with Energy Models

    DOE Data Explorer

    The Building Component Library (BCL) is the U.S. Department of Energy’s comprehensive online searchable library of energy modeling building blocks and descriptive metadata. Novice users and seasoned practitioners can use the freely available and uniquely identifiable components to create energy models and cite the sources of input data, which will increase the credibility and reproducibility of their simulations. The BCL contains components which are the building blocks of an energy model. They can represent physical characteristics of the building such as roofs, walls, and windows, or can refer to related operational information such as occupancy and equipment schedules and weather information. Each component is identified through a set of attributes that are specific to its type, as well as other metadata such as provenance information and associated files. The BCL also contains energy conservation measures (ECM), referred to as measures, which describe a change to a building and its associated model. For the BCL, this description attempts to define a measure for reproducible application, either to compare it to a baseline model, to estimate potential energy savings, or to examine the effects of a particular implementation. The BCL currently contains more than 30,000 components and measures. A faceted search mechanism has been implemented on the BCL that allows users to filter through the search results using various facets. Facet categories include component and measure types, data source, and energy modeling software type. All attributes of a component or measure can also be used to filter the results.

  4. Heterogeneous stock rats: a model to study the genetics of despair-like behavior in adolescence.

    PubMed

    Holl, Katie; He, Hong; Wedemeyer, Michael; Clopton, Larissa; Wert, Stephanie; Meckes, Jeanie K; Cheng, Riyan; Kastner, Abigail; Palmer, Abraham A; Redei, Eva E; Solberg Woods, Leah C

    2017-08-22

    Major depressive disorder (MDD) is a complex illness caused by both genetic and environmental factors. Antidepressant resistance also has a genetic component. To date, however, very few genes have been identified for major depression or antidepressant resistance. In the current study, we investigated whether outbred heterogeneous stock (HS) rats would be a suitable model to uncover the genetics of depression and its connection to antidepressant resistance. The Wistar Kyoto (WKY) rat, one of the eight founders of the HS, is a recognized animal model of juvenile depression and is resistant to fluoxetine antidepressant treatment. We therefore hypothesized that adolescent HS rats would exhibit variation in both despair-like behavior and response to fluoxetine treatment. We assessed heritability of despair-like behavior and response to sub-acute fluoxetine using a modified forced swim test (FST) in four-week old HS rats. We also tested whether blood transcript levels previously identified as depression biomarkers in adolescent human subjects are differentially expressed in HS rats with high versus low FST immobility. We demonstrate heritability of despair-like behavior in four-week old HS rats and show that many HS rats are resistant to fluoxetine treatment. In addition, blood transcript levels of Amfr, Cdr2, and Kiaa1539, genes previously identified in human adolescents with MDD, are differentially expressed between HS rats with high vs low immobility. These data demonstrate that FST despair-like behavior will be amenable to genetic fine-mapping in adolescent HS rats. The overlap between human and HS blood biomarkers suggest that these studies may translate to depression in humans. This article is protected by copyright. All rights reserved.

  5. Team Learning: Building Shared Mental Models

    ERIC Educational Resources Information Center

    Van den Bossche, Piet; Gijselaers, Wim; Segers, Mien; Woltjer, Geert; Kirschner, Paul

    2011-01-01

    To gain insight in the social processes that underlie knowledge sharing in teams, this article questions which team learning behaviors lead to the construction of a shared mental model. Additionally, it explores how the development of shared mental models mediates the relation between team learning behaviors and team effectiveness. Analyses were…

  6. Building a Database for a Quantitative Model

    NASA Technical Reports Server (NTRS)

    Kahn, C. Joseph; Kleinhammer, Roger

    2014-01-01

    A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.

  7. Building Water Models: A Different Approach

    PubMed Central

    2015-01-01

    Simplified classical water models are currently an indispensable component in practical atomistic simulations. Yet, despite several decades of intense research, these models are still far from perfect. Presented here is an alternative approach to constructing widely used point charge water models. In contrast to the conventional approach, we do not impose any geometry constraints on the model other than the symmetry. Instead, we optimize the distribution of point charges to best describe the “electrostatics” of the water molecule. The resulting “optimal” 3-charge, 4-point rigid water model (OPC) reproduces a comprehensive set of bulk properties significantly more accurately than commonly used rigid models: average error relative to experiment is 0.76%. Close agreement with experiment holds over a wide range of temperatures. The improvements in the proposed model extend beyond bulk properties: compared to common rigid models, predicted hydration free energies of small molecules using OPC are uniformly closer to experiment, with root-mean-square error <1 kcal/mol. PMID:25400877

  8. Team Learning: Building Shared Mental Models

    ERIC Educational Resources Information Center

    Van den Bossche, Piet; Gijselaers, Wim; Segers, Mien; Woltjer, Geert; Kirschner, Paul

    2011-01-01

    To gain insight in the social processes that underlie knowledge sharing in teams, this article questions which team learning behaviors lead to the construction of a shared mental model. Additionally, it explores how the development of shared mental models mediates the relation between team learning behaviors and team effectiveness. Analyses were…

  9. Career Pathways Skill-Building Instructional Model.

    ERIC Educational Resources Information Center

    Community Coll. of Rhode Island, Warwick.

    As part of an effort to develop a skill-based education program for students that relates academic skills with workplace skills, the Community College of Rhode Island developed a working instructional model consisting of 6 areas, or strands, and 31 skills. The model is directed at students in grades 9 through 12 and recognizes the importance of…

  10. Carbon stock and carbon turnover in boreal and temperate forests - Integration of remote sensing data and global vegetation models

    NASA Astrophysics Data System (ADS)

    Thurner, Martin; Beer, Christian; Carvalhais, Nuno; Forkel, Matthias; Tito Rademacher, Tim; Santoro, Maurizio; Tum, Markus; Schmullius, Christiane

    2016-04-01

    Long-term vegetation dynamics are one of the key uncertainties of the carbon cycle. There are large differences in simulated vegetation carbon stocks and fluxes including productivity, respiration and carbon turnover between global vegetation models. Especially the implementation of climate-related mortality processes, for instance drought, fire, frost or insect effects, is often lacking or insufficient in current models and their importance at global scale is highly uncertain. These shortcomings have been due to the lack of spatially extensive information on vegetation carbon stocks, which cannot be provided by inventory data alone. Instead, we recently have been able to estimate northern boreal and temperate forest carbon stocks based on radar remote sensing data. Our spatially explicit product (0.01° resolution) shows strong agreement to inventory-based estimates at a regional scale and allows for a spatial evaluation of carbon stocks and dynamics simulated by global vegetation models. By combining this state-of-the-art biomass product and NPP datasets originating from remote sensing, we are able to study the relation between carbon turnover rate and a set of climate indices in northern boreal and temperate forests along spatial gradients. We observe an increasing turnover rate with colder winter temperatures and longer winters in boreal forests, suggesting frost damage and the trade-off between frost adaptation and growth being important mortality processes in this ecosystem. In contrast, turnover rate increases with climatic conditions favouring drought and insect outbreaks in temperate forests. Investigated global vegetation models from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP), including HYBRID4, JeDi, JULES, LPJml, ORCHIDEE, SDGVM, and VISIT, are able to reproduce observation-based spatial climate - turnover rate relationships only to a limited extent. While most of the models compare relatively well in terms of NPP, simulated

  11. Feasibility of dsRNA treatment for post-clearing SPF shrimp stocks of newly discovered viral infections using Laem Singh virus (LSNV) as a model.

    PubMed

    Saksmerprome, Vanvimon; Charoonnart, Patai; Flegel, Timothy W

    2017-05-02

    Using post-larvae derived from specific pathogen free (SPF) stocks in penaeid shrimp farming has led to a dramatic increase in production. At the same time, new pathogens of farmed shrimp are continually being discovered. Sometimes these pathogens are carried by shrimp and other crustaceans as persistent infections without gross signs of disease. Thus it is that a 5-generation stock of Penaeus monodon SPF for several pathogens was found, post-stock-development, to be persistently-infected with newly-discovered Laem Singh virus (LSNV). In this situation, the stock developers were faced with destroying their existing stock (developed over a long period at considerable cost) and starting the whole stock development process anew in order to add LSNV to its SPF list. As an alternative, it was hypothesized that injection of complementary dsRNA into viral-infected broodstock prior to mating might inhibit replication of the target virus sufficiently to reduce or eliminate its transmission to their offspring. Subsequent selection of uninfected offspring would allow for post-clearing of LSNV from the existing stock and for conversion of the stock to LSNV-free status. Testing this hypothesis using the LSNV-infected stock described above, we found that transmission was substantially reduced in several treated broodstock compared to much higher transmission in buffer-injected broodstock. Based on these results, the model is proposed for post-clearing of SPF stocks using dsRNA treatment. The model may also be applicable to post-clearing of exceptional, individual performers from grow-out ponds for return to a nucleus breeding center. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. To stock or not to stock? Assessing restoration potential of a remnant American shad spawning run with hatchery supplementation

    USGS Publications Warehouse

    Bailey, Michael M.; Zydlewski, Joseph

    2013-01-01

    Hatchery supplementation has been widely used as a restoration technique for American Shad Alosa sapidissima on the East Coast of the USA, but results have been equivocal. In the Penobscot River, Maine, dam removals and other improvements to fish passage will likely reestablish access to the majority of this species’ historic spawning habitat. Additional efforts being considered include the stocking of larval American Shad. The decision about whether to stock a river system undergoing restoration should be made after evaluating the probability of natural recolonization and examining the costs and benefits of potentially accelerating recovery using a stocking program. However, appropriate evaluation can be confounded by a dearth of information about the starting population size and age structure of the remnant American Shad spawning run in the river. We used the Penobscot River as a case study to assess the theoretical sensitivity of recovery time to either scenario (stocking or not) by building a deterministic model of an American Shad population. This model is based on the best available estimates of size at age, fecundity, rate of iteroparity, and recruitment. Density dependence was imposed, such that the population reached a plateau at an arbitrary recovery goal of 633,000 spawning adults. Stocking had a strong accelerating effect on the time to modeled recovery (as measured by the time to reach 50% of the recovery goal) in the base model, but stocking had diminishing effects with larger population sizes. There is a diminishing return to stocking when the starting population is modestly increased. With a low starting population (a spawning run of 1,000), supplementation with 12 million larvae annually accelerated modeled recovery by 12 years. Only a 2-year acceleration was observed if the starting population was 15,000. Such a heuristic model may aid managers in assessing the costs and benefits of stocking by incorporating a structured decision framework.

  13. Back to basics for Bayesian model building in genomic selection.

    PubMed

    Kärkkäinen, Hanni P; Sillanpää, Mikko J

    2012-07-01

    Numerous Bayesian methods of phenotype prediction and genomic breeding value estimation based on multilocus association models have been proposed. Computationally the methods have been based either on Markov chain Monte Carlo or on faster maximum a posteriori estimation. The demand for more accurate and more efficient estimation has led to the rapid emergence of workable methods, unfortunately at the expense of well-defined principles for Bayesian model building. In this article we go back to the basics and build a Bayesian multilocus association model for quantitative and binary traits with carefully defined hierarchical parameterization of Student's t and Laplace priors. In this treatment we consider alternative model structures, using indicator variables and polygenic terms. We make the most of the conjugate analysis, enabled by the hierarchical formulation of the prior densities, by deriving the fully conditional posterior densities of the parameters and using the acquired known distributions in building fast generalized expectation-maximization estimation algorithms.

  14. A Case of Business Model Conversion Using Planning and Scheduling Application Having Intermediate Cutting Stock

    NASA Astrophysics Data System (ADS)

    Takada, Masayoshi

    This paper describes the experience of developing a production planning and scheduling system, which makes more than 1700 kinds of end products from more than 300 kinds of large plates stocked in the intermediate warehouse. This system was mostly achieved with Constraint Logic Programming. As our business environments having changed, we ware in urgent need of converting our production method to the pull-type from the push-type. With the push-type production method, we have to keep our finish products in stock for three to four months and refill the stock in 30 days. The features of our system: it solved the complex network problem of 38 patterns found in the whole production process from cutting to shipment in generic way. At the former process, it also solved an optimization of the setups problem in the two-phase batch-type production.

  15. A Pathway Idea for Model Building.

    PubMed

    Mathai, A M; Moschopoulos, Panagis

    2012-01-01

    Models, mathematical or stochastic, which move from one functional form to another through pathway parameters, so that in between stages can be captured, are examined in this article. Models which move from generalized type-1 beta family to type-2 beta family, to generalized gamma family to generalized Mittag-Leffler family to Lévy distributions are examined here. It is known that one can likely find an approximate model for the data at hand whether the data are coming from biological, physical, engineering, social sciences or other areas. Different families of functions are connected through the pathway parameters and hence one will find a suitable member from within one of the families or in between stages of two families. Graphs are provided to show the movement of the different models showing thicker tails, thinner tails, right tail cut off etc.

  16. Building an environment model using depth information

    NASA Technical Reports Server (NTRS)

    Roth-Tabak, Y.; Jain, Ramesh

    1989-01-01

    Modeling the environment is one of the most crucial issues for the development and research of autonomous robot and tele-perception. Though the physical robot operates (navigates and performs various tasks) in the real world, any type of reasoning, such as situation assessment, planning or reasoning about action, is performed based on information in its internal world. Hence, the robot's intentional actions are inherently constrained by the models it has. These models may serve as interfaces between sensing modules and reasoning modules, or in the case of telerobots serve as interface between the human operator and the distant robot. A robot operating in a known restricted environment may have a priori knowledge of its whole possible work domain, which will be assimilated in its World Model. As the information in the World Model is relatively fixed, an Environment Model must be introduced to cope with the changes in the environment and to allow exploring entirely new domains. Introduced here is an algorithm that uses dense range data collected at various positions in the environment to refine and update or generate a 3-D volumetric model of an environment. The model, which is intended for autonomous robot navigation and tele-perception, consists of cubic voxels with the possible attributes: Void, Full, and Unknown. Experimental results from simulations of range data in synthetic environments are given. The quality of the results show great promise for dealing with noisy input data. The performance measures for the algorithm are defined, and quantitative results for noisy data and positional uncertainty are presented.

  17. A model for the sustainable selection of building envelope assemblies

    SciTech Connect

    Huedo, Patricia; Mulet, Elena; López-Mesa, Belinda

    2016-02-15

    The aim of this article is to define an evaluation model for the environmental impacts of building envelopes to support planners in the early phases of materials selection. The model is intended to estimate environmental impacts for different combinations of building envelope assemblies based on scientifically recognised sustainability indicators. These indicators will increase the amount of information that existing catalogues show to support planners in the selection of building assemblies. To define the model, first the environmental indicators were selected based on the specific aims of the intended sustainability assessment. Then, a simplified LCA methodology was developed to estimate the impacts applicable to three types of dwellings considering different envelope assemblies, building orientations and climate zones. This methodology takes into account the manufacturing, installation, maintenance and use phases of the building. Finally, the model was validated and a matrix in Excel was created as implementation of the model. - Highlights: • Method to assess the envelope impacts based on a simplified LCA • To be used at an earlier phase than the existing methods in a simple way. • It assigns a score by means of known sustainability indicators. • It estimates data about the embodied and operating environmental impacts. • It compares the investment costs with the costs of the consumed energy.

  18. Testing steady states carbon stocks of Yasso07 and ROMUL models against soil inventory data in Finland

    NASA Astrophysics Data System (ADS)

    Lehtonen, Aleksi; Linkosalo, Tapio; Heikkinen, Juha; Peltoniemi, Mikko; Sievänen, Risto; Mäkipää, Raisa; Tamminen, Pekka; Salemaa, Maija; Komarov, Alexander

    2015-04-01

    Soil carbon pool is a significant storage of carbon. Unfortunately, the significance of different drivers of this pool is still unknown. In order to predict future feedbacks of soils to climate change at global level, Earth system model (ESMs) are needed. These ESMs have been tested against soil carbon inventories in order to judge whether models can be used for future prediction. Unfortunately results have been poor, and e.g. Guenet et al. 2013 presents a test where soil carbon stocks by ORCHIDEE models are plotted at plot level against measurements without any correlation. Similarly, Todd-Brown et al. (2013) concludes that most ESMs are not able reproduce measured soil carbon stocks at grid level. Here we estimated litter input from trees and understorey vegetation to soil, based national forest inventory 9 data. Both, biomass estimates for trees and for understorey vegetation were smoothed with ordinary kriging methods and thereafter litter input was modeled by dominant tree species. Also regional litter input from natural mortality and harvesting residues were added to the input. Thereafter we applied Yasso07 (Tuomi et al. 2011) and ROMUL (Chertov et al. 2001) soil models to estimate steady-state carbon stocks for mineral soils of Finland on a 10*10 km2 grid. We run Yasso07 model with annual time step and using parameters based on Scandinavian data (Rantakari et al. 2012) and also with parameters based on global data set (Tuomi et al. 2011). ROMUL model was applied with and without soil water holding capacity information. Results were compared against Biosoil measurements of soil carbon stocks (n=521). We found out that the best match between model estimates and measurements by latitudinal bands (n=43) were by ROMUL model with soil water holding capacity, with RMSE of 9.9 Mg C. Second best match was with Yasso07 with Scandinavian parameters, with RMSE of 15.3 Mg C. Results of this study highlight two things, it is essential to run dynamic soil models with time

  19. Generating 3D building models from architectural drawings: a survey.

    PubMed

    Yin, Xuetao; Wonka, Peter; Razdan, Anshuman

    2009-01-01

    Automatically generating 3D building models from 2D architectural drawings has many useful applications in the architecture engineering and construction community. This survey of model generation from paper and CAD-based architectural drawings covers the common pipeline and compares various algorithms for each step of the process.

  20. Concordance between criteria for covariate model building.

    PubMed

    Hennig, Stefanie; Karlsson, Mats O

    2014-04-01

    When performing a population pharmacokinetic modelling analysis covariates are often added to the model. Such additions are often justified by improved goodness of fit and/or decreased in unexplained (random) parameter variability. Increased goodness of fit is most commonly measured by the decrease in the objective function value. Parameter variability can be defined as the sum of unexplained (random) and explained (predictable) variability. Increase in magnitude of explained parameter variability could be another possible criterion for judging improvement in the model. The agreement between these three criteria in diagnosing covariate-parameter relationships of different strengths and nature using stochastic simulations and estimations as well as assessing covariate-parameter relationships in four previously published real data examples were explored. Total estimated parameter variability was found to vary with the number of covariates introduced on the parameter. In the simulated examples and two real examples, the parameter variability increased with increasing number of included covariates. For the other real examples parameter variability decreased or did not change systematically with the addition of covariates. The three criteria were highly correlated, with the decrease in unexplained variability being more closely associated with changes in objective function values than increases in explained parameter variability were. The often used assumption that inclusion of covariates in models only shifts unexplained parameter variability to explained parameter variability appears not to be true, which may have implications for modelling decisions.

  1. Organic carbon stock modelling for the quantification of the carbon sinks in terrestrial ecosystems

    NASA Astrophysics Data System (ADS)

    Durante, Pilar; Algeet, Nur; Oyonarte, Cecilio

    2017-04-01

    Given the recent environmental policies derived from the serious threats caused by global change, practical measures to decrease net CO2 emissions have to be put in place. Regarding this, carbon sequestration is a major measure to reduce atmospheric CO2 concentrations within a short and medium term, where terrestrial ecosystems play a basic role as carbon sinks. Development of tools for quantification, assessment and management of organic carbon in ecosystems at different scales and management scenarios, it is essential to achieve these commitments. The aim of this study is to establish a methodological framework for the modeling of this tool, applied to a sustainable land use planning and management at spatial and temporal scale. The methodology for carbon stock estimation in ecosystems is based on merger techniques between carbon stored in soils and aerial biomass. For this purpose, both spatial variability map of soil organic carbon (SOC) and algorithms for calculation of forest species biomass will be created. For the modelling of the SOC spatial distribution at different map scales, it is necessary to fit in and screen the available information of soil database legacy. Subsequently, SOC modelling will be based on the SCORPAN model, a quantitative model use to assess the correlation among soil-forming factors measured at the same site location. These factors will be selected from both static (terrain morphometric variables) and dynamic variables (climatic variables and vegetation indexes -NDVI-), providing to the model the spatio-temporal characteristic. After the predictive model, spatial inference techniques will be used to achieve the final map and to extrapolate the data to unavailable information areas (automated random forest regression kriging). The estimated uncertainty will be calculated to assess the model performance at different scale approaches. Organic carbon modelling of aerial biomass will be estimate using LiDAR (Light Detection And Ranging

  2. Estimating national forest carbon stocks and dynamics: combining models and remotely sensed information

    NASA Astrophysics Data System (ADS)

    Smallman, Luke; Williams, Mathew

    2016-04-01

    Forests are a critical component of the global carbon cycle, storing significant amounts of carbon, split between living biomass and dead organic matter. The carbon budget of forests is the most uncertain component of the global carbon cycle - it is currently impossible to quantify accurately the carbon source/sink strength of forest biomes due to their heterogeneity and complex dynamics. It has been a major challenge to generate robust carbon budgets across landscapes due to data scarcity. Models have been used but outputs have lacked an assessment of uncertainty, making a robust assessment of their reliability and accuracy challenging. Here a Metropolis Hastings - Markov Chain Monte Carlo (MH-MCMC) data assimilation framework has been used to combine remotely sensed leaf area index (MODIS), biomass (where available) and deforestation estimates, in addition to forest planting and clear-felling information from the UK's national forest inventory, an estimate of soil carbon from the Harmonized World Database (HWSD) and plant trait information with a process model (DALEC) to produce a constrained analysis with a robust estimate of uncertainty of the UK forestry carbon budget between 2000 and 2010. Our analysis estimates the mean annual UK forest carbon sink at -3.9 MgC ha-1yr-1 with a 95 % confidence interval between -4.0 and -3.1 MgC ha-1 yr-1. The UK national forest inventory (NFI) estimates the mean UK forest carbon sink to be between -1.4 and -5.5 MgC ha-1 yr-1. The analysis estimate for total forest biomass stock in 2010 is estimated at 229 (177/232) TgC, while the NFI an estimated total forest biomass carbon stock of 216 TgC. Leaf carbon area (LCA) is a key plant trait which we are able to estimate using our analysis. Comparison of median estimates for LCA retrieved from the analysis and a UK land cover map show higher and lower values for LCA are estimated areas dominated by needle leaf and broad leaf forests forest respectively, consistent with ecological

  3. Temperature effects on stocks and stability of a phytoplankton-zooplankton model and the dependence on light and nutrients

    USGS Publications Warehouse

    Norberg, J.; DeAngelis, D.L.

    1997-01-01

    A model of a closed phytoplankton—zooplankton ecosystem was analyzed for effects of temperature on stocks and stability and the dependence of these effects on light and total nutrient concentration of the system. An analysis of the steady state equations showed that the effect of temperature on zooplankton and POM biomass was levelled when primary production is nutrient limited. Temperature increase had a generally negative effect on all biomasses at high nutrient levels due to increased maintenance costs. Nutrient limitation of net primary production is the main factor governing the effect of stocks and flows as well as the stability of the system. All components of the system, except for phytoplankton biomass, are proportional to net production and thus to the net effect of light on photosynthesis. However, temperature determines the slope of that relationship. The resilience of the system was measured by calculating the eigenvalues of the steady state. Under oligotrophic conditions, the system can be stable, but an increase in temperature can cause instability or a decrease in resilience. This conclusion is discussed in the face of recent models that take spatial heterogeneity into account and display far more stable behavior, in better agreement to empirical data. Using simulations, we found that the amplitude of fluctuations of the herbivore stock increases with temperature while the mean biomass and minimum values decrease in comparison with steady state predictions

  4. A Pathway Idea in Model Building

    NASA Astrophysics Data System (ADS)

    Mathai, A. M.; Haubold, H. J.

    2014-01-01

    The pathway idea is a way of going from one family of functions to another family of functions and yet another family of functions through a parameter in the mode l so that a switching mechanism is introduced into the model through a parameter. The advantage of the idea is that the model can cover the ideal or stable situation in a physical situation as well as cover the unstable neighborhoods or move from unstable neighborhoods to the stable situation. The basic idea is illustrated for the real scalar case here and its connections to topics in astrophysics and non-extens ive statistical mechanics, namely superstatistics and Tsallis statistics, Mittag-Leffler models, hypergeometric functions and generalized special functions such as the H-function etc are pointed out. The pathway idea is available for the real and complex rectangular matrix variate cases but only the real scalar case is illustrated here.

  5. Impact of the U.S. National Building Information Model Standard (NBIMS) on Building Energy Performance Simulation

    SciTech Connect

    Bazjanac, Vladimir

    2007-08-01

    The U.S. National Institute for Building Sciences (NIBS) started the development of the National Building Information Model Standard (NBIMS). Its goal is to define standard sets of data required to describe any given building in necessary detail so that any given AECO industry discipline application can find needed data at any point in the building lifecycle. This will include all data that are used in or are pertinent to building energy performance simulation and analysis. This paper describes the background that lead to the development of NBIMS, its goals and development methodology, its Part 1 (Version 1.0), and its probable impact on building energy performance simulation and analysis.

  6. Time-varying volatility in Malaysian stock exchange: An empirical study using multiple-volatility-shift fractionally integrated model

    NASA Astrophysics Data System (ADS)

    Cheong, Chin Wen

    2008-02-01

    This article investigated the influences of structural breaks on the fractionally integrated time-varying volatility model in the Malaysian stock markets which included the Kuala Lumpur composite index and four major sectoral indices. A fractionally integrated time-varying volatility model combined with sudden changes is developed to study the possibility of structural change in the empirical data sets. Our empirical results showed substantial reduction in fractional differencing parameters after the inclusion of structural change during the Asian financial and currency crises. Moreover, the fractionally integrated model with sudden change in volatility performed better in the estimation and specification evaluations.

  7. Application of the ORCHIDEE global vegetation model to evaluate biomass and soil carbon stocks of Qinghai-Tibetan grasslands

    NASA Astrophysics Data System (ADS)

    Tan, Kun; Ciais, Philippe; Piao, Shilong; Wu, Xiaopu; Tang, Yanhong; Vuichard, Nicolas; Liang, Shuang; Fang, Jingyun

    2010-03-01

    The cold grasslands of the Qinghai-Tibetan Plateau form a globally significant biome, which represents 6% of the world's grasslands and 44% of China's grasslands. Yet little is known about carbon cycling in this biome. In this study, we calibrated and applied a process-based ecosystem model called Organizing Carbon and Hydrology in Dynamic Ecosystems (ORCHIDEE) to estimate the C fluxes and stocks of these grasslands. First, the parameterizations of ORCHIDEE were improved and calibrated against multiple time-scale and spatial-scale observations of (1) eddy-covariance fluxes of CO2 above one alpine meadow site; (2) soil temperature collocated with 30 meteorological stations; (3) satellite leaf area index (LAI) data collocated with the meteorological stations; and (4) soil organic carbon (SOC) density profiles from China's Second National Soil Survey. The extensive SOC survey data were used to extrapolate local fluxes to the entire grassland biome. After calibration, we show that ORCHIDEE can successfully capture the seasonal variation of net ecosystem exchange (NEE), as well as the LAI and SOC spatial distribution. We applied the calibrated model to estimate 0.3 Pg C yr-1 (1 Pg = 1015 g) of total annual net primary productivity (NPP), 0.4 Pg C of vegetation total biomass (aboveground and belowground), and 12 Pg C of SOC stocks for Qinghai-Tibetan grasslands covering an area of 1.4 × 106 km2. The mean annual NPP, vegetation biomass, and soil carbon stocks decrease from the southeast to the northwest, along with precipitation gradients. Our results also suggest that in response to an increase of temperature by 2°C, approximately 10% of current SOC stocks in Qinghai-Tibetan grasslands could be lost, even though NPP increases by about 9%. This result implies that Qinghai-Tibetan grasslands may be a vulnerable component of the terrestrial carbon cycle to future climate warming.

  8. Building Action Principles for Extended MHD Models

    NASA Astrophysics Data System (ADS)

    Keramidas Charidakos, Ioannis; Lingam, Manasvi; Morrison, Philip; White, Ryan; Wurm, Alexander

    2014-10-01

    The general, non-dissipative, two-fluid model in plasma physics is Hamiltonian, but this property is sometimes lost in the process of deriving simplified two-fluid or one-fluid models from the two-fluid equations of motion. One way to ensure that the reduced models are Hamiltonian is to derive them from an action. We start with the general two-fluid action functional for an electron and an ion fluid interacting with an electromagnetic field, expressed in Lagrangian variables. We perform a change of variables and make various approximations (eg. quasineutrality and ordering of the fields) and small parameter expansions directly in the action. The resulting equations of motion are then mapped to the Eulerian fluid variables using a novel nonlocal Lagrange-Euler map. The correct Eulerian equations are obtained after we impose locality. Using this method and the proper approximations and expansions, we recover Lust's general two-fluid model, extended MHD, Hall MHD, and Electron MHD from a unified framework. The variational formulation allows us to use Noether's theorem to derive conserved quantities for each symmetry of the action. U.S. Dept. of Energy Contract # DE-FG05-80ET-53088, Western New England University Research Fund.

  9. Evolutionary Tuning of Building Models to Monthly Electrical Consumption

    SciTech Connect

    Garrett, Aaron; New, Joshua Ryan; Chandler, Theodore

    2013-01-01

    Building energy models of existing buildings are unreliable unless calibrated so they correlate well with actual energy usage. Calibrating models is costly because it is currently an art which requires significant manual effort by an experienced and skilled professional. An automated methodology could significantly decrease this cost and facilitate greater adoption of energy simulation capabilities into the marketplace. The Autotune project is a novel methodology which leverages supercomputing, large databases of simulation data, and machine learning to allow automatic calibration of simulations to match measured experimental data on commodity hardware. This paper shares initial results from the automated methodology applied to the calibration of building energy models (BEM) for EnergyPlus (E+) to reproduce measured monthly electrical data.

  10. Allometric Models for Predicting Aboveground Biomass and Carbon Stock of Tropical Perennial C4 Grasses in Hawaii.

    PubMed

    Youkhana, Adel H; Ogoshi, Richard M; Kiniry, James R; Meki, Manyowa N; Nakahata, Mae H; Crow, Susan E

    2017-01-01

    Biomass is a promising renewable energy option that provides a more environmentally sustainable alternative to fossil resources by reducing the net flux of greenhouse gasses to the atmosphere. Yet, allometric models that allow the prediction of aboveground biomass (AGB), biomass carbon (C) stock non-destructively have not yet been developed for tropical perennial C4 grasses currently under consideration as potential bioenergy feedstock in Hawaii and other subtropical and tropical locations. The objectives of this study were to develop optimal allometric relationships and site-specific models to predict AGB, biomass C stock of napiergrass, energycane, and sugarcane under cultivation practices for renewable energy and validate these site-specific models against independent data sets generated from sites with widely different environments. Several allometric models were developed for each species from data at a low elevation field on the island of Maui, Hawaii. A simple power model with stalk diameter (D) was best related to AGB and biomass C stock for napiergrass, energycane, and sugarcane, (R(2) = 0.98, 0.96, and 0.97, respectively). The models were then tested against data collected from independent fields across an environmental gradient. For all crops, the models over-predicted AGB in plants with lower stalk D, but AGB was under-predicted in plants with higher stalk D. The models using stalk D were better for biomass prediction compared to dewlap H (Height from the base cut to most recently exposed leaf dewlap) models, which showed weak validation performance. Although stalk D model performed better, however, the mean square error (MSE)-systematic was ranged from 23 to 43 % of MSE for all crops. A strong relationship between model coefficient and rainfall was existed, although these were irrigated systems; suggesting a simple site-specific coefficient modulator for rainfall to reduce systematic errors in water-limited areas. These allometric equations provide a

  11. Allometric Models for Predicting Aboveground Biomass and Carbon Stock of Tropical Perennial C4 Grasses in Hawaii

    DOE PAGES

    Youkhana, Adel H.; Ogoshi, Richard M.; Kiniry, James R.; ...

    2017-05-02

    Biomass is a promising renewable energy option that provides a more environmentally sustainable alternative to fossil resources by reducing the net flux of greenhouse gasses to the atmosphere. Yet, allometric models that allow the prediction of aboveground biomass (AGB), biomass carbon (C) stock non-destructively have not yet been developed for tropical perennial C4 grasses currently under consideration as potential bioenergy feedstock in Hawaii and other subtropical and tropical locations. The objectives of this study were to develop optimal allometric relationships and site-specific models to predict AGB, biomass C stock of napiergrass, energycane, and sugarcane under cultivation practices for renewable energymore » and validate these site-specific models against independent data sets generated from sites with widely different environments. Several allometric models were developed for each species from data at a low elevation field on the island of Maui, Hawaii. A simple power model with stalk diameter (D) was best related to AGB and biomass C stock for napiergrass, energycane, and sugarcane, (R2 = 0.98, 0.96, and 0.97, respectively). The models were then tested against data collected from independent fields across an environmental gradient. For all crops, the models over-predicted AGB in plants with lower stalk D, but AGB was under-predicted in plants with higher stalk D. The models using stalk D were better for biomass prediction compared to dewlap H (Height from the base cut to most recently exposed leaf dewlap) models, which showed weak validation performance. Although stalk D model performed better, however, the mean square error (MSE)-systematic was ranged from 23 to 43 % of MSE for all crops. A strong relationship between model coefficient and rainfall was existed, although these were irrigated systems; suggesting a simple site-specific coefficient modulator for rainfall to reduce systematic errors in water-limited areas. These allometric equations

  12. Exploitation of Semantic Building Model in Indoor Navigation Systems

    NASA Astrophysics Data System (ADS)

    Anjomshoaa, A.; Shayeganfar, F.; Tjoa, A. Min

    2009-04-01

    There are many types of indoor and outdoor navigation tools and methodologies available. A majority of these solutions are based on Global Positioning Systems (GPS) and instant video and image processing. These approaches are ideal for open world environments where very few information about the target location is available, but for large scale building environments such as hospitals, governmental offices, etc the end-user will need more detailed information about the surrounding context which is especially important in case of people with special needs. This paper presents a smart indoor navigation solution that is based on Semantic Web technologies and Building Information Model (BIM). The proposed solution is also aligned with Google Android's concepts to enlighten the realization of results. Keywords: IAI IFCXML, Building Information Model, Indoor Navigation, Semantic Web, Google Android, People with Special Needs 1 Introduction Built environment is a central factor in our daily life and a big portion of human life is spent inside buildings. Traditionally the buildings are documented using building maps and plans by utilization of IT tools such as computer-aided design (CAD) applications. Documenting the maps in an electronic way is already pervasive but CAD drawings do not suffice the requirements regarding effective building models that can be shared with other building-related applications such as indoor navigation systems. The navigation in built environment is not a new issue, however with the advances in emerging technologies like GPS, mobile and networked environments, and Semantic Web new solutions have been suggested to enrich the traditional building maps and convert them to smart information resources that can be reused in other applications and improve the interpretability with building inhabitants and building visitors. Other important issues that should be addressed in building navigation scenarios are location tagging and end-user communication

  13. Building metaphors and extending models of grief.

    PubMed

    VandeCreek, L

    1985-01-01

    Persons in grief turn to metaphors as they seek to understand and express their experience. Metaphors illustrated in this article include "grief is a whirlwind," "grief is the Great Depression all over again" and "grief is gray, cloudy and rainy weather." Hospice personnel can enhance their bereavement efforts by identifying and cultivating the expression of personal metaphors from patients and families. Two metaphors have gained wide cultural acceptance and lie behind contemporary scientific explorations of grief. These are "grief is recovery from illness" (Bowlby and Parkes) and "death is the last stage of growth and grief is the adjustment reaction to this growth" (Kubler-Ross). These models have developed linear perspectives of grief but have neglected to study the fluctuating intensity of symptoms. Adopting Worden's four-part typology of grief, the author illustrates how the pie graph can be used to display this important aspect of the grief experience, thus enhancing these models.

  14. Visualization and model building in medical imaging.

    PubMed

    McDonald, J P; Siebert, J P; Fryer, R J; Urquhart, C W

    1994-01-01

    We present technologies and ideas, developed from the JFIT 'Active Stereo Probe Project', which are applicable to problems within medical measurement and monitoring. Two related areas are considered. The first concerns patient body surface modelling. During the project two state-of-the-art non-contact surface measurement techniques have been developed which are applicable to medical situations requiring dense and accurate body surface modelling. Such applications include, for example, prosthetic appliance fabrication, presurgical planning and non-invasive deformity analysis. The second is concerned with overlay projection. Using this enabling technology the information content of a scene can be enhanced as an aid to medical personnel. Results and illustrative applications of the newly developed technology are presented.

  15. The Schwarzschild Method for Building Galaxy Models

    NASA Astrophysics Data System (ADS)

    de Zeeuw, P. T.

    1998-09-01

    Martin Schwarzschild is most widely known as one of the towering figures of the theory of stellar evolution. However, from the early fifties onward he displayed a strong interest in dynamical astronomy, and in particular in its application to the structure of star clusters and galaxies. This resulted in a string of remarkable investigations, including the discovery of what became known as the Spitzer-Schwarzschild mechanism, the invention of the strip count method for mass determinations, the demonstration of the existence of dark matter on large scales, and the study of the nucleus of M31, based on his own Stratoscope II balloon observations. With his retirement approaching he decided to leave the field of stellar evolution, and to make his life--long hobby of stellar dynamics a full-time occupation, and to tackle the problem of self-consistent equilibria for elliptical galaxies, which by then were suspected to have a triaxial shape. Rather than following classical methods, which had trouble already in dealing with axisymmetric systems, he invented a simple numerical technique, which seeks to populate individual stellar orbits in the galaxy potential so as to reproduce the associated model density. This is now known as Schwarzschild's method. He showed by numerical calculation that most stellar orbits in a triaxial potential relevant for elliptical galaxies have two effective integrals of motion in addition to the classical energy integral, and then constructed the first ever self-consistent equilibrium model for a realistic triaxial galaxy. This provided a very strong stimulus to research in the dynamics of flattened galaxies. This talk will review how Schwarzschild's Method is used today, in problems ranging from the existence of equilibrium models as a function of shape, central cusp slope, tumbling rate, and presence of a central point mass, to modeling of individual galaxies to find stellar dynamical evidence for dark matter in extended halos, and/or massive

  16. Building Qualitative Models of Thermodynamic Processes

    DTIC Science & Technology

    2007-01-01

    definition for fluid flow 39 28 Modifying flow rates according to conductance assumptions 40 29 Transfer of heat during fluid flow 40 30...bindings of the : type and : conditions modifiers hold . Notice that if any of these statements is known to be false such an instance can never exist, let...enforce the consistent use of modeling assumptions . That is, if it is assumed that FOO is a Thermal-Physob, then it must be the case that one is

  17. Involving Stakeholders in Building Integrated Fisheries Models Using Bayesian Methods

    NASA Astrophysics Data System (ADS)

    Haapasaari, Päivi; Mäntyniemi, Samu; Kuikka, Sakari

    2013-06-01

    A participatory Bayesian approach was used to investigate how the views of stakeholders could be utilized to develop models to help understand the Central Baltic herring fishery. In task one, we applied the Bayesian belief network methodology to elicit the causal assumptions of six stakeholders on factors that influence natural mortality, growth, and egg survival of the herring stock in probabilistic terms. We also integrated the expressed views into a meta-model using the Bayesian model averaging (BMA) method. In task two, we used influence diagrams to study qualitatively how the stakeholders frame the management problem of the herring fishery and elucidate what kind of causalities the different views involve. The paper combines these two tasks to assess the suitability of the methodological choices to participatory modeling in terms of both a modeling tool and participation mode. The paper also assesses the potential of the study to contribute to the development of participatory modeling practices. It is concluded that the subjective perspective to knowledge, that is fundamental in Bayesian theory, suits participatory modeling better than a positivist paradigm that seeks the objective truth. The methodology provides a flexible tool that can be adapted to different kinds of needs and challenges of participatory modeling. The ability of the approach to deal with small data sets makes it cost-effective in participatory contexts. However, the BMA methodology used in modeling the biological uncertainties is so complex that it needs further development before it can be introduced to wider use in participatory contexts.

  18. Involving stakeholders in building integrated fisheries models using Bayesian methods.

    PubMed

    Haapasaari, Päivi; Mäntyniemi, Samu; Kuikka, Sakari

    2013-06-01

    A participatory Bayesian approach was used to investigate how the views of stakeholders could be utilized to develop models to help understand the Central Baltic herring fishery. In task one, we applied the Bayesian belief network methodology to elicit the causal assumptions of six stakeholders on factors that influence natural mortality, growth, and egg survival of the herring stock in probabilistic terms. We also integrated the expressed views into a meta-model using the Bayesian model averaging (BMA) method. In task two, we used influence diagrams to study qualitatively how the stakeholders frame the management problem of the herring fishery and elucidate what kind of causalities the different views involve. The paper combines these two tasks to assess the suitability of the methodological choices to participatory modeling in terms of both a modeling tool and participation mode. The paper also assesses the potential of the study to contribute to the development of participatory modeling practices. It is concluded that the subjective perspective to knowledge, that is fundamental in Bayesian theory, suits participatory modeling better than a positivist paradigm that seeks the objective truth. The methodology provides a flexible tool that can be adapted to different kinds of needs and challenges of participatory modeling. The ability of the approach to deal with small data sets makes it cost-effective in participatory contexts. However, the BMA methodology used in modeling the biological uncertainties is so complex that it needs further development before it can be introduced to wider use in participatory contexts.

  19. A stock-flow consistent input-output model with applications to energy price shocks, interest rates, and heat emissions

    NASA Astrophysics Data System (ADS)

    Berg, Matthew; Hartley, Brian; Richters, Oliver

    2015-01-01

    By synthesizing stock-flow consistent models, input-output models, and aspects of ecological macroeconomics, a method is developed to simultaneously model monetary flows through the financial system, flows of produced goods and services through the real economy, and flows of physical materials through the natural environment. This paper highlights the linkages between the physical environment and the economic system by emphasizing the role of the energy industry. A conceptual model is developed in general form with an arbitrary number of sectors, while emphasizing connections with the agent-based, econophysics, and complexity economics literature. First, we use the model to challenge claims that 0% interest rates are a necessary condition for a stationary economy and conduct a stability analysis within the parameter space of interest rates and consumption parameters of an economy in stock-flow equilibrium. Second, we analyze the role of energy price shocks in contributing to recessions, incorporating several propagation and amplification mechanisms. Third, implied heat emissions from energy conversion and the effect of anthropogenic heat flux on climate change are considered in light of a minimal single-layer atmosphere climate model, although the model is only implicitly, not explicitly, linked to the economic model.

  20. Stabilities of Stock States in Chinese Stock Markets

    NASA Astrophysics Data System (ADS)

    Lim, Gyuchang; Seo, Kyungho; Kim, Soo Yong; Kim, Kyungsik

    We study the evolution of the correlation-based clusters of stocks, which usually accord with business groups. By segmenting the whole time series into several overlapping segments, we trace the dynamical evolution of each business sectors in terms of the multi-factor model and especially treat the stock prices of Shanghai composites that are not incorporated into developed markets of the financial time stock exchange index.

  1. Fractional Market Model and its Verification on the Warsaw STOCK Exchange

    NASA Astrophysics Data System (ADS)

    Kozłowska, Marzena; Kasprzak, Andrzej; Kutner, Ryszard

    We analyzed the rising and relaxation of the cusp-like local peaks superposed with oscillations which were well defined by the Warsaw Stock Exchange index WIG in a daily time horizon. We found that the falling paths of all index peaks were described by a generalized exponential function or the Mittag-Leffler (ML) one superposed with various types of oscillations. However, the rising paths (except the first one of WIG which rises exponentially and the most important last one which rises again according to the ML function) can be better described by bullish anti-bubbles or inverted bubbles.2-4 The ML function superposed with oscillations is a solution of the nonhomogeneous fractional relaxation equation which defines here our Fractional Market Model (FMM) of index dynamics which can be also called the Rheological Model of Market. This solution is a generalized analog of an exactly solvable fractional version of the Standard or Zener Solid Model of viscoelastic materials commonly used in modern rheology.5 For example, we found that the falling paths of the index can be considered to be a system in the intermediate state lying between two complex ones, defined by short and long-time limits of the Mittag-Leffler function; these limits are given by the Kohlrausch-Williams-Watts (KWW) law for the initial times, and the power-law or the Nutting law for asymptotic time. Some rising paths (i.e., the bullish anti-bubbles) are a kind of log-periodic oscillations of the market in the bullish state initiated by a crash. The peaks of the index can be viewed as precritical or precrash ones since: (i) the financial market changes its state too early from the bullish to bearish one before it reaches a scaling region (defined by the diverging power-law of return per unit time), and (ii) they are affected by a finite size effect. These features could be a reminiscence of a significant risk aversion of the investors and their finite number, respectively. However, this means that the

  2. Current State of the Art Historic Building Information Modelling

    NASA Astrophysics Data System (ADS)

    Dore, C.; Murphy, M.

    2017-08-01

    In an extensive review of existing literature a number of observations were made in relation to the current approaches for recording and modelling existing buildings and environments: Data collection and pre-processing techniques are becoming increasingly automated to allow for near real-time data capture and fast processing of this data for later modelling applications. Current BIM software is almost completely focused on new buildings and has very limited tools and pre-defined libraries for modelling existing and historic buildings. The development of reusable parametric library objects for existing and historic buildings supports modelling with high levels of detail while decreasing the modelling time. Mapping these parametric objects to survey data, however, is still a time-consuming task that requires further research. Promising developments have been made towards automatic object recognition and feature extraction from point clouds for as-built BIM. However, results are currently limited to simple and planar features. Further work is required for automatic accurate and reliable reconstruction of complex geometries from point cloud data. Procedural modelling can provide an automated solution for generating 3D geometries but lacks the detail and accuracy required for most as-built applications in AEC and heritage fields.

  3. Modelling the Carbon Stocks Estimation of the Tropical Lowland Dipterocarp Forest Using LIDAR and Remotely Sensed Data

    NASA Astrophysics Data System (ADS)

    Zaki, N. A. M.; Latif, Z. A.; Suratman, M. N.; Zainal, M. Z.

    2016-06-01

    Tropical forest embraces a large stock of carbon in the global carbon cycle and contributes to the enormous amount of above and below ground biomass. The carbon kept in the aboveground living biomass of trees is typically the largest pool and the most directly impacted by the anthropogenic factor such as deforestation and forest degradation. However, fewer studies had been proposed to model the carbon for tropical rain forest and the quantification still remain uncertainties. A multiple linear regression (MLR) is one of the methods to define the relationship between the field inventory measurements and the statistical extracted from the remotely sensed data which is LiDAR and WorldView-3 imagery (WV-3). This paper highlight the model development from fusion of multispectral WV-3 with the LIDAR metrics to model the carbon estimation of the tropical lowland Dipterocarp forest of the study area. The result shown the over segmentation and under segmentation value for this output is 0.19 and 0.11 respectively, thus D-value for the classification is 0.19 which is 81%. Overall, this study produce a significant correlation coefficient (r) between Crown projection area (CPA) and Carbon stocks (CS); height from LiDAR (H_LDR) and Carbon stocks (CS); and Crown projection area (CPA) and height from LiDAR (H_LDR) were shown 0.671, 0.709 and 0.549 respectively. The CPA of the segmentation found to be representative spatially with higher correlation of relationship between diameter at the breast height (DBH) and carbon stocks which is Pearson Correlation p = 0.000 (p < 0.01) with correlation coefficient (r) is 0.909 which shown that there a good relationship between carbon and DBH predictors to improve the inventory estimates of carbon using multiple linear regression method. The study concluded that the integration of WV-3 imagery with the CHM raster based LiDAR were useful in order to quantify the AGB and carbon stocks for a larger sample area of the Lowland Dipterocarp forest.

  4. Dynamic stock and end-of-life flow identification based on the internal cycle model and mean-age monitoring.

    PubMed

    Tsiliyannis, Christos Aristeides

    2014-07-01

    Planning of end-of-life (EoL) product take-back systems and sizing of dismantling and recycling centers, entails the EoL flow (EoLF) that originates from the product dynamic stock (DS). Several uncertain factors (economic, technological, health, social and environmental) render both the EoLF and the remaining stock uncertain. Early losses of products during use due to biodegradation, wear and uncertain factors such as withdrawals and exports of used, may diminish the stock and the EoLF. Life expectancy prediction methods are static, ignoring early losses and inapt under dynamic conditions. Existing dynamic methods, either consider a single uncertain factor (e.g. GDP) approximately or heuristically modelled and ignore other factors that may become dominant, or assume cognizance of DS and of the center axis of the EoL exit distribution that are unknown for most products. As a result, reliable dynamic EoLF prediction for both durables and consumer end-products is still challenging. The present work develops an identification method for estimating the early loss and DS and predicting the dynamic EoLF, based on available input data (production+net imports) and on sampled measurements of the stock mean-age and the EoLF mean-age. The mean ages are scaled quantities, slowly varying, even under dynamic conditions and can be reliably determined, even from small size and/or frequent samples. The method identifies the early loss sequence, as well as the center axis and spread of the EoL exit distribution, which are subsequently used to determine the DS and EoLF profiles, enabling consistent and reliable predictions.

  5. Model-building codes for membrane proteins.

    SciTech Connect

    Shirley, David Noyes; Hunt, Thomas W.; Brown, W. Michael; Schoeniger, Joseph S.; Slepoy, Alexander; Sale, Kenneth L.; Young, Malin M.; Faulon, Jean-Loup Michel; Gray, Genetha Anne

    2005-01-01

    We have developed a novel approach to modeling the transmembrane spanning helical bundles of integral membrane proteins using only a sparse set of distance constraints, such as those derived from MS3-D, dipolar-EPR and FRET experiments. Algorithms have been written for searching the conformational space of membrane protein folds matching the set of distance constraints, which provides initial structures for local conformational searches. Local conformation search is achieved by optimizing these candidates against a custom penalty function that incorporates both measures derived from statistical analysis of solved membrane protein structures and distance constraints obtained from experiments. This results in refined helical bundles to which the interhelical loops and amino acid side-chains are added. Using a set of only 27 distance constraints extracted from the literature, our methods successfully recover the structure of dark-adapted rhodopsin to within 3.2 {angstrom} of the crystal structure.

  6. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, R.

    1992-01-01

    The key elements in the second year (1991-92) of our project are: (1) implementation of the distributed system prototype; (2) successful passing of the candidacy examination and a PhD proposal acceptance by the funded student; (3) design of storage efficient schemes for replicated distributed systems; and (4) modeling of gracefully degrading reliable computing systems. In the third year of the project (1992-93), we propose to: (1) complete the testing of the prototype; (2) enhance the functionality of the modules by enabling the experimentation with more complex protocols; (3) use the prototype to verify the theoretically predicted performance of locking protocols, etc.; and (4) work on issues related to real-time distributed systems. This should result in efficient protocols for these systems.

  7. Modeling of heat and mass transfer in lateritic building envelopes

    NASA Astrophysics Data System (ADS)

    Meukam, Pierre; Noumowe, Albert

    2005-12-01

    The aim of the present work is to investigate the behavior of building envelopes made of local lateritic soil bricks subjected to different climatic conditions. The building envelopes studied in this work consist of lateritic soil bricks with incorporation of natural pozzolan or sawdust in order to obtain small thermal conductivity and low-density materials. In order to describe coupled heat and moisture transfer in wet porous materials, the coupled equations were solved by the introduction of diffusion coefficients. A numerical model HMtrans, developed for prediction of heat and moisture transfer in multi-layered building components, was used to simulate the temperature, water content and relative humidity profiles within the building envelopes. The results allow the prediction of the duration of the exposed building walls to the local weather conditions. They show that the durability of building envelopes made of lateritic soil bricks with incorporation of natural pozzolan or sawdust is not strongly affected by the climatic conditions in tropical and equatorial areas.

  8. Improving Traditional Building Repair Construction Quality Using Historic Building Information Modeling Concept

    NASA Astrophysics Data System (ADS)

    Wu, T. C.; Lin, Y. C.; Hsu, M. F.; Zheng, N. W.; Chen, W. L.

    2013-07-01

    In addition to the repair construction project following the repair principles contemplated by heritage experts, the construction process should be recorded and measured at any time for monitoring to ensure the quality of repair. The conventional construction record methods mostly depend on the localized shooting of 2D digital images coupled with text and table for illustration to achieve the purpose of monitoring. Such methods cannot fully and comprehensively record the 3D spatial relationships in the real world. Therefore, the construction records of traditional buildings are very important but cannot function due to technical limitations. This study applied the 3D laser scanning technology to establish a 3D point cloud model for the repair construction of historical buildings. It also broke down the detailed components of the 3D point cloud model by using the concept of the historic building information modeling, and established the 3D models of various components and their attribute data in the 3DGIS platform database. In the construction process, according to the time of completion of each stage as developed on the construction project, this study conducted the 3D laser scanning and database establishment for each stage, also applied 3DGIS spatial information and attribute information comparison and analysis to propose the analysis of differences in completion of various stages for improving the traditional building repair construction quality. This method helps to improve the quality of repair construction work of tangible cultural assets of the world. The established 3DGIS platform can be used as a power tool for subsequent management and maintenance.

  9. Combined Grammar for the Modeling of Building Interiors

    NASA Astrophysics Data System (ADS)

    Becker, S.; Peter, M.; Fritsch, D.; Philipp, D.; Baier, P.; Dibak, C.

    2013-11-01

    As spatial grammars have proven successful and efficient to deliver LOD3 models, the next challenge is their extension to indoor applications, leading to LOD4 models. Therefore, a combined indoor grammar for the automatic generation of indoor models from erroneous and incomplete observation data is presented. In building interiors where inaccurate observation data is available, the grammar can be used to make the reconstruction process robust, and verify the reconstructed geometries. In unobserved building interiors, the grammar can generate hypotheses about possible indoor geometries matching the style of the rest of the building. The grammar combines concepts from L-systems and split grammars. It is designed in such way that it can be derived from observation data fully automatically. Thus, manual predefinitions of the grammar rules usually required to tune the grammar to a specific building style, become obsolete. The potential benefit of using our grammar as support for indoor modeling is evaluated based on an example where the grammar has been applied to automatically generate an indoor model from erroneous and incomplete traces gathered by foot-mounted MEMS/IMU positioning systems.

  10. Evaluation study of building-resolved urban dispersion models

    SciTech Connect

    Flaherty, Julia E.; Allwine, K Jerry; Brown, Mike J.; Coirier, WIlliam J.; Ericson, Shawn C.; Hansen, Olav R.; Huber, Alan H.; Kim, Sura; Leach, Martin J.; Mirocha, Jeff D.; Newsom, Rob K.; Patnaik, Gopal; Senocak, Inanc

    2007-09-10

    For effective emergency response and recovery planning, it is critically important that building-resolved urban dispersion models be evaluated using field data. Several full-physics computational fluid dynamics (CFD) models and semi-empirical building-resolved (SEB) models are being advanced and applied to simulating flow and dispersion in urban areas. To obtain an estimate of the current state-of-readiness of these classes of models, the Department of Homeland Security (DHS) funded a study to compare five CFD models and one SEB model with tracer data from the extensive Midtown Manhattan field study (MID05) conducted during August 2005 as part of the DHS Urban Dispersion Program (UDP; Allwine and Flaherty 2007). Six days of tracer and meteorological experiments were conducted over an approximately 2-km-by-2-km area in Midtown Manhattan just south of Central Park in New York City. A subset of these data was used for model evaluations. The study was conducted such that an evaluation team, independent of the six modeling teams, provided all the input data (e.g., building data, meteorological data and tracer release rates) and run conditions for each of four experimental periods simulated. Tracer concentration data for two of the four experimental periods were provided to the modeling teams for their own evaluation of their respective models to ensure proper setup and operation. Tracer data were not provided for the second two experimental periods to provide for an independent evaluation of the models. The tracer concentrations resulting from the model simulations were provided to the evaluation team in a standard format for consistency in inter-comparing model results. An overview of the model evaluation approach will be given followed by a discussion on the qualitative comparison of the respective models with the field data. Future model developments efforts needed to address modeling gaps identified from this study will also be discussed.

  11. Simulation of changes in arctic terrestrial carbon stocks under using ecosys mathematical model

    NASA Astrophysics Data System (ADS)

    Metivier, K.; Grant, R. F.; Humphreys, E. R.; Lafleur, P.; Zhang, H.

    2010-12-01

    better represented. The study showed the importance of using ecosys mathematical model, in conjunction with measured data to asses both short and long-term sustainability of these northern ecosystems. The research will also allow recommendations of sustainable (soil, water, air, plant and other habitat quality) best management practices (BMPs) for northern ecosystems in Canada and around the world, today and tomorrow. A healthy environment will in turn help people in northern communities e.g. food, water, economic security. This research will contribute to many other different areas e.g. quantification of carbon stocks in inventories, carbon trading, IPCC Tier III methodology for the Kyoto Protocol, policy decisions etc. We hope that the research can contribute to avoiding climate change since this may disrupt the sustainability of these ecosystems vital for northern communities, as well as affect other regions of the world more negatively such as the Tropics.

  12. The Use of Mixed Effects Models for Obtaining Low-Cost Ecosystem Carbon Stock Estimates in Mangroves of the Asia-Pacific.

    PubMed

    Bukoski, Jacob J; Broadhead, Jeremy S; Donato, Daniel C; Murdiyarso, Daniel; Gregoire, Timothy G

    2017-01-01

    Mangroves provide extensive ecosystem services that support local livelihoods and international environmental goals, including coastal protection, biodiversity conservation and the sequestration of carbon (C). While voluntary C market projects seeking to preserve and enhance forest C stocks offer a potential means of generating finance for mangrove conservation, their implementation faces barriers due to the high costs of quantifying C stocks through field inventories. To streamline C quantification in mangrove conservation projects, we develop predictive models for (i) biomass-based C stocks, and (ii) soil-based C stocks for the mangroves of the Asia-Pacific. We compile datasets of mangrove biomass C (197 observations from 48 sites) and soil organic C (99 observations from 27 sites) to parameterize the predictive models, and use linear mixed effect models to model the expected C as a function of stand attributes. The most parsimonious biomass model predicts total biomass C stocks as a function of both basal area and the interaction between latitude and basal area, whereas the most parsimonious soil C model predicts soil C stocks as a function of the logarithmic transformations of both latitude and basal area. Random effects are specified by site for both models, which are found to explain a substantial proportion of variance within the estimation datasets and indicate significant heterogeneity across-sites within the region. The root mean square error (RMSE) of the biomass C model is approximated at 24.6 Mg/ha (18.4% of mean biomass C in the dataset), whereas the RMSE of the soil C model is estimated at 4.9 mg C/cm3 (14.1% of mean soil C). The results point to a need for standardization of forest metrics to facilitate meta-analyses, as well as provide important considerations for refining ecosystem C stock models in mangroves.

  13. The Use of Mixed Effects Models for Obtaining Low-Cost Ecosystem Carbon Stock Estimates in Mangroves of the Asia-Pacific

    PubMed Central

    Bukoski, Jacob J.; Broadhead, Jeremy S.; Donato, Daniel C.; Murdiyarso, Daniel; Gregoire, Timothy G.

    2017-01-01

    Mangroves provide extensive ecosystem services that support local livelihoods and international environmental goals, including coastal protection, biodiversity conservation and the sequestration of carbon (C). While voluntary C market projects seeking to preserve and enhance forest C stocks offer a potential means of generating finance for mangrove conservation, their implementation faces barriers due to the high costs of quantifying C stocks through field inventories. To streamline C quantification in mangrove conservation projects, we develop predictive models for (i) biomass-based C stocks, and (ii) soil-based C stocks for the mangroves of the Asia-Pacific. We compile datasets of mangrove biomass C (197 observations from 48 sites) and soil organic C (99 observations from 27 sites) to parameterize the predictive models, and use linear mixed effect models to model the expected C as a function of stand attributes. The most parsimonious biomass model predicts total biomass C stocks as a function of both basal area and the interaction between latitude and basal area, whereas the most parsimonious soil C model predicts soil C stocks as a function of the logarithmic transformations of both latitude and basal area. Random effects are specified by site for both models, which are found to explain a substantial proportion of variance within the estimation datasets and indicate significant heterogeneity across-sites within the region. The root mean square error (RMSE) of the biomass C model is approximated at 24.6 Mg/ha (18.4% of mean biomass C in the dataset), whereas the RMSE of the soil C model is estimated at 4.9 mg C/cm3 (14.1% of mean soil C). The results point to a need for standardization of forest metrics to facilitate meta-analyses, as well as provide important considerations for refining ecosystem C stock models in mangroves. PMID:28068361

  14. Effect of alternative models for increasing stocking density on the short-term behavior and hygiene of Holstein dairy cows.

    PubMed

    Krawczel, P D; Mooney, C S; Dann, H M; Carter, M P; Butzler, R E; Ballard, C S; Grant, R J

    2012-05-01

    imposing stocking density were bioequivalent for responses in behaviors, DMI, and hygiene. Future stocking density experiments in 4-row barns should simply deny resting and feeding space to simulate overcrowded housing conditions for lactating dairy cows because it is bioequivalent to more complicated, and potentially confounding, research models. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. Activity-dependent branching ratios in stocks, solar x-ray flux, and the Bak-Tang-Wiesenfeld sandpile model.

    PubMed

    Martin, Elliot; Shreim, Amer; Paczuski, Maya

    2010-01-01

    We define an activity-dependent branching ratio that allows comparison of different time series X(t). The branching ratio b(x) is defined as b(x)=E[xi(x)/x]. The random variable xi(x) is the value of the next signal given that the previous one is equal to x, so xi(x)=[X(t+1) | X(t)=x]. If b(x)>1, the process is on average supercritical when the signal is equal to x, while if b(x)<1, it is subcritical. For stock prices we find b(x)=1 within statistical uncertainty, for all x, consistent with an "efficient market hypothesis." For stock volumes, solar x-ray flux intensities, and the Bak-Tang-Wiesenfeld (BTW) sandpile model, b(x) is supercritical for small values of activity and subcritical for the largest ones, indicating a tendency to return to a typical value. For stock volumes this tendency has an approximate power-law behavior. For solar x-ray flux and the BTW model, there is a broad regime of activity where b(x) approximately equal 1, which we interpret as an indicator of critical behavior. This is true despite different underlying probability distributions for X(t) and for xi(x). For the BTW model the distribution of xi(x) is Gaussian, for x sufficiently larger than 1, and its variance grows linearly with x. Hence, the activity in the BTW model obeys a central limit theorem when sampling over past histories. The broad region of activity where b(x) is close to one disappears once bulk dissipation is introduced in the BTW model-supporting our hypothesis that it is an indicator of criticality.

  16. Semi-Automatic Modelling of Building FAÇADES with Shape Grammars Using Historic Building Information Modelling

    NASA Astrophysics Data System (ADS)

    Dore, C.; Murphy, M.

    2013-02-01

    This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.

  17. Overcoming Microsoft Excel's Weaknesses for Crop Model Building and Simulations

    ERIC Educational Resources Information Center

    Sung, Christopher Teh Boon

    2011-01-01

    Using spreadsheets such as Microsoft Excel for building crop models and running simulations can be beneficial. Excel is easy to use, powerful, and versatile, and it requires the least proficiency in computer programming compared to other programming platforms. Excel, however, has several weaknesses: it does not directly support loops for iterative…

  18. Facilities Management of Existing School Buildings: Two Models.

    ERIC Educational Resources Information Center

    Building Technology, Inc., Silver Spring, MD.

    While all school districts are responsible for the management of their existing buildings, they often approach the task in different ways. This document presents two models that offer ways a school district administration, regardless of size, may introduce activities into its ongoing management process that will lead to improvements in earthquake…

  19. Reframing Leadership Pedagogy through Model and Theory Building.

    ERIC Educational Resources Information Center

    Mello, Jeffrey A.

    1999-01-01

    Leadership theories formed the basis of a course assignment with four objectives: understanding complex factors affecting leadership dynamics, developing abilities to assess organizational factors influencing leadership, practicing model and theory building, and viewing leadership from a multicultural perspective. The assignment was to develop a…

  20. Getting Started and Working with Building Information Modeling

    ERIC Educational Resources Information Center

    Smith, Dana K.

    2009-01-01

    This article will assume that one has heard of Building Information Modeling or BIM but has not developed a strategy as to how to get the most out of it. The National BIM Standard (NBIMS) has defined BIM as a digital representation of physical and functional characteristics of a facility. As such, it serves as a shared knowledge resource for…

  1. Overcoming Microsoft Excel's Weaknesses for Crop Model Building and Simulations

    ERIC Educational Resources Information Center

    Sung, Christopher Teh Boon

    2011-01-01

    Using spreadsheets such as Microsoft Excel for building crop models and running simulations can be beneficial. Excel is easy to use, powerful, and versatile, and it requires the least proficiency in computer programming compared to other programming platforms. Excel, however, has several weaknesses: it does not directly support loops for iterative…

  2. Building and Sustaining Digital Collections: Models for Libraries and Museums.

    ERIC Educational Resources Information Center

    Council on Library and Information Resources, Washington, DC.

    In February 2001, the Council on Library and Information Resources (CLIR) and the National Initiative for a Networked Cultural Heritage (NINCH) convened a meeting to discuss how museums and libraries are building digital collections and what business models are available to sustain them. A group of museum and library senior executives met with…

  3. A Synergetic Model for Building an Intelligent Documentation System (IDS).

    ERIC Educational Resources Information Center

    Emdad, Ali

    1990-01-01

    Presents a conceptual framework for building an intelligent documentation system (IDS) for computer software by integrating hypermedia and expert systems technologies. The need for online computer documentation for end-users is discussed, and elements of the synergetic model are described, including knowledge representation, the hypermedia…

  4. A Relationship-Building Model for the Web Retail Marketplace.

    ERIC Educational Resources Information Center

    Wang, Fang; Head, Milena; Archer, Norm

    2000-01-01

    Discusses the effects of the Web on marketing practices. Introduces the concept and theory of relationship marketing. The relationship network concept, which typically is only applied to the business-to-business market, is discussed within the business-to-consumer market, and a new relationship-building model for the Web marketplace is proposed.…

  5. Building a Model PE Curriculum: Education Reform in Action

    ERIC Educational Resources Information Center

    Moore, John

    2012-01-01

    The blueprint to build a model physical education (PE) curriculum begins by establishing a sound curricular foundation based on a lesson plan template that incorporates clear and concise program goals, the alignment of lessons to state or national content standards, and the collection, analysis and use of objective assessment data that informs…

  6. Building information modeling (BIM) approach to the GMT Project

    NASA Astrophysics Data System (ADS)

    Teran, Jose; Sheehan, Michael; Neff, Daniel H.; Adriaanse, David; Grigel, Eric; Farahani, Arash

    2014-07-01

    The Giant Magellan Telescope (GMT), one of several next generation Extremely Large Telescopes (ELTs), is a 25.4 meter diameter altitude over azimuth design set to be built at the summit of Cerro Campánas at the Las Campánas Observatory in Chile. The paper describes the use of Building Information Modeling (BIM) for the GMT project.

  7. The Creation of Space Vector Models of Buildings From RPAS Photogrammetry Data

    NASA Astrophysics Data System (ADS)

    Trhan, Ondrej

    2017-06-01

    The results of Remote Piloted Aircraft System (RPAS) photogrammetry are digital surface models and orthophotos. The main problem of the digital surface models obtained is that buildings are not perpendicular and the shape of roofs is deformed. The task of this paper is to obtain a more accurate digital surface model using building reconstructions. The paper discusses the problem of obtaining and approximating building footprints, reconstructing the final spatial vector digital building model, and modifying the buildings on the digital surface model.

  8. Activity-dependent branching ratios in stocks, solar x-ray flux, and the Bak-Tang-Wiesenfeld sandpile model

    NASA Astrophysics Data System (ADS)

    Martin, Elliot; Shreim, Amer; Paczuski, Maya

    2010-01-01

    We define an activity-dependent branching ratio that allows comparison of different time series Xt . The branching ratio bx is defined as bx=E[ξx/x] . The random variable ξx is the value of the next signal given that the previous one is equal to x , so ξx={Xt+1∣Xt=x} . If bx>1 , the process is on average supercritical when the signal is equal to x , while if bx<1 , it is subcritical. For stock prices we find bx=1 within statistical uncertainty, for all x , consistent with an “efficient market hypothesis.” For stock volumes, solar x-ray flux intensities, and the Bak-Tang-Wiesenfeld (BTW) sandpile model, bx is supercritical for small values of activity and subcritical for the largest ones, indicating a tendency to return to a typical value. For stock volumes this tendency has an approximate power-law behavior. For solar x-ray flux and the BTW model, there is a broad regime of activity where bx≃1 , which we interpret as an indicator of critical behavior. This is true despite different underlying probability distributions for Xt and for ξx . For the BTW model the distribution of ξx is Gaussian, for x sufficiently larger than 1, and its variance grows linearly with x . Hence, the activity in the BTW model obeys a central limit theorem when sampling over past histories. The broad region of activity where bx is close to one disappears once bulk dissipation is introduced in the BTW model—supporting our hypothesis that it is an indicator of criticality.

  9. On a computational model of building thermal dynamic response

    NASA Astrophysics Data System (ADS)

    Jarošová, Petra; Vala, Jiří

    2016-07-01

    Development and exploitation of advanced materials, structures and technologies in civil engineering, both for buildings with carefully controlled interior temperature and for common residential houses, together with new European and national directives and technical standards, stimulate the development of rather complex and robust, but sufficiently simple and inexpensive computational tools, supporting their design and optimization of energy consumption. This paper demonstrates the possibility of consideration of such seemingly contradictory requirements, using the simplified non-stationary thermal model of a building, motivated by the analogy with the analysis of electric circuits; certain semi-analytical forms of solutions come from the method of lines.

  10. Enhancements to ASHRAE Standard 90.1 Prototype Building Models

    SciTech Connect

    Goel, Supriya; Athalye, Rahul A.; Wang, Weimin; Zhang, Jian; Rosenberg, Michael I.; Xie, YuLong; Hart, Philip R.; Mendon, Vrushali V.

    2014-04-16

    This report focuses on enhancements to prototype building models used to determine the energy impact of various versions of ANSI/ASHRAE/IES Standard 90.1. Since the last publication of the prototype building models, PNNL has made numerous enhancements to the original prototype models compliant with the 2004, 2007, and 2010 editions of Standard 90.1. Those enhancements are described here and were made for several reasons: (1) to change or improve prototype design assumptions; (2) to improve the simulation accuracy; (3) to improve the simulation infrastructure; and (4) to add additional detail to the models needed to capture certain energy impacts from Standard 90.1 improvements. These enhancements impact simulated prototype energy use, and consequently impact the savings estimated from edition to edition of Standard 90.1.

  11. Early experiences building a software quality prediction model

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  12. Introducing a decomposition rate modifier in the Rothamsted Carbon Model to predict soil organic carbon stocks in saline soils.

    PubMed

    Setia, Raj; Smith, Pete; Marschner, Petra; Baldock, Jeff; Chittleborough, David; Smith, Jo

    2011-08-01

    Soil organic carbon (SOC) models such as the Rothamsted Carbon Model (RothC) have been used to estimate SOC dynamics in soils over different time scales but, until recently, their ability to accurately predict SOC stocks/carbon dioxide (CO(2)) emissions from salt-affected soils has not been assessed. Given the large extent of salt-affected soils (19% of the 20.8 billion ha of arable land on Earth), this may lead to miss-estimation of CO(2) release. Using soils from two salt-affected regions (one in Punjab, India and one in South Australia), an incubation study was carried out measuring CO(2) release over 120 days. The soils varied both in salinity (measured as electrical conductivity (EC) and calculated as osmotic potential using EC and water content) and sodicity (measured as sodium adsorption ratio, SAR). For soils from both regions, the osmotic potential had a significant positive relationship with CO(2)-C release, but no significant relationship was found between SAR and CO(2)-C release. The monthly cumulative CO(2)-C was simulated using RothC. RothC was modified to take into account reductions in plant inputs due to salinity. A subset of non-salt-affected soils was used to derive an equation for a "lab-effect" modifier to account for changes in decomposition under lab conditions and this modifier was significantly related with pH. Using a subset of salt-affected soils, a decomposition rate modifier (as a function of osmotic potential) was developed to match measured and modelled CO(2)-C release after correcting for the lab effect. Using this decomposition rate modifier, we found an agreement (R(2) = 0.92) between modelled and independently measured data for a set of soils from the incubation experiment. RothC, modified by including reduced plant inputs due to salinity and the salinity decomposition rate modifier, was used to predict SOC stocks of soils in a field in South Australia. The predictions clearly showed that SOC stocks are reduced in saline soils

  13. 7 CFR Exhibit E to Subpart A of... - Voluntary National Model Building Codes

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 12 2011-01-01 2011-01-01 false Voluntary National Model Building Codes E Exhibit E... National Model Building Codes The following documents address the health and safety aspects of buildings and related structures and are voluntary national model building codes as defined in § 1924.4(h)(2)...

  14. Links Related to the Indoor Air Quality Building Education and Assessment Model

    EPA Pesticide Factsheets

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  15. Bibliography for the Indoor Air Quality Building Education and Assessment Model

    EPA Pesticide Factsheets

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  16. Building America Top Innovations 2012: Model Simulating Real Domestic Hot Water Use

    SciTech Connect

    none,

    2013-01-01

    This Building America Top Innovations profile describes Building America research that is improving domestic hot water modeling capabilities to more effectively address one of the largest energy uses in residential buildings.

  17. Supermultiplicative Speedups of Probabilistic Model-Building Genetic Algorithms

    DTIC Science & Technology

    2009-02-01

    simulations. We (Todd Martinez (2005 MacArthur fellow), Duanc Johnson, Kumara Sastry and David E. Goldberg) have applied inultiobjcctive GAs and model...AUTHOR(S) David E. Goldberg. Kumara Sastry. Martin Pelikan 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S...Speedups of Probabilistic Model-Building Genetic Algorithms AFOSR Grant No. FA9550-06-1-0096 February 1, 2006 to November 30, 2008 David E. Goldberg

  18. Resolving model parameter values from carbon and nitrogen stock measurements in a wide range of tropical mature forests using nonlinear inversion and regression trees

    Treesearch

    Shuguang Liua; Pamela Anderson; Guoyi Zhoud; Boone Kauffman; Flint Hughes; David Schimel; Vicente Watson; Joseph. Tosi

    2008-01-01

    Objectively assessing the performance of a model and deriving model parameter values from observations are critical and challenging in landscape to regional modeling. In this paper, we applied a nonlinear inversion technique to calibrate the ecosystem model CENTURY against carbon (C) and nitrogen (N) stock measurements collected from 39 mature tropical forest sites in...

  19. First Prismatic Building Model Reconstruction from Tomosar Point Clouds

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Shahzad, M.; Zhu, X.

    2016-06-01

    This paper demonstrates for the first time the potential of explicitly modelling the individual roof surfaces to reconstruct 3-D prismatic building models using spaceborne tomographic synthetic aperture radar (TomoSAR) point clouds. The proposed approach is modular and works as follows: it first extracts the buildings via DSM generation and cutting-off the ground terrain. The DSM is smoothed using BM3D denoising method proposed in (Dabov et al., 2007) and a gradient map of the smoothed DSM is generated based on height jumps. Watershed segmentation is then adopted to oversegment the DSM into different regions. Subsequently, height and polygon complexity constrained merging is employed to refine (i.e., to reduce) the retrieved number of roof segments. Coarse outline of each roof segment is then reconstructed and later refined using quadtree based regularization plus zig-zag line simplification scheme. Finally, height is associated to each refined roof segment to obtain the 3-D prismatic model of the building. The proposed approach is illustrated and validated over a large building (convention center) in the city of Las Vegas using TomoSAR point clouds generated from a stack of 25 images using Tomo-GENESIS software developed at DLR.

  20. A consistent model for tsunami actions on buildings

    NASA Astrophysics Data System (ADS)

    Foster, A.; Rossetto, T.; Eames, I.; Chandler, I.; Allsop, W.

    2016-12-01

    The Japan (2011) and Indian Ocean (2004) tsunami resulted in significant loss of life, buildings, and critical infrastructure. The tsunami forces imposed upon structures in coastal regions are initially due to wave slamming, after which the quasi-steady flow of the sea water around buildings becomes important. An essential requirement in both design and loss assessment is a consistent model that can accurately predict these forces. A model suitable for predicting forces in the in the quasi-steady range has been established as part of a systematic programme of research by the UCL EPICentre to understand the fundamental physical processes of tsunami actions on buildings, and more generally their social and economic consequences. Using the pioneering tsunami generator at HR Wallingford, this study considers the influence of unsteady flow conditions on the forces acting upon a rectangular building occupying 10-80% of a channel for 20-240 second wave periods. A mathematical model based upon basic open-channel flow principles is proposed, which provides empirical estimates for drag and hydrostatic coefficients. A simple force prediction equation, requiring only basic flow velocity and wave height inputs is then developed, providing good agreement with the experimental results. The results of this study demonstrate that the unsteady forces from the very long waves encountered during tsunami events can be predicted with a level of accuracy and simplicity suitable for design and risk assessment.

  1. Development of a stock-recruitment model and assessment of biological reference points for the Lake Erie walleye fishery

    USGS Publications Warehouse

    Zhao, Yingming; Kocovsky, Patrick M.; Madenjian, Charles P.

    2013-01-01

    We developed an updated stock–recruitment relationship for Lake Erie Walleye Sander vitreus using the Akaike information criterion model selection approach. Our best stock–recruitment relationship was a Ricker spawner–recruit function to which spring warming rate was added as an environmental variable, and this regression model explained 39% of the variability in Walleye recruitment over the 1978 through 2006 year-classes. Thus, most of the variability in Lake Erie Walleye recruitment appeared to be attributable to factors other than spawning stock size and spring warming rate. The abundance of age-0 Gizzard Shad Dorosoma cepedianum, which was an important term in previous models, may still be an important factor for Walleye recruitment, but poorer ability to monitor Gizzard Shad since the late 1990s could have led to that term failing to appear in our best model. Secondly, we used numerical simulation to demonstrate how to use the stock recruitment relationship to characterize the population dynamics (such as stable age structure, carrying capacity, and maximum sustainable yield) and some biological reference points (such as fishing rates at different important biomass or harvest levels) for an age-structured population in a deterministic way.

  2. A method for building 3D models of barchan dunes

    NASA Astrophysics Data System (ADS)

    Nai, Yang; Li-lan, Su; Lin, Wan; Jie, Yang; Shi-yi, Chen; Wei-lu, Hu

    2016-01-01

    The distributions of barchan dunes are usually represented by digital terrain models (DTMs) overlaid with digital orthophoto maps. Given that most regions with barchan dues have low relief, a 3D map obtained from a DTM may ineffectively show the stereoscopic shape of each dune. The method of building 3D models of barchan dunes using existing modeling software seldom considers the geographical environment. As a result, barchan dune models are often inconsistent with actual DTMs and incompletely express the morphological characteristics of dunes. Manual construction of barchan dune models is also costly and time consuming. Considering these problems, the morphological characteristics of barchan dunes and the mathematical relationships between the morphological parameters of the dunes, such as length, height, and width, are analyzed in this study. The methods of extracting the morphological feature points of barchan dunes, calculating their morphological parameters and building dune outlines and skeleton lines based on the medial axes, are also presented. The dune outlines, skeleton lines, and part of the medial axes of dunes are used to construct a constrained triangulated irregular network. C# and ArcEngine are employed to build 3D models of barchan dunes automatically. Experimental results of a study conducted in Tengger Desert show that the method can be used to approximate the morphological characteristics of barchan dunes and is less time consuming than manual methods.

  3. Modelling soil carbon flows and stocks following a carbon balance approach at regional scale for the EU-27

    NASA Astrophysics Data System (ADS)

    Lesschen, Jan Peter; Sikirica, Natasa; Bonten, Luc; Dibari, Camilla; Sanchez, Berta; Kuikman, Peter

    2014-05-01

    Soil Organic Carbon (SOC) is a key parameter to many soil functions and services. SOC is essential to support water retention and nutrient buffering and mineralization in the soil as well as to enhance soil biodiversity. Consequently, loss of SOC or low SOC levels might threaten soil productivity or even lead to a collapse of a farming system. Identification of areas in Europe with critically low SOC levels or with a negative carbon balance is a challenge in order to apply the appropriate strategies to restore these areas or prevent further SOC losses. The objective of this study is to assess current soil carbon flows and stocks at a regional scale; we follow a carbon balance approach which we developed within the MITERRA-Europe model. MITERRA-Europe is an environmental impact assessment model and calculates nitrogen and greenhouse emission on a deterministic and annual basis using emission and leaching factors at regional level (NUTS2, comparable to province level) in the EU27. The model already contained a soil carbon module based on the IPCC stock change approach. Within the EU FP7 SmartSoil project we developed a SOC balance approach, for which we quantified the input of carbon (manure, crop residues, other organic inputs) and the losses of carbon (decomposition, leaching and erosion). The calculations rules from the Roth-C model were used to estimate SOC decomposition. For the actual soil carbon stocks we used the data from the LUCAS soil sample survey. LUCAS collected soil samples in 2009 at about 22000 locations across the EU, which were analysed for a range of soil properties. Land management practices are accounted for, based on data from the EU wide Survey on Agricultural Production Methods in the 2010 Farm Structure Survey. The survey comprises data on the application of soil tillage, soil cover, crop rotation and irrigation. Based on the simulated soil carbon balance and the actual carbon stocks from LUCAS we now can identify regions within the EU that

  4. Applied Concepts in PBPK Modeling: How to Build a PBPK/PD Model

    PubMed Central

    Kuepfer, L; Niederalt, C; Wendl, T; Schlender, J‐F; Willmann, S; Lippert, J; Block, M; Eissing, T

    2016-01-01

    The aim of this tutorial is to introduce the fundamental concepts of physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) modeling with a special focus on their practical implementation in a typical PBPK model building workflow. To illustrate basic steps in PBPK model building, a PBPK model for ciprofloxacin will be constructed and coupled to a pharmacodynamic model to simulate the antibacterial activity of ciprofloxacin treatment. PMID:27653238

  5. Infiltration modeling guidelines for commercial building energy analysis

    SciTech Connect

    Gowri, Krishnan; Winiarski, David W.; Jarnagin, Ronald E.

    2009-09-30

    This report presents a methodology for modeling air infiltration in EnergyPlus to account for envelope air barrier characteristics. Based on a review of various infiltration modeling options available in EnergyPlus and sensitivity analysis, the linear wind velocity coefficient based on DOE-2 infiltration model is recommended. The methodology described in this report can be used to calculate the EnergyPlus infiltration input for any given building level infiltration rate specified at known pressure difference. The sensitivity analysis shows that EnergyPlus calculates the wind speed based on zone altitude, and the linear wind velocity coefficient represents the variation in infiltration heat loss consistent with building location and weather data.

  6. Active buildings: modelling physical activity and movement in office buildings. An observational study protocol.

    PubMed

    Smith, Lee; Ucci, Marcella; Marmot, Alexi; Spinney, Richard; Laskowski, Marek; Sawyer, Alexia; Konstantatou, Marina; Hamer, Mark; Ambler, Gareth; Wardle, Jane; Fisher, Abigail

    2013-11-12

    Health benefits of regular participation in physical activity are well documented but population levels are low. Office layout, and in particular the number and location of office building destinations (eg, print and meeting rooms), may influence both walking time and characteristics of sitting time. No research to date has focused on the role that the layout of the indoor office environment plays in facilitating or inhibiting step counts and characteristics of sitting time. The primary aim of this study was to investigate associations between office layout and physical activity, as well as sitting time using objective measures. Active buildings is a unique collaboration between public health, built environment and computer science researchers. The study involves objective monitoring complemented by a larger questionnaire arm. UK office buildings will be selected based on a variety of features, including office floor area and number of occupants. Questionnaires will include items on standard demographics, well-being, physical activity behaviour and putative socioecological correlates of workplace physical activity. Based on survey responses, approximately 30 participants will be recruited from each building into the objective monitoring arm. Participants will wear accelerometers (to monitor physical activity and sitting inside and outside the office) and a novel tracking device will be placed in the office (to record participant location) for five consecutive days. Data will be analysed using regression analyses, as well as novel agent-based modelling techniques. The results of this study will be disseminated through peer-reviewed publications and scientific presentations. Ethical approval was obtained through the University College London Research Ethics Committee (Reference number 4400/001).

  7. Active buildings: modelling physical activity and movement in office buildings. An observational study protocol

    PubMed Central

    Smith, Lee; Ucci, Marcella; Marmot, Alexi; Spinney, Richard; Laskowski, Marek; Sawyer, Alexia; Konstantatou, Marina; Hamer, Mark; Ambler, Gareth; Wardle, Jane; Fisher, Abigail

    2013-01-01

    Introduction Health benefits of regular participation in physical activity are well documented but population levels are low. Office layout, and in particular the number and location of office building destinations (eg, print and meeting rooms), may influence both walking time and characteristics of sitting time. No research to date has focused on the role that the layout of the indoor office environment plays in facilitating or inhibiting step counts and characteristics of sitting time. The primary aim of this study was to investigate associations between office layout and physical activity, as well as sitting time using objective measures. Methods and analysis Active buildings is a unique collaboration between public health, built environment and computer science researchers. The study involves objective monitoring complemented by a larger questionnaire arm. UK office buildings will be selected based on a variety of features, including office floor area and number of occupants. Questionnaires will include items on standard demographics, well-being, physical activity behaviour and putative socioecological correlates of workplace physical activity. Based on survey responses, approximately 30 participants will be recruited from each building into the objective monitoring arm. Participants will wear accelerometers (to monitor physical activity and sitting inside and outside the office) and a novel tracking device will be placed in the office (to record participant location) for five consecutive days. Data will be analysed using regression analyses, as well as novel agent-based modelling techniques. Ethics and dissemination The results of this study will be disseminated through peer-reviewed publications and scientific presentations. Ethical approval was obtained through the University College London Research Ethics Committee (Reference number 4400/001). PMID:24227873

  8. Air Dispersion Modeling for Building 3026C/D Demolition

    SciTech Connect

    Ward, Richard C; Sjoreen, Andrea L; Eckerman, Keith F

    2010-06-01

    This report presents estimates of dispersion coefficients and effective dose for potential air dispersion scenarios of uncontrolled releases from Oak Ridge National Laboratory (ORNL) buildings 3026C, 3026D, and 3140 prior to or during the demolition of the 3026 Complex. The Environmental Protection Agency (EPA) AERMOD system1-6 was used to compute these estimates. AERMOD stands for AERMIC Model, where AERMIC is the American Meteorological Society-EPA Regulatory Model Improvement Committee. Five source locations (three in building 3026D and one each in building 3026C and the filter house 3140) and associated source characteristics were determined with the customer. In addition, the area of study was determined and building footprints and intake locations of air-handling systems were obtained. In addition to the air intakes, receptor sites consisting of ground level locations on four polar grids (50 m, 100 m, 200 m, and 500 m) and two intersecting lines of points (50 m separation), corresponding to sidewalks along Central Avenue and Fifth Street. Three years of meteorological data (2006 2008) were used each consisting of three datasets: 1) National Weather Service data; 2) upper air data for the Knoxville-Oak Ridge area; and 3) local weather data from Tower C (10 m, 30 m and 100 m) on the ORNL reservation. Annual average air concentration, highest 1 h average and highest 3 h average air concentrations were computed using AERMOD for the five source locations for the three years of meteorological data. The highest 1 h average air concentrations were converted to dispersion coefficients to characterize the atmospheric dispersion as the customer was interested in the most significant response and the highest 1 h average data reflects the best time-averaged values available from the AERMOD code. Results are presented in tabular and graphical form. The results for dose were obtained using radionuclide activities for each of the buildings provided by the customer.7

  9. Building 235-F Goldsim Fate And Transport Model

    SciTech Connect

    Taylor, G. A.; Phifer, M. A.

    2012-09-14

    Savannah River National Laboratory (SRNL) personnel, at the request of Area Completion Projects (ACP), evaluated In-Situ Disposal (ISD) alternatives that are under consideration for deactivation and decommissioning (D&D) of Building 235-F and the Building 294-2F Sand Filter. SRNL personnel developed and used a GoldSim fate and transport model, which is consistent with Musall 2012, to evaluate relative to groundwater protection, ISD alternatives that involve either source removal and/or the grouting of portions or all of 235-F. This evaluation was conducted through the development and use of a Building 235-F GoldSim fate and transport model. The model simulates contaminant release from four 235-F process areas and the 294-2F Sand Filter. In addition, it simulates the fate and transport through the vadose zone, the Upper Three Runs (UTR) aquifer, and the Upper Three Runs (UTR) creek. The model is designed as a stochastic model, and as such it can provide both deterministic and stochastic (probabilistic) results. The results show that the median radium activity concentrations exceed the 5 ?Ci/L radium MCL at the edge of the building for all ISD alternatives after 10,000 years, except those with a sufficient amount of inventory removed. A very interesting result was that grouting was shown to basically have minimal effect on the radium activity concentration. During the first 1,000 years grouting may have some small positive benefit relative to radium, however after that it may have a slightly deleterious effect. The Pb-210 results, relative to its 0.06 ?Ci/L PRG, are essentially identical to the radium results, but the Pb-210 results exhibit a lesser degree of exceedance. In summary, some level of inventory removal will be required to ensure that groundwater standards are met.

  10. Assessing Predicted Contacts for Building Protein Three-Dimensional Models.

    PubMed

    Adhikari, Badri; Bhattacharya, Debswapna; Cao, Renzhi; Cheng, Jianlin

    2017-01-01

    Recent successes of contact-guided protein structure prediction methods have revived interest in solving the long-standing problem of ab initio protein structure prediction. With homology modeling failing for many protein sequences that do not have templates, contact-guided structure prediction has shown promise, and consequently, contact prediction has gained a lot of interest recently. Although a few dozen contact prediction tools are already currently available as web servers and downloadables, not enough research has been done towards using existing measures like precision and recall to evaluate these contacts with the goal of building three-dimensional models. Moreover, when we do not have a native structure for a set of predicted contacts, the only analysis we can perform is a simple contact map visualization of the predicted contacts. A wider and more rigorous assessment of the predicted contacts is needed, in order to build tertiary structure models. This chapter discusses instructions and protocols for using tools and applying techniques in order to assess predicted contacts for building three-dimensional models.

  11. Building Detection Using Aerial Images and Digital Surface Models

    NASA Astrophysics Data System (ADS)

    Mu, J.; Cui, S.; Reinartz, P.

    2017-05-01

    In this paper a method for building detection in aerial images based on variational inference of logistic regression is proposed. It consists of three steps. In order to characterize the appearances of buildings in aerial images, an effective bag-of-Words (BoW) method is applied for feature extraction in the first step. In the second step, a classifier of logistic regression is learned using these local features. The logistic regression can be trained using different methods. In this paper we adopt a fully Bayesian treatment for learning the classifier, which has a number of obvious advantages over other learning methods. Due to the presence of hyper prior in the probabilistic model of logistic regression, approximate inference methods have to be applied for prediction. In order to speed up the inference, a variational inference method based on mean field instead of stochastic approximation such as Markov Chain Monte Carlo is applied. After the prediction, a probabilistic map is obtained. In the third step, a fully connected conditional random field model is formulated and the probabilistic map is used as the data term in the model. A mean field inference is utilized in order to obtain a binary building mask. A benchmark data set consisting of aerial images and digital surfaced model (DSM) released by ISPRS for 2D semantic labeling is used for performance evaluation. The results demonstrate the effectiveness of the proposed method.

  12. Toward a General Research Process for Using Dubin's Theory Building Model

    ERIC Educational Resources Information Center

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  13. Toward a General Research Process for Using Dubin's Theory Building Model

    ERIC Educational Resources Information Center

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  14. The Use of Mixed Effects Models for Obtaining Low-Cost Ecosystem Carbon Stock Estimates in Mangroves of the Asia-Pacific

    NASA Astrophysics Data System (ADS)

    Bukoski, J. J.; Broadhead, J. S.; Donato, D.; Murdiyarso, D.; Gregoire, T. G.

    2016-12-01

    Mangroves provide extensive ecosystem services that support both local livelihoods and international environmental goals, including coastal protection, water filtration, biodiversity conservation and the sequestration of carbon (C). While voluntary C market projects that seek to preserve and enhance forest C stocks offer a potential means of generating finance for mangrove conservation, their implementation faces barriers due to the high costs of quantifying C stocks through measurement, reporting and verification (MRV) activities. To streamline MRV activities in mangrove C forestry projects, we develop predictive models for (i) biomass-based C stocks, and (ii) soil-based C stocks for the mangroves of the Asia-Pacific. We use linear mixed effect models to account for spatial correlation in modeling the expected C as a function of stand attributes. The most parsimonious biomass model predicts total biomass C stocks as a function of both basal area and the interaction between latitude and basal area, whereas the most parsimonious soil C model predicts soil C stocks as a function of the logarithmic transformations of both latitude and basal area. Random effects are specified by site for both models, and are found to explain a substantial proportion of variance within the estimation datasets. The root mean square error (RMSE) of the biomass C model is approximated at 24.6 Mg/ha (18.4% of mean biomass C in the dataset), whereas the RMSE of the soil C model is estimated at 4.9 mg C/cm 3 (14.1% of mean soil C). A substantial proportion of the variation in soil C, however, is explained by the random effects and thus the use of the SOC model may be most valuable for sites in which field measurements of soil C exist.

  15. Understanding Building Infrastructure and Building Operation through DOE Asset Score Model: Lessons Learned from a Pilot Project

    SciTech Connect

    Wang, Na; Goel, Supriya; Gorrissen, Willy J.; Makhmalbaf, Atefe

    2013-06-24

    The U.S. Department of Energy (DOE) is developing a national voluntary energy asset score system to help building owners to evaluate the as-built physical characteristics (including building envelope, the mechanical and electrical systems) and overall building energy efficiency, independent of occupancy and operational choices. The energy asset score breaks down building energy use information by simulating building performance under typical operating and occupancy conditions for a given use type. A web-based modeling tool, the energy asset score tool facilitates the implementation of the asset score system. The tool consists of a simplified user interface built on a centralized simulation engine (EnergyPlus). It is intended to reduce both the implementation cost for the users and increase modeling standardization compared with an approach that requires users to build their own energy models. A pilot project with forty-two buildings (consisting mostly offices and schools) was conducted in 2012. This paper reports the findings. Participants were asked to collect a minimum set of building data and enter it into the asset score tool. Participants also provided their utility bills, existing ENERGY STAR scores, and previous energy audit/modeling results if available. The results from the asset score tool were compared with the building energy use data provided by the pilot participants. Three comparisons were performed. First, the actual building energy use, either from the utility bills or via ENERGY STAR Portfolio Manager, was compared with the modeled energy use. It was intended to examine how well the energy asset score represents a building’s system efficiencies, and how well it is correlated to a building’s actual energy consumption. Second, calibrated building energy models (where they exist) were used to examine any discrepancies between the asset score model and the pilot participant buildings’ [known] energy use pattern. This comparison examined the end

  16. Combining a Detailed Building Energy Model with a Physically-Based Urban Canopy Model

    NASA Astrophysics Data System (ADS)

    Bueno, Bruno; Norford, Leslie; Pigeon, Grégoire; Britter, Rex

    2011-09-01

    A scheme that couples a detailed building energy model, EnergyPlus, and an urban canopy model, the Town Energy Balance (TEB), is presented. Both models are well accepted and evaluated within their individual scientific communities. The coupled scheme proposes a more realistic representation of buildings and heating, ventilation and air-conditioning (HVAC) systems, which allows a broader analysis of the two-way interactions between the energy performance of buildings and the urban climate around the buildings. The scheme can be used to evaluate the building energy models that are being developed within the urban climate community. In this study, the coupled scheme is evaluated using measurements conducted over the dense urban centre of Toulouse, France. The comparison includes electricity and natural gas energy consumption of buildings, building façade temperatures, and urban canyon air temperatures. The coupled scheme is then used to analyze the effect of different building and HVAC system configurations on building energy consumption, waste heat released from HVAC systems, and outdoor air temperatures for the case study of Toulouse. Three different energy efficiency strategies are analyzed: shading devices, economizers, and heat recovery.

  17. Lidar-equipped uav for building information modelling

    NASA Astrophysics Data System (ADS)

    Roca, D.; Armesto, J.; Lagüela, S.; Díaz-Vilariño, L.

    2014-06-01

    The trend to minimize electronic devices in the last decades accounts for Unmanned Airborne Vehicles (UAVs) as well as for sensor technologies and imaging devices, resulting in a strong revolution in the surveying and mapping industries. However, only within the last few years the LIDAR sensor technology has achieved sufficiently reduction in terms of size and weight to be considered for UAV platforms. This paper presents an innovative solution to capture point cloud data from a Lidar-equipped UAV and further perform the 3D modelling of the whole envelope of buildings in BIM format. A mini-UAV platform is used (weigh less than 5 kg and up to 1.5 kg of sensor payload), and data from two different acquisition methodologies is processed and compared with the aim at finding the optimal configuration for the generation of 3D models of buildings for energy studies

  18. Modelling the impact of agricultural management on soil carbon stocks at the regional scale: the role of lateral fluxes.

    PubMed

    Nadeu, Elisabet; Gobin, Anne; Fiener, Peter; van Wesemael, Bas; van Oost, Kristof

    2015-08-01

    Agricultural management has received increased attention over the last decades due to its central role in carbon (C) sequestration and greenhouse gas mitigation. Yet, regardless of the large body of literature on the effects of soil erosion by tillage and water on soil organic carbon (SOC) stocks in agricultural landscapes, the significance of soil redistribution for the overall C budget and the C sequestration potential of land management options remains poorly quantified. In this study, we explore the role of lateral SOC fluxes in regional scale modelling of SOC stocks under three different agricultural management practices in central Belgium: conventional tillage (CT), reduced tillage (RT) and reduced tillage with additional carbon input (RT+i). We assessed each management scenario twice: using a conventional approach that did not account for lateral fluxes and an alternative approach that included soil erosion-induced lateral SOC fluxes. The results show that accounting for lateral fluxes increased C sequestration rates by 2.7, 2.5 and 1.5 g C m(-2)  yr(-1) for CT, RT and RT+i, respectively, relative to the conventional approach. Soil redistribution also led to a reduction of SOC concentration in the plough layer and increased the spatial variability of SOC stocks, suggesting that C sequestration studies relying on changes in the plough layer may underestimate the soil's C sequestration potential due to the effects of soil erosion. Additionally, lateral C export from cropland was in the same of order of magnitude as C sequestration; hence, the fate of C exported from cropland into other land uses is crucial to determine the ultimate impact of management and erosion on the landscape C balance. Consequently, soil management strategies targeting C sequestration will be most effective when accompanied by measures that reduce soil erosion given that erosion loss can balance potential C uptake, particularly in sloping areas. © 2015 John Wiley & Sons Ltd.

  19. Reducing stock-outs of essential tuberculosis medicines: a system dynamics modelling approach to supply chain management.

    PubMed

    Bam, L; McLaren, Z M; Coetzee, E; von Leipzig, K H

    2017-10-01

    The under-performance of supply chains presents a significant hindrance to disease control in developing countries. Stock-outs of essential medicines lead to treatment interruption which can force changes in patient drug regimens, drive drug resistance and increase mortality. This study is one of few to quantitatively evaluate the effectiveness of supply chain policies in reducing shortages and costs. This study develops a systems dynamics simulation model of the downstream supply chain for amikacin, a second-line tuberculosis drug using 10 years of South African data. We evaluate current supply chain performance in terms of reliability, responsiveness and agility, following the widely-used Supply Chain Operation Reference framework. We simulate 141 scenarios that represent different combinations of supplier characteristics, inventory management strategies and demand forecasting methods to identify the Pareto optimal set of management policies that jointly minimize the number of shortages and total cost. Despite long supplier lead times and unpredictable demand, the amikacin supply chain is 98% reliable and agile enough to accommodate a 20% increase in demand without a shortage. However, this is accomplished by overstocking amikacin by 167%, which incurs high holding costs. The responsiveness of suppliers is low: only 57% of orders are delivered to the central provincial drug depot within one month. We identify three Pareto optimal safety stock management policies. Short supplier lead time can produce Pareto optimal outcomes even in the absence of other optimal policies. This study produces concrete, actionable guidelines to cost-effectively reduce stock-outs by implementing optimal supply chain policies. Preferentially selecting drug suppliers with short lead times accommodates unexpected changes in demand. Optimal supply chain management should be an essential component of national policy to reduce the mortality rate. © The Author 2017. Published by Oxford

  20. Empirical regularities of order placement in the Chinese stock market

    NASA Astrophysics Data System (ADS)

    Gu, Gao-Feng; Chen, Wei; Zhou, Wei-Xing

    2008-05-01

    Using ultra-high-frequency data extracted from the order flows of 23 stocks traded on the Shenzhen Stock Exchange, we study the empirical regularities of order placement in the opening call auction, cool period and continuous auction. The distributions of relative logarithmic prices against reference prices in the three time periods are qualitatively the same with quantitative discrepancies. The order placement behavior is asymmetric between buyers and sellers and between the inside-the-book orders and outside-the-book orders. In addition, the conditional distributions of relative prices in the continuous auction are independent of the bid-ask spread and volatility. These findings are crucial to build an empirical behavioral microscopic model based on order flows for Chinese stocks.

  1. A model for simulating airflow and pollutant dispersion around buildings

    SciTech Connect

    Chan, S T; Lee, R L

    1999-02-24

    A three-dimensional, numerical mode1 for simulating airflow and pollutant dispersion around buildings is described. The model is based on an innovative finite element approach and fully implicit time integration techniques. Linear and nonlinear eddy viscosity/diffusivity submodels are provided for turbulence parameterization. Mode1 predictions for the flow-field and dispersion patterns around a surface-mounted cube are compared with measured data from laboratory experiments.

  2. FORTRAN M as a language for building earth system models

    SciTech Connect

    Foster, I.

    1992-12-31

    FORTRAN M is a small set of extensions to FORTRAN 77 that supports a modular or object-oriented approach to the development of parallel programs. In this paper, I discuss the use of FORTRAN M as a tool for building earth system models on massively parallel computers. I hypothesize that the use of FORTRAN M has software engineering advantages and outline experiments that we are conducting to investigate this hypothesis.

  3. FORTRAN M as a language for building earth system models

    SciTech Connect

    Foster, I.

    1992-01-01

    FORTRAN M is a small set of extensions to FORTRAN 77 that supports a modular or object-oriented approach to the development of parallel programs. In this paper, I discuss the use of FORTRAN M as a tool for building earth system models on massively parallel computers. I hypothesize that the use of FORTRAN M has software engineering advantages and outline experiments that we are conducting to investigate this hypothesis.

  4. Modeling and identification of multistory buildings with seismic recordings

    NASA Astrophysics Data System (ADS)

    Gargab, Lotfi O.

    This study proposes a continuous-discrete model for one-dimensional wave propagation in a multi-story building with seismic excitation and shows its applications in forward predicting analysis and inverse system identification. In particular, the building is modeled as a series of continuous shear-beams for columns/walls in inter-stories and discrete lumped-masses for floors. Wave response at one location of the building is then derived from an impulsive motion at another location in the time and frequency domains, termed here as wave-based or generalized impulse and frequency response functions (GIRF and GFRF). The GIRF and GFRF are fundamental in relating seismic wave responses at the two locations of a building structure subjected to seismic excitation that is not fully known due to the complicated soil-structure interaction. Additionally, they play a key role in characterizing seismic structural responses, as well as in identifying dynamic parameters and subsequently diagnosing local damage of the structure. For illustration, this study examines the ten-story Millikan Library in Pasadena, California with recordings of the Yorba Linda earthquake of September 3, 2002. With the use of the proposed continuous-discrete model as well as its degenerated ones, seismic wave responses are interpreted from the perspective of wave propagation, and more importantly, validated with the recordings and pertinent discrete-model-based results. Finally, a wave-based approach for system identification with a limited number of seismic recordings is presented, which can be used to evaluate structural integrity and detect damage in post-earthquake structural condition assessment.

  5. Simulation and Big Data Challenges in Tuning Building Energy Models

    SciTech Connect

    Sanyal, Jibonananda; New, Joshua Ryan

    2013-01-01

    EnergyPlus is the flagship building energy simulation software used to model whole building energy consumption for residential and commercial establishments. A typical input to the program often has hundreds, sometimes thousands of parameters which are typically tweaked by a buildings expert to get it right . This process can sometimes take months. Autotune is an ongoing research effort employing machine learning techniques to automate the tuning of the input parameters for an EnergyPlus input description of a building. Even with automation, the computational challenge faced to run the tuning simulation ensemble is daunting and requires the use of supercomputers to make it tractable in time. In this proposal, we describe the scope of the problem, the technical challenges faced and overcome, the machine learning techniques developed and employed, and the software infrastructure developed/in development when taking the EnergyPlus engine, which was primarily designed to run on desktops, and scaling it to run on shared memory supercomputers (Nautilus) and distributed memory supercomputers (Frost and Titan). The parametric simulations produce data in the order of tens to a couple of hundred terabytes.We describe the approaches employed to streamline and reduce bottlenecks in the workflow for this data, which is subsequently being made available for the tuning effort as well as made available publicly for open-science.

  6. Validation of Building Energy Modeling Tools Under Idealized and Realistic Conditions

    SciTech Connect

    Ryan, Emily M.; Sanquist, Thomas F.

    2012-04-02

    Building energy models provide valuable insight into the energy use of commercial and residential buildings based on the building architecture, materials and thermal loads. They are used in the design of new buildings and the retrofitting to increase the efficiency of older buildings. The accuracy of these models is crucial to reducing the energy use of the United States and building a sustainable energy future. In addition to the architecture and thermal loads of a building, building energy models also must account for the effects of the building's occupants on the energy use of the building. Traditionally simple schedule based methods have been used to account for the effects of the occupants. However, newer research has shown that these methods often result in large differences between the modeled and actual energy use of buildings. In this paper we discuss building energy models and their accuracy in predicting building energy use. In particular we focus on the different types of validation methods which have been used to investigate the accuracy of building energy models and how they account for (or do not account for) the effects of occupants. We also review some of the newer work on stochastic methods for estimating the effects of occupants on building energy use and discuss the improvements necessary to increase the accuracy of building energy models.

  7. Building and testing models with extended Higgs sectors

    NASA Astrophysics Data System (ADS)

    Ivanov, Igor P.

    2017-07-01

    Models with non-minimal Higgs sectors represent a mainstream direction in theoretical exploration of physics opportunities beyond the Standard Model. Extended scalar sectors help alleviate difficulties of the Standard Model and lead to a rich spectrum of characteristic collider signatures and astroparticle consequences. In this review, we introduce the reader to the world of extended Higgs sectors. Not pretending to exhaustively cover the entire body of literature, we walk through a selection of the most popular examples: the two- and multi-Higgs-doublet models, as well as singlet and triplet extensions. We will show how one typically builds models with extended Higgs sectors, describe the main goals and the challenges which arise on the way, and mention some methods to overcome them. We will also describe how such models can be tested, what are the key observables one focuses on, and illustrate the general strategy with a subjective selection of results.

  8. Model building strategy for logistic regression: purposeful selection.

    PubMed

    Zhang, Zhongheng

    2016-03-01

    Logistic regression is one of the most commonly used models to account for confounders in medical literature. The article introduces how to perform purposeful selection model building strategy with R. I stress on the use of likelihood ratio test to see whether deleting a variable will have significant impact on model fit. A deleted variable should also be checked for whether it is an important adjustment of remaining covariates. Interaction should be checked to disentangle complex relationship between covariates and their synergistic effect on response variable. Model should be checked for the goodness-of-fit (GOF). In other words, how the fitted model reflects the real data. Hosmer-Lemeshow GOF test is the most widely used for logistic regression model.

  9. Toward Accessing Spatial Structure from Building Information Models

    NASA Astrophysics Data System (ADS)

    Schultz, C.; Bhatt, M.

    2011-08-01

    Data about building designs and layouts is becoming increasingly more readily available. In the near future, service personal (such as maintenance staff or emergency rescue workers) arriving at a building site will have immediate real-time access to enormous amounts of data relating to structural properties, utilities, materials, temperature, and so on. The critical problem for users is the taxing and error prone task of interpreting such a large body of facts in order to extract salient information. This is necessary for comprehending a situation and deciding on a plan of action, and is a particularly serious issue in time-critical and safety-critical activities such as firefighting. Current unifying building models such as the Industry Foundation Classes (IFC), while being comprehensive, do not directly provide data structures that focus on spatial reasoning and spatial modalities that are required for high-level analytical tasks. The aim of the research presented in this paper is to provide computational tools for higher level querying and reasoning that shift the cognitive burden of dealing with enormous amounts of data away from the user. The user can then spend more energy and time in planning and decision making in order to accomplish the tasks at hand. We present an overview of our framework that provides users with an enhanced model of "built-up space". In order to test our approach using realistic design data (in terms of both scale and the nature of the building models) we describe how our system interfaces with IFC, and we conduct timing experiments to determine the practicality of our approach. We discuss general computational approaches for deriving higher-level spatial modalities by focusing on the example of route graphs. Finally, we present a firefighting scenario with alternative route graphs to motivate the application of our framework.

  10. An Evolving Model for Capacity Building with Earth Observation Imagery

    NASA Astrophysics Data System (ADS)

    Sylak-Glassman, E. J.

    2015-12-01

    For the first forty years of Earth observation satellite imagery, all imagery was collected by civilian or military governmental satellites. Over this timeframe, countries without observation satellite capabilities had very limited access to Earth observation data or imagery. In response to the limited access to Earth observation systems, capacity building efforts were focused on satellite manufacturing. Wood and Weigel (2012) describe the evolution of satellite programs in developing countries with a technology ladder. A country moves up the ladder as they move from producing satellites with training services to building satellites locally. While the ladder model may be appropriate if the goal is to develop autonomous satellite manufacturing capability, in the realm of Earth observation, the goal is generally to derive societal benefit from the use of Earth observation-derived information. In this case, the model for developing Earth observation capacity is more appropriately described by a hub-and-spoke model in which the use of Earth observation imagery is the "hub," and the "spokes" describe the various paths to achieving that imagery: the building of a satellite (either independently or with assistance), the purchase of a satellite, participation in a constellation of satellites, and the use of freely available or purchased satellite imagery. We discuss the different capacity-building activities that are conducted in each of these pathways, such as the "Know-How Transfer and Training" program developed by Surrey Satellite Technology Ltd. , Earth observation imagery training courses run by SERVIR in developing countries, and the use of national or regional remote sensing centers (such as those in Morocco, Malaysia, and Kenya) to disseminate imagery and training. In addition, we explore the factors that determine through which "spoke" a country arrives at the ability to use Earth observation imagery, and discuss best practices for achieving the capability to use

  11. Dynamic Metabolic Model Building Based on the Ensemble Modeling Approach

    SciTech Connect

    Liao, James C.

    2016-10-01

    Ensemble modeling of kinetic systems addresses the challenges of kinetic model construction, with respect to parameter value selection, and still allows for the rich insights possible from kinetic models. This project aimed to show that constructing, implementing, and analyzing such models is a useful tool for the metabolic engineering toolkit, and that they can result in actionable insights from models. Key concepts are developed and deliverable publications and results are presented.

  12. Spatiotemporal models of global soil organic carbon stock to support land degradation assessments at regional and global scales: limitations, challenges and opportunities

    NASA Astrophysics Data System (ADS)

    Hengl, Tomislav; Heuvelink, Gerard; Sanderman, Jonathan; MacMillan, Robert

    2017-04-01

    There is an increasing interest in fitting and applying spatiotemporal models that can be used to assess and monitor soil organic carbon stocks (SOCS), for example, in support of the '4 pourmille' initiative aiming at soil carbon sequestration towards climate change adaptation and mitigation and UN's Land Degradation Neutrality indicators and similar degradation assessment projects at regional and global scales. The land cover mapping community has already produced several spatiotemporal data sets with global coverage and at relatively fine resolution e.g. USGS MODIS land cover annual maps for period 2000-2014; European Space Agency land cover maps at 300 m resolution for the year 2000, 2005 and 2010; Chinese GlobeLand30 dataset available for years 2000 and 2010; Columbia University's WRI GlobalForestWatch with deforestation maps at 30 m resolution for the period 2000-2016 (Hansen et al. 2013). These data sets can be used for land degradation assessment and scenario testing at global and regional scales (Wei et al 2014). Currently, however, no compatible global spatiotemporal data sets exist on status of soil quality and/or soil health (Powlson et al. 2013). This paper describes an initial effort to devise and evaluate a procedure for mapping spatio-temporal changes in SOC stocks using a complete stack of soil forming factors (climate, relief, land cover, land use, lithology and living organisms) represented mainly through remote sensing based time series of Earth images. For model building we used some 75,000 geo-referenced soil profiles and a stacks space-time covariates (land cover, land use, biomass, climate) at two standard resolutions: (1) 10 km resolution with data available for period 1920-2014 and (2) 1000 m resolution with data available for period 2000-2014. The initial results show that, although it is technically feasible to produce space time estimates of SOCS that demonstrate the procedure, the estimates are relatively uncertain (<45% of variation

  13. An Approach for Incorporating Context in Building Probabilistic Predictive Models

    PubMed Central

    Wu, Juan Anna; Hsu, William; Bui, Alex AT

    2016-01-01

    With the increasing amount of information collected through clinical practice and scientific experimentation, a growing challenge is how to utilize available resources to construct predictive models to facilitate clinical decision making. Clinicians often have questions related to the treatment and outcome of a medical problem for individual patients; however, few tools exist that leverage the large collection of patient data and scientific knowledge to answer these questions. Without appropriate context, existing data that have been collected for a specific task may not be suitable for creating new models that answer different questions. This paper presents an approach that leverages available structured or unstructured data to build a probabilistic predictive model that assists physicians with answering clinical questions on individual patients. Various challenges related to transforming available data to an end-user application are addressed: problem decomposition, variable selection, context representation, automated extraction of information from unstructured data sources, model generation, and development of an intuitive application to query the model and present the results. We describe our efforts towards building a model that predicts the risk of vasospasm in aneurysm patients. PMID:27617299

  14. An Approach for Incorporating Context in Building Probabilistic Predictive Models.

    PubMed

    Wu, Juan Anna; Hsu, William; Bui, Alex At

    2012-09-01

    With the increasing amount of information collected through clinical practice and scientific experimentation, a growing challenge is how to utilize available resources to construct predictive models to facilitate clinical decision making. Clinicians often have questions related to the treatment and outcome of a medical problem for individual patients; however, few tools exist that leverage the large collection of patient data and scientific knowledge to answer these questions. Without appropriate context, existing data that have been collected for a specific task may not be suitable for creating new models that answer different questions. This paper presents an approach that leverages available structured or unstructured data to build a probabilistic predictive model that assists physicians with answering clinical questions on individual patients. Various challenges related to transforming available data to an end-user application are addressed: problem decomposition, variable selection, context representation, automated extraction of information from unstructured data sources, model generation, and development of an intuitive application to query the model and present the results. We describe our efforts towards building a model that predicts the risk of vasospasm in aneurysm patients.

  15. Roll System and Stock's Multi-parameter Coupling Dynamic Modeling Based on the Shape Control of Steel Strip

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Peng, Yan; Sun, Jianliang; Zang, Yong

    2017-03-01

    The existence of rolling deformation area in the rolling mill system is the main characteristic which distinguishes the other machinery. In order to analyze the dynamic property of roll system's flexural deformation, it is necessary to consider the transverse periodic movement of stock in the rolling deformation area which is caused by the flexural deformation movement of roll system simultaneously. Therefore, the displacement field of roll system and flow of metal in the deformation area is described by kinematic analysis in the dynamic system. Through introducing the lateral displacement function of metal in the deformation area, the dynamic variation of per unit width rolling force can be determined at the same time. Then the coupling law caused by the co-effect of rigid movement and flexural deformation of the system structural elements is determined. Furthermore, a multi-parameter coupling dynamic model of the roll system and stock is established by the principle of virtual work. More explicitly, the coupled motion modal analysis was made for the roll system. Meanwhile, the analytical solutions for the flexural deformation movement's mode shape functions of rolls are discussed. In addition, the dynamic characteristic of the lateral flow of metal in the rolling deformation area has been analyzed at the same time. The establishment of dynamic lateral displacement function of metal in the deformation area makes the foundation for analyzing the coupling law between roll system and rolling deformation area, and provides a theoretical basis for the realization of the dynamic shape control of steel strip.

  16. Roll System and Stock's Multi-parameter Coupling Dynamic Modeling Based on the Shape Control of Steel Strip

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Peng, Yan; Sun, Jianliang; Zang, Yong

    2017-05-01

    The existence of rolling deformation area in the rolling mill system is the main characteristic which distinguishes the other machinery. In order to analyze the dynamic property of roll system's flexural deformation, it is necessary to consider the transverse periodic movement of stock in the rolling deformation area which is caused by the flexural deformation movement of roll system simultaneously. Therefore, the displacement field of roll system and flow of metal in the deformation area is described by kinematic analysis in the dynamic system. Through introducing the lateral displacement function of metal in the deformation area, the dynamic variation of per unit width rolling force can be determined at the same time. Then the coupling law caused by the co-effect of rigid movement and flexural deformation of the system structural elements is determined. Furthermore, a multi-parameter coupling dynamic model of the roll system and stock is established by the principle of virtual work. More explicitly, the coupled motion modal analysis was made for the roll system. Meanwhile, the analytical solutions for the flexural deformation movement's mode shape functions of rolls are discussed. In addition, the dynamic characteristic of the lateral flow of metal in the rolling deformation area has been analyzed at the same time. The establishment of dynamic lateral displacement function of metal in the deformation area makes the foundation for analyzing the coupling law between roll system and rolling deformation area, and provides a theoretical basis for the realization of the dynamic shape control of steel strip.

  17. [Ecological carrying capacity of Chinese shrimp stock enhancement in Laizhou Bay of East China based on Ecopath model].

    PubMed

    Lin, Qun; Li, Xian-sen; Li, Zhong-yi; Jin, Xian-shi

    2013-04-01

    Stock enhancement is an important way of fishery resources conservation, which can increase the high quality fishery resources and improve the fish population structure. The study of ecological carrying capacity is the premise for the scientific implementation of stock enhancement. Based on the survey data of the fishery resources and ecological environment in Laizhou Bay from 2009 to 2010, an Ecopath mass-balance model of the Laizhou Bay ecosystem consisted of 26 functional groups was constructed, and applied to analyze the overall characteristics of the ecosystem, the trophic interrelationships, and the keystone species, and to calculate the ecological carrying capacity of Chinese shrimp enhancement. As for the overall characteristics of the ecosystem, the total primary production/total respiration (TPP/TR) was 1. 53, total primary production/total biomass (TPP/B) was 24.54, Finn' s cycling index was lower (0.07), surplus production was higher (434. 41 t km-2 a-1 ), and system connectance index was lower (0. 29), indicating that this ecosystem was at an early development stage. The analysis on the keystone species showed that Chinese shrimp was not a keystone species of this ecosystem. At present, the biomass of Chinese shrimp in the ecosystem was 0. 1143 t km-2, with a greater potential of continued enhancement. It did not exceed the ecological carrying capacity of 2. 9489 t km-2 when the biomass of the Chinese shrimp was increased by 25. 8 times.

  18. Time-varying coefficient vector autoregressions model based on dynamic correlation with an application to crude oil and stock markets.

    PubMed

    Lu, Fengbin; Qiao, Han; Wang, Shouyang; Lai, Kin Keung; Li, Yuze

    2017-01-01

    This paper proposes a new time-varying coefficient vector autoregressions (VAR) model, in which the coefficient is a linear function of dynamic lagged correlation. The proposed model allows for flexibility in choices of dynamic correlation models (e.g. dynamic conditional correlation generalized autoregressive conditional heteroskedasticity (GARCH) models, Markov-switching GARCH models and multivariate stochastic volatility models), which indicates that it can describe many types of time-varying causal effects. Time-varying causal relations between West Texas Intermediate (WTI) crude oil and the US Standard and Poor's 500 (S&P 500) stock markets are examined by the proposed model. The empirical results show that their causal relations evolve with time and display complex characters. Both positive and negative causal effects of the WTI on the S&P 500 in the subperiods have been found and confirmed by the traditional VAR models. Similar results have been obtained in the causal effects of S&P 500 on WTI. In addition, the proposed model outperforms the traditional VAR model. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Contam airflow models of three large buildings: Model descriptions and validation

    SciTech Connect

    Black, Douglas R.; Price, Phillip N.

    2009-09-30

    Airflow and pollutant transport models are useful for several reasons, including protection from or response to biological terrorism. In recent years they have been used for deciding how many biological agent samplers are needed in a given building to detect the release of an agent; to figure out where those samplers should be located; to predict the number of people at risk in the event of a release of a given size and location; to devise response strategies in the event of a release; to determine optimal trade-offs between sampler characteristics (such as detection limit and response time); and so on. For some of these purposes it is necessary to model a specific building of interest: if you are trying to determine optimal sampling locations, you must have a model of your building and not some different building. But for many purposes generic or 'prototypical' building models would suffice. For example, for determining trade-offs between sampler characteristics, results from one building will carry over other, similar buildings. Prototypical building models are also useful for comparing or testing different algorithms or computational pproaches: different researchers can use the same models, thus allowing direct comparison of results in a way that is not otherwise possible. This document discusses prototypical building models developed by the Airflow and Pollutant Transport Group at Lawrence Berkeley National Laboratory. The models are implemented in the Contam v2.4c modeling program, available from the National Institutes for Standards and Technology. We present Contam airflow models of three virtual buildings: a convention center, an airport terminal, and a multi-story office building. All of the models are based to some extent on specific real buildings. Our goal is to produce models that are realistic, in terms of approximate magnitudes, directions, and speeds of airflow and pollutant transport. The three models vary substantially in detail. The airport model

  20. Microscopic Spin Model for the STOCK Market with Attractor Bubbling on Regular and Small-World Lattices

    NASA Astrophysics Data System (ADS)

    Krawiecki, A.

    A multi-agent spin model for changes of prices in the stock market based on the Ising-like cellular automaton with interactions between traders randomly varying in time is investigated by means of Monte Carlo simulations. The structure of interactions has topology of a small-world network obtained from regular two-dimensional square lattices with various coordination numbers by randomly cutting and rewiring edges. Simulations of the model on regular lattices do not yield time series of logarithmic price returns with statistical properties comparable with the empirical ones. In contrast, in the case of networks with a certain degree of randomness for a wide range of parameters the time series of the logarithmic price returns exhibit intermittent bursting typical of volatility clustering. Also the tails of distributions of returns obey a power scaling law with exponents comparable to those obtained from the empirical data.

  1. BUILD: a program generator for modelling experimental biological data.

    PubMed

    Rodriguez, F; Altibelli, A; Lopez, A

    1994-04-01

    BUILD is a program generator acting at source code level. The generated code corresponds to a whole application in order to model a biological process of interest using an iterative adjustment of experimental data. The program is designed to be executed in command line mode for processing of multiple data files with an individual execution control for each file. The results are completed by modular statistical and graphical functions. This approach has been shown to reduce the time and the amount of work needed for program development, debugging and maintenance. To date, BUILD has been successfully used in mathematical analysis of phenomenological approaches, but other fields of activity, such as educational software, are also conceivable.

  2. Developing Inventory Projection Models Using Empirical Net Forest Growth and Growing-Stock Density Relationships Across U.S. Regions and Species Group

    Treesearch

    Prakash Nepal; Peter J. Ince; Kenneth E. Skog; Sun J. Chang

    2012-01-01

    This paper describes a set of empirical net forest growth models based on forest growing-stock density relationships for three U.S. regions (North, South, and West) and two species groups (softwoods and hardwoods) at the regional aggregate level. The growth models accurately predict historical U.S. timber inventory trends when we incorporate historical timber harvests...

  3. Development and assessment of a physics-based simulation model to investigate residential PM2.5 infiltration across the US housing stock

    EPA Science Inventory

    The Lawrence Berkeley National Laboratory Population Impact Assessment Modeling Framework (PIAMF) was expanded to enable determination of indoor PM2.5 concentrations and exposures in a set of 50,000 homes representing the US housing stock. A mass-balance model is used to calculat...

  4. Development and assessment of a physics-based simulation model to investigate residential PM2.5 infiltration across the US housing stock

    EPA Science Inventory

    The Lawrence Berkeley National Laboratory Population Impact Assessment Modeling Framework (PIAMF) was expanded to enable determination of indoor PM2.5 concentrations and exposures in a set of 50,000 homes representing the US housing stock. A mass-balance model is used to calculat...

  5. A Stock Market Forecasting Model Combining Two-Directional Two-Dimensional Principal Component Analysis and Radial Basis Function Neural Network

    PubMed Central

    Guo, Zhiqiang; Wang, Huaiqing; Yang, Jie; Miller, David J.

    2015-01-01

    In this paper, we propose and implement a hybrid model combining two-directional two-dimensional principal component analysis ((2D)2PCA) and a Radial Basis Function Neural Network (RBFNN) to forecast stock market behavior. First, 36 stock market technical variables are selected as the input features, and a sliding window is used to obtain the input data of the model. Next, (2D)2PCA is utilized to reduce the dimension of the data and extract its intrinsic features. Finally, an RBFNN accepts the data processed by (2D)2PCA to forecast the next day's stock price or movement. The proposed model is used on the Shanghai stock market index, and the experiments show that the model achieves a good level of fitness. The proposed model is then compared with one that uses the traditional dimension reduction method principal component analysis (PCA) and independent component analysis (ICA). The empirical results show that the proposed model outperforms the PCA-based model, as well as alternative models based on ICA and on the multilayer perceptron. PMID:25849483

  6. A stock market forecasting model combining two-directional two-dimensional principal component analysis and radial basis function neural network.

    PubMed

    Guo, Zhiqiang; Wang, Huaiqing; Yang, Jie; Miller, David J

    2015-01-01

    In this paper, we propose and implement a hybrid model combining two-directional two-dimensional principal component analysis ((2D)2PCA) and a Radial Basis Function Neural Network (RBFNN) to forecast stock market behavior. First, 36 stock market technical variables are selected as the input features, and a sliding window is used to obtain the input data of the model. Next, (2D)2PCA is utilized to reduce the dimension of the data and extract its intrinsic features. Finally, an RBFNN accepts the data processed by (2D)2PCA to forecast the next day's stock price or movement. The proposed model is used on the Shanghai stock market index, and the experiments show that the model achieves a good level of fitness. The proposed model is then compared with one that uses the traditional dimension reduction method principal component analysis (PCA) and independent component analysis (ICA). The empirical results show that the proposed model outperforms the PCA-based model, as well as alternative models based on ICA and on the multilayer perceptron.

  7. 3D Building Evacuation Route Modelling and Visualization

    NASA Astrophysics Data System (ADS)

    Chan, W.; Armenakis, C.

    2014-11-01

    The most common building evacuation approach currently applied is to have evacuation routes planned prior to these emergency events. These routes are usually the shortest and most practical path from each building room to the closest exit. The problem with this approach is that it is not adaptive. It is not responsively configurable relative to the type, intensity, or location of the emergency risk. Moreover, it does not provide any information to the affected persons or to the emergency responders while not allowing for the review of simulated hazard scenarios and alternative evacuation routes. In this paper we address two main tasks. The first is the modelling of the spatial risk caused by a hazardous event leading to choosing the optimal evacuation route for a set of options. The second is to generate a 3D visual representation of the model output. A multicriteria decision making (MCDM) approach is used to model the risk aiming at finding the optimal evacuation route. This is achieved by using the analytical hierarchy process (AHP) on the criteria describing the different alternative evacuation routes. The best route is then chosen to be the alternative with the least cost. The 3D visual representation of the model displays the building, the surrounding environment, the evacuee's location, the hazard location, the risk areas and the optimal evacuation pathway to the target safety location. The work has been performed using ESRI's ArcGIS. Using the developed models, the user can input the location of the hazard and the location of the evacuee. The system then determines the optimum evacuation route and displays it in 3D.

  8. Scalable tuning of building models to hourly data

    DOE PAGES

    Garrett, Aaron; New, Joshua Ryan

    2015-03-31

    Energy models of existing buildings are unreliable unless calibrated so they correlate well with actual energy usage. Manual tuning requires a skilled professional, is prohibitively expensive for small projects, imperfect, non-repeatable, non-transferable, and not scalable to the dozens of sensor channels that smart meters, smart appliances, and cheap/ubiquitous sensors are beginning to make available today. A scalable, automated methodology is needed to quickly and intelligently calibrate building energy models to all available data, increase the usefulness of those models, and facilitate speed-and-scale penetration of simulation-based capabilities into the marketplace for actualized energy savings. The "Autotune'' project is a novel, model-agnosticmore » methodology which leverages supercomputing, large simulation ensembles, and big data mining with multiple machine learning algorithms to allow automatic calibration of simulations that match measured experimental data in a way that is deployable on commodity hardware. This paper shares several methodologies employed to reduce the combinatorial complexity to a computationally tractable search problem for hundreds of input parameters. Furthermore, accuracy metrics are provided which quantify model error to measured data for either monthly or hourly electrical usage from a highly-instrumented, emulated-occupancy research home.« less

  9. Scalable tuning of building models to hourly data

    SciTech Connect

    Garrett, Aaron; New, Joshua Ryan

    2015-03-31

    Energy models of existing buildings are unreliable unless calibrated so they correlate well with actual energy usage. Manual tuning requires a skilled professional, is prohibitively expensive for small projects, imperfect, non-repeatable, non-transferable, and not scalable to the dozens of sensor channels that smart meters, smart appliances, and cheap/ubiquitous sensors are beginning to make available today. A scalable, automated methodology is needed to quickly and intelligently calibrate building energy models to all available data, increase the usefulness of those models, and facilitate speed-and-scale penetration of simulation-based capabilities into the marketplace for actualized energy savings. The "Autotune'' project is a novel, model-agnostic methodology which leverages supercomputing, large simulation ensembles, and big data mining with multiple machine learning algorithms to allow automatic calibration of simulations that match measured experimental data in a way that is deployable on commodity hardware. This paper shares several methodologies employed to reduce the combinatorial complexity to a computationally tractable search problem for hundreds of input parameters. Furthermore, accuracy metrics are provided which quantify model error to measured data for either monthly or hourly electrical usage from a highly-instrumented, emulated-occupancy research home.

  10. Automatic shape model building based on principal geodesic analysis bootstrapping.

    PubMed

    Dam, Erik B; Fletcher, P Thomas; Pizer, Stephen M

    2008-04-01

    We present a novel method for automatic shape model building from a collection of training shapes. The result is a shape model consisting of the mean model and the major modes of variation with a dense correspondence map between individual shapes. The framework consists of iterations where a medial shape representation is deformed into the training shapes followed by computation of the shape mean and modes of shape variation. In the first iteration, a generic shape model is used as starting point - in the following iterations in the bootstrap method, the resulting mean and modes from the previous iteration are used. Thereby, we gradually capture the shape variation in the training collection better and better. Convergence of the method is explicitly enforced. The method is evaluated on collections of artificial training shapes where the expected shape mean and modes of variation are known by design. Furthermore, collections of real prostates and cartilage sheets are used in the evaluation. The evaluation shows that the method is able to capture the training shapes close to the attainable accuracy already in the first iteration. Furthermore, the correspondence properties measured by generality, specificity, and compactness are improved during the shape model building iterations.

  11. Procedural Modeling for Rapid-Prototyping of Multiple Building Phases

    NASA Astrophysics Data System (ADS)

    Saldana, M.; Johanson, C.

    2013-02-01

    RomeLab is a multidisciplinary working group at UCLA that uses the city of Rome as a laboratory for the exploration of research approaches and dissemination practices centered on the intersection of space and time in antiquity. In this paper we present a multiplatform workflow for the rapid-prototyping of historical cityscapes through the use of geographic information systems, procedural modeling, and interactive game development. Our workflow begins by aggregating archaeological data in a GIS database. Next, 3D building models are generated from the ArcMap shapefiles in Esri CityEngine using procedural modeling techniques. A GIS-based terrain model is also adjusted in CityEngine to fit the building elevations. Finally, the terrain and city models are combined in Unity, a game engine which we used to produce web-based interactive environments which are linked to the GIS data using keyhole markup language (KML). The goal of our workflow is to demonstrate that knowledge generated within a first-person virtual world experience can inform the evaluation of data derived from textual and archaeological sources, and vice versa.

  12. Arbitrage and Volatility in Chinese Stock's Markets

    NASA Astrophysics Data System (ADS)

    Lu, Shu Quan; Ito, Takao; Zhang, Jianbo

    From the point of view of no-arbitrage pricing, what matters is how much volatility the stock has, for volatility measures the amount of profit that can be made from shorting stocks and purchasing options. With the short-sales constraints or in the absence of options, however, high volatility is likely to mean arbitrage from stock market. As emerging stock markets for China, investors are increasingly concerned about volatilities of Chinese two stock markets. We estimate volatility's models for Chinese stock markets' indexes using Markov chain Monte Carlo (MCMC) method and GARCH. We find that estimated values of volatility parameters are very high for all data frequencies. It suggests that stock returns are extremely volatile even at long term intervals in Chinese markets. Furthermore, this result could be considered that there seems to be arbitrage opportunities in Chinese stock markets.

  13. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    SciTech Connect

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  14. Digital Learning Material for Student-Directed Model Building in Molecular Biology

    ERIC Educational Resources Information Center

    Aegerter-Wilmsen, Tinri; Coppens, Marjolijn; Janssen, Fred; Hartog, Rob; Bisseling, Ton

    2005-01-01

    The building of models to explain data and make predictions constitutes an important goal in molecular biology research. To give students the opportunity to practice such model building, two digital cases had previously been developed in which students are guided to build a model step by step. In this article, the development and initial…

  15. Introducing Molecular Life Science Students to Model Building Using Computer Simulations

    ERIC Educational Resources Information Center

    Aegerter-Wilmsen, Tinri; Kettenis, Dik; Sessink, Olivier; Hartog, Rob; Bisseling, Ton; Janssen, Fred

    2006-01-01

    Computer simulations can facilitate the building of models of natural phenomena in research, such as in the molecular life sciences. In order to introduce molecular life science students to the use of computer simulations for model building, a digital case was developed in which students build a model of a pattern formation process in…

  16. Introducing Molecular Life Science Students to Model Building Using Computer Simulations

    ERIC Educational Resources Information Center

    Aegerter-Wilmsen, Tinri; Kettenis, Dik; Sessink, Olivier; Hartog, Rob; Bisseling, Ton; Janssen, Fred

    2006-01-01

    Computer simulations can facilitate the building of models of natural phenomena in research, such as in the molecular life sciences. In order to introduce molecular life science students to the use of computer simulations for model building, a digital case was developed in which students build a model of a pattern formation process in…

  17. Digital Learning Material for Student-Directed Model Building in Molecular Biology

    ERIC Educational Resources Information Center

    Aegerter-Wilmsen, Tinri; Coppens, Marjolijn; Janssen, Fred; Hartog, Rob; Bisseling, Ton

    2005-01-01

    The building of models to explain data and make predictions constitutes an important goal in molecular biology research. To give students the opportunity to practice such model building, two digital cases had previously been developed in which students are guided to build a model step by step. In this article, the development and initial…

  18. Occupants' satisfaction toward building environmental quality: structural equation modeling approach.

    PubMed

    Kamaruzzaman, Syahrul Nizam; Egbu, C O; Zawawi, Emma Marinie Ahmad; Karim, Saipol Bari Abd; Woon, Chen Jia

    2015-05-01

    It is accepted that occupants who are more satisfied with their workplace's building internal environment are more productive. The main objective of the study was to measure the occupants' level of satisfaction and the perceived importance of the design or refurbishment on office conditions. The study also attempted to determine the factors affecting the occupants' satisfaction with their building or office conditions. Post-occupancy evaluations were conducted using a structured questionnaire developed by the Built Environment Research Group at the University of Manchester, UK. Our questionnaires incorporate 22 factors relating to the internal environment and rate these in terms of "user satisfaction" and "degree of importance." The questions were modified to reflect the specific setting of the study and take into consideration the local conditions and climate in Malaysia. The overall mean satisfaction of the occupants toward their office environment was 5.35. The results were measured by a single item of overall liking of office conditions in general. Occupants were more satisfied with their state of health in the workplace, but they were extremely dissatisfied with the distance away from a window. The factor analysis divided the variables into three groups, namely intrusion, air quality, and office appearance. Structural equation modeling (SEM) was then used to determine which factor had the most significant influence on occupants' satisfaction: appearance. The findings from the study suggest that continuous improvement in aspects of the building's appearance needs to be supported with effective and comprehensive maintenance to sustain the occupants' satisfaction.

  19. Structural equation modeling: building and evaluating causal models: Chapter 8

    USGS Publications Warehouse

    Grace, James B.; Scheiner, Samuel M.; Schoolmaster, Donald R.

    2015-01-01

    Scientists frequently wish to study hypotheses about causal relationships, rather than just statistical associations. This chapter addresses the question of how scientists might approach this ambitious task. Here we describe structural equation modeling (SEM), a general modeling framework for the study of causal hypotheses. Our goals are to (a) concisely describe the methodology, (b) illustrate its utility for investigating ecological systems, and (c) provide guidance for its application. Throughout our presentation, we rely on a study of the effects of human activities on wetland ecosystems to make our description of methodology more tangible. We begin by presenting the fundamental principles of SEM, including both its distinguishing characteristics and the requirements for modeling hypotheses about causal networks. We then illustrate SEM procedures and offer guidelines for conducting SEM analyses. Our focus in this presentation is on basic modeling objectives and core techniques. Pointers to additional modeling options are also given.

  20. Determinants of residential electricity consumption: Using smart meter data to examine the effect of climate, building characteristics, appliance stock, and occupants' behavior

    SciTech Connect

    Kavousian, A; Rajagopal, R; Fischer, M

    2013-06-15

    We propose a method to examine structural and behavioral determinants of residential electricity consumption, by developing separate models for daily maximum (peak) and minimum (idle) consumption. We apply our method on a data set of 1628 households' electricity consumption. The results show that weather, location and floor area are among the most important determinants of residential electricity consumption. In addition to these variables, number of refrigerators and entertainment devices (e.g., VCRs) are among the most important determinants of daily minimum consumption, while number of occupants and high-consumption appliances such as electric water heaters are the most significant determinants of daily maximum consumption. Installing double-pane windows and energy-efficient lights helped to reduce consumption, as did the energy-conscious use of electric heater. Acknowledging climate change as a motivation to save energy showed correlation with lower electricity consumption. Households with individuals over 55 or between 19 and 35 years old recorded lower electricity consumption, while pet owners showed higher consumption. Contrary to some previous studies, we observed no significant correlation between electricity consumption and income level, home ownership, or building age. Some otherwise energy-efficient features such as energy-efficient appliances, programmable thermostats, and insulation were correlated with slight increase in electricity consumption. (C) 2013 Elsevier Ltd. All rights reserved.

  1. Comparison between the probability distribution of returns in the Heston model and empirical data for stock indexes

    NASA Astrophysics Data System (ADS)

    Silva, A. Christian; Yakovenko, Victor M.

    2003-06-01

    We compare the probability distribution of returns for the three major stock-market indexes (Nasdaq, S&P500, and Dow-Jones) with an analytical formula recently derived by Drăgulescu and Yakovenko for the Heston model with stochastic variance. For the period of 1982-1999, we find a very good agreement between the theory and the data for a wide range of time lags from 1 to 250 days. On the other hand, deviations start to appear when the data for 2000-2002 are included. We interpret this as a statistical evidence of the major change in the market from a positive growth rate in 1980s and 1990s to a negative rate in 2000s.

  2. Carbon stocks and cycling in the Amazon basin: Measurement and modeling of natural disturbance and recovery using airborne LIDAR

    NASA Astrophysics Data System (ADS)

    Hunter, Maria O'Healy

    Forest structure, the three dimensional distribution of living and dead plant material including live crowns, understory vegetation and coarse woody debris, is the concrete physical form of carbon storage, the framework for biodiversity, and the instantaneous manifestation of disturbance and recovery processes. The frequency of disturbance and rate of decomposition drives the fractions of living and dead biomass, and the size of and intensity of disturbance drives the rate and species composition of forest recovery; both are primary sinks and sources in the carbon cycle. To improve understanding of disturbance and recovery processes, high-resolution airborne LIDAR (light detection and ranging) data from the Amazon region is combined with field measurements to analyze forest structure. These measurements are incorporated into a simple model to estimate light availability and the associated changes in carbon stocks. This work improves the understanding of Amazon forest dynamics and its role in the carbon cycle.

  3. Planetary Boundary-Layer Modelling and Tall Building Design

    NASA Astrophysics Data System (ADS)

    Simiu, Emil; Shi, Liang; Yeo, DongHun

    2016-04-01

    Characteristics of flow in the planetary boundary layer (PBL) strongly affect the design of tall structures. PBL modelling in building codes, based as it is on empirical data from the 1960s and 1970s, differs significantly from contemporary PBL models, which account for both "neutral" flows, and "conventionally neutral" flows. PBL heights estimated in these relatively sophisticated models are typically approximately half as large as those obtained using the classical asymptotic similarity approach, and are one order of magnitude larger than those specified in North American and Japanese building codes. A simple method is proposed for estimating the friction velocity and PBL height as functions of specified surface roughness and geostrophic wind speed. Based on published results, it is tentatively determined that, even at elevations as high as 800 m above the surface, the contribution to the resultant mean flow velocity of the component V normal to the surface stress is negligible and the veering angle is of the order of only 5°. This note aims to encourage dialogue between boundary-layer meteorologists and structural engineers.

  4. An instanton toolbox for F-theory model building

    NASA Astrophysics Data System (ADS)

    Marsano, Joseph; Saulina, Natalia; Schäfer-Nameki, Sakura

    2010-01-01

    Several dimensionful parameters needed for model building can be engineered in a certain class of SU(5) F-theory GUTs by adding extra singlet fields which are localized along pairwise intersections of D7-branes. The values of these parameters, however, depend on dynamics external to the GUT which causes the singlets to acquire suitable masses or expectation values. In this note, we demonstrate that D3-instantons which wrap one of the intersecting D7’s can provide precisely the needed dynamics to generate several important scales, including the supersymmetry-breaking scale and the right-handed neutrino mass. Furthermore, these instantons seem unable to directly generate the μ term suggesting that, at least in this class of models, it should perhaps be tied to one of the other scales in the problem. More specifically, we study the simple system consisting of a pair of D7-branes wrapping del Pezzo surfaces which intersect along a curve Σ of genus 0 or 1 and classify all instanton configurations which can potentially contribute to the superpotential. This allows one to formulate topological conditions which must be imposed on Σ for various model-building applications.

  5. Inflation model building with an accurate measure of e -folding

    NASA Astrophysics Data System (ADS)

    Chongchitnan, Sirichai

    2016-08-01

    It has become standard practice to take the logarithmic growth of the scale factor as a measure of the amount of inflation, despite the well-known fact that this is only an approximation for the true amount of inflation required to solve the horizon and flatness problems. The aim of this work is to show how this approximation can be completely avoided using an alternative framework for inflation model building. We show that using the inverse Hubble radius, H =a H , as the key dynamical parameter, the correct number of e -folding arises naturally as a measure of inflation. As an application, we present an interesting model in which the entire inflationary dynamics can be solved analytically and exactly, and, in special cases, reduces to the familiar class of power-law models.

  6. From neurons to nests: nest-building behaviour as a model in behavioural and comparative neuroscience.

    PubMed

    Hall, Zachary J; Meddle, Simone L; Healy, Susan D

    Despite centuries of observing the nest building of most extant bird species, we know surprisingly little about how birds build nests and, specifically, how the avian brain controls nest building. Here, we argue that nest building in birds may be a useful model behaviour in which to study how the brain controls behaviour. Specifically, we argue that nest building as a behavioural model provides a unique opportunity to study not only the mechanisms through which the brain controls behaviour within individuals of a single species but also how evolution may have shaped the brain to produce interspecific variation in nest-building behaviour. In this review, we outline the questions in both behavioural and comparative neuroscience that nest building could be used to address, summarize recent findings regarding the neurobiology of nest building in lab-reared zebra finches and across species building different nest structures, and suggest some future directions for the neurobiology of nest building.

  7. Toward Building a New Seismic Hazard Model for Mainland China

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z.

    2015-12-01

    At present, the only publicly available seismic hazard model for mainland China was generated by Global Seismic Hazard Assessment Program in 1999. We are building a new seismic hazard model by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data using the methodology recommended by Global Earthquake Model (GEM), and derive a strain rate map based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones based on seismotectonics. For each zone, we use the tapered Gutenberg-Richter (TGR) relationship to model the seismicity rates. We estimate the TGR a- and b-values from the historical earthquake data, and constrain corner magnitude using the seismic moment rate derived from the strain rate. From the TGR distributions, 10,000 to 100,000 years of synthetic earthquakes are simulated. Then, we distribute small and medium earthquakes according to locations and magnitudes of historical earthquakes. Some large earthquakes are distributed on active faults based on characteristics of the faults, including slip rate, fault length and width, and paleoseismic data, and the rest to the background based on the distributions of historical earthquakes and strain rate. We evaluate available ground motion prediction equations (GMPE) by comparison to observed ground motions. To apply appropriate GMPEs, we divide the region into active and stable tectonics. The seismic hazard will be calculated using the OpenQuake software developed by GEM. To account for site amplifications, we construct a site condition map based on geology maps. The resulting new seismic hazard map can be used for seismic risk analysis and management, and business and land-use planning.

  8. Simulating Carbon Stocks and Fluxes of an African Tropical Montane Forest with an Individual-Based Forest Model

    PubMed Central

    Fischer, Rico; Ensslin, Andreas; Rutten, Gemma; Fischer, Markus; Schellenberger Costa, David; Kleyer, Michael; Hemp, Andreas; Paulick, Sebastian; Huth, Andreas

    2015-01-01

    Tropical forests are carbon-dense and highly productive ecosystems. Consequently, they play an important role in the global carbon cycle. In the present study we used an individual-based forest model (FORMIND) to analyze the carbon balances of a tropical forest. The main processes of this model are tree growth, mortality, regeneration, and competition. Model parameters were calibrated using forest inventory data from a tropical forest at Mt. Kilimanjaro. The simulation results showed that the model successfully reproduces important characteristics of tropical forests (aboveground biomass, stem size distribution and leaf area index). The estimated aboveground biomass (385 t/ha) is comparable to biomass values in the Amazon and other tropical forests in Africa. The simulated forest reveals a gross primary production of 24 tcha-1yr-1. Modeling above- and belowground carbon stocks, we analyzed the carbon balance of the investigated tropical forest. The simulated carbon balance of this old-growth forest is zero on average. This study provides an example of how forest models can be used in combination with forest inventory data to investigate forest structure and local carbon balances. PMID:25915854

  9. Simulating carbon stocks and fluxes of an African tropical montane forest with an individual-based forest model.

    PubMed

    Fischer, Rico; Ensslin, Andreas; Rutten, Gemma; Fischer, Markus; Schellenberger Costa, David; Kleyer, Michael; Hemp, Andreas; Paulick, Sebastian; Huth, Andreas

    2015-01-01

    Tropical forests are carbon-dense and highly productive ecosystems. Consequently, they play an important role in the global carbon cycle. In the present study we used an individual-based forest model (FORMIND) to analyze the carbon balances of a tropical forest. The main processes of this model are tree growth, mortality, regeneration, and competition. Model parameters were calibrated using forest inventory data from a tropical forest at Mt. Kilimanjaro. The simulation results showed that the model successfully reproduces important characteristics of tropical forests (aboveground biomass, stem size distribution and leaf area index). The estimated aboveground biomass (385 t/ha) is comparable to biomass values in the Amazon and other tropical forests in Africa. The simulated forest reveals a gross primary production of 24 tcha(-1) yr(-1). Modeling above- and belowground carbon stocks, we analyzed the carbon balance of the investigated tropical forest. The simulated carbon balance of this old-growth forest is zero on average. This study provides an example of how forest models can be used in combination with forest inventory data to investigate forest structure and local carbon balances.

  10. Vision-based building energy diagnostics and retrofit analysis using 3D thermography and building information modeling

    NASA Astrophysics Data System (ADS)

    Ham, Youngjib

    The emerging energy crisis in the building sector and the legislative measures on improving energy efficiency are steering the construction industry towards adopting new energy efficient design concepts and construction methods that decrease the overall energy loads. However, the problems of energy efficiency are not only limited to the design and construction of new buildings. Today, a significant amount of input energy in existing buildings is still being wasted during the operational phase. One primary source of the energy waste is attributed to unnecessary heat flows through building envelopes during hot and cold seasons. This inefficiency increases the operational frequency of heating and cooling systems to keep the desired thermal comfort of building occupants, and ultimately results in excessive energy use. Improving thermal performance of building envelopes can reduce the energy consumption required for space conditioning and in turn provide building occupants with an optimal thermal comfort at a lower energy cost. In this sense, energy diagnostics and retrofit analysis for existing building envelopes are key enablers for improving energy efficiency. Since proper retrofit decisions of existing buildings directly translate into energy cost saving in the future, building practitioners are increasingly interested in methods for reliable identification of potential performance problems so that they can take timely corrective actions. However, sensing what and where energy problems are emerging or are likely to emerge and then analyzing how the problems influence the energy consumption are not trivial tasks. The overarching goal of this dissertation focuses on understanding the gaps in knowledge in methods for building energy diagnostics and retrofit analysis, and filling these gaps by devising a new method for multi-modal visual sensing and analytics using thermography and Building Information Modeling (BIM). First, to address the challenges in scaling and

  11. Emerging Challenges and Opportunities in Building Information Modeling for the US Army Installation Management Command

    DTIC Science & Technology

    2012-07-01

    Information Modeling ( BIM ) is the process of generating and managing building data during a facility’s entire life cycle. New BIM standards for...cycle Building Information Modeling ( BIM ) as a new standard for building information data repositories can serve as the foun- dation for automation and... Building Information Modeling ( BIM ) is defined as “a digital representa- tion of physical and functional

  12. The dependence of Islamic and conventional stocks: A copula approach

    NASA Astrophysics Data System (ADS)

    Razak, Ruzanna Ab; Ismail, Noriszura

    2015-09-01

    Recent studies have found that Islamic stocks are dependent on conventional stocks and they appear to be more risky. In Asia, particularly in Islamic countries, research on dependence involving Islamic and non-Islamic stock markets is limited. The objective of this study is to investigate the dependence between financial times stock exchange Hijrah Shariah index and conventional stocks (EMAS and KLCI indices). Using the copula approach and a time series model for each marginal distribution function, the copula parameters were estimated. The Elliptical copula was selected to present the dependence structure of each pairing of the Islamic stock and conventional stock. Specifically, the Islamic versus conventional stocks (Shariah-EMAS and Shariah-KLCI) had lower dependence compared to conventional versus conventional stocks (EMAS-KLCI). These findings suggest that the occurrence of shocks in a conventional stock will not have strong impact on the Islamic stock.

  13. Strategies for carbohydrate model building, refinement and validation.

    PubMed

    Agirre, Jon

    2017-02-01

    Sugars are the most stereochemically intricate family of biomolecules and present substantial challenges to anyone trying to understand their nomenclature, reactions or branched structures. Current crystallographic programs provide an abstraction layer allowing inexpert structural biologists to build complete protein or nucleic acid model components automatically either from scratch or with little manual intervention. This is, however, still not generally true for sugars. The need for carbohydrate-specific building and validation tools has been highlighted a number of times in the past, concomitantly with the introduction of a new generation of experimental methods that have been ramping up the production of protein-sugar complexes and glycoproteins for the past decade. While some incipient advances have been made to address these demands, correctly modelling and refining carbohydrates remains a challenge. This article will address many of the typical difficulties that a structural biologist may face when dealing with carbohydrates, with an emphasis on problem solving in the resolution range where X-ray crystallography and cryo-electron microscopy are expected to overlap in the next decade.

  14. Strategies for carbohydrate model building, refinement and validation

    PubMed Central

    2017-01-01

    Sugars are the most stereochemically intricate family of biomolecules and present substantial challenges to anyone trying to understand their nomenclature, reactions or branched structures. Current crystallographic programs provide an abstraction layer allowing inexpert structural biologists to build complete protein or nucleic acid model components automatically either from scratch or with little manual intervention. This is, however, still not generally true for sugars. The need for carbohydrate-specific building and validation tools has been highlighted a number of times in the past, concomitantly with the introduction of a new generation of experimental methods that have been ramping up the production of protein–sugar complexes and glycoproteins for the past decade. While some incipient advances have been made to address these demands, correctly modelling and refining carbohydrates remains a challenge. This article will address many of the typical difficulties that a structural biologist may face when dealing with carbohydrates, with an emphasis on problem solving in the resolution range where X-ray crystallography and cryo-electron microscopy are expected to overlap in the next decade. PMID:28177313

  15. A New Modeling Application of Legacy Data on Ecosystem Stocks and Fluxes in Multiple Land Uses in the Eastern Amazon

    NASA Astrophysics Data System (ADS)

    Nifong, R. L.; Davidson, E. A.

    2015-12-01

    Land-use change and its interaction with climate change remain significant threats to the integrity of Amazonian ecosystems. The responses and feedbacks of biogeochemical cycles to these changes play an important role in determining ecosystem responses to possible future trajectories for land stewardship through effects on rates of secondary forest regrowth, soil emissions of greenhouse gases, inputs of nutrients to groundwater and streamwater, and nutrient management in agroecosystems. The Terrestrial Ecology program at NASA supported numerous studies on these topics in the Amazon and Cerrado regions, both before and during the LBA-ECO project. Here we present analyses of data from this body of work on nutrient cycling in cattle pastures, secondary forests, and mature forests of the Paragominas area, where we are developing a stoichiometric model relating C-N-P interactions during land use change, constrained by multiple observations of ecosystem stocks and fluxes in each land use. Whereas P is conservatively cycled in all land uses, we demonstrate how pyrolyzation of N during pasture formation and management depletes available-N pools, consistent with observations of lower rates of N leaching and trace gas emission and consistent with secondary forest growth responses to experimental N amendments. Although the soils store large stocks of N and P, our parameterization of available forms of these nutrients for steady-state dynamics in the mature forest yield reasonable estimates of net N and P mineralization available for grasses and secondary forest species at rates consistent with observed biomass accumulation and productivity in these modified ecosystems. The multiple data constraints from measurements made by the type of integrated studies supported by the NASA TE program provide an important legacy that continues to support exploration of the functions, vulnerabilities, and resiliencies of these ecosystems.

  16. Modelling of cosmic-ray muon exposure in building's interior.

    PubMed

    Fujitaka, K; Abe, S

    1984-06-01

    Physical parameters on the exposure indoors from cosmic ray muons were determined in order to undertake computer simulations. The hitherto known information was compiled, and the unknowns were newly calculated. Assumptions and approximations required in making a practical model were also described. The stopping power and the range of muons in a normal concrete as well as the air were calculated for the energy up to hundreds GeV. The consistency of those results with ready-made tables was found satisfactory although the comparisons were available only in the low energy tail. The scattering effect of cosmic ray muons in building's interior was examined numerically through very simple model calculations. It was revealed that the overall scattering effect would be ignored unless very small variations are wanted. The iron fraction in a reinforced concrete as well as the density of the concrete was also shown to be an ineffective factor.

  17. Modeling National Impacts for the Building America Program

    SciTech Connect

    Coughlin, Katie M.; McNeil, Michael A.

    2006-06-15

    In this paper we present a model to estimate the nationalenergy and economic impacts of the Department of Energy Building Americaprogram. The program goal is to improve energy performance in newresidential construction, by working with builders to design andconstruct energy-efficient homes at minimal cost. The model is anadaptation of the method used to calculate the national energy savingsfor appliance energy efficiency standards. The main difference is thatthe key decision here is not the consumer decision to buy anefficienthouse, but rather the builder decision to offer such a house inthe market. The builder decision is treated by developing a number ofscenarios in which the relative importance of first costs vs. energysavings is varied.

  18. Stereovision vibration measurement test of a masonry building model

    NASA Astrophysics Data System (ADS)

    Shan, Baohua; Gao, Yunli; Shen, Yu

    2016-04-01

    To monitor 3D deformations of structural vibration response, a stereovision-based 3D deformation measurement method is proposed in paper. The world coordinate system is established on structural surface, and 3D displacement equations of structural vibration response are acquired through coordinate transformation. The algorithms of edge detection, center fitting and matching constraint are developed for circular target. A shaking table test of a masonry building model under Taft and El Centro earthquake at different acceleration peak is performed in lab, 3D displacement time histories of the model are acquired by the integrated stereovision measurement system. In-plane displacement curves obtained by two methods show good agreement, this suggests that the proposed method is reliable for monitoring structural vibration response. Out-of-plane displacement curves indicate that the proposed method is feasible and useful for monitoring 3D deformations of vibration response.

  19. Compressive sensing as a paradigm for building physics models

    NASA Astrophysics Data System (ADS)

    Nelson, Lance J.; Hart, Gus L. W.; Zhou, Fei; Ozoliņš, Vidvuds

    2013-01-01

    The widely accepted intuition that the important properties of solids are determined by a few key variables underpins many methods in physics. Though this reductionist paradigm is applicable in many physical problems, its utility can be limited because the intuition for identifying the key variables often does not exist or is difficult to develop. Machine learning algorithms (genetic programming, neural networks, Bayesian methods, etc.) attempt to eliminate the a priori need for such intuition but often do so with increased computational burden and human time. A recently developed technique in the field of signal processing, compressive sensing (CS), provides a simple, general, and efficient way of finding the key descriptive variables. CS is a powerful paradigm for model building; we show that its models are more physical and predict more accurately than current state-of-the-art approaches and can be constructed at a fraction of the computational cost and user effort.

  20. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    PubMed

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent