Los Alamos National Laboratory Economic Analysis Capability Overview
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boero, Riccardo; Edwards, Brian Keith; Pasqualini, Donatella
Los Alamos National Laboratory has developed two types of models to compute the economic impact of infrastructure disruptions. FastEcon is a fast running model that estimates first-order economic impacts of large scale events such as hurricanes and floods and can be used to identify the amount of economic activity that occurs in a specific area. LANL’s Computable General Equilibrium (CGE) model estimates more comprehensive static and dynamic economic impacts of a broader array of events and captures the interactions between sectors and industries when estimating economic impacts.
Macromod: Computer Simulation For Introductory Economics
ERIC Educational Resources Information Center
Ross, Thomas
1977-01-01
The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)
Gaming via Computer Simulation Techniques for Junior College Economics Education. Final Report.
ERIC Educational Resources Information Center
Thompson, Fred A.
A study designed to answer the need for more attractive and effective economics education involved the teaching of one junior college economics class by the conventional (lecture) method and an experimental class by computer simulation techniques. Econometric models approximating the "real world" were computer programed to enable the experimental…
NASA Astrophysics Data System (ADS)
Aygunes, Gunes
2017-07-01
The objective of this paper is to survey and determine the macroeconomic factors affecting the level of venture capital (VC) investments in a country. The literary depends on venture capitalists' quality and countries' venture capital investments. The aim of this paper is to give relationship between venture capital investment and macro economic variables via statistical computation method. We investigate the countries and macro economic variables. By using statistical computation method, we derive correlation between venture capital investments and macro economic variables. According to method of logistic regression model (logit regression or logit model), macro economic variables are correlated with each other in three group. Venture capitalists regard correlations as a indicator. Finally, we give correlation matrix of our results.
NEUROBIOLOGY OF ECONOMIC CHOICE: A GOOD-BASED MODEL
Padoa-Schioppa, Camillo
2012-01-01
Traditionally the object of economic theory and experimental psychology, economic choice recently became a lively research focus in systems neuroscience. Here I summarize the emerging results and I propose a unifying model of how economic choice might function at the neural level. Economic choice entails comparing options that vary on multiple dimensions. Hence, while choosing, individuals integrate different determinants into a subjective value; decisions are then made by comparing values. According to the good-based model, the values of different goods are computed independently of one another, which implies transitivity. Values are not learned as such, but rather computed at the time of choice. Most importantly, values are compared within the space of goods, independent of the sensori-motor contingencies of choice. Evidence from neurophysiology, imaging and lesion studies indicates that abstract representations of value exist in the orbitofrontal and ventromedial prefrontal cortices. The computation and comparison of values may thus take place within these regions. PMID:21456961
Computer models for economic and silvicultural decisions
Rosalie J. Ingram
1989-01-01
Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.
Wind Energy Conversion System Analysis Model (WECSAM) computer program documentation
NASA Astrophysics Data System (ADS)
Downey, W. T.; Hendrick, P. L.
1982-07-01
Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation.
NASA Astrophysics Data System (ADS)
Halkos, George E.; Tsilika, Kyriaki D.
2011-09-01
In this paper we examine the property of asymptotic stability in several dynamic economic systems, modeled in ordinary differential equation formulations of time parameter t. Asymptotic stability ensures intertemporal equilibrium for the economic quantity the solution stands for, regardless of what the initial conditions happen to be. Existence of economic equilibrium in continuous time models is checked via a Symbolic language, the Xcas program editor. Using stability theorems of differential equations as background a brief overview of symbolic capabilities of free software Xcas is given. We present computational experience with a programming style for stability results of ordinary linear and nonlinear differential equations. Numerical experiments on traditional applications of economic dynamics exhibit the simplicity clarity and brevity of input and output of our computer codes.
NASA Astrophysics Data System (ADS)
Babu, C. Rajesh; Kumar, P.; Rajamohan, G.
2017-07-01
Computation of fluid flow and heat transfer in an economizer is simulated by a porous medium approach, with plain tubes having a horizontal in-line arrangement and cross flow arrangement in a coal-fired thermal power plant. The economizer is a thermal mechanical device that captures waste heat from the thermal exhaust flue gasses through heat transfer surfaces to preheat boiler feed water. In order to evaluate the fluid flow and heat transfer on tubes, a numerical analysis on heat transfer performance is carried out on an 110 t/h MCR (Maximum continuous rating) boiler unit. In this study, thermal performance is investigated using the computational fluid dynamics (CFD) simulation using ANSYS FLUENT. The fouling factor ε and the overall heat transfer coefficient ψ are employed to evaluate the fluid flow and heat transfer. The model demands significant computational details for geometric modeling, grid generation, and numerical calculations to evaluate the thermal performance of an economizer. The simulation results show that the overall heat transfer coefficient 37.76 W/(m2K) and economizer coil side pressure drop of 0.2 (kg/cm2) are found to be conformity within the tolerable limits when compared with existing industrial economizer data.
The development of computer networks: First results from a microeconomic model
NASA Astrophysics Data System (ADS)
Maier, Gunther; Kaufmann, Alexander
Computer networks like the Internet are gaining importance in social and economic life. The accelerating pace of the adoption of network technologies for business purposes is a rather recent phenomenon. Many applications are still in the early, sometimes even experimental, phase. Nevertheless, it seems to be certain that networks will change the socioeconomic structures we know today. This is the background for our special interest in the development of networks, in the role of spatial factors influencing the formation of networks, and consequences of networks on spatial structures, and in the role of externalities. This paper discusses a simple economic model - based on a microeconomic calculus - that incorporates the main factors that generate the growth of computer networks. The paper provides analytic results about the generation of computer networks. The paper discusses (1) under what conditions economic factors will initiate the process of network formation, (2) the relationship between individual and social evaluation, and (3) the efficiency of a network that is generated based on economic mechanisms.
The economic impact of public resource supply constraints in northeast Oregon.
Edward C Waters; David W. Holland; Richard W. Haynes
1977-01-01
Traditional, fixed-price (input-output) economic models provide a useful framework for conceptualizing links in a regional economy. Apparent shortcomings in these models, however, can severely restrict our ability to deduce valid prescriptions for public policy and economic development. A more efficient approach using regional computable general equilibrium (CGE)...
NASA Astrophysics Data System (ADS)
Iglesias, A.; Quiroga, S.; Garrote, L.; Cunningham, R.
2012-04-01
This paper provides monetary estimates of the effects of agricultural adaptation to climate change in Europe. The model computes spatial crop productivity changes as a response to climate change linking biophysical and socioeconomic components. It combines available data sets of crop productivity changes under climate change (Iglesias et al 2011, Ciscar et al 2011), statistical functions of productivity response to water and nitrogen inputs, catchment level water availability, and environmental policy scenarios. Future global change scenarios are derived from several socio-economic futures of representative concentration pathways and regional climate models. The economic valuation is conducted by using GTAP general equilibrium model. The marginal productivity changes has been used as an input for the economic general equilibrium model in order to analyse the economic impact of the agricultural changes induced by climate change in the world. The study also includes the analysis of an adaptive capacity index computed by using the socio-economic results of GTAP. The results are combined to prioritize agricultural adaptation policy needs in Europe.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The model is designed to enable decision makers to compare the economics of geothermal projects with the economics of alternative energy systems at an early stage in the decision process. The geothermal engineering and economic feasibility computer model (GEEF) is written in FORTRAN IV language and can be run on a mainframe or a mini-computer system. An abbreviated version of the model is being developed for usage in conjunction with a programmable desk calculator. The GEEF model has two main segments, namely (i) the engineering design/cost segment and (ii) the economic analysis segment. In the engineering segment, the model determinesmore » the numbers of production and injection wells, heat exchanger design, operating parameters for the system, requirement of supplementary system (to augment the working fluid temperature if the resource temperature is not sufficiently high), and the fluid flow rates. The model can handle single stage systems as well as two stage cascaded systems in which the second stage may involve a space heating application after a process heat application in the first stage.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boero, Riccardo; Edwards, Brian Keith
Economists use computable general equilibrium (CGE) models to assess how economies react and self-organize after changes in policies, technology, and other exogenous shocks. CGE models are equation-based, empirically calibrated, and inspired by Neoclassical economic theory. The focus of this work was to validate the National Infrastructure Simulation and Analysis Center (NISAC) CGE model and apply it to the problem of assessing the economic impacts of severe events. We used the 2012 Hurricane Sandy event as our validation case. In particular, this work first introduces the model and then describes the validation approach and the empirical data available for studying themore » event of focus. Shocks to the model are then formalized and applied. Finally, model results and limitations are presented and discussed, pointing out both the model degree of accuracy and the assessed total damage caused by Hurricane Sandy.« less
System capacity and economic modeling computer tool for satellite mobile communications systems
NASA Technical Reports Server (NTRS)
Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.
1988-01-01
A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joseph, Earl C.; Conway, Steve; Dekate, Chirag
This study investigated how high-performance computing (HPC) investments can improve economic success and increase scientific innovation. This research focused on the common good and provided uses for DOE, other government agencies, industry, and academia. The study created two unique economic models and an innovation index: 1 A macroeconomic model that depicts the way HPC investments result in economic advancements in the form of ROI in revenue (GDP), profits (and cost savings), and jobs. 2 A macroeconomic model that depicts the way HPC investments result in basic and applied innovations, looking at variations by sector, industry, country, and organization size. Amore » new innovation index that provides a means of measuring and comparing innovation levels. Key findings of the pilot study include: IDC collected the required data across a broad set of organizations, with enough detail to create these models and the innovation index. The research also developed an expansive list of HPC success stories.« less
Economic Modeling as a Component of Academic Strategic Planning.
ERIC Educational Resources Information Center
MacKinnon, Joyce; Sothmann, Mark; Johnson, James
2001-01-01
Computer-based economic modeling was used to enable a school of allied health to define outcomes, identify associated costs, develop cost and revenue models, and create a financial planning system. As a strategic planning tool, it assisted realistic budgeting and improved efficiency and effectiveness. (Contains 18 references.) (SK)
Coupling Climate Models and Forward-Looking Economic Models
NASA Astrophysics Data System (ADS)
Judd, K.; Brock, W. A.
2010-12-01
Authors: Dr. Kenneth L. Judd, Hoover Institution, and Prof. William A. Brock, University of Wisconsin Current climate models range from General Circulation Models (GCM’s) with millions of degrees of freedom to models with few degrees of freedom. Simple Energy Balance Climate Models (EBCM’s) help us understand the dynamics of GCM’s. The same is true in economics with Computable General Equilibrium Models (CGE’s) where some models are infinite-dimensional multidimensional differential equations but some are simple models. Nordhaus (2007, 2010) couples a simple EBCM with a simple economic model. One- and two- dimensional ECBM’s do better at approximating damages across the globe and positive and negative feedbacks from anthroprogenic forcing (North etal. (1981), Wu and North (2007)). A proper coupling of climate and economic systems is crucial for arriving at effective policies. Brock and Xepapadeas (2010) have used Fourier/Legendre based expansions to study the shape of socially optimal carbon taxes over time at the planetary level in the face of damages caused by polar ice cap melt (as discussed by Oppenheimer, 2005) but in only a “one dimensional” EBCM. Economists have used orthogonal polynomial expansions to solve dynamic, forward-looking economic models (Judd, 1992, 1998). This presentation will couple EBCM climate models with basic forward-looking economic models, and examine the effectiveness and scaling properties of alternative solution methods. We will use a two dimensional EBCM model on the sphere (Wu and North, 2007) and a multicountry, multisector regional model of the economic system. Our aim will be to gain insights into intertemporal shape of the optimal carbon tax schedule, and its impact on global food production, as modeled by Golub and Hertel (2009). We will initially have limited computing resources and will need to focus on highly aggregated models. However, this will be more complex than existing models with forward-looking economic modules, and the initial models will help guide the construction of more refined models that can effectively use more powerful computational environments to analyze economic policies related to climate change. REFERENCES Brock, W., Xepapadeas, A., 2010, “An Integration of Simple Dynamic Energy Balance Climate Models and Ramsey Growth Models,” Department of Economics, University of Wisconsin, Madison, and University of Athens. Golub, A., Hertel, T., etal., 2009, “The opportunity cost of land use and the global potential for greenhouse gas mitigation in agriculture and forestry,” RESOURCE AND ENERGY ECONOMICS, 31, 299-319. Judd, K., 1992, “Projection methods for solving aggregate growth models,” JOURNAL OF ECONOMIC THEORY, 58: 410-52. Judd, K., 1998, NUMERICAL METHODS IN ECONOMICS, MIT Press, Cambridge, Mass. Nordhaus, W., 2007, A QUESTION OF BALANCE: ECONOMIC MODELS OF CLIMATE CHANGE, Yale University Press, New Haven, CT. North, G., R., Cahalan, R., Coakely, J., 1981, “Energy balance climate models,” REVIEWS OF GEOPHYSICS AND SPACE PHYSICS, Vol. 19, No. 1, 91-121, February Wu, W., North, G. R., 2007, “Thermal decay modes of a 2-D energy balance climate model,” TELLUS, 59A, 618-626.
Analysis and assessment of STES technologies
NASA Astrophysics Data System (ADS)
Brown, D. R.; Blahnik, D. E.; Huber, H. D.
1982-12-01
Technical and economic assessments completed in FY 1982 in support of the Seasonal Thermal Energy Storage (STES) segment of the Underground Energy Storage Program included: (1) a detailed economic investigation of the cost of heat storage in aquifers, (2) documentation for AQUASTOR, a computer model for analyzing aquifer thermal energy storage (ATES) coupled with district heating or cooling, and (3) a technical and economic evaluation of several ice storage concepts. This paper summarizes the research efforts and main results of each of these three activities. In addition, a detailed economic investigation of the cost of chill storage in aquifers is currently in progress. The work parallels that done for ATES heat storage with technical and economic assumptions being varied in a parametric analysis of the cost of ATES delivered chill. The computer model AQUASTOR is the principal analytical tool being employed.
Efficient Monte Carlo Estimation of the Expected Value of Sample Information Using Moment Matching.
Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca
2018-02-01
The Expected Value of Sample Information (EVSI) is used to calculate the economic value of a new research strategy. Although this value would be important to both researchers and funders, there are very few practical applications of the EVSI. This is due to computational difficulties associated with calculating the EVSI in practical health economic models using nested simulations. We present an approximation method for the EVSI that is framed in a Bayesian setting and is based on estimating the distribution of the posterior mean of the incremental net benefit across all possible future samples, known as the distribution of the preposterior mean. Specifically, this distribution is estimated using moment matching coupled with simulations that are available for probabilistic sensitivity analysis, which is typically mandatory in health economic evaluations. This novel approximation method is applied to a health economic model that has previously been used to assess the performance of other EVSI estimators and accurately estimates the EVSI. The computational time for this method is competitive with other methods. We have developed a new calculation method for the EVSI which is computationally efficient and accurate. This novel method relies on some additional simulation so can be expensive in models with a large computational cost.
LIME SPRAY DRYER FLUE GAS DESULFURIZATION COMPUTER MODEL USERS MANUAL
The report describes a lime spray dryer/baghouse (FORTRAN) computer model that simulates SO2 removal and permits study of related impacts on design and economics as functions of design parameters and operating conditions for coal-fired electric generating units. The model allows ...
Computer model for economic study of unbleached kraft paperboard production
Peter J. Ince
1984-01-01
Unbleached kraft paperboard is produced from wood fiber in an industrial papermaking process. A highly specific and detailed model of the process is presented. The model is also presented as a working computer program. A user of the computer program will provide data on physical parameters of the process and on prices of material inputs and outputs. The program is then...
Computer Series, 13: Bits and Pieces, 11.
ERIC Educational Resources Information Center
Moore, John W., Ed.
1982-01-01
Describes computer programs (with ordering information) on various topics including, among others, modeling of thermodynamics and economics of solar energy, radioactive decay simulation, stoichiometry drill/tutorial (in Spanish), computer-generated safety quiz, medical chemistry computer game, medical biochemistry question bank, generation of…
Direct coal liquefaction baseline design and system analysis. Quarterly report, January--March 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-04-01
The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less
Direct coal liquefaction baseline design and system analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-04-01
The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less
BEST (bioreactor economics, size and time of operation) is an Excel™ spreadsheet-based model that is used in conjunction with the public domain geochemical modeling software, PHREEQCI. The BEST model is used in the design process of sulfate-reducing bacteria (SRB) field bioreacto...
Computer Applications in the Design Process.
ERIC Educational Resources Information Center
Winchip, Susan
Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…
Investment Justification of Robotic Technology in Aerospace Manufacturing. User’s Manual
1984-10-01
assessing the economic attractiveness of investments in robotics and/or flexible manufacturing systems (FMS). It models the cash flows...relative. 5. RIDM assesses the inherent economic attractiveness of robotic/FMS implementation. The model is based on real economic events and not...provided for an optional analysis of state and local tax impacts, to be custom designed by the user. (2) Computation of Depreciation
Patricia K. Lebow; Henry Spelter; Peter J. Ince
2003-01-01
This report provides documentation and user information for FPL-PELPS, a personal computer price endogenous linear programming system for economic modeling. Originally developed to model the North American pulp and paper industry, FPL-PELPS follows its predecessors in allowing the modeling of any appropriate sector to predict consumption, production and capacity by...
Economic models for management of resources in peer-to-peer and grid computing
NASA Astrophysics Data System (ADS)
Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David
2001-07-01
The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.
Photogrammetry and computer-aided piping design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keneflick, J.F.; Chirillo, R.D.
1985-02-18
Three-dimensional measurements taken from photographs of a plant model can be digitized and linked with computer-aided piping design. This can short-cut the design and construction of new plants and expedite repair and retrofitting projects. Some designers bridge the gap between model and computer by digitizing from orthographic prints obtained via orthography or the laser scanning of model sections. Such valve or fitting then processed is described in this paper. The marriage of photogrammetry and computer-aided piping design can economically produce such numerical drawings.
NASA Astrophysics Data System (ADS)
Kang, Yoonyoung
While vast resources have been invested in the development of computational models for cost-benefit analysis for the "whole world" or for the largest economies (e.g. United States, Japan, Germany), the remainder have been thrown together into one model for the "rest of the world." This study presents a multi-sectoral, dynamic, computable general equilibrium (CGE) model for Korea. This research evaluates the impacts of controlling COsb2 emissions using a multisectoral CGE model. This CGE economy-energy-environment model analyzes and quantifies the interactions between COsb2, energy and economy. This study examines interactions and influences of key environmental policy components: applied economic instruments, emission targets, and environmental tax revenue recycling methods. The most cost-effective economic instrument is the carbon tax. The economic effects discussed include impacts on main macroeconomic variables (in particular, economic growth), sectoral production, and the energy market. This study considers several aspects of various COsb2 control policies, such as the basic variables in the economy: capital stock and net foreign debt. The results indicate emissions might be stabilized in Korea at the expense of economic growth and with dramatic sectoral allocation effects. Carbon dioxide emissions stabilization could be achieved to the tune of a 600 trillion won loss over a 20 year period (1990-2010). The average annual real GDP would decrease by 2.10% over the simulation period compared to the 5.87% increase in the Business-as-Usual. This model satisfies an immediate need for a policy simulation model for Korea and provides the basic framework for similar economies. It is critical to keep the central economic question at the forefront of any discussion regarding environmental protection. How much will reform cost, and what does the economy stand to gain and lose? Without this model, the policy makers might resort to hesitation or even blind speculation. With the model, the policy makers gain the power of prediction. This model serves as a tool for constructing the most effective strategy for Korea.
Economic Analysis. Computer Simulation Models.
ERIC Educational Resources Information Center
Sterling Inst., Washington, DC. Educational Technology Center.
A multimedia course in economic analysis was developed and used in conjunction with the United States Naval Academy. (See ED 043 790 and ED 043 791 for final reports of the project evaluation and development model.) This volume of the text discusses the simulation of behavioral relationships among variable elements in an economy and presents…
A simulation model for wind energy storage systems. Volume 1: Technical report
NASA Technical Reports Server (NTRS)
Warren, A. W.; Edsinger, R. W.; Chan, Y. K.
1977-01-01
A comprehensive computer program for the modeling of wind energy and storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic) was developed. The level of detail of Simulation Model for Wind Energy Storage (SIMWEST) is consistent with a role of evaluating the economic feasibility as well as the general performance of wind energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. The first program is a precompiler which generates computer models (in FORTRAN) of complex wind source storage application systems, from user specifications using the respective library components. The second program provides the techno-economic system analysis with the respective I/O, the integration of systems dynamics, and the iteration for conveyance of variables. SIMWEST program, as described, runs on the UNIVAC 1100 series computers.
A Seminar in Mathematical Model-Building.
ERIC Educational Resources Information Center
Smith, David A.
1979-01-01
A course in mathematical model-building is described. Suggested modeling projects include: urban problems, biology and ecology, economics, psychology, games and gaming, cosmology, medicine, history, computer science, energy, and music. (MK)
Input-output model for MACCS nuclear accident impacts estimation¹
DOE Office of Scientific and Technical Information (OSTI.GOV)
Outkin, Alexander V.; Bixler, Nathan E.; Vargas, Vanessa N
Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domesticmore » product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.« less
On the need and use of models to explore the role of economic confidence:a survey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprigg, James A.; Paez, Paul J.; Hand, Michael S.
2005-04-01
Empirical studies suggest that consumption is more sensitive to current income than suggested under the permanent income hypothesis, which raises questions regarding expectations for future income, risk aversion, and the role of economic confidence measures. This report surveys a body of fundamental economic literature as well as burgeoning computational modeling methods to support efforts to better anticipate cascading economic responses to terrorist threats and attacks. This is a three part survey to support the incorporation of models of economic confidence into agent-based microeconomic simulations. We first review broad underlying economic principles related to this topic. We then review the economicmore » principle of confidence and related empirical studies. Finally, we provide a brief survey of efforts and publications related to agent-based economic simulation.« less
Strategies for Balanced Rural-Urban Growth. Agricultural Information Bulletin No. 392.
ERIC Educational Resources Information Center
Edwards, Clark
Summarizing an Economic Research Service (ERS) publication, this guide to a balanced rural-urban growth describes the results of a computer based ERS model which examined seven strategies to improve rural economic development. Based on 1960-70 trends, the model is described as asking how much would be required of each of the following strategies…
Evaluation of trade influence on economic growth rate by computational intelligence approach
NASA Astrophysics Data System (ADS)
Sokolov-Mladenović, Svetlana; Milovančević, Milos; Mladenović, Igor
2017-01-01
In this study was analyzed the influence of trade parameters on the economic growth forecasting accuracy. Computational intelligence method was used for the analyzing since the method can handle highly nonlinear data. It is known that the economic growth could be modeled based on the different trade parameters. In this study five input parameters were considered. These input parameters were: trade in services, exports of goods and services, imports of goods and services, trade and merchandise trade. All these parameters were calculated as added percentages in gross domestic product (GDP). The main goal was to select which parameters are the most impactful on the economic growth percentage. GDP was used as economic growth indicator. Results show that the imports of goods and services has the highest influence on the economic growth forecasting accuracy.
An economic and financial exploratory
NASA Astrophysics Data System (ADS)
Cincotti, S.; Sornette, D.; Treleaven, P.; Battiston, S.; Caldarelli, G.; Hommes, C.; Kirman, A.
2012-11-01
This paper describes the vision of a European Exploratory for economics and finance using an interdisciplinary consortium of economists, natural scientists, computer scientists and engineers, who will combine their expertise to address the enormous challenges of the 21st century. This Academic Public facility is intended for economic modelling, investigating all aspects of risk and stability, improving financial technology, and evaluating proposed regulatory and taxation changes. The European Exploratory for economics and finance will be constituted as a network of infrastructure, observatories, data repositories, services and facilities and will foster the creation of a new cross-disciplinary research community of social scientists, complexity scientists and computing (ICT) scientists to collaborate in investigating major issues in economics and finance. It is also considered a cradle for training and collaboration with the private sector to spur spin-offs and job creations in Europe in the finance and economic sectors. The Exploratory will allow Social Scientists and Regulators as well as Policy Makers and the private sector to conduct realistic investigations with real economic, financial and social data. The Exploratory will (i) continuously monitor and evaluate the status of the economies of countries in their various components, (ii) use, extend and develop a large variety of methods including data mining, process mining, computational and artificial intelligence and every other computer and complex science techniques coupled with economic theory and econometric, and (iii) provide the framework and infrastructure to perform what-if analysis, scenario evaluations and computational, laboratory, field and web experiments to inform decision makers and help develop innovative policy, market and regulation designs.
A Comprehensive Review of Existing Risk Assessment Models in Cloud Computing
NASA Astrophysics Data System (ADS)
Amini, Ahmad; Jamil, Norziana
2018-05-01
Cloud computing is a popular paradigm in information technology and computing as it offers numerous advantages in terms of economical saving and minimal management effort. Although elasticity and flexibility brings tremendous benefits, it still raises many information security issues due to its unique characteristic that allows ubiquitous computing. Therefore, the vulnerabilities and threats in cloud computing have to be identified and proper risk assessment mechanism has to be in place for better cloud computing management. Various quantitative and qualitative risk assessment models have been proposed but up to our knowledge, none of them is suitable for cloud computing environment. This paper, we compare and analyse the strengths and weaknesses of existing risk assessment models. We then propose a new risk assessment model that sufficiently address all the characteristics of cloud computing, which was not appeared in the existing models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waters, E.C.; Holland, D.W.; Haynes, R.W.
1997-04-01
Traditional, fixed-price (input-output) economic models provide a useful framework for conceptualizing links in a regional economy. Apparent shortcomings in these models, however, severely restrict our ability to deduce valid prescriptions for public policy and economic development. A more efficient approach using regional computable general equilibrium (CGE) models as well as a brief survey of relevant literature is presented. Computable general equilibrium results under several different resource policy scenarios are examined and contrasted with a fixed-price analysis. In the most severe CGE scenario, elimination of Federal range programs caused the loss of 1,371 jobs (2.3 percent of regional employment) and $29more » million (1.6 percent) of house income; and an 80-percent reduction in Federal log supplies resulted in the loss of 3,329 jobs (5.5 percent of regional employment), and $76 millin (4.2 percent) of household income. These results do not include positive economic impacts associated with improvement in salmon runs. Economic counter scenarios indicate that increases in tourism and high-technology manufacturing and growth in the population of retirees can largely offset total employment and income losses.« less
Modeling of terminal-area airplane fuel consumption
DOT National Transportation Integrated Search
2009-08-01
Accurate modeling of airplane fuel consumption is necessary for air transportation policy-makers to properly : adjudicate trades between competing environmental and economic demands. Existing public models used for : computing terminal-area airplane ...
Sea/Lake Water Air Conditioning at Naval Facilities.
1980-05-01
ECONOMICS AT TWO FACILITIES ......... ................... 2 Facilities ........... .......................... 2 Computer Models...of an operational test at Naval Security Group Activity (NSGA) Winter Harbor, Me., and the economics of Navywide application. In FY76 an assessment of... economics of Navywide application of sea/lake water AC indicated that cost and energy savings at the sites of some Naval facilities are possible, depending
Development of Aspen: A microanalytic simulation model of the US economy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pryor, R.J.; Basu, N.; Quint, T.
1996-02-01
This report describes the development of an agent-based microanalytic simulation model of the US economy. The microsimulation model capitalizes on recent technological advances in evolutionary learning and parallel computing. Results are reported for a test problem that was run using the model. The test results demonstrate the model`s ability to predict business-like cycles in an economy where prices and inventories are allowed to vary. Since most economic forecasting models have difficulty predicting any kind of cyclic behavior. These results show the potential of microanalytic simulation models to improve economic policy analysis and to provide new insights into underlying economic principles.more » Work already has begun on a more detailed model.« less
Pharmaceutical industry and trade liberalization using computable general equilibrium model.
Barouni, M; Ghaderi, H; Banouei, Aa
2012-01-01
Computable general equilibrium models are known as a powerful instrument in economic analyses and widely have been used in order to evaluate trade liberalization effects. The purpose of this study was to provide the impacts of trade openness on pharmaceutical industry using CGE model. Using a computable general equilibrium model in this study, the effects of decrease in tariffs as a symbol of trade liberalization on key variables of Iranian pharmaceutical products were studied. Simulation was performed via two scenarios in this study. The first scenario was the effect of decrease in tariffs of pharmaceutical products as 10, 30, 50, and 100 on key drug variables, and the second was the effect of decrease in other sectors except pharmaceutical products on vital and economic variables of pharmaceutical products. The required data were obtained and the model parameters were calibrated according to the social accounting matrix of Iran in 2006. The results associated with simulation demonstrated that the first scenario has increased import, export, drug supply to markets and household consumption, while import, export, supply of product to market, and household consumption of pharmaceutical products would averagely decrease in the second scenario. Ultimately, society welfare would improve in all scenarios. We presents and synthesizes the CGE model which could be used to analyze trade liberalization policy issue in developing countries (like Iran), and thus provides information that policymakers can use to improve the pharmacy economics.
John Bishir; James Roberds; Brian Strom; Xiaohai Wan
2009-01-01
SPLOB is a computer simulation model for the interaction between loblolly pine (Pinus taeda L.), the economically most important forest crop in the United States, and the southern pine beetle (SPB: Dendroctonus frontalis Zimm.), the major insect pest for this species. The model simulates loblolly pine stands from time of planting...
Lateral Orbitofrontal Inactivation Dissociates Devaluation-Sensitive Behavior and Economic Choice.
Gardner, Matthew P H; Conroy, Jessica S; Shaham, Michael H; Styer, Clay V; Schoenbaum, Geoffrey
2017-12-06
How do we choose between goods that have different subjective values, like apples and oranges? Neuroeconomics proposes that this is done by reducing complex goods to a single unitary value to allow comparison. This value is computed "on the fly" from the underlying model of the goods space, allowing decisions to meet current needs. This is termed "model-based" behavior to distinguish it from pre-determined, habitual, or "model-free" behavior. The lateral orbitofrontal cortex (OFC) supports model-based behavior in rats and primates, but whether the OFC is necessary for economic choice is less clear. Here we tested this question by optogenetically inactivating the lateral OFC in rats in a classic model-based task and during economic choice. Contrary to predictions, inactivation disrupted model-based behavior without affecting economic choice. Published by Elsevier Inc.
Economics of liquid hydrogen from water electrolysis
NASA Technical Reports Server (NTRS)
Lin, F. N.; Moore, W. I.; Walker, S. W.
1985-01-01
An economical model for preliminary analysis of LH2 cost from water electrolysis is presented. The model is based on data from vendors and open literature, and is suitable for computer analysis of different scenarios for 'directional' purposes. Cost data associated with a production rate of 10,886 kg/day are presented. With minimum modification, the model can also be used to predict LH2 cost from any electrolyzer once the electrolyzer's cost data are available.
Offshore Wind Jobs and Economic Development Impact: Four Regional Scenarios (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tegen, S.
NREL's Jobs and Economic Development Impact (JEDI) Model for Offshore Wind, is a computer tool for studying the economic impacts of fixed-bottom offshore wind projects in the United States. This presentation provides the results of an analysis of four offshore wind development scenarios in the Southeast Atlantic, Great Lakes, Mid-Atlantic, and Gulf of Mexico regions.
ERIC Educational Resources Information Center
Morrison, James L.
A computerized delivery system in consumer economics developed at the University of Delaware uses the PLATO system to provide a basis for analyzing consumer behavior in the marketplace. The 16 sequential lessons, part of the Consumer in the Marketplace Series (CMS), demonstrate consumer economic theory in layman's terms and are structured to focus…
Socio-economic and climate change impacts on agriculture: an integrated assessment, 1990–2080
Fischer, Günther; Shah, Mahendra; N. Tubiello, Francesco; van Velhuizen, Harrij
2005-01-01
A comprehensive assessment of the impacts of climate change on agro-ecosystems over this century is developed, up to 2080 and at a global level, albeit with significant regional detail. To this end an integrated ecological–economic modelling framework is employed, encompassing climate scenarios, agro-ecological zoning information, socio-economic drivers, as well as world food trade dynamics. Specifically, global simulations are performed using the FAO/IIASA agro-ecological zone model, in conjunction with IIASAs global food system model, using climate variables from five different general circulation models, under four different socio-economic scenarios from the intergovernmental panel on climate change. First, impacts of different scenarios of climate change on bio-physical soil and crop growth determinants of yield are evaluated on a 5′×5′ latitude/longitude global grid; second, the extent of potential agricultural land and related potential crop production is computed. The detailed bio-physical results are then fed into an economic analysis, to assess how climate impacts may interact with alternative development pathways, and key trends expected over this century for food demand and production, and trade, as well as key composite indices such as risk of hunger and malnutrition, are computed. This modelling approach connects the relevant bio-physical and socio-economic variables within a unified and coherent framework to produce a global assessment of food production and security under climate change. The results from the study suggest that critical impact asymmetries due to both climate and socio-economic structures may deepen current production and consumption gaps between developed and developing world; it is suggested that adaptation of agricultural techniques will be central to limit potential damages under climate change. PMID:16433094
Comparative analysis of economic models in selected solar energy computer programs
NASA Astrophysics Data System (ADS)
Powell, J. W.; Barnes, K. A.
1982-01-01
The economic evaluation models in five computer programs widely used for analyzing solar energy systems (F-CHART 3.0, F-CHART 4.0, SOLCOST, BLAST, and DOE-2) are compared. Differences in analysis techniques and assumptions among the programs are assessed from the point of view of consistency with the Federal requirements for life cycle costing (10 CFR Part 436), effect on predicted economic performance, and optimal system size, case of use, and general applicability to diverse systems types and building types. The FEDSOL program developed by the National Bureau of Standards specifically to meet the Federal life cycle cost requirements serves as a basis for the comparison. Results of the study are illustrated in test cases of two different types of Federally owned buildings: a single family residence and a low rise office building.
NASA Astrophysics Data System (ADS)
McDonald, G. W.; Cronin, S. J.; Kim, J.-H.; Smith, N. J.; Murray, C. A.; Procter, J. N.
2017-12-01
The economic impacts of volcanism extend well beyond the direct costs of loss of life and asset damage. This paper presents one of the first attempts to assess the economic consequences of disruption associated with volcanic impacts at a range of temporal and spatial scales using multi-regional and dynamic computable general equilibrium (CGE) modelling. Based on the last decade of volcanic research findings at Mt. Taranaki, three volcanic event scenarios (Tahurangi, Inglewood and Opua) differentiated by critical physical thresholds were generated. In turn, the corresponding disruption economic impacts were calculated for each scenario. Under the Tahurangi scenario (annual probability of 0.01-0.02), a small-scale explosive (Volcanic Explosivity Index (VEI) 2-3) and dome forming eruption, the economic impacts were negligible with complete economic recovery experienced within a year. The larger Inglewood sub-Plinian to Plinian eruption scenario event (VEI > 4, annualised probability of 0.003) produced significant impacts on the Taranaki region economy of 207 million (representing 4.0% of regional gross domestic product (GDP) 1 year after the event, 2007 New Zealand dollars), that will take around 5 years to recover. The Opua scenario, the largest magnitude volcanic hazard modelled, is a major flank collapse and debris avalanche event with an annual probability of 0.00018. The associated economic impacts of this scenario were 397 million (representing 7.7% of regional GDP 1 year after the event) with the Taranaki region economy suffering permanent structural changes. Our dynamic analysis illustrates that different economic impacts play out at different stages in a volcanic crisis. We also discuss the key strengths and weaknesses of our modelling along with potential extensions.
2015-01-01
Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting. PMID:25826692
Dong, Xianlei; Bollen, Johan
2015-01-01
Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.
Computable general equilibrium model fiscal year 2013 capability development report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, Brian Keith; Rivera, Michael Kelly; Boero, Riccardo
This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences inmore » the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.« less
Numerical Optimization Using Desktop Computers
1980-09-11
concentrating compound parabolic trough solar collector . Thermophysical, geophysical, optical and economic analyses were used to compute a life-cycle...third computer program, NISCO, was developed to model a nonimaging concentrating compound parabolic trough solar collector using thermophysical...concentrating compound parabolic trough Solar Collector . C. OBJECTIVE The objective of this thesis was to develop a system of interactive programs for the Hewlett
Assessing the benefits and economic values of trees
David J. Nowak
2017-01-01
Understanding the environmental, economic, and social/community benefits of nature, in particular trees and forests, can lead to better vegetation management and designs to optimize environmental quality and human health for current and future generations. Computer models have been developed to assess forest composition and its associated effects on environmental...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Zhenhua; Rose, Adam Z.; Prager, Fynnwin
The state of the art approach to economic consequence analysis (ECA) is computable general equilibrium (CGE) modeling. However, such models contain thousands of equations and cannot readily be incorporated into computerized systems used by policy analysts to yield estimates of economic impacts of various types of transportation system failures due to natural hazards, human related attacks or technological accidents. This paper presents a reduced-form approach to simplify the analytical content of CGE models to make them more transparent and enhance their utilization potential. The reduced-form CGE analysis is conducted by first running simulations one hundred times, varying key parameters, suchmore » as magnitude of the initial shock, duration, location, remediation, and resilience, according to a Latin Hypercube sampling procedure. Statistical analysis is then applied to the “synthetic data” results in the form of both ordinary least squares and quantile regression. The analysis yields linear equations that are incorporated into a computerized system and utilized along with Monte Carlo simulation methods for propagating uncertainties in economic consequences. Although our demonstration and discussion focuses on aviation system disruptions caused by terrorist attacks, the approach can be applied to a broad range of threat scenarios.« less
DOT National Transportation Integrated Search
2000-06-01
The Highway Economic Requirements System (HERS) computer model estimates investment requirements for the nation's highways by adding together the costs of highway improvements that the model's benefit-cost analyses indicate are warranted. In making i...
Computational Methods to Assess the Production Potential of Bio-Based Chemicals.
Campodonico, Miguel A; Sukumara, Sumesh; Feist, Adam M; Herrgård, Markus J
2018-01-01
Elevated costs and long implementation times of bio-based processes for producing chemicals represent a bottleneck for moving to a bio-based economy. A prospective analysis able to elucidate economically and technically feasible product targets at early research phases is mandatory. Computational tools can be implemented to explore the biological and technical spectrum of feasibility, while constraining the operational space for desired chemicals. In this chapter, two different computational tools for assessing potential for bio-based production of chemicals from different perspectives are described in detail. The first tool is GEM-Path: an algorithm to compute all structurally possible pathways from one target molecule to the host metabolome. The second tool is a framework for Modeling Sustainable Industrial Chemicals production (MuSIC), which integrates modeling approaches for cellular metabolism, bioreactor design, upstream/downstream processes, and economic impact assessment. Integrating GEM-Path and MuSIC will play a vital role in supporting early phases of research efforts and guide the policy makers with decisions, as we progress toward planning a sustainable chemical industry.
Annual economic impacts of seasonal influenza on US counties: Spatial heterogeneity and patterns
2012-01-01
Economic impacts of seasonal influenza vary across US counties, but little estimation has been conducted at the county level. This research computed annual economic costs of seasonal influenza for 3143 US counties based on Census 2010, identified inherent spatial patterns, and investigated cost-benefits of vaccination strategies. The computing model modified existing methods for national level estimation, and further emphasized spatial variations between counties, in terms of population size, age structure, influenza activity, and income level. Upon such a model, four vaccination strategies that prioritize different types of counties were simulated and their net returns were examined. The results indicate that the annual economic costs of influenza varied from $13.9 thousand to $957.5 million across US counties, with a median of $2.47 million. Prioritizing vaccines to counties with high influenza attack rates produces the lowest influenza cases and highest net returns. This research fills the current knowledge gap by downscaling the estimation to a county level, and adds spatial variability into studies of influenza economics and interventions. Compared to the national estimates, the presented statistics and maps will offer detailed guidance for local health agencies to fight against influenza. PMID:22594494
NASA Astrophysics Data System (ADS)
Xie, W.; Li, N.; Wu, J.-D.; Hao, X.-L.
2014-04-01
Disaster damages have negative effects on the economy, whereas reconstruction investment has positive effects. The aim of this study is to model economic causes of disasters and recovery involving the positive effects of reconstruction activities. Computable general equilibrium (CGE) model is a promising approach because it can incorporate these two kinds of shocks into a unified framework and furthermore avoid the double-counting problem. In order to factor both shocks into the CGE model, direct loss is set as the amount of capital stock reduced on the supply side of the economy; a portion of investments restores the capital stock in an existing period; an investment-driven dynamic model is formulated according to available reconstruction data, and the rest of a given country's saving is set as an endogenous variable to balance the fixed investment. The 2008 Wenchuan Earthquake is selected as a case study to illustrate the model, and three scenarios are constructed: S0 (no disaster occurs), S1 (disaster occurs with reconstruction investment) and S2 (disaster occurs without reconstruction investment). S0 is taken as business as usual, and the differences between S1 and S0 and that between S2 and S0 can be interpreted as economic losses including reconstruction and excluding reconstruction, respectively. The study showed that output from S1 is found to be closer to real data than that from S2. Economic loss under S2 is roughly 1.5 times that under S1. The gap in the economic aggregate between S1 and S0 is reduced to 3% at the end of government-led reconstruction activity, a level that should take another four years to achieve under S2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunn, B. D.; Diamond, S. C.; Bennett, G. A.
1977-10-01
A set of computer programs, called Cal-ERDA, is described that is capable of rapid and detailed analysis of energy consumption in buildings. A new user-oriented input language, named the Building Design Language (BDL), has been written to allow simplified manipulation of the many variables used to describe a building and its operation. This manual provides the user with information necessary to understand in detail the Cal-ERDA set of computer programs. The new computer programs described include: an EXECUTIVE Processor to create computer system control commands; a BDL Processor to analyze input instructions, execute computer system control commands, perform assignments andmore » data retrieval, and control the operation of the LOADS, SYSTEMS, PLANT, ECONOMICS, and REPORT programs; a LOADS analysis program that calculates peak (design) zone and hourly loads and the effect of the ambient weather conditions, the internal occupancy, lighting, and equipment within the building, as well as variations in the size, location, orientation, construction, walls, roofs, floors, fenestrations, attachments (awnings, balconies), and shape of a building; a Heating, Ventilating, and Air-Conditioning (HVAC) SYSTEMS analysis program capable of modeling the operation of HVAC components including fans, coils, economizers, humidifiers, etc.; 16 standard configurations and operated according to various temperature and humidity control schedules. A plant equipment program models the operation of boilers, chillers, electrical generation equipment (diesel or turbines), heat storage apparatus (chilled or heated water), and solar heating and/or cooling systems. An ECONOMIC analysis program calculates life-cycle costs. A REPORT program produces tables of user-selected variables and arranges them according to user-specified formats. A set of WEATHER ANALYSIS programs manipulates, summarizes and plots weather data. Libraries of weather data, schedule data, and building data were prepared.« less
Forest management and economics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buongiorno, J.; Gilless, J.K.
1987-01-01
This volume provides a survey of quantitative methods, guiding the reader through formulation and analysis of models that address forest management problems. The authors use simple mathematics, graphics, and short computer programs to explain each method. Emphasizing applications, they discuss linear, integer, dynamic, and goal programming; simulation; network modeling; and econometrics, as these relate to problems of determining economic harvest schedules in even-aged and uneven-aged forests, the evaluation of forest policies, multiple-objective decision making, and more.
Energy-economic policy modeling
NASA Astrophysics Data System (ADS)
Sanstad, Alan H.
2018-01-01
Computational models based on economic principles and methods are powerful tools for understanding and analyzing problems in energy and the environment and for designing policies to address them. Among their other features, some current models of this type incorporate information on sustainable energy technologies and can be used to examine their potential role in addressing the problem of global climate change. The underlying principles and the characteristics of the models are summarized, and examples of this class of model and their applications are presented. Modeling epistemology and related issues are discussed, as well as critiques of the models. The paper concludes with remarks on the evolution of the models and possibilities for their continued development.
76 FR 33399 - Advisory Committee on International Economic Policy; Notice of Open Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-08
... to the advent of cloud computing as a new business model in international trade, the implications of... of State for Economic, Energy, and Business Affairs Jose W. Fernandez and Committee Chair Ted... and Business Affairs, at (202) 647-2231 or [email protected] . This announcement might appear in the...
Computable General Equilibrium Model Fiscal Year 2013 Capability Development Report - April 2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, Brian Keith; Rivera, Michael K.; Boero, Riccardo
2014-04-01
This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences inmore » the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.« less
A physical and economic model of the nuclear fuel cycle
NASA Astrophysics Data System (ADS)
Schneider, Erich Alfred
A model of the nuclear fuel cycle that is suitable for use in strategic planning and economic forecasting is presented. The model, to be made available as a stand-alone software package, requires only a small set of fuel cycle and reactor specific input parameters. Critical design criteria include ease of use by nonspecialists, suppression of errors to within a range dictated by unit cost uncertainties, and limitation of runtime to under one minute on a typical desktop computer. Collision probability approximations to the neutron transport equation that lead to a computationally efficient decoupling of the spatial and energy variables are presented and implemented. The energy dependent flux, governed by coupled integral equations, is treated by multigroup or continuous thermalization methods. The model's output includes a comprehensive nuclear materials flowchart that begins with ore requirements, calculates the buildup of 24 actinides as well as fission products, and concludes with spent fuel or reprocessed material composition. The costs, direct and hidden, of the fuel cycle under study are also computed. In addition to direct disposal and plutonium recycling strategies in current use, the model addresses hypothetical cycles. These include cycles chosen for minor actinide burning and for their low weapons-usable content.
NASA Astrophysics Data System (ADS)
Wibowo, Wahyu; Wene, Chatrien; Budiantara, I. Nyoman; Permatasari, Erma Oktania
2017-03-01
Multiresponse semiparametric regression is simultaneous equation regression model and fusion of parametric and nonparametric model. The regression model comprise several models and each model has two components, parametric and nonparametric. The used model has linear function as parametric and polynomial truncated spline as nonparametric component. The model can handle both linearity and nonlinearity relationship between response and the sets of predictor variables. The aim of this paper is to demonstrate the application of the regression model for modeling of effect of regional socio-economic on use of information technology. More specific, the response variables are percentage of households has access to internet and percentage of households has personal computer. Then, predictor variables are percentage of literacy people, percentage of electrification and percentage of economic growth. Based on identification of the relationship between response and predictor variable, economic growth is treated as nonparametric predictor and the others are parametric predictors. The result shows that the multiresponse semiparametric regression can be applied well as indicate by the high coefficient determination, 90 percent.
Evaluation of Enhanced Risk Monitors for Use on Advanced Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramuhalli, Pradeep; Veeramany, Arun; Bonebrake, Christopher A.
This study provides an overview of the methodology for integrating time-dependent failure probabilities into nuclear power reactor risk monitors. This prototypic enhanced risk monitor (ERM) methodology was evaluated using a hypothetical probabilistic risk assessment (PRA) model, generated using a simplified design of a liquid-metal-cooled advanced reactor (AR). Component failure data from industry compilation of failures of components similar to those in the simplified AR model were used to initialize the PRA model. Core damage frequency (CDF) over time were computed and analyzed. In addition, a study on alternative risk metrics for ARs was conducted. Risk metrics that quantify the normalizedmore » cost of repairs, replacements, or other operations and management (O&M) actions were defined and used, along with an economic model, to compute the likely economic risk of future actions such as deferred maintenance based on the anticipated change in CDF due to current component condition and future anticipated degradation. Such integration of conventional-risk metrics with alternate-risk metrics provides a convenient mechanism for assessing the impact of O&M decisions on safety and economics of the plant. It is expected that, when integrated with supervisory control algorithms, such integrated-risk monitors will provide a mechanism for real-time control decision-making that ensure safety margins are maintained while operating the plant in an economically viable manner.« less
High Performance Computing for Modeling Wind Farms and Their Impact
NASA Astrophysics Data System (ADS)
Mavriplis, D.; Naughton, J. W.; Stoellinger, M. K.
2016-12-01
As energy generated by wind penetrates further into our electrical system, modeling of power production, power distribution, and the economic impact of wind-generated electricity is growing in importance. The models used for this work can range in fidelity from simple codes that run on a single computer to those that require high performance computing capabilities. Over the past several years, high fidelity models have been developed and deployed on the NCAR-Wyoming Supercomputing Center's Yellowstone machine. One of the primary modeling efforts focuses on developing the capability to compute the behavior of a wind farm in complex terrain under realistic atmospheric conditions. Fully modeling this system requires the simulation of continental flows to modeling the flow over a wind turbine blade, including down to the blade boundary level, fully 10 orders of magnitude in scale. To accomplish this, the simulations are broken up by scale, with information from the larger scales being passed to the lower scale models. In the code being developed, four scale levels are included: the continental weather scale, the local atmospheric flow in complex terrain, the wind plant scale, and the turbine scale. The current state of the models in the latter three scales will be discussed. These simulations are based on a high-order accurate dynamic overset and adaptive mesh approach, which runs at large scale on the NWSC Yellowstone machine. A second effort on modeling the economic impact of new wind development as well as improvement in wind plant performance and enhancements to the transmission infrastructure will also be discussed.
Macroeconomic and household-level impacts of HIV/AIDS in Botswana.
Jefferis, Keith; Kinghorn, Anthony; Siphambe, Happy; Thurlow, James
2008-07-01
To measure the impact of HIV/AIDS on economic growth and poverty in Botswana and estimate how providing treatment can mitigate its effects. Demographic and financial projections were combined with economic simulation models, including a macroeconomic growth model and a macro-microeconomic computable general equilibrium and microsimulation model. HIV/AIDS significantly reduces economic growth and increases household poverty. The impact is now severe enough to be affecting the economy as a whole, and threatens to pull some of the uninfected population into poverty. Providing antiretroviral therapy can partly offset this negative effect. Treatment increases health's share of government expenditure only marginally, because it increases economic growth and because withholding treatment raises the cost of other health services. Botswana's treatment programme is appropriate from a macroeconomic perspective. Conducting macroeconomic impact assessments is important in countries where prevalence rates are particularly high.
Energy: Economic activity and energy demand; link to energy flow. Example: France
NASA Astrophysics Data System (ADS)
1980-10-01
The data derived from the EXPLOR and EPOM, Energy Flow Optimization Model are described. The core of the EXPLOR model is a circular system of relations involving consumer's demand, producer's outputs, and market prices. The solution of this system of relations is obtained by successive iterations; the final output is a coherent system of economic accounts. The computer program for this transition is described. The work conducted by comparing different energy demand models is summarized. The procedure is illustrated by a numerical projection to 1980 and 1985 using the existing version of the EXPLOR France model.
An economic evaluation of solar radiation management.
Aaheim, Asbjørn; Romstad, Bård; Wei, Taoyuan; Kristjánsson, Jón Egill; Muri, Helene; Niemeier, Ulrike; Schmidt, Hauke
2015-11-01
Economic evaluations of solar radiation management (SRM) usually assume that the temperature will be stabilized, with no economic impacts of climate change, but with possible side-effects. We know from experiments with climate models, however, that unlike emission control the spatial and temporal distributions of temperature, precipitation and wind conditions will change. Hence, SRM may have economic consequences under a stabilization of global mean temperature even if side-effects other than those related to the climatic responses are disregarded. This paper addresses the economic impacts of implementing two SRM technologies; stratospheric sulfur injection and marine cloud brightening. By the use of a computable general equilibrium model, we estimate the economic impacts of climatic responses based on the results from two earth system models, MPI-ESM and NorESM. We find that under a moderately increasing greenhouse-gas concentration path, RCP4.5, the economic benefits of implementing climate engineering are small, and may become negative. Global GDP increases in three of the four experiments and all experiments include regions where the benefits from climate engineering are negative. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Singh, V. K.; Jha, A. K.; Gupta, K.; Srivastav, S. K.
2017-12-01
Recent studies indicate that there is a significant improvement in the urban land use dynamics through modeling at finer spatial resolutions. Geo-computational models such as cellular automata and agent based model have given evident proof regarding the quantification of the urban growth pattern with urban boundary. In recent studies, socio- economic factors such as demography, education rate, household density, parcel price of the current year, distance to road, school, hospital, commercial centers and police station are considered to the major factors influencing the Land Use Land Cover (LULC) pattern of the city. These factors have unidirectional approach to land use pattern which makes it difficult to analyze the spatial aspects of model results both quantitatively and qualitatively. In this study, cellular automata model is combined with generic model known as Agent Based Model to evaluate the impact of socio economic factors on land use pattern. For this purpose, Dehradun an Indian city is selected as a case study. Socio economic factors were collected from field survey, Census of India, Directorate of economic census, Uttarakhand, India. A 3X3 simulating window is used to consider the impact on LULC. Cellular automata model results are examined for the identification of hot spot areas within the urban area and agent based model will be using logistic based regression approach where it will identify the correlation between each factor on LULC and classify the available area into low density, medium density, high density residential or commercial area. In the modeling phase, transition rule, neighborhood effect, cell change factors are used to improve the representation of built-up classes. Significant improvement is observed in the built-up classes from 84 % to 89 %. However after incorporating agent based model with cellular automata model the accuracy improved from 89 % to 94 % in 3 classes of urban i.e. low density, medium density and commercial classes. Sensitivity study of the model indicated that southern and south-west part of the city have shown improvement and small patches of growth are also observed in the north western part of the city.The study highlights the growing importance of socio economic factors and geo-computational modeling approach on changing LULC of newly growing cities of modern India.
Brown, Zachary S.; Dickinson, Katherine L.; Kramer, Randall A.
2014-01-01
The evolutionary dynamics of insecticide resistance in harmful arthropods has economic implications, not only for the control of agricultural pests (as has been well studied), but also for the control of disease vectors, such as malaria-transmitting Anopheles mosquitoes. Previous economic work on insecticide resistance illustrates the policy relevance of knowing whether insecticide resistance mutations involve fitness costs. Using a theoretical model, this article investigates economically optimal strategies for controlling malaria-transmitting mosquitoes when there is the potential for mosquitoes to evolve resistance to insecticides. Consistent with previous literature, we find that fitness costs are a key element in the computation of economically optimal resistance management strategies. Additionally, our models indicate that different biological mechanisms underlying these fitness costs (e.g., increased adult mortality and/or decreased fecundity) can significantly alter economically optimal resistance management strategies. PMID:23448053
Computationally intensive econometrics using a distributed matrix-programming language.
Doornik, Jurgen A; Hendry, David F; Shephard, Neil
2002-06-15
This paper reviews the need for powerful computing facilities in econometrics, focusing on concrete problems which arise in financial economics and in macroeconomics. We argue that the profession is being held back by the lack of easy-to-use generic software which is able to exploit the availability of cheap clusters of distributed computers. Our response is to extend, in a number of directions, the well-known matrix-programming interpreted language Ox developed by the first author. We note three possible levels of extensions: (i) Ox with parallelization explicit in the Ox code; (ii) Ox with a parallelized run-time library; and (iii) Ox with a parallelized interpreter. This paper studies and implements the first case, emphasizing the need for deterministic computing in science. We give examples in the context of financial economics and time-series modelling.
Manual of phosphoric acid fuel cell power plant cost model and computer program
NASA Technical Reports Server (NTRS)
Lu, C. Y.; Alkasab, K. A.
1984-01-01
Cost analysis of phosphoric acid fuel cell power plant includes two parts: a method for estimation of system capital costs, and an economic analysis which determines the levelized annual cost of operating the system used in the capital cost estimation. A FORTRAN computer has been developed for this cost analysis.
Participatory modeling of recreation and tourism
Lisa C. Chase; Roelof M.J. Boumans; Stephanie Morse
2007-01-01
Communities involved in recreation and tourism planning need to understand the broad range of benefits and challenges--economic, social, and ecological--in order to make informed decisions. Participatory computer modeling is a methodology that involves a community in the process of collectively building a model about a particular situation that affects participants...
Fernández-Carrión, E; Ivorra, B; Martínez-López, B; Ramos, A M; Sánchez-Vizcaíno, J M
2016-04-01
Be-FAST is a computer program based on a time-spatial stochastic spread mathematical model for studying the transmission of infectious livestock diseases within and between farms. The present work describes a new module integrated into Be-FAST to model the economic consequences of the spreading of classical swine fever (CSF) and other infectious livestock diseases within and between farms. CSF is financially one of the most damaging diseases in the swine industry worldwide. Specifically in Spain, the economic costs in the two last CSF epidemics (1997 and 2001) reached jointly more than 108 million euros. The present analysis suggests that severe CSF epidemics are associated with significant economic costs, approximately 80% of which are related to animal culling. Direct costs associated with control measures are strongly associated with the number of infected farms, while indirect costs are more strongly associated with epidemic duration. The economic model has been validated with economic information around the last outbreaks in Spain. These results suggest that our economic module may be useful for analysing and predicting economic consequences of livestock disease epidemics. Copyright © 2016 Elsevier B.V. All rights reserved.
Effects of economic interactions on credit risk
NASA Astrophysics Data System (ADS)
Hatchett, J. P. L.; Kühn, R.
2006-03-01
We study a credit-risk model which captures effects of economic interactions on a firm's default probability. Economic interactions are represented as a functionally defined graph, and the existence of both cooperative and competitive business relations is taken into account. We provide an analytic solution of the model in a limit where the number of business relations of each company is large, but the overall fraction of the economy with which a given company interacts may be small. While the effects of economic interactions are relatively weak in typical (most probable) scenarios, they are pronounced in situations of economic stress, and thus lead to a substantial fattening of the tails of loss distributions in large loan portfolios. This manifests itself in a pronounced enhancement of the value at risk computed for interacting economies in comparison with their non-interacting counterparts.
NASA Astrophysics Data System (ADS)
Li, Ying; Luo, Zhiling; Yin, Jianwei; Xu, Lida; Yin, Yuyu; Wu, Zhaohui
2017-01-01
Modern service company (MSC), the enterprise involving special domains, such as the financial industry, information service industry and technology development industry, depends heavily on information technology. Modelling of such enterprise has attracted much research attention because it promises to help enterprise managers to analyse basic business strategies (e.g. the pricing strategy) and even optimise the business process (BP) to gain benefits. While the existing models proposed by economists cover the economic elements, they fail to address the basic BP and its relationship with the economic characteristics. Those proposed in computer science regardless of achieving great success in BP modelling perform poorly in supporting the economic analysis. Therefore, the existing approaches fail to satisfy the requirement of enterprise modelling for MSC, which demands simultaneous consideration of both economic analysing and business processing. In this article, we provide a unified enterprise modelling approach named Enterprise Pattern (EP) which bridges the gap between the BP model and the enterprise economic model of MSC. Proposing a language named Enterprise Pattern Description Language (EPDL) covering all the basic language elements of EP, we formulate the language syntaxes and two basic extraction rules assisting economic analysis. Furthermore, we extend Business Process Model and Notation (BPMN) to support EPDL, named BPMN for Enterprise Pattern (BPMN4EP). The example of mobile application platform is studied in detail for a better understanding of EPDL.
An expanded system simulation model for solar energy storage (technical report), volume 1
NASA Technical Reports Server (NTRS)
Warren, A. W.
1979-01-01
The simulation model for wind energy storage (SIMWEST) program now includes wind and/or photovoltaic systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic) and is available for the UNIVAC 1100 series and the CDC 6000 series computers. The level of detail is consistent with a role of evaluating the economic feasibility as well as the general performance of wind and/or photovoltaic energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. The first program is a precompiler which generates computer models (in FORTRAN) of complex wind and/or photovoltaic source/storage/application systems, from user specifications using the respective library components. The second program provides the techno-economic system analysis with the respective I/0, the integration of system dynamics, and the iteration for conveyance of variables.
A simulation model for wind energy storage systems. Volume 2: Operation manual
NASA Technical Reports Server (NTRS)
Warren, A. W.; Edsinger, R. W.; Burroughs, J. D.
1977-01-01
A comprehensive computer program (SIMWEST) developed for the modeling of wind energy/storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel, and pneumatic) is described. Features of the program include: a precompiler which generates computer models (in FORTRAN) of complex wind source/storage/application systems, from user specifications using the respective library components; a program which provides the techno-economic system analysis with the respective I/O the integration of system dynamics, and the iteration for conveyance of variables; and capability to evaluate economic feasibility as well as general performance of wind energy systems. The SIMWEST operation manual is presented and the usage of the SIMWEST program and the design of the library components are described. A number of example simulations intended to familiarize the user with the program's operation is given along with a listing of each SIMWEST library subroutine.
Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne
2015-01-01
Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.
Operating experience with LEAP from the perspective of the computing applications analyst
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, W.E. III; Horwedel, J.E.; McAdoo, J.W.
1981-05-01
The Long-Term Energy Analysis Program (LEAP), which was used for the energy price-quantity projections in the 1978 Annual Report to Congress (ARC '78) and used in an ORNL research program to develop and demonstrate a procedure for evaluating energy-economic modeling computer codes and the important results derived therefrom, is discussed. The LEAP system used in the ORNL research, the mechanics of executing LEAP, and the personnel skills required to execute the system are described. In addition, a LEAP sample problem, subroutine hierarchical flowcharts, and input tables for the ARC '78 energy-economic model are included. Results of a study to testmore » the capability of the LEAP system used in the ORNL research to reproduce the ARC '78 results credited to LEAP are presented.« less
Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation
NASA Technical Reports Server (NTRS)
Stocker, John C.; Golomb, Andrew M.
2011-01-01
Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.
Understanding the Hows and Whys of Decision-Making: From Expected Utility to Divisive Normalization.
Glimcher, Paul
2014-01-01
Over the course of the last century, economists and ethologists have built detailed models from first principles of how humans and animals should make decisions. Over the course of the last few decades, psychologists and behavioral economists have gathered a wealth of data at variance with the predictions of these economic models. This has led to the development of highly descriptive models that can often predict what choices people or animals will make but without offering any insight into why people make the choices that they do--especially when those choices reduce a decision-maker's well-being. Over the course of the last two decades, neurobiologists working with economists and psychologists have begun to use our growing understanding of how the nervous system works to develop new models of how the nervous system makes decisions. The result, a growing revolution at the interdisciplinary border of neuroscience, psychology, and economics, is a new field called Neuroeconomics. Emerging neuroeconomic models stand to revolutionize our understanding of human and animal choice behavior by combining fundamental properties of neurobiological representation with decision-theoretic analyses. In this overview, one class of these models, based on the widely observed neural computation known as divisive normalization, is presented in detail. The work demonstrates not only that a discrete class of computation widely observed in the nervous system is fundamentally ubiquitous, but how that computation shapes behaviors ranging from visual perception to financial decision-making. It also offers the hope of reconciling economic analysis of what choices we should make with psychological observations of the choices we actually do make. Copyright © 2014 Cold Spring Harbor Laboratory Press; all rights reserved.
Francisco Rodríguez y Silva; Juan Ramón Molina Martínez; Miguel Ángel Herrera Machuca; Jesús Mª Rodríguez Leal
2013-01-01
Progress made in recent years in fire science, particularly as applied to forest fire protection, coupled with the increased power offered by mathematical processors integrated into computers, has led to important developments in the field of dynamic and static simulation of forest fires. Furthermore, and similarly, econometric models applied to economic...
A decision support model for investment on P2P lending platform.
Zeng, Xiangxiang; Liu, Li; Leung, Stephen; Du, Jiangze; Wang, Xun; Li, Tao
2017-01-01
Peer-to-peer (P2P) lending, as a novel economic lending model, has triggered new challenges on making effective investment decisions. In a P2P lending platform, one lender can invest N loans and a loan may be accepted by M investors, thus forming a bipartite graph. Basing on the bipartite graph model, we built an iteration computation model to evaluate the unknown loans. To validate the proposed model, we perform extensive experiments on real-world data from the largest American P2P lending marketplace-Prosper. By comparing our experimental results with those obtained by Bayes and Logistic Regression, we show that our computation model can help borrowers select good loans and help lenders make good investment decisions. Experimental results also show that the Logistic classification model is a good complement to our iterative computation model, which motivates us to integrate the two classification models. The experimental results of the hybrid classification model demonstrate that the logistic classification model and our iteration computation model are complementary to each other. We conclude that the hybrid model (i.e., the integration of iterative computation model and Logistic classification model) is more efficient and stable than the individual model alone.
A decision support model for investment on P2P lending platform
Liu, Li; Leung, Stephen; Du, Jiangze; Wang, Xun; Li, Tao
2017-01-01
Peer-to-peer (P2P) lending, as a novel economic lending model, has triggered new challenges on making effective investment decisions. In a P2P lending platform, one lender can invest N loans and a loan may be accepted by M investors, thus forming a bipartite graph. Basing on the bipartite graph model, we built an iteration computation model to evaluate the unknown loans. To validate the proposed model, we perform extensive experiments on real-world data from the largest American P2P lending marketplace—Prosper. By comparing our experimental results with those obtained by Bayes and Logistic Regression, we show that our computation model can help borrowers select good loans and help lenders make good investment decisions. Experimental results also show that the Logistic classification model is a good complement to our iterative computation model, which motivates us to integrate the two classification models. The experimental results of the hybrid classification model demonstrate that the logistic classification model and our iteration computation model are complementary to each other. We conclude that the hybrid model (i.e., the integration of iterative computation model and Logistic classification model) is more efficient and stable than the individual model alone. PMID:28877234
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olsen, R.J.; Westley, G.W.; Herzog, H.W. Jr.
This report documents the development of MULTIREGION, a computer model of regional and interregional socio-economic development. The MULTIREGION model interprets the economy of each BEA economic area as a labor market, measures all activity in terms of people as members of the population (labor supply) or as employees (labor demand), and simultaneously simulates or forecasts the demands and supplies of labor in all BEA economic areas at five-year intervals. In general the outputs of MULTIREGION are intended to resemble those of the Water Resource Council's OBERS projections and to be put to similar planning and analysis purposes. This report hasmore » been written at two levels to serve the needs of multiple audiences. The body of the report serves as a fairly nontechnical overview of the entire MULTIREGION project; a series of technical appendixes provide detailed descriptions of the background empirical studies of births, deaths, migration, labor force participation, natural resource employment, manufacturing employment location, and local service employment used to construct the model.« less
Computer-aided engineering of semiconductor integrated circuits
NASA Astrophysics Data System (ADS)
Meindl, J. D.; Dutton, R. W.; Gibbons, J. F.; Helms, C. R.; Plummer, J. D.; Tiller, W. A.; Ho, C. P.; Saraswat, K. C.; Deal, B. E.; Kamins, T. I.
1980-07-01
Economical procurement of small quantities of high performance custom integrated circuits for military systems is impeded by inadequate process, device and circuit models that handicap low cost computer aided design. The principal objective of this program is to formulate physical models of fabrication processes, devices and circuits to allow total computer-aided design of custom large-scale integrated circuits. The basic areas under investigation are (1) thermal oxidation, (2) ion implantation and diffusion, (3) chemical vapor deposition of silicon and refractory metal silicides, (4) device simulation and analytic measurements. This report discusses the fourth year of the program.
Huntington II Simulation Program - MALAR. Student Workbook, Teacher's Guide, and Resource Handbook.
ERIC Educational Resources Information Center
Friedland, James; Frishman, Austin
Described is the computer model "MALAR" which deals with malaria and its eradication. A computer program allows the tenth- to twelfth-grade student to attempt to control a malaria epidemic. This simulation provides a context within which to study the biological, economic, social, political, and ecological aspects of a classic world health problem.…
Simulating Quantile Models with Applications to Economics and Management
NASA Astrophysics Data System (ADS)
Machado, José A. F.
2010-05-01
The massive increase in the speed of computers over the past forty years changed the way that social scientists, applied economists and statisticians approach their trades and also the very nature of the problems that they could feasibly tackle. The new methods that use intensively computer power go by the names of "computer-intensive" or "simulation". My lecture will start with bird's eye view of the uses of simulation in Economics and Statistics. Then I will turn out to my own research on uses of computer- intensive methods. From a methodological point of view the question I address is how to infer marginal distributions having estimated a conditional quantile process, (Counterfactual Decomposition of Changes in Wage Distributions using Quantile Regression," Journal of Applied Econometrics 20, 2005). Illustrations will be provided of the use of the method to perform counterfactual analysis in several different areas of knowledge.
Dairy cow culling strategies: making economical culling decisions.
Lehenbauer, T W; Oltjen, J W
1998-01-01
The purpose of this report was to examine important economic elements of culling decisions, to review progress in development of culling decision support systems, and to discern some of the potentially rewarding areas for future research on culling models. Culling decisions have an important influence on the economic performance of the dairy but are often made in a nonprogrammed fashion and based partly on the intuition of the decision maker. The computer technology that is available for dairy herd management has made feasible the use of economic models to support culling decisions. Financial components--including profit, cash flow, and risk--are major economic factors affecting culling decisions. Culling strategies are further influenced by short-term fluctuations in cow numbers as well as by planned herd expansion. Changes in herd size affect the opportunity cost for postponed replacement and may alter the relevance of optimization strategies that assume a fixed herd size. Improvements in model components related to biological factors affecting future cow performance, including milk production, reproductive status, and mastitis, appear to offer the greatest economic potential for enhancing culling decision support systems. The ultimate value of any culling decision support system for developing economic culling strategies will be determined by its results under field conditions.
Verification of Decision-Analytic Models for Health Economic Evaluations: An Overview.
Dasbach, Erik J; Elbasha, Elamin H
2017-07-01
Decision-analytic models for cost-effectiveness analysis are developed in a variety of software packages where the accuracy of the computer code is seldom verified. Although modeling guidelines recommend using state-of-the-art quality assurance and control methods for software engineering to verify models, the fields of pharmacoeconomics and health technology assessment (HTA) have yet to establish and adopt guidance on how to verify health and economic models. The objective of this paper is to introduce to our field the variety of methods the software engineering field uses to verify that software performs as expected. We identify how many of these methods can be incorporated in the development process of decision-analytic models in order to reduce errors and increase transparency. Given the breadth of methods used in software engineering, we recommend a more in-depth initiative to be undertaken (e.g., by an ISPOR-SMDM Task Force) to define the best practices for model verification in our field and to accelerate adoption. Establishing a general guidance for verifying models will benefit the pharmacoeconomics and HTA communities by increasing accuracy of computer programming, transparency, accessibility, sharing, understandability, and trust of models.
DESIGNING SULFATE-REDUCING BACTERIA FIELD BIOREACTORS USING THE BEST MODEL
BEST (bioreactor economics, size and time of operation) is a spreadsheet-based model that is used in conjunction with a public domain computer software package, PHREEQCI. BEST is intended to be used in the design process of sulfate-reducing bacteria (SRB)field bioreactors to pas...
Some Automated Cartography Developments at the Defense Mapping Agency.
1981-01-01
on a pantographic router creating a laminate step model which was moulded in plaster for carving Into a terrain model. This section will trace DMA’s...offering economical automation. Precision flatbed Concord plotters were brought into DMA with sufficiently programmable control computers to perform these
Optimization Scheduling Model for Wind-thermal Power System Considering the Dynamic penalty factor
NASA Astrophysics Data System (ADS)
PENG, Siyu; LUO, Jianchun; WANG, Yunyu; YANG, Jun; RAN, Hong; PENG, Xiaodong; HUANG, Ming; LIU, Wanyu
2018-03-01
In this paper, a new dynamic economic dispatch model for power system is presented.Objective function of the proposed model presents a major novelty in the dynamic economic dispatch including wind farm: introduced the “Dynamic penalty factor”, This factor could be computed by using fuzzy logic considering both the variable nature of active wind power and power demand, and it could change the wind curtailment cost according to the different state of the power system. Case studies were carried out on the IEEE30 system. Results show that the proposed optimization model could mitigate the wind curtailment and the total cost effectively, demonstrate the validity and effectiveness of the proposed model.
When water saving limits recycling: Modelling economy-wide linkages of wastewater use.
Luckmann, Jonas; Grethe, Harald; McDonald, Scott
2016-01-01
The reclamation of wastewater is an increasingly important water source in parts of the world. It is claimed that wastewater recycling is a cheap and reliable form of water supply, which preserves water resources and is economically efficient. However, the quantity of reclaimed wastewater depends on water consumption by economic agents connected to a sewage system. This study uses a Computable General Equilibrium (CGE) model to analyse such a cascading water system. A case study of Israel shows that failing to include this linkage can lead to an overestimation of the potential of wastewater recycling, especially when economic agents engage in water saving. Copyright © 2015 Elsevier Ltd. All rights reserved.
Chris B. LeDoux; Gary W. Miller
2008-01-01
In this study we used data from 16 Appalachian hardwood stands, a growth and yield computer simulation model, and stump-to-mill logging cost-estimating software to evaluate the optimal economic timing of crop tree release (CTR) treatments. The simulated CTR treatments consisted of one-time logging operations at stand age 11, 23, 31, or 36 years, with the residual...
NASA Technical Reports Server (NTRS)
1972-01-01
An economic analysis of space tug operations is presented. The subjects discussed are: (1) data base for orbit injection stages, (2) data base for reusable space tug, (3) performance equations, (4) data integration and interpretation, (5) tug performance and mission model accomodation, (6) total program cost, (7) payload analysis, (8) computer software, and (9) comparison of tug concepts.
Transportation Planning for Your Community
DOT National Transportation Integrated Search
2000-12-01
The Highway Economic Requirements System (HERS) is a computer model designed to simulate improvement selection decisions based on the relative benefit-cost merits of alternative improvement options. HERS is intended to estimate national level investm...
Economic Assessment of Correlated Energy-Water Impacts using Computable General Equilibrium Modeling
NASA Astrophysics Data System (ADS)
Qiu, F.; Andrew, S.; Wang, J.; Yan, E.; Zhou, Z.; Veselka, T.
2016-12-01
Many studies on energy and water are rightfully interested in the interaction of water and energy, and their projected dependence into the future. Water is indeed an essential input to the power sector currently, and energy is required to pump water for end use in either household consumption or in industrial uses. However, each presented study either qualitatively discusses the issues, particularly about how better understanding the interconnectedness of the system is paramount in getting better policy recommendations, or considers a partial equilibrium framework where water use and energy use changes are considered explicitly without thought to other repercussions throughout the regional/national/international economic landscapes. While many studies are beginning to ask the right questions, the lack of numerical rigor raises questions of concern in conclusions discerned. Most use life cycle analysis as a method for providing numerical results, though this lacks the flexibility that economics can provide. In this study, we will perform economic analysis using computable general equilibrium models with energy-water interdependencies captured as an important factor. We atempt to answer important and interesting questions in the studies: how can we characterize the economic choice of energy technology adoptions and their implications on water use in the domestic economy. Moreover, given predictions of reductions in rain fall in the near future, how does this impact the water supply in the midst of this energy-water trade-off?
NASA Technical Reports Server (NTRS)
1974-01-01
This report presents the derivation, description, and operating instructions for a computer program (TEKVAL) which measures the economic value of advanced technology features applied to long range commercial passenger aircraft. The program consists of three modules; and airplane sizing routine, a direct operating cost routine, and an airline return-on-investment routine. These modules are linked such that they may be operated sequentially or individually, with one routine generating the input for the next or with the option of externally specifying the input for either of the economic routines. A very simple airplane sizing technique was previously developed, based on the Brequet range equation. For this program, that sizing technique has been greatly expanded and combined with the formerly separate DOC and ROI programs to produce TEKVAL.
Lokkerbol, Joran; Adema, Dirk; Cuijpers, Pim; Reynolds, Charles F; Schulz, Richard; Weehuizen, Rifka; Smit, Filip
2014-03-01
Depressive disorders are significant causes of disease burden and are associated with substantial economic costs. It is therefore important to design a healthcare system that can effectively manage depression at sustainable costs. This article computes the benefit-to-cost ratio of the current Dutch healthcare system for depression, and investigates whether offering more online preventive interventions improves the cost-effectiveness overall. A health economic (Markov) model was used to synthesize clinical and economic evidence and to compute population-level costs and effects of interventions. The model compared a base case scenario without preventive telemedicine and alternative scenarios with preventive telemedicine. The central outcome was the benefit-to-cost ratio, also known as return-on-investment (ROI). In terms of ROI, a healthcare system with preventive telemedicine for depressive disorders offers better value for money than a healthcare system without Internet-based prevention. Overall, the ROI increases from €1.45 ($1.72) in the base case scenario to €1.76 ($2.09) in the alternative scenario in which preventive telemedicine is offered. In a scenario in which the costs of offering preventive telemedicine are balanced by reducing the expenditures for curative interventions, ROI increases to €1.77 ($2.10), while keeping the healthcare budget constant. For a healthcare system for depressive disorders to remain economically sustainable, its cost-benefit ratio needs to be improved. Offering preventive telemedicine at a large scale is likely to introduce such an improvement. Copyright © 2014 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.
Performability evaluation of the SIFT computer
NASA Technical Reports Server (NTRS)
Meyer, J. F.; Furchtgott, D. G.; Wu, L. T.
1979-01-01
Performability modeling and evaluation techniques are applied to the SIFT computer as it might operate in the computational evironment of an air transport mission. User-visible performance of the total system (SIFT plus its environment) is modeled as a random variable taking values in a set of levels of accomplishment. These levels are defined in terms of four attributes of total system behavior: safety, no change in mission profile, no operational penalties, and no economic process whose states describe the internal structure of SIFT as well as relavant conditions of the environment. Base model state trajectories are related to accomplishment levels via a capability function which is formulated in terms of a 3-level model hierarchy. Performability evaluation algorithms are then applied to determine the performability of the total system for various choices of computer and environment parameter values. Numerical results of those evaluations are presented and, in conclusion, some implications of this effort are discussed.
ERIC Educational Resources Information Center
Schenk, Robert E.
Intended for use with college students in introductory macroeconomics or American economic history courses, these two computer simulations of two basic macroeconomic models--a simple Keynesian-type model and a quantity-theory-of-money model--present largely incompatible explanations of the Great Depression. Written in Basic, the simulations are…
ERIC Educational Resources Information Center
McKeever, Barbara
An award-winning fourth-grade unit combines computer and economics education by examining the impact of computer usage on various segments of the economy. Students spent one semester becoming familiar with a classroom computer and gaining a general understanding of basic economic concepts through class discussion, field trips, and bulletin boards.…
Version 3.0 of EMINERS - Economic Mineral Resource Simulator
Duval, Joseph S.
2012-01-01
Quantitative mineral resource assessment, as developed by the U.S. Geological Survey (USGS), consists of three parts: (1) development of grade and tonnage mineral deposit models; (2) delineation of tracts permissive for each deposit type; and (3) probabilistic estimation of the numbers of undiscovered deposits for each deposit type. The estimate of the number of undiscovered deposits at different levels of probability is the input to the EMINERS (Economic Mineral Resource Simulator) program. EMINERS uses a Monte Carlo statistical process to combine probabilistic estimates of undiscovered mineral deposits with models of mineral deposit grade and tonnage to estimate mineral resources. Version 3.0 of the EMINERS program is available as this USGS Open-File Report 2004-1344. Changes from version 2.0 include updating 87 grade and tonnage models, designing new templates to produce graphs showing cumulative distribution and summary tables, and disabling economic filters. The economic filters were disabled because embedded data for costs of labor and materials, mining techniques, and beneficiation methods are out of date. However, the cost algorithms used in the disabled economic filters are still in the program and available for reference for mining methods and milling techniques. The release notes included with this report give more details on changes in EMINERS over the years. EMINERS is written in C++ and depends upon the Microsoft Visual C++ 6.0 programming environment. The code depends heavily on the use of Microsoft Foundation Classes (MFC) for implementation of the Windows interface. The program works only on Microsoft Windows XP or newer personal computers. It does not work on Macintosh computers. For help in using the program in this report, see the "Quick-Start Guide for Version 3.0 of EMINERS-Economic Mineral Resource Simulator" (W.J. Bawiec and G.T. Spanski, 2012, USGS Open-File Report 2009-1057, linked at right). It demonstrates how to execute EMINERS software using default settings and existing deposit models.
Analysing child mortality in Nigeria with geoadditive discrete-time survival models.
Adebayo, Samson B; Fahrmeir, Ludwig
2005-03-15
Child mortality reflects a country's level of socio-economic development and quality of life. In developing countries, mortality rates are not only influenced by socio-economic, demographic and health variables but they also vary considerably across regions and districts. In this paper, we analysed child mortality in Nigeria with flexible geoadditive discrete-time survival models. This class of models allows us to measure small-area district-specific spatial effects simultaneously with possibly non-linear or time-varying effects of other factors. Inference is fully Bayesian and uses computationally efficient Markov chain Monte Carlo (MCMC) simulation techniques. The application is based on the 1999 Nigeria Demographic and Health Survey. Our method assesses effects at a high level of temporal and spatial resolution not available with traditional parametric models, and the results provide some evidence on how to reduce child mortality by improving socio-economic and public health conditions. Copyright (c) 2004 John Wiley & Sons, Ltd.
Simulation of Local Blood Flow in Human Brain under Altered Gravity
NASA Technical Reports Server (NTRS)
Kim, Chang Sung; Kiris, Cetin; Kwak, Dochan
2003-01-01
In addition to the altered gravitational forces, specific shapes and connections of arteries in the brain vary in the human population (Cebral et al., 2000; Ferrandez et al., 2002). Considering the geometric variations, pulsatile unsteadiness, and moving walls, computational approach in analyzing altered blood circulation will offer an economical alternative to experiments. This paper presents a computational approach for modeling the local blood flow through the human brain under altered gravity. This computational approach has been verified through steady and unsteady experimental measurements and then applied to the unsteady blood flows through a carotid bifurcation model and an idealized Circle of Willis (COW) configuration under altered gravity conditions.
NASA Astrophysics Data System (ADS)
Yu, Long-Bao; Zhang, Wen-Hai; Ye, Liu
2007-09-01
We propose a simple scheme to realize 1→M economical phase-covariant quantum cloning machine (EPQCM) with superconducting quantum interference device (SQUID) qubits. In our scheme, multi-SQUIDs are fixed into a microwave cavity by adiabatic passage for their manipulation. Based on this model, we can realize the EPQCM with high fidelity via adiabatic quantum computation.
ERIC Educational Resources Information Center
Holyoak, Arlene, Ed.
These proceedings consist of 16 papers, some of which are followed by discussants' comments. They include: "Growing Older in a Rural Retirement Community" (Brokaw, Peters, Tripple; discussants Olson, Tucker; "An Interactive Computer Model for Achieving Personal Financial Goals" (Dilbeck, Hinds, Ulivi; discussants Burton, Peterson); "The Economics…
The potential value of Clostridium difficile vaccine: an economic computer simulation model.
Lee, Bruce Y; Popovich, Michael J; Tian, Ye; Bailey, Rachel R; Ufberg, Paul J; Wiringa, Ann E; Muder, Robert R
2010-07-19
Efforts are currently underway to develop a vaccine against Clostridium difficile infection (CDI). We developed two decision analytic Monte Carlo computer simulation models: (1) an Initial Prevention Model depicting the decision whether to administer C. difficile vaccine to patients at-risk for CDI and (2) a Recurrence Prevention Model depicting the decision whether to administer C. difficile vaccine to prevent CDI recurrence. Our results suggest that a C. difficile vaccine could be cost-effective over a wide range of C. difficile risk, vaccine costs, and vaccine efficacies especially, when being used post-CDI treatment to prevent recurrent disease. (c) 2010 Elsevier Ltd. All rights reserved.
The Potential Value of Clostridium difficile Vaccine: An Economic Computer Simulation Model
Lee, Bruce Y.; Popovich, Michael J.; Tian, Ye; Bailey, Rachel R.; Ufberg, Paul J.; Wiringa, Ann E.; Muder, Robert R.
2010-01-01
Efforts are currently underway to develop a vaccine against Clostridium difficile infection (CDI). We developed two decision analytic Monte Carlo computer simulation models: (1) an Initial Prevention Model depicting the decision whether to administer C. difficile vaccine to patients at-risk for CDI and (2) a Recurrence Prevention Model depicting the decision whether to administer C. difficile vaccine to prevent CDI recurrence. Our results suggest that a C. difficile vaccine could be cost-effective over a wide range of C. difficile risk, vaccine costs, and vaccine efficacies especially when being used post-CDI treatment to prevent recurrent disease. PMID:20541582
A solid reactor core thermal model for nuclear thermal rockets
NASA Astrophysics Data System (ADS)
Rider, William J.; Cappiello, Michael W.; Liles, Dennis R.
1991-01-01
A Helium/Hydrogen Cooled Reactor Analysis (HERA) computer code has been developed. HERA has the ability to model arbitrary geometries in three dimensions, which allows the user to easily analyze reactor cores constructed of prismatic graphite elements. The code accounts for heat generation in the fuel, control rods, and other structures; conduction and radiation across gaps; convection to the coolant; and a variety of boundary conditions. The numerical solution scheme has been optimized for vector computers, making long transient analyses economical. Time integration is either explicit or implicit, which allows the use of the model to accurately calculate both short- or long-term transients with an efficient use of computer time. Both the basic spatial and temporal integration schemes have been benchmarked against analytical solutions.
Parallel Optimization of 3D Cardiac Electrophysiological Model Using GPU
Xia, Yong; Zhang, Henggui
2015-01-01
Large-scale 3D virtual heart model simulations are highly demanding in computational resources. This imposes a big challenge to the traditional computation resources based on CPU environment, which already cannot meet the requirement of the whole computation demands or are not easily available due to expensive costs. GPU as a parallel computing environment therefore provides an alternative to solve the large-scale computational problems of whole heart modeling. In this study, using a 3D sheep atrial model as a test bed, we developed a GPU-based simulation algorithm to simulate the conduction of electrical excitation waves in the 3D atria. In the GPU algorithm, a multicellular tissue model was split into two components: one is the single cell model (ordinary differential equation) and the other is the diffusion term of the monodomain model (partial differential equation). Such a decoupling enabled realization of the GPU parallel algorithm. Furthermore, several optimization strategies were proposed based on the features of the virtual heart model, which enabled a 200-fold speedup as compared to a CPU implementation. In conclusion, an optimized GPU algorithm has been developed that provides an economic and powerful platform for 3D whole heart simulations. PMID:26581957
Parallel Optimization of 3D Cardiac Electrophysiological Model Using GPU.
Xia, Yong; Wang, Kuanquan; Zhang, Henggui
2015-01-01
Large-scale 3D virtual heart model simulations are highly demanding in computational resources. This imposes a big challenge to the traditional computation resources based on CPU environment, which already cannot meet the requirement of the whole computation demands or are not easily available due to expensive costs. GPU as a parallel computing environment therefore provides an alternative to solve the large-scale computational problems of whole heart modeling. In this study, using a 3D sheep atrial model as a test bed, we developed a GPU-based simulation algorithm to simulate the conduction of electrical excitation waves in the 3D atria. In the GPU algorithm, a multicellular tissue model was split into two components: one is the single cell model (ordinary differential equation) and the other is the diffusion term of the monodomain model (partial differential equation). Such a decoupling enabled realization of the GPU parallel algorithm. Furthermore, several optimization strategies were proposed based on the features of the virtual heart model, which enabled a 200-fold speedup as compared to a CPU implementation. In conclusion, an optimized GPU algorithm has been developed that provides an economic and powerful platform for 3D whole heart simulations.
Pricing the Computing Resources: Reading Between the Lines and Beyond
NASA Technical Reports Server (NTRS)
Nakai, Junko; Veronico, Nick (Editor); Thigpen, William W. (Technical Monitor)
2001-01-01
Distributed computing systems have the potential to increase the usefulness of existing facilities for computation without adding anything physical, but that is realized only when necessary administrative features are in place. In a distributed environment, the best match is sought between a computing job to be run and a computer to run the job (global scheduling), which is a function that has not been required by conventional systems. Viewing the computers as 'suppliers' and the users as 'consumers' of computing services, markets for computing services/resources have been examined as one of the most promising mechanisms for global scheduling. We first establish why economics can contribute to scheduling. We further define the criterion for a scheme to qualify as an application of economics. Many studies to date have claimed to have applied economics to scheduling. If their scheduling mechanisms do not utilize economics, contrary to their claims, their favorable results do not contribute to the assertion that markets provide the best framework for global scheduling. We examine the well-known scheduling schemes, which concern pricing and markets, using our criterion of what application of economics is. Our conclusion is that none of the schemes examined makes full use of economics.
NASA Astrophysics Data System (ADS)
Tootle, G. A.; Gutenson, J. L.; Zhu, L.; Ernest, A. N. S.; Oubeidillah, A.; Zhang, X.
2015-12-01
The National Flood Interoperability Experiment (NFIE) held June 3-July 17, 2015 at the National Water Center (NWC) in Tuscaloosa, Alabama sought to demonstrate an increase in flood predictive capacity for the coterminous United States (CONUS). Accordingly, NFIE-derived technologies and workflows offer the ability to forecast flood damage and economic consequence estimates that coincide with the hydrologic and hydraulic estimations these physics-based models generate. A model providing an accurate prediction of damage and economic consequences is a valuable asset when allocating funding for disaster response, recovery, and relief. Damage prediction and economic consequence assessment also offer an adaptation planning mechanism for defending particularly valuable or vulnerable structures. The NFIE, held at the NWC on The University of Alabama (UA) campus led to the development of this large scale flow and inundation forecasting framework. Currently, the system can produce 15-hour lead-time forecasts for the entire coterminous United States (CONUS). A concept which is anticipated to become operational as of May 2016 within the NWC. The processing of such a large-scale, fine resolution model is accomplished in a parallel computing environment using large supercomputing clusters. Traditionally, flood damage and economic consequence assessment is calculated in a desktop computing environment with a ménage of meteorology, hydrology, hydraulic, and damage assessment tools. In the United States, there are a range of these flood damage/ economic consequence assessment software's available to local, state, and federal emergency management agencies. Among the more commonly used and freely accessible models are the Hydrologic Engineering Center's Flood Damage Reduction Analysis (HEC-FDA), Flood Impact Assessment (HEC-FIA), and Federal Emergency Management Agency's (FEMA's) United States Multi-Hazard (Hazus-MH). All of which exist only in a desktop environment. With this, authors submit an initial framework for estimating damage and economic consequences to floods using flow and inundation products from the NFIE framework. This adaptive system utilizes existing nationwide datasets describing location and use of structures and can take assimilate a range of data resolutions.
Wang, Guizhi; Gu, SaiJu; Chen, Jibo; Wu, Xianhua; Yu, Jun
2016-12-01
Assessment of the health and economic impacts of PM2.5 pollution is of great importance for urban air pollution prevention and control. In this study, we evaluate the damage of PM2.5 pollution using Beijing as an example. First, we use exposure-response functions to estimate the adverse health effects due to PM2.5 pollution. Then, the corresponding labour loss and excess medical expenditure are computed as two conducting variables. Finally, different from the conventional valuation methods, this paper introduces the two conducting variables into the computable general equilibrium (CGE) model to assess the impacts on sectors and the whole economic system caused by PM2.5 pollution. The results show that, substantial health effects of the residents in Beijing from PM2.5 pollution occurred in 2013, including 20,043 premature deaths and about one million other related medical cases. Correspondingly, using the 2010 social accounting data, Beijing gross domestic product loss due to the health impact of PM2.5 pollution is estimated as 1286.97 (95% CI: 488.58-1936.33) million RMB. This demonstrates that PM2.5 pollution not only has adverse health effects, but also brings huge economic loss.
Modelling the role of forests on water provision services: a hydro-economic valuation approach
NASA Astrophysics Data System (ADS)
Beguería, S.; Campos, P.
2015-12-01
Hydro-economic models that allow integrating the ecological, hydrological, infrastructure, economic and social aspects into a coherent, scientifically- informed framework constitute preferred tools for supporting decision making in the context of integrated water resources management. We present a case study of water regulation and provision services of forests in the Andalusia region of Spain. Our model computes the physical water flows and conducts an economic environmental income and asset valuation of forest surface and underground water yield. Based on available hydrologic and economic data, we develop a comprehensive water account for all the forest lands at the regional scale. This forest water environmental valuation is integrated within a much larger project aiming at providing a robust and easily replicable accounting tool to evaluate yearly the total income and capital of forests, encompassing all measurable sources of private and public incomes (timber and cork production, auto-consumption, recreational activities, biodiversity conservation, carbon sequestration, water production, etc.). We also force our simulation with future socio-economic scenarios to quantify the physical and economic efects of expected trends or simulated public and private policies on future water resources. Only a comprehensive integrated tool may serve as a basis for the development of integrated policies, such as those internationally agreed and recommended for the management of water resources.
Modeling and Simulation of the Economics of Mining in the Bitcoin Market.
Cocco, Luisanna; Marchesi, Michele
2016-01-01
In January 3, 2009, Satoshi Nakamoto gave rise to the "Bitcoin Blockchain", creating the first block of the chain hashing on his computer's central processing unit (CPU). Since then, the hash calculations to mine Bitcoin have been getting more and more complex, and consequently the mining hardware evolved to adapt to this increasing difficulty. Three generations of mining hardware have followed the CPU's generation. They are GPU's, FPGA's and ASIC's generations. This work presents an agent-based artificial market model of the Bitcoin mining process and of the Bitcoin transactions. The goal of this work is to model the economy of the mining process, starting from GPU's generation, the first with economic significance. The model reproduces some "stylized facts" found in real-time price series and some core aspects of the mining business. In particular, the computational experiments performed can reproduce the unit root property, the fat tail phenomenon and the volatility clustering of Bitcoin price series. In addition, under proper assumptions, they can reproduce the generation of Bitcoins, the hashing capability, the power consumption, and the mining hardware and electrical energy expenditures of the Bitcoin network.
Stochastic Multi-Timescale Power System Operations With Variable Wind Generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Hongyu; Krad, Ibrahim; Florita, Anthony
This paper describes a novel set of stochastic unit commitment and economic dispatch models that consider stochastic loads and variable generation at multiple operational timescales. The stochastic model includes four distinct stages: stochastic day-ahead security-constrained unit commitment (SCUC), stochastic real-time SCUC, stochastic real-time security-constrained economic dispatch (SCED), and deterministic automatic generation control (AGC). These sub-models are integrated together such that they are continually updated with decisions passed from one to another. The progressive hedging algorithm (PHA) is applied to solve the stochastic models to maintain the computational tractability of the proposed models. Comparative case studies with deterministic approaches are conductedmore » in low wind and high wind penetration scenarios to highlight the advantages of the proposed methodology, one with perfect forecasts and the other with current state-of-the-art but imperfect deterministic forecasts. The effectiveness of the proposed method is evaluated with sensitivity tests using both economic and reliability metrics to provide a broader view of its impact.« less
Shizgal, Peter
2012-01-01
Almost 80 years ago, Lionel Robbins proposed a highly influential definition of the subject matter of economics: the allocation of scarce means that have alternative ends. Robbins confined his definition to human behavior, and he strove to separate economics from the natural sciences in general and from psychology in particular. Nonetheless, I extend his definition to the behavior of non-human animals, rooting my account in psychological processes and their neural underpinnings. Some historical developments are reviewed that render such a view more plausible today than would have been the case in Robbins' time. To illustrate a neuroeconomic perspective on decision making in non-human animals, I discuss research on the rewarding effect of electrical brain stimulation. Central to this discussion is an empirically based, functional/computational model of how the subjective intensity of the electrical reward is computed and combined with subjective costs so as to determine the allocation of time to the pursuit of reward. Some successes achieved by applying the model are discussed, along with limitations, and evidence is presented regarding the roles played by several different neural populations in processes posited by the model. I present a rationale for marshaling convergent experimental methods to ground psychological and computational processes in the activity of identified neural populations, and I discuss the strengths, weaknesses, and complementarity of the individual approaches. I then sketch some recent developments that hold great promise for advancing our understanding of structure-function relationships in neuroscience in general and in the neuroeconomic study of decision making in particular.
Shizgal, Peter
2011-01-01
Almost 80 years ago, Lionel Robbins proposed a highly influential definition of the subject matter of economics: the allocation of scarce means that have alternative ends. Robbins confined his definition to human behavior, and he strove to separate economics from the natural sciences in general and from psychology in particular. Nonetheless, I extend his definition to the behavior of non-human animals, rooting my account in psychological processes and their neural underpinnings. Some historical developments are reviewed that render such a view more plausible today than would have been the case in Robbins’ time. To illustrate a neuroeconomic perspective on decision making in non-human animals, I discuss research on the rewarding effect of electrical brain stimulation. Central to this discussion is an empirically based, functional/computational model of how the subjective intensity of the electrical reward is computed and combined with subjective costs so as to determine the allocation of time to the pursuit of reward. Some successes achieved by applying the model are discussed, along with limitations, and evidence is presented regarding the roles played by several different neural populations in processes posited by the model. I present a rationale for marshaling convergent experimental methods to ground psychological and computational processes in the activity of identified neural populations, and I discuss the strengths, weaknesses, and complementarity of the individual approaches. I then sketch some recent developments that hold great promise for advancing our understanding of structure–function relationships in neuroscience in general and in the neuroeconomic study of decision making in particular. PMID:22363253
Smith, Richard D; Keogh-Brown, Marcus R; Barnett, Tony; Tait, Joyce
2009-11-19
To estimate the potential economic impact of pandemic influenza, associated behavioural responses, school closures, and vaccination on the United Kingdom. A computable general equilibrium model of the UK economy was specified for various combinations of mortality and morbidity from pandemic influenza, vaccine efficacy, school closures, and prophylactic absenteeism using published data. The 2004 UK economy (the most up to date available with suitable economic data). The economic impact of various scenarios with different pandemic severity, vaccination, school closure, and prophylactic absenteeism specified in terms of gross domestic product, output from different economic sectors, and equivalent variation. The costs related to illness alone ranged between 0.5% and 1.0% of gross domestic product ( pound8.4bn to pound16.8bn) for low fatality scenarios, 3.3% and 4.3% ( pound55.5bn to pound72.3bn) for high fatality scenarios, and larger still for an extreme pandemic. School closure increases the economic impact, particularly for mild pandemics. If widespread behavioural change takes place and there is large scale prophylactic absence from work, the economic impact would be notably increased with few health benefits. Vaccination with a pre-pandemic vaccine could save 0.13% to 2.3% of gross domestic product ( pound2.2bn to pound38.6bn); a single dose of a matched vaccine could save 0.3% to 4.3% ( pound5.0bn to pound72.3bn); and two doses of a matched vaccine could limit the overall economic impact to about 1% of gross domestic product for all disease scenarios. Balancing school closure against "business as usual" and obtaining sufficient stocks of effective vaccine are more important factors in determining the economic impact of an influenza pandemic than is the disease itself. Prophylactic absence from work in response to fear of infection can add considerably to the economic impact.
Keogh-Brown, Marcus R; Barnett, Tony; Tait, Joyce
2009-01-01
Objectives To estimate the potential economic impact of pandemic influenza, associated behavioural responses, school closures, and vaccination on the United Kingdom. Design A computable general equilibrium model of the UK economy was specified for various combinations of mortality and morbidity from pandemic influenza, vaccine efficacy, school closures, and prophylactic absenteeism using published data. Setting The 2004 UK economy (the most up to date available with suitable economic data). Main outcome measures The economic impact of various scenarios with different pandemic severity, vaccination, school closure, and prophylactic absenteeism specified in terms of gross domestic product, output from different economic sectors, and equivalent variation. Results The costs related to illness alone ranged between 0.5% and 1.0% of gross domestic product (£8.4bn to £16.8bn) for low fatality scenarios, 3.3% and 4.3% (£55.5bn to £72.3bn) for high fatality scenarios, and larger still for an extreme pandemic. School closure increases the economic impact, particularly for mild pandemics. If widespread behavioural change takes place and there is large scale prophylactic absence from work, the economic impact would be notably increased with few health benefits. Vaccination with a pre-pandemic vaccine could save 0.13% to 2.3% of gross domestic product (£2.2bn to £38.6bn); a single dose of a matched vaccine could save 0.3% to 4.3% (£5.0bn to £72.3bn); and two doses of a matched vaccine could limit the overall economic impact to about 1% of gross domestic product for all disease scenarios. Conclusion Balancing school closure against “business as usual” and obtaining sufficient stocks of effective vaccine are more important factors in determining the economic impact of an influenza pandemic than is the disease itself. Prophylactic absence from work in response to fear of infection can add considerably to the economic impact. PMID:19926697
Hallmann, Kirstin; Breuer, Christoph
2014-01-01
This article analyses sport participation using a demographic-economic model which was extended by the construct 'social recognition'. Social recognition was integrated into the model on the understanding that it is the purpose of each individual to maximise his or her utility. A computer-assisted telephone interview survey was conducted in the city of Rheinberg, Germany, producing an overall sample of n=1934. Regression analyses were performed to estimate the impact of socio-demographic, economic determinants, and social recognition on sport participation. The results suggest that various socio-economic factors and social recognition are important determinants of sport participation on the one hand, and on sport frequency on the other. Social recognition plays a significant yet different role for both sport participation and sport frequency. While friends' involvement with sport influences one's sport participation, parents' involvement with sport influences one's sport frequency.
Volume of the steady-state space of financial flows in a monetary stock-flow-consistent model
NASA Astrophysics Data System (ADS)
Hazan, Aurélien
2017-05-01
We show that a steady-state stock-flow consistent macro-economic model can be represented as a Constraint Satisfaction Problem (CSP). The set of solutions is a polytope, which volume depends on the constraints applied and reveals the potential fragility of the economic circuit, with no need to study the dynamics. Several methods to compute the volume are compared, inspired by operations research methods and the analysis of metabolic networks, both exact and approximate. We also introduce a random transaction matrix, and study the particular case of linear flows with respect to money stocks.
Modeling the internal combustion engine
NASA Technical Reports Server (NTRS)
Zeleznik, F. J.; Mcbride, B. J.
1985-01-01
A flexible and computationally economical model of the internal combustion engine was developed for use on large digital computer systems. It is based on a system of ordinary differential equations for cylinder-averaged properties. The computer program is capable of multicycle calculations, with some parameters varying from cycle to cycle, and has restart capabilities. It can accommodate a broad spectrum of reactants, permits changes in physical properties, and offers a wide selection of alternative modeling functions without any reprogramming. It readily adapts to the amount of information available in a particular case because the model is in fact a hierarchy of five models. The models range from a simple model requiring only thermodynamic properties to a complex model demanding full combustion kinetics, transport properties, and poppet valve flow characteristics. Among its many features the model includes heat transfer, valve timing, supercharging, motoring, finite burning rates, cycle-to-cycle variations in air-fuel ratio, humid air, residual and recirculated exhaust gas, and full combustion kinetics.
NASA Astrophysics Data System (ADS)
Bataille, Christopher G. F.
2005-11-01
Are further energy efficiency gains, or more recently greenhouse gas reductions, expensive or cheap? Analysts provide conflicting advice to policy makers based on divergent modelling perspectives, a 'top-down/bottom-up debate' in which economists use equation based models that equilibrate markets by maximizing consumer welfare, and technologists use technology simulation models that minimize the financial cost of providing energy services. This thesis summarizes a long term research project to find a middle ground between these two positions that is more useful to policy makers. Starting with the individual components of a behaviourally realistic and technologically explicit simulation model (ISTUM---Inter Sectoral Technology Use Model), or "hybrid", the individual sectors of the economy are linked using a framework of micro and macro economic feedbacks. These feedbacks are taken from the economic theory that informs the computable general equilibrium (CGE) family of models. Speaking in the languages of both economists and engineers, the resulting "physical" equilibrium model of Canada (CIMS---Canadian Integrated Modeling System), equilibrates energy and end-product markets, including imports and exports, for seven regions and 15 economic sectors, including primary industry, manufacturing, transportation, commerce, residences, governmental infrastructure and the energy supply sectors. Several different policy experiments demonstrate the value-added of the model and how its results compare to top-down and bottom-up practice. In general, the results show that technical adjustments make up about half the response to simulated energy policy, and macroeconomic demand adjustments the other half. Induced technical adjustments predominate with minor policies, while the importance of macroeconomic demand adjustment increases with the strength of the policy. Results are also shown for an experiment to derive estimates of future elasticity of substitution (ESUB) and autonomous energy efficiency indices (AEEI) from the model, parameters that could be used in long-run computable general equilibrium (CGE) analysis. The thesis concludes with a summary of the strengths and weakness of the new model as a policy tool, a work plan for its further improvement, and a discussion of the general potential for technologically explicit general equilibrium modelling.
Risk in the Clouds?: Security Issues Facing Government Use of Cloud Computing
NASA Astrophysics Data System (ADS)
Wyld, David C.
Cloud computing is poised to become one of the most important and fundamental shifts in how computing is consumed and used. Forecasts show that government will play a lead role in adopting cloud computing - for data storage, applications, and processing power, as IT executives seek to maximize their returns on limited procurement budgets in these challenging economic times. After an overview of the cloud computing concept, this article explores the security issues facing public sector use of cloud computing and looks to the risk and benefits of shifting to cloud-based models. It concludes with an analysis of the challenges that lie ahead for government use of cloud resources.
Physics and financial economics (1776-2014): puzzles, Ising and agent-based models.
Sornette, Didier
2014-06-01
This short review presents a selected history of the mutual fertilization between physics and economics--from Isaac Newton and Adam Smith to the present. The fundamentally different perspectives embraced in theories developed in financial economics compared with physics are dissected with the examples of the volatility smile and of the excess volatility puzzle. The role of the Ising model of phase transitions to model social and financial systems is reviewed, with the concepts of random utilities and the logit model as the analog of the Boltzmann factor in statistical physics. Recent extensions in terms of quantum decision theory are also covered. A wealth of models are discussed briefly that build on the Ising model and generalize it to account for the many stylized facts of financial markets. A summary of the relevance of the Ising model and its extensions is provided to account for financial bubbles and crashes. The review would be incomplete if it did not cover the dynamical field of agent-based models (ABMs), also known as computational economic models, of which the Ising-type models are just special ABM implementations. We formulate the 'Emerging Intelligence Market Hypothesis' to reconcile the pervasive presence of 'noise traders' with the near efficiency of financial markets. Finally, we note that evolutionary biology, more than physics, is now playing a growing role to inspire models of financial markets.
Economic analysis and assessment of syngas production using a modeling approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Hakkwan; Parajuli, Prem B.; Yu, Fei
Economic analysis and modeling are essential and important issues for the development of current feedstock and process technology for bio-gasification. The objective of this study was to develop an economic model and apply to predict the unit cost of syngas production from a micro-scale bio-gasification facility. An economic model was programmed in C++ computer programming language and developed using a parametric cost approach, which included processes to calculate the total capital costs and the total operating costs. The model used measured economic data from the bio-gasification facility at Mississippi State University. The modeling results showed that the unit cost ofmore » syngas production was $1.217 for a 60 Nm-3 h-1 capacity bio-gasifier. The operating cost was the major part of the total production cost. The equipment purchase cost and the labor cost were the largest part of the total capital cost and the total operating cost, respectively. Sensitivity analysis indicated that labor costs rank the top as followed by equipment cost, loan life, feedstock cost, interest rate, utility cost, and waste treatment cost. The unit cost of syngas production increased with the increase of all parameters with exception of loan life. The annual cost regarding equipment, labor, feedstock, waste treatment, and utility cost showed a linear relationship with percent changes, while loan life and annual interest rate showed a non-linear relationship. This study provides the useful information for economic analysis and assessment of the syngas production using a modeling approach.« less
Economic decision making and the application of nonparametric prediction models
Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.
2008-01-01
Sustained increases in energy prices have focused attention on gas resources in low-permeability shale or in coals that were previously considered economically marginal. Daily well deliverability is often relatively small, although the estimates of the total volumes of recoverable resources in these settings are often large. Planning and development decisions for extraction of such resources must be areawide because profitable extraction requires optimization of scale economies to minimize costs and reduce risk. For an individual firm, the decision to enter such plays depends on reconnaissance-level estimates of regional recoverable resources and on cost estimates to develop untested areas. This paper shows how simple nonparametric local regression models, used to predict technically recoverable resources at untested sites, can be combined with economic models to compute regional-scale cost functions. The context of the worked example is the Devonian Antrim-shale gas play in the Michigan basin. One finding relates to selection of the resource prediction model to be used with economic models. Models chosen because they can best predict aggregate volume over larger areas (many hundreds of sites) smooth out granularity in the distribution of predicted volumes at individual sites. This loss of detail affects the representation of economic cost functions and may affect economic decisions. Second, because some analysts consider unconventional resources to be ubiquitous, the selection and order of specific drilling sites may, in practice, be determined arbitrarily by extraneous factors. The analysis shows a 15-20% gain in gas volume when these simple models are applied to order drilling prospects strategically rather than to choose drilling locations randomly. Copyright ?? 2008 Society of Petroleum Engineers.
Applied Dynamic Analysis of the Global Economy (ADAGE)
ADAGE is a dynamic computable general equilibrium (CGE) model capable of examining many types of economic, energy, environmental, climate change mitigation, and trade policies at the international, national, U.S. regional, and U.S. state levels. To investigate proposed policy eff...
NASA Astrophysics Data System (ADS)
Albersen, Peter J.; Houba, Harold E. D.; Keyzer, Michiel A.
A general approach is presented to value the stocks and flows of water as well as the physical structure of the basin on the basis of an arbitrary process-based hydrological model. This approach adapts concepts from the economic theory of capital accumulation, which are based on Lagrange multipliers that reflect market prices in the absence of markets. This permits to derive a financial account complementing the water balance in which the value of deliveries by the hydrological system fully balances with the value of resources, including physical characteristics reflected in the shape of the functions in the model. The approach naturally suggests the use of numerical optimization software to compute the multipliers, without the need to impose an immensely large number of small perturbations on the simulation model, or to calculate all derivatives analytically. A novel procedure is proposed to circumvent numerical problems in computation and it is implemented in a numerical application using AQUA, an existing model of the Upper-Zambezi River. It appears, not unexpectedly, that most end value accrues to agriculture. Irrigated agriculture receives a remarkably large share, and is by far the most rewarding activity. Furthermore, according to the model, the economic value would be higher if temperature was lower, pointing to the detrimental effect of climate change. We also find that a significant economic value is stored in the groundwater stock because of its critical role in the dry season. As groundwater comes out as the main capital of the basin, its mining could be harmful.
Modeling of an intelligent pressure sensor using functional link artificial neural networks.
Patra, J C; van den Bos, A
2000-01-01
A capacitor pressure sensor (CPS) is modeled for accurate readout of applied pressure using a novel artificial neural network (ANN). The proposed functional link ANN (FLANN) is a computationally efficient nonlinear network and is capable of complex nonlinear mapping between its input and output pattern space. The nonlinearity is introduced into the FLANN by passing the input pattern through a functional expansion unit. Three different polynomials such as, Chebyschev, Legendre and power series have been employed in the FLANN. The FLANN offers computational advantage over a multilayer perceptron (MLP) for similar performance in modeling of the CPS. The prime aim of the present paper is to develop an intelligent model of the CPS involving less computational complexity, so that its implementation can be economical and robust. It is shown that, over a wide temperature variation ranging from -50 to 150 degrees C, the maximum error of estimation of pressure remains within +/- 3%. With the help of computer simulation, the performance of the three types of FLANN models has been compared to that of an MLP based model.
The mineral sector and economic development in Ghana: A computable general equilibrium analysis
NASA Astrophysics Data System (ADS)
Addy, Samuel N.
A computable general equilibrium model (CGE) model is formulated for conducting mineral policy analysis in the context of national economic development for Ghana. The model, called GHANAMIN, places strong emphasis on production, trade, and investment. It can be used to examine both micro and macro economic impacts of policies associated with mineral investment, taxation, and terms of trade changes, as well as mineral sector performance impacts due to technological change or the discovery of new deposits. Its economywide structure enables the study of broader development policy with a focus on individual or multiple sectors, simultaneously. After going through a period of contraction for about two decades, mining in Ghana has rebounded significantly and is currently the main foreign exchange earner. Gold alone contributed 44.7 percent of 1994 total export earnings. GHANAMIN is used to investigate the economywide impacts of mineral tax policies, world market mineral prices changes, mining investment, and increased mineral exports. It is also used for identifying key sectors for economic development. Various simulations were undertaken with the following results: Recently implemented mineral tax policies are welfare increasing, but have an accompanying decrease in the output of other export sectors. World mineral price rises stimulate an increase in real GDP; however, this increase is less than real GDP decreases associated with price declines. Investment in the non-gold mining sector increases real GDP more than investment in gold mining, because of the former's stronger linkages to the rest of the economy. Increased mineral exports are very beneficial to the overall economy. Foreign direct investment (FDI) in mining increases welfare more so than domestic capital, which is very limited. Mining investment and the increased mineral exports since 1986 have contributed significantly to the country's economic recovery, with gold mining accounting for 95 percent of the mineral sector's contribution. The mining sector in general is identified as a leading sector for economic development.
Testing simulation and structural models with applications to energy demand
NASA Astrophysics Data System (ADS)
Wolff, Hendrik
2007-12-01
This dissertation deals with energy demand and consists of two parts. Part one proposes a unified econometric framework for modeling energy demand and examples illustrate the benefits of the technique by estimating the elasticity of substitution between energy and capital. Part two assesses the energy conservation policy of Daylight Saving Time and empirically tests the performance of electricity simulation. In particular, the chapter "Imposing Monotonicity and Curvature on Flexible Functional Forms" proposes an estimator for inference using structural models derived from economic theory. This is motivated by the fact that in many areas of economic analysis theory restricts the shape as well as other characteristics of functions used to represent economic constructs. Specific contributions are (a) to increase the computational speed and tractability of imposing regularity conditions, (b) to provide regularity preserving point estimates, (c) to avoid biases existent in previous applications, and (d) to illustrate the benefits of our approach via numerical simulation results. The chapter "Can We Close the Gap between the Empirical Model and Economic Theory" discusses the more fundamental question of whether the imposition of a particular theory to a dataset is justified. I propose a hypothesis test to examine whether the estimated empirical model is consistent with the assumed economic theory. Although the proposed methodology could be applied to a wide set of economic models, this is particularly relevant for estimating policy parameters that affect energy markets. This is demonstrated by estimating the Slutsky matrix and the elasticity of substitution between energy and capital, which are crucial parameters used in computable general equilibrium models analyzing energy demand and the impacts of environmental regulations. Using the Berndt and Wood dataset, I find that capital and energy are complements and that the data are significantly consistent with duality theory. Both results would not necessarily be achieved using standard econometric methods. The final chapter "Daylight Time and Energy" uses a quasi-experiment to evaluate a popular energy conservation policy: we challenge the conventional wisdom that extending Daylight Saving Time (DST) reduces energy demand. Using detailed panel data on half-hourly electricity consumption, prices, and weather conditions from four Australian states we employ a novel 'triple-difference' technique to test the electricity-saving hypothesis. We show that the extension failed to reduce electricity demand and instead increased electricity prices. We also apply the most sophisticated electricity simulation model available in the literature to the Australian data. We find that prior simulation models significantly overstate electricity savings. Our results suggest that extending DST will fail as an instrument to save energy resources.
The Use of Computer Simulation Gaming in Teaching Broadcast Economics.
ERIC Educational Resources Information Center
Mancuso, Louis C.
The purpose of this study was to develop a broadcast economic computer simulation and to ascertain how a lecture-computer simulation game compared as a teaching method with a more traditional lecture and case study instructional methods. In each of three sections of a broadcast economics course, a different teaching methodology was employed: (1)…
Optimal Energy Extraction From a Hot Water Geothermal Reservoir
NASA Astrophysics Data System (ADS)
Golabi, Kamal; Scherer, Charles R.; Tsang, Chin Fu; Mozumder, Sashi
1981-01-01
An analytical decision model is presented for determining optimal energy extraction rates from hot water geothermal reservoirs when cooled brine is reinjected into the hot water aquifer. This applied economic management model computes the optimal fluid pumping rate and reinjection temperature and the project (reservoir) life consistent with maximum present worth of the net revenues from sales of energy for space heating. The real value of product energy is assumed to increase with time, as is the cost of energy used in pumping the aquifer. The economic model is implemented by using a hydrothermal model that relates hydraulic pumping rate to the quality (temperature) of remaining heat energy in the aquifer. The results of a numerical application to space heating show that profit-maximizing extraction rate increases with interest (discount) rate and decreases as the rate of rise of real energy value increases. The economic life of the reservoir generally varies inversely with extraction rate. Results were shown to be sensitive to permeability, initial equilibrium temperature, well cost, and well life.
Quantum-like Probabilistic Models Outside Physics
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei
We present a quantum-like (QL) model in that contexts (complexes of e.g. mental, social, biological, economic or even political conditions) are represented by complex probability amplitudes. This approach gives the possibility to apply the mathematical quantum formalism to probabilities induced in any domain of science. In our model quantum randomness appears not as irreducible randomness (as it is commonly accepted in conventional quantum mechanics, e.g. by von Neumann and Dirac), but as a consequence of obtaining incomplete information about a system. We pay main attention to the QL description of processing of incomplete information. Our QL model can be useful in cognitive, social and political sciences as well as economics and artificial intelligence. In this paper we consider in a more detail one special application — QL modeling of brain's functioning. The brain is modeled as a QL-computer.
Integrated Assessment of Health-related Economic Impacts of U.S. Air Pollution Policy
NASA Astrophysics Data System (ADS)
Saari, R. K.; Rausch, S.; Selin, N. E.
2012-12-01
We examine the environmental impacts, health-related economic benefits, and distributional effects of new US regulations to reduce smog from power plants, namely: the Cross-State Air Pollution Rule. Using integrated assessment methods, linking atmospheric and economic models, we assess the magnitude of economy-wide effects and distributional consequences that are not captured by traditional regulatory impact assessment methods. We study the Cross-State Air Pollution Rule, a modified allowance trading scheme that caps emissions of nitrogen oxides and sulfur dioxide from power plants in the eastern United States and thus reduces ozone and particulate matter pollution. We use results from the regulatory regional air quality model, CAMx (the Comprehensive Air Quality Model with extensions), and epidemiologic studies in BenMAP (Environmental Benefits Mapping and Analysis Program), to quantify differences in morbidities and mortalities due to this policy. To assess the economy-wide and distributional consequences of these health impacts, we apply a recently developed economic and policy model, the US Regional Energy and Environmental Policy Model (USREP), a multi-region, multi-sector, multi-household, recursive dynamic computable general equilibrium economic model of the US that provides a detailed representation of the energy sector, and the ability to represent energy and environmental policies. We add to USREP a representation of air pollution impacts, including the estimation and valuation of health outcomes and their effects on health services, welfare, and factor markets. We find that the economic welfare benefits of the Rule are underestimated by traditional methods, which omit economy-wide impacts. We also quantify the distribution of benefits, which have varying effects across US regions, income groups, and pollutants, and we identify factors influencing this distribution, including the geographic variation of pollution and population as well as underlying economic conditions.
Tufts, Jennifer B; Weathersby, Paul K; Rodriguez, Francisco A
2010-05-01
The purpose of this paper is to demonstrate the feasibility and utility of developing economic cost models for noise-induced hearing loss (NIHL). First, we outline an economic model of NIHL for a population of US Navy sailors with an "industrial"-type noise exposure. Next, we describe the effect on NIHL-related cost of varying the two central model inputs--the noise-exposure level and the duration of exposure. Such an analysis can help prioritize promising areas, to which limited resources to reduce NIHL-related costs should be devoted. NIHL-related costs borne by the US government were computed on a yearly basis using a finite element approach that took into account varying levels of susceptibility to NIHL. Predicted hearing thresholds for the population were computed with ANSI S3.44-1996 and then used as the basis for the calculation of NIHL-related costs. Annual and cumulative costs were tracked. Noise-exposure level and duration were systematically varied to determine their effects on the expected lifetime NIHL-related cost of a specific US Navy sailor population. Our nominal noise-exposure case [93 dB(A) for six years] yielded a total expected lifetime cost of US $13,472 per sailor, with plausible lower and upper bounds of US $2,500 and US $26,000. Starting with the nominal case, a decrease of 50% in exposure level or duration would yield cost savings of approximately 23% and 19%, respectively. We concluded that a reduction in noise level would be more somewhat more cost-effective than the same percentage reduction in years of exposure. Our economic cost model can be used to estimate the changes in NIHL-related costs that would result from changes in noise-exposure level and/or duration for a single military population. Although the model is limited at present, suggestions are provided for adapting it to civilian populations.
The importance of employing computational resources for the automation of drug discovery.
Rosales-Hernández, Martha Cecilia; Correa-Basurto, José
2015-03-01
The application of computational tools to drug discovery helps researchers to design and evaluate new drugs swiftly with a reduce economic resources. To discover new potential drugs, computational chemistry incorporates automatization for obtaining biological data such as adsorption, distribution, metabolism, excretion and toxicity (ADMET), as well as drug mechanisms of action. This editorial looks at examples of these computational tools, including docking, molecular dynamics simulation, virtual screening, quantum chemistry, quantitative structural activity relationship, principal component analysis and drug screening workflow systems. The authors then provide their perspectives on the importance of these techniques for drug discovery. Computational tools help researchers to design and discover new drugs for the treatment of several human diseases without side effects, thus allowing for the evaluation of millions of compounds with a reduced cost in both time and economic resources. The problem is that operating each program is difficult; one is required to use several programs and understand each of the properties being tested. In the future, it is possible that a single computer and software program will be capable of evaluating the complete properties (mechanisms of action and ADMET properties) of ligands. It is also possible that after submitting one target, this computer-software will be capable of suggesting potential compounds along with ways to synthesize them, and presenting biological models for testing.
DOT National Transportation Integrated Search
1996-11-01
The Highway Economic Requirements System (HERS) is a computer model designed to simulate improvement selection decisions based on the relative benefit-cost merits of alternative improvement options. HERS is intended to estimate national level investm...
NASA Astrophysics Data System (ADS)
Jin, D.; Hoagland, P.; Dalton, T. M.; Thunberg, E. M.
2012-09-01
We present an integrated economic-ecological framework designed to help assess the implementation of ecosystem-based fisheries management (EBFM) in New England. We develop the framework by linking a computable general equilibrium (CGE) model of a coastal economy to an end-to-end (E2E) model of a marine food web for Georges Bank. We focus on the New England region using coastal county economic data for a restricted set of industry sectors and marine ecological data for three top level trophic feeding guilds: planktivores, benthivores, and piscivores. We undertake numerical simulations to model the welfare effects of changes in alternative combinations of yields from feeding guilds and alternative manifestations of biological productivity. We estimate the economic and distributional effects of these alternative simulations across a range of consumer income levels. This framework could be used to extend existing methodologies for assessing the impacts on human communities of groundfish stock rebuilding strategies, such as those expected through the implementation of the sector management program in the US northeast fishery. We discuss other possible applications of and modifications and limitations to the framework.
Ou-Yang, Si-sheng; Lu, Jun-yan; Kong, Xiang-qian; Liang, Zhong-jie; Luo, Cheng; Jiang, Hualiang
2012-01-01
Computational drug discovery is an effective strategy for accelerating and economizing drug discovery and development process. Because of the dramatic increase in the availability of biological macromolecule and small molecule information, the applicability of computational drug discovery has been extended and broadly applied to nearly every stage in the drug discovery and development workflow, including target identification and validation, lead discovery and optimization and preclinical tests. Over the past decades, computational drug discovery methods such as molecular docking, pharmacophore modeling and mapping, de novo design, molecular similarity calculation and sequence-based virtual screening have been greatly improved. In this review, we present an overview of these important computational methods, platforms and successful applications in this field. PMID:22922346
Boundary-layer computational model for predicting the flow and heat transfer in sudden expansions
NASA Technical Reports Server (NTRS)
Lewis, J. P.; Pletcher, R. H.
1986-01-01
Fully developed turbulent and laminar flows through symmetric planar and axisymmetric expansions with heat transfer were modeled using a finite-difference discretization of the boundary-layer equations. By using the boundary-layer equations to model separated flow in place of the Navier-Stokes equations, computational effort was reduced permitting turbulence modelling studies to be economically carried out. For laminar flow, the reattachment length was well predicted for Reynolds numbers as low as 20 and the details of the trapped eddy were well predicted for Reynolds numbers above 200. For turbulent flows, the Boussinesq assumption was used to express the Reynolds stresses in terms of a turbulent viscosity. Near-wall algebraic turbulence models based on Prandtl's-mixing-length model and the maximum Reynolds shear stress were compared.
Salipur, Zdravko; Bertocci, Gina
2010-01-01
It has been shown that ANSI WC19 transit wheelchairs that are crashworthy in frontal impact exhibit catastrophic failures in rear impact and may not be able to provide stable seating support and thus occupant protection for the wheelchair occupant. Thus far only limited sled test and computer simulation data have been available to study rear impact wheelchair safety. Computer modeling can be used as an economic and comprehensive tool to gain critical knowledge regarding wheelchair integrity and occupant safety. This study describes the development and validation of a computer model simulating an adult wheelchair-seated occupant subjected to a rear impact event. The model was developed in MADYMO and validated rigorously using the results of three similar sled tests conducted to specifications provided in the draft ISO/TC 173 standard. Outcomes from the model can provide critical wheelchair loading information to wheelchair and tiedown manufacturers, resulting in safer wheelchair designs for rear impact conditions. (c) 2009 IPEM. Published by Elsevier Ltd. All rights reserved.
Financial Structure and Economic Welfare: Applied General Equilibrium Development Economics.
Townsend, Robert
2010-09-01
This review provides a common framework for researchers thinking about the next generation of micro-founded macro models of growth, inequality, and financial deepening, as well as direction for policy makers targeting microfinance programs to alleviate poverty. Topics include treatment of financial structure general equilibrium models: testing for as-if-complete markets or other financial underpinnings; examining dual-sector models with both a perfectly intermediated sector and a sector in financial autarky, as well as a second generation of these models that embeds information problems and other obstacles to trade; designing surveys to capture measures of income, investment/savings, and flow of funds; and aggregating individuals and households to the level of network, village, or national economy. The review concludes with new directions that overcome conceptual and computational limitations.
An economic model of friendship and enmity for measuring social balance in networks
NASA Astrophysics Data System (ADS)
Lee, Kyu-Min; Shin, Euncheol; You, Seungil
2017-12-01
We propose a dynamic economic model of networks where agents can be friends or enemies with one another. This is a decentralized relationship model in that agents decide whether to change their relationships so as to minimize their imbalanced triads. In this model, there is a single parameter, which we call social temperature, that captures the degree to which agents care about social balance in their relationships. We show that the global structure of relationship configuration converges to a unique stationary distribution. Using this stationary distribution, we characterize the maximum likelihood estimator of the social temperature parameter. Since the estimator is computationally challenging to calculate from real social network datasets, we provide a simple simulation algorithm and verify its performance with real social network datasets.
Financial Structure and Economic Welfare: Applied General Equilibrium Development Economics
Townsend, Robert
2010-01-01
This review provides a common framework for researchers thinking about the next generation of micro-founded macro models of growth, inequality, and financial deepening, as well as direction for policy makers targeting microfinance programs to alleviate poverty. Topics include treatment of financial structure general equilibrium models: testing for as-if-complete markets or other financial underpinnings; examining dual-sector models with both a perfectly intermediated sector and a sector in financial autarky, as well as a second generation of these models that embeds information problems and other obstacles to trade; designing surveys to capture measures of income, investment/savings, and flow of funds; and aggregating individuals and households to the level of network, village, or national economy. The review concludes with new directions that overcome conceptual and computational limitations. PMID:21037939
Nihonsugi, Tsuyoshi; Ihara, Aya; Haruno, Masahiko
2015-02-25
The intention behind another's action and the impact of the outcome are major determinants of human economic behavior. It is poorly understood, however, whether the two systems share a core neural computation. Here, we investigated whether the two systems are causally dissociable in the brain by integrating computational modeling, functional magnetic resonance imaging, and transcranial direct current stimulation experiments in a newly developed trust game task. We show not only that right dorsolateral prefrontal cortex (DLPFC) activity is correlated with intention-based economic decisions and that ventral striatum and amygdala activity are correlated with outcome-based decisions, but also that stimulation to the DLPFC selectively enhances intention-based decisions. These findings suggest that the right DLPFC is involved in the implementation of intention-based decisions in the processing of cooperative decisions. This causal dissociation of cortical and subcortical backgrounds may indicate evolutionary and developmental differences in the two decision systems. Copyright © 2015 the authors 0270-6474/15/53412-08$15.00/0.
Economic decision making and the application of nonparametric prediction models
Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.
2007-01-01
Sustained increases in energy prices have focused attention on gas resources in low permeability shale or in coals that were previously considered economically marginal. Daily well deliverability is often relatively small, although the estimates of the total volumes of recoverable resources in these settings are large. Planning and development decisions for extraction of such resources must be area-wide because profitable extraction requires optimization of scale economies to minimize costs and reduce risk. For an individual firm the decision to enter such plays depends on reconnaissance level estimates of regional recoverable resources and on cost estimates to develop untested areas. This paper shows how simple nonparametric local regression models, used to predict technically recoverable resources at untested sites, can be combined with economic models to compute regional scale cost functions. The context of the worked example is the Devonian Antrim shale gas play, Michigan Basin. One finding relates to selection of the resource prediction model to be used with economic models. Models which can best predict aggregate volume over larger areas (many hundreds of sites) may lose granularity in the distribution of predicted volumes at individual sites. This loss of detail affects the representation of economic cost functions and may affect economic decisions. Second, because some analysts consider unconventional resources to be ubiquitous, the selection and order of specific drilling sites may, in practice, be determined by extraneous factors. The paper also shows that when these simple prediction models are used to strategically order drilling prospects, the gain in gas volume over volumes associated with simple random site selection amounts to 15 to 20 percent. It also discusses why the observed benefit of updating predictions from results of new drilling, as opposed to following static predictions, is somewhat smaller. Copyright 2007, Society of Petroleum Engineers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oladosu, Gbadebo A; Rose, Adam; Bumsoo, Lee
2013-01-01
The foot and mouth disease (FMD) virus has high agro-terrorism potential because it is contagious, can be easily transmitted via inanimate objects and can be spread by wind. An outbreak of FMD in developed countries results in massive slaughtering of animals (for disease control) and disruptions in meat supply chains and trade, with potentially large economic losses. Although the United States has been FMD-free since 1929, the potential of FMD as a deliberate terrorist weapon calls for estimates of the physical and economic damage that could result from an outbreak. This paper estimates the economic impacts of three alternative scenariosmore » of potential FMD attacks using a computable general equilibrium (CGE) model of the US economy. The three scenarios range from a small outbreak successfully contained within a state to a large multi-state attack resulting in slaughtering of 30 percent of the national livestock. Overall, the value of total output losses in our simulations range between $37 billion (0.15% of 2006 baseline economic output) and $228 billion (0.92%). Major impacts stem from the supply constraint on livestock due to massive animal slaughtering. As expected, the economic losses are heavily concentrated in agriculture and food manufacturing sectors, with losses ranging from $23 billion to $61 billion in the two industries.« less
Economic resilience through "One-Water" management
Hanson, Randall T.; Schmid, Wolfgang
2013-01-01
Disruption of water availability leads to food scarcity and loss of economic opportunity. Development of effective water-resource policies and management strategies could provide resiliance to local economies in the face of water disruptions such as drought, flood, and climate change. To accomplish this, a detailed understanding of human water use and natural water resource availability is needed. A hydrologic model is a computer software system that simulates the movement and use of water in a geographic area. It takes into account all components of the water cycle--“One Water”--and helps estimate water budgets for groundwater, surface water, and landscape features. The U.S. Geological Survey MODFLOW One-Water Integrated Hydrologic Model (MODFLOWOWHM) software and scientific methods can provide water managers and political leaders with hydrologic information they need to help ensure water security and economic resilience.
ERIC Educational Resources Information Center
Miller, John; Weil, Gordon
1986-01-01
The interactive feature of computers is used to incorporate a guided inquiry method of learning introductory economics, extending the Computer Assisted Instruction (CAI) method beyond drills. (Author/JDH)
Robustness of disaggregate oil and gas discovery forecasting models
Attanasi, E.D.; Schuenemeyer, J.H.
1989-01-01
The trend in forecasting oil and gas discoveries has been to develop and use models that allow forecasts of the size distribution of future discoveries. From such forecasts, exploration and development costs can more readily be computed. Two classes of these forecasting models are the Arps-Roberts type models and the 'creaming method' models. This paper examines the robustness of the forecasts made by these models when the historical data on which the models are based have been subject to economic upheavals or when historical discovery data are aggregated from areas having widely differing economic structures. Model performance is examined in the context of forecasting discoveries for offshore Texas State and Federal areas. The analysis shows how the model forecasts are limited by information contained in the historical discovery data. Because the Arps-Roberts type models require more regularity in discovery sequence than the creaming models, prior information had to be introduced into the Arps-Roberts models to accommodate the influence of economic changes. The creaming methods captured the overall decline in discovery size but did not easily allow introduction of exogenous information to compensate for incomplete historical data. Moreover, the predictive log normal distribution associated with the creaming model methods appears to understate the importance of the potential contribution of small fields. ?? 1989.
Economic Value Biases Uncertain Perceptual Choices in the Parietal and Prefrontal Cortices
Summerfield, Christopher; Koechlin, Etienne
2010-01-01
An observer detecting a noisy sensory signal is biased by the costs and benefits associated with its presence or absence. When these costs and benefits are asymmetric, sensory, and economic information must be integrated to inform the final choice. However, it remains unknown how this information is combined at the neural or computational levels. To address this question, we asked healthy human observers to judge the presence or absence of a noisy sensory signal under economic conditions that favored yes responses (liberal blocks), no responses (conservative blocks), or neither response (neutral blocks). Economic information biased fast choices more than slow choices, suggesting that value and sensory information are integrated early in the decision epoch. More formal simulation analyses using an Ornstein–Uhlenbeck process demonstrated that the influence of economic information was best captured by shifting the origin of evidence accumulation toward the more valuable bound. We then used the computational model to generate trial-by-trial estimates of decision-related evidence that were based on combined sensory and economic information (the decision variable or DV), and regressed these against fMRI activity recorded whilst participants performed the task. Extrastriate visual regions responded to the level of sensory input (momentary evidence), but fMRI signals in the parietal and prefrontal cortices responded to the decision variable. These findings support recent single-neuron data suggesting that economic information biases decision-related signals in higher cortical regions. PMID:21267421
Water resources planning and management : A stochastic dual dynamic programming approach
NASA Astrophysics Data System (ADS)
Goor, Q.; Pinte, D.; Tilmant, A.
2008-12-01
Allocating water between different users and uses, including the environment, is one of the most challenging task facing water resources managers and has always been at the heart of Integrated Water Resources Management (IWRM). As water scarcity is expected to increase over time, allocation decisions among the different uses will have to be found taking into account the complex interactions between water and the economy. Hydro-economic optimization models can capture those interactions while prescribing efficient allocation policies. Many hydro-economic models found in the literature are formulated as large-scale non linear optimization problems (NLP), seeking to maximize net benefits from the system operation while meeting operational and/or institutional constraints, and describing the main hydrological processes. However, those models rarely incorporate the uncertainty inherent to the availability of water, essentially because of the computational difficulties associated stochastic formulations. The purpose of this presentation is to present a stochastic programming model that can identify economically efficient allocation policies in large-scale multipurpose multireservoir systems. The model is based on stochastic dual dynamic programming (SDDP), an extension of traditional SDP that is not affected by the curse of dimensionality. SDDP identify efficient allocation policies while considering the hydrologic uncertainty. The objective function includes the net benefits from the hydropower and irrigation sectors, as well as penalties for not meeting operational and/or institutional constraints. To be able to implement the efficient decomposition scheme that remove the computational burden, the one-stage SDDP problem has to be a linear program. Recent developments improve the representation of the non-linear and mildly non- convex hydropower function through a convex hull approximation of the true hydropower function. This model is illustrated on a cascade of 14 reservoirs on the Nile river basin.
1981-03-12
agriculture, raw materials, energy sources, computers, lasers , space and aeronautics, high energy physics, and genetics. The four modernizations will be...accomp- lished and the strong socialist country that is born at the end of the century will be a keyhole for the promotion of science and technology...Process (FNP). Its purpose is to connect with the Kiautsu University computer (model 108) and then to connect a data terminal . This will make a
NASA Astrophysics Data System (ADS)
Plegnière, Sabrina; Casper, Markus; Hecker, Benjamin; Müller-Fürstenberger, Georg
2014-05-01
The basis of many models to calculate and assess climate change and its consequences are annual means of temperature and precipitation. This method leads to many uncertainties especially at the regional or local level: the results are not realistic or too coarse. Particularly in agriculture, single events and the distribution of precipitation and temperature during the growing season have enormous influences on plant growth. Therefore, the temporal distribution of climate variables should not be ignored. To reach this goal, a high-resolution ecological-economic model was developed which combines a complex plant growth model (STICS) and an economic model. In this context, input data of the plant growth model are daily climate values for a specific climate station calculated by the statistical climate model (WETTREG). The economic model is deduced from the results of the plant growth model STICS. The chosen plant is corn because corn is often cultivated and used in many different ways. First of all, a sensitivity analysis showed that the plant growth model STICS is suitable to calculate the influences of different cultivation methods and climate on plant growth or yield as well as on soil fertility, e.g. by nitrate leaching, in a realistic way. Additional simulations helped to assess a production function that is the key element of the economic model. Thereby the problems when using mean values of temperature and precipitation in order to compute a production function by linear regression are pointed out. Several examples show why a linear regression to assess a production function based on mean climate values or smoothed natural distribution leads to imperfect results and why it is not possible to deduce a unique climate factor in the production function. One solution for this problem is the additional consideration of stress indices that show the impairment of plants by water or nitrate shortage. Thus, the resulting model takes into account not only the ecological factors (e.g. the plant growth) or the economical factors as a simple monetary calculation, but also their mutual influences. Finally, the ecological-economic model enables us to make a risk assessment or evaluate adaptation strategies.
The Economics of Educational Software Portability.
ERIC Educational Resources Information Center
Oliveira, Joao Batista Araujo e
1990-01-01
Discusses economic issues that affect the portability of educational software. Topics discussed include economic reasons for portability, including cost effectiveness; the nature and behavior of educational computer software markets; the role of producers, buyers, and consumers; potential effects of government policies; computer piracy; and…
Volume sharing of reservoir water
NASA Astrophysics Data System (ADS)
Dudley, Norman J.
1988-05-01
Previous models optimize short-, intermediate-, and long-run irrigation decision making in a simplified river valley system characterized by highly variable water supplies and demands for a single decision maker controlling both reservoir releases and farm water use. A major problem in relaxing the assumption of one decision maker is communicating the stochastic nature of supplies and demands between reservoir and farm managers. In this paper, an optimizing model is used to develop release rules for reservoir management when all users share equally in releases, and computer simulation is used to generate an historical time sequence of announced releases. These announced releases become a state variable in a farm management model which optimizes farm area-to-irrigate decisions through time. Such modeling envisages the use of growing area climatic data by the reservoir authority to gauge water demand and the transfer of water supply data from reservoir to farm managers via computer data files. Alternative model forms, including allocating water on a priority basis, are discussed briefly. Results show lower mean aggregate farm income and lower variance of aggregate farm income than in the single decision-maker case. This short-run economic efficiency loss coupled with likely long-run economic efficiency losses due to the attenuated nature of property rights indicates the need for quite different ways of integrating reservoir and farm management.
COMPUTER SIMULATOR (BEST) FOR DESIGNING SULFATE-REDUCING BACTERIA FIELD BIOREACTORS
BEST (bioreactor economics, size and time of operation) is a spreadsheet-based model that is used in conjunction with public domain software, PhreeqcI. BEST is used in the design process of sulfate-reducing bacteria (SRB) field bioreactors to passively treat acid mine drainage (A...
Molinos-Senante, María; Farías, Rodrigo
2018-06-04
The privatization of water and sewerage services (WSS) has led to the foundation of water economic groups, which integrate several water companies and have gained notable importance at the global level. In the framework of benchmarking studies, there are no prior studies exploring the impact that economic groups have on the efficiency and quality of service provided by water companies. This study investigates, for the first time, whether the membership of water companies in an economic group influences their performance. Quantity- and quality-adjusted efficiency scores were computed using data envelopment analysis models. An empirical application was developed for the Chilean water industry since most of their water companies are private and belong to an economic group. The results show that independent water companies provide WSS with better quality than do water companies that belong to an economic group. From a statistical point of view, it was evident that membership in an economic group impacts both the quantity- and quality-adjusted efficiency scores of water companies. The results of this study illustrate that applying the model-firm regulation to the Chilean water industry has significant drawbacks that should be addressed by the water regulator to promote the long-term sustainability of the water industry.
Introduction to Computers for Home Economics Teachers.
ERIC Educational Resources Information Center
Thompson, Cecelia; And Others
Written in simple language and designed in a large-print format, this short guide is aimed at teaching home economics teachers to use computers in their classrooms. The guide is organized in six sections. The first section covers the basics of computer equipment and explains how computers work while the second section outlines how to use…
A neuro-computational model of economic decisions.
Rustichini, Aldo; Padoa-Schioppa, Camillo
2015-09-01
Neuronal recordings and lesion studies indicate that key aspects of economic decisions take place in the orbitofrontal cortex (OFC). Previous work identified in this area three groups of neurons encoding the offer value, the chosen value, and the identity of the chosen good. An important and open question is whether and how decisions could emerge from a neural circuit formed by these three populations. Here we adapted a biophysically realistic neural network previously proposed for perceptual decisions (Wang XJ. Neuron 36: 955-968, 2002; Wong KF, Wang XJ. J Neurosci 26: 1314-1328, 2006). The domain of economic decisions is significantly broader than that for which the model was originally designed, yet the model performed remarkably well. The input and output nodes of the network were naturally mapped onto two groups of cells in OFC. Surprisingly, the activity of interneurons in the network closely resembled that of the third group of cells, namely, chosen value cells. The model reproduced several phenomena related to the neuronal origins of choice variability. It also generated testable predictions on the excitatory/inhibitory nature of different neuronal populations and on their connectivity. Some aspects of the empirical data were not reproduced, but simple extensions of the model could overcome these limitations. These results render a biologically credible model for the neuronal mechanisms of economic decisions. They demonstrate that choices could emerge from the activity of cells in the OFC, suggesting that chosen value cells directly participate in the decision process. Importantly, Wang's model provides a platform to investigate the implications of neuroscience results for economic theory. Copyright © 2015 the American Physiological Society.
Boundary formulations for sensitivity analysis without matrix derivatives
NASA Technical Reports Server (NTRS)
Kane, J. H.; Guru Prasad, K.
1993-01-01
A new hybrid approach to continuum structural shape sensitivity analysis employing boundary element analysis (BEA) is presented. The approach uses iterative reanalysis to obviate the need to factor perturbed matrices in the determination of surface displacement and traction sensitivities via a univariate perturbation/finite difference (UPFD) step. The UPFD approach makes it possible to immediately reuse existing subroutines for computation of BEA matrix coefficients in the design sensitivity analysis process. The reanalysis technique computes economical response of univariately perturbed models without factoring perturbed matrices. The approach provides substantial computational economy without the burden of a large-scale reprogramming effort.
PLYMAP : a computer simulation model of the rotary peeled softwood plywood manufacturing process
Henry Spelter
1990-01-01
This report documents a simulation model of the plywood manufacturing process. Its purpose is to enable a user to make quick estimates of the economic impact of a particular process change within a mill. The program was designed to simulate the processing of plywood within a relatively simplified mill design. Within that limitation, however, it allows a wide range of...
André, Francisco J; Cardenete, M Alejandro; Romero, Carlos
2009-05-01
The economic policy needs to pay increasingly more attention to the environmental issues, which requires the development of methodologies able to incorporate environmental, as well as macroeconomic, goals in the design of public policies. Starting from this observation, this article proposes a methodology based upon a Simonian satisficing logic made operational with the help of goal programming (GP) models, to address the joint design of macroeconomic and environmental policies. The methodology is applied to the Spanish economy, where a joint policy is elicited, taking into consideration macroeconomic goals (economic growth, inflation, unemployment, public deficit) and environmental goals (CO(2), NO( x ) and SO( x ) emissions) within the context of a computable general equilibrium model. The results show how the government can "fine-tune" its policy according to different criteria using GP models. The resulting policies aggregate the environmental and the economic goals in different ways: maximum aggregate performance, maximum balance and a lexicographic hierarchy of the goals.
A comprehensive overview of the applications of artificial life.
Kim, Kyung-Joong; Cho, Sung-Bae
2006-01-01
We review the applications of artificial life (ALife), the creation of synthetic life on computers to study, simulate, and understand living systems. The definition and features of ALife are shown by application studies. ALife application fields treated include robot control, robot manufacturing, practical robots, computer graphics, natural phenomenon modeling, entertainment, games, music, economics, Internet, information processing, industrial design, simulation software, electronics, security, data mining, and telecommunications. In order to show the status of ALife application research, this review primarily features a survey of about 180 ALife application articles rather than a selected representation of a few articles. Evolutionary computation is the most popular method for designing such applications, but recently swarm intelligence, artificial immune network, and agent-based modeling have also produced results. Applications were initially restricted to the robotics and computer graphics, but presently, many different applications in engineering areas are of interest.
The Society-Deciders Model and Fairness in Nations
NASA Astrophysics Data System (ADS)
Flomenbom, Ophir
2015-05-01
Modeling the dynamics in nations from economical and sociological perspectives is a central theme in economics and sociology. Accurate models can predict and therefore help all the world's citizens. Yet, recent years have show that the current models are missing. Here, we develop a dynamical society-deciders model that can explain the stability in a nation, based on concepts from dynamics, ecology and socio-econo-physics; a nation has two groups that interconnect, the deciders and the society. We show that a nation is either stable or it collapses. This depends on just two coefficients that we relate with sociological and economical indicators. We define a new socio-economic indicator, fairness. Fairness can measure the stability in a nation and how probable a change favoring the society is. We compute fairness among all the world's nations. Interestingly, in comparison with other indicators, fairness shows that the USA loses its rank among Western democracies, India is the best among the 15 most populated nations, and Egypt, Libya and Tunisia have significantly improved their rankings as a result of recent revolutions, further increasing the probability of additional positive changes. Within the model, long lasting crises are solved rather than with increasing governmental spending or cuts with regulations that reduce the stability of the deciders, namely, increasing fairness, while, for example, shifting wealth in the direction of the people, and therefore increasing further opportunities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowery, P.S.; Lessor, D.L.
Waste glass melter and in situ vitrification (ISV) processes represent the combination of electrical thermal, and fluid flow phenomena to produce a stable waste-from product. Computational modeling of the thermal and fluid flow aspects of these processes provides a useful tool for assessing the potential performance of proposed system designs. These computations can be performed at a fraction of the cost of experiment. Consequently, computational modeling of vitrification systems can also provide and economical means for assessing the suitability of a proposed process application. The computational model described in this paper employs finite difference representations of the basic continuum conservationmore » laws governing the thermal, fluid flow, and electrical aspects of the vitrification process -- i.e., conservation of mass, momentum, energy, and electrical charge. The resulting code is a member of the TEMPEST family of codes developed at the Pacific Northwest Laboratory (operated by Battelle for the US Department of Energy). This paper provides an overview of the numerical approach employed in TEMPEST. In addition, results from several TEMPEST simulations of sample waste glass melter and ISV processes are provided to illustrate the insights to be gained from computational modeling of these processes. 3 refs., 13 figs.« less
Statistical, economic and other tools for assessing natural aggregate
Bliss, J.D.; Moyle, P.R.; Bolm, K.S.
2003-01-01
Quantitative aggregate resource assessment provides resource estimates useful for explorationists, land managers and those who make decisions about land allocation, which may have long-term implications concerning cost and the availability of aggregate resources. Aggregate assessment needs to be systematic and consistent, yet flexible enough to allow updating without invalidating other parts of the assessment. Evaluators need to use standard or consistent aggregate classification and statistic distributions or, in other words, models with geological, geotechnical and economic variables or interrelationships between these variables. These models can be used with subjective estimates, if needed, to estimate how much aggregate may be present in a region or country using distributions generated by Monte Carlo computer simulations.
Strategy and gaps for modeling, simulation, and control of hybrid systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rabiti, Cristian; Garcia, Humberto E.; Hovsapian, Rob
2015-04-01
The purpose of this report is to establish a strategy for modeling and simulation of candidate hybrid energy systems. Modeling and simulation is necessary to design, evaluate, and optimize the system technical and economic performance. Accordingly, this report first establishes the simulation requirements to analysis candidate hybrid systems. Simulation fidelity levels are established based on the temporal scale, real and synthetic data availability or needs, solution accuracy, and output parameters needed to evaluate case-specific figures of merit. Accordingly, the associated computational and co-simulation resources needed are established; including physical models when needed, code assembly and integrated solutions platforms, mathematical solvers,more » and data processing. This report first attempts to describe the figures of merit, systems requirements, and constraints that are necessary and sufficient to characterize the grid and hybrid systems behavior and market interactions. Loss of Load Probability (LOLP) and effective cost of Effective Cost of Energy (ECE), as opposed to the standard Levelized Cost of Electricty (LCOE), are introduced as technical and economical indices for integrated energy system evaluations. Financial assessment methods are subsequently introduced for evaluation of non-traditional, hybrid energy systems. Algorithms for coupled and iterative evaluation of the technical and economic performance are subsequently discussed. This report further defines modeling objectives, computational tools, solution approaches, and real-time data collection and processing (in some cases using real test units) that will be required to model, co-simulate, and optimize; (a) an energy system components (e.g., power generation unit, chemical process, electricity management unit), (b) system domains (e.g., thermal, electrical or chemical energy generation, conversion, and transport), and (c) systems control modules. Co-simulation of complex, tightly coupled, dynamic energy systems requires multiple simulation tools, potentially developed in several programming languages and resolved on separate time scales. Whereas further investigation and development of hybrid concepts will provide a more complete understanding of the joint computational and physical modeling needs, this report highlights areas in which co-simulation capabilities are warranted. The current development status, quality assurance, availability and maintainability of simulation tools that are currently available for hybrid systems modeling is presented. Existing gaps in the modeling and simulation toolsets and development needs are subsequently discussed. This effort will feed into a broader Roadmap activity for designing, developing, and demonstrating hybrid energy systems.« less
Study of short haul high-density V/STOL transportation systems. Volume 2: Appendices
NASA Technical Reports Server (NTRS)
Solomon, H. L.
1972-01-01
Essential supporting data to the short haul transportation study are presented. The specific appendices are arena characteristics, aerospace transportation analysis computer program, economics, model calibration, STOLport siting and services path selection, STOL schedule definition, tabulated California corridor results, and tabulated Midwest arena results.
21st century environmental problems are wicked and require holistic systems thinking and solutions that integrate social and economic knowledge with knowledge of the environment. Computer-based technologies are fundamental to our ability to research and understand the relevant sy...
ERIC Educational Resources Information Center
Kim, H. S.; Dixon, James P.
1993-01-01
Examines the lack of interdisciplinary communication in environmental education programs in U.S. graduate schools. Following comparative historical reviews of environmental protection activities, presents a computer-developed curriculum model base containing 15 subject areas: philosophy, politics, economics, architecture, sociology, biology,…
ERIC Educational Resources Information Center
Rice, Patricia Brisotti
2012-01-01
As the basis of a society undergoes a fundamental change, such as progression from the industrial age to the knowledge/information age, the massive change affects every aspect of life. Change causes stress in individuals that often manifest itself as anxiety. Using an economic model of the endogenous growth, which includes technology as input,…
Visualization of the Eastern Renewable Generation Integration Study: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gruchalla, Kenny; Novacheck, Joshua; Bloom, Aaron
The Eastern Renewable Generation Integration Study (ERGIS), explores the operational impacts of the wide spread adoption of wind and solar photovoltaics (PV) resources in the U.S. Eastern Interconnection and Quebec Interconnection (collectively, EI). In order to understand some of the economic and reliability challenges of managing hundreds of gigawatts of wind and PV generation, we developed state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NREL's high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated withmore » evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions. state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NRELs high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated with evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions.« less
Science literacy by technology by country: USA, Finland and Mexico. making sense of it all
NASA Astrophysics Data System (ADS)
Papanastasiou, Elena C.
2003-02-01
The purpose of this study was to examine how variables related to computer availability, computer comfort and educational software are associated with higher or lower levels of science literacy in the USA, Finland and Mexico, after controlling for the socio-economic status of the students. The analyses for this study were based on a series of multivariate regression models. The data were obtained from the Program for International Student Assessment. The results of this study showed that it was not computer use itself that had a positive or negative effect on the science achievement of the students, but the way in which the computers were used within the context of each country.
NASA Astrophysics Data System (ADS)
Gennett, Zachary Andrew
Millennial Generation students bring significant learning and teaching challenges to the classroom, because of their unique learning styles, breadth of interests related to social and environmental issues, and intimate experiences with technology. As a result, there has been an increased willingness at many universities to experiment with pedagogical strategies that depart from a traditional "learning by listening" model, and move toward more innovative methods involving active learning through computer games. In particular, current students typically express a strong interest in sustainability in which economic concerns must be weighed relative to environmental and social responsibilities. A game-based setting could prove very effective for fostering an operational understanding of these tradeoffs, and especially the social dimension which remains largely underdeveloped relative to the economic and environmental aspects. Through an examination of the educational potential of computer games, this study hypothesizes that to acquire the skills necessary to manage and understand the complexities of sustainability, Millennial Generation students must be engaged in active learning exercises that present dynamic problems and foster a high level of social interaction. This has led to the development of an educational computer game, entitled Shortfall, which simulates a business milieu for testing alternative paths regarding the principles of sustainability. This study examines the evolution of Shortfall from an educational board game that teaches the principles of environmentally benign manufacturing, to a completely networked computer game, entitled Shortfall Online that teaches the principles of sustainability. A capital-based theory of sustainability is adopted to more accurately convey the tradeoffs and opportunity costs among economic prosperity, environmental preservation, and societal responsibilities. While the economic and environmental aspects of sustainability have received considerable attention in traditional pedagogical approaches, specific focus is provided for the social dimension of sustainability, as it had remained largely underdeveloped. To measure social sustainability and provide students with an understanding of its significance, a prospective metric utilizing a social capital peer-evaluation survey, unique to Shortfall, is developed.
Using Computers in Undergraduate Economics Courses.
ERIC Educational Resources Information Center
Barr, Saul Z.; Harmon, Oscar
Seven computer assignments for undergraduate economics students that concentrate on building a foundation for programming higher level mathematical calculations are described. The purpose of each assignment, the computer program for it, and the correct answers are provided. "Introduction to Text Editing" acquaints the student with some…
Tableau Economique: Teaching Economics with a Tablet Computer
ERIC Educational Resources Information Center
Scott, Robert H., III
2011-01-01
The typical method of instruction in economics is chalk and talk. Economics courses often require writing equations and drawing graphs and charts, which are all best done in freehand. Unlike static PowerPoint presentations, tablet computers create dynamic nonlinear presentations. Wireless technology allows professors to write on their tablets and…
Herd-Level Mastitis-Associated Costs on Canadian Dairy Farms
Aghamohammadi, Mahjoob; Haine, Denis; Kelton, David F.; Barkema, Herman W.; Hogeveen, Henk; Keefe, Gregory P.; Dufour, Simon
2018-01-01
Mastitis imposes considerable and recurring economic losses on the dairy industry worldwide. The main objective of this study was to estimate herd-level costs incurred by expenditures and production losses associated with mastitis on Canadian dairy farms in 2015, based on producer reports. Previously, published mastitis economic frameworks were used to develop an economic model with the most important cost components. Components investigated were divided between clinical mastitis (CM), subclinical mastitis (SCM), and other costs components (i.e., preventive measures and product quality). A questionnaire was mailed to 374 dairy producers randomly selected from the (Canadian National Dairy Study 2015) to collect data on these costs components, and 145 dairy producers returned a completed questionnaire. For each herd, costs due to the different mastitis-related components were computed by applying the values reported by the dairy producer to the developed economic model. Then, for each herd, a proportion of the costs attributable to a specific component was computed by dividing absolute costs for this component by total herd mastitis-related costs. Median self-reported CM incidence was 19 cases/100 cow-year and mean self-reported bulk milk somatic cell count was 184,000 cells/mL. Most producers reported using post-milking teat disinfection (97%) and dry cow therapy (93%), and a substantial proportion of producers reported using pre-milking teat disinfection (79%) and wearing gloves during milking (77%). Mastitis costs were substantial (662 CAD per milking cow per year for a typical Canadian dairy farm), with a large portion of the costs (48%) being attributed to SCM, and 34 and 15% due to CM and implementation of preventive measures, respectively. For SCM, the two most important cost components were the subsequent milk yield reduction and culling (72 and 25% of SCM costs, respectively). For CM, first, second, and third most important cost components were culling (48% of CM costs), milk yield reduction following the CM events (34%), and discarded milk (11%), respectively. This study is the first since 1990 to investigate costs of mastitis in Canada. The model developed in the current study can be used to compute mastitis costs at the herd and national level in Canada. PMID:29868620
NASA Astrophysics Data System (ADS)
Luna, Byron Quan; Vidar Vangelsten, Bjørn; Liu, Zhongqiang; Eidsvig, Unni; Nadim, Farrokh
2013-04-01
Landslide risk must be assessed at the appropriate scale in order to allow effective risk management. At the moment, few deterministic models exist that can do all the computations required for a complete landslide risk assessment at a regional scale. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the models to compute the displacement with a large amount of individual initiation areas (computationally exhaustive). This paper presents a medium-scale, dynamic physical model for rapid mass movements in mountainous and volcanic areas. The deterministic nature of the approach makes it possible to apply it to other sites since it considers the frictional equilibrium conditions for the initiation process, the rheological resistance of the displaced flow for the run-out process and fragility curve that links intensity to economic loss for each building. The model takes into account the triggering effect of an earthquake, intense rainfall and a combination of both (spatial and temporal). The run-out module of the model considers the flow as a 2-D continuum medium solving the equations of mass balance and momentum conservation. The model is embedded in an open source environment geographical information system (GIS), it is computationally efficient and it is transparent (understandable and comprehensible) for the end-user. The model was applied to a virtual region, assessing landslide hazard, vulnerability and risk. A Monte Carlo simulation scheme was applied to quantify, propagate and communicate the effects of uncertainty in input parameters on the final results. In this technique, the input distributions are recreated through sampling and the failure criteria are calculated for each stochastic realisation of the site properties. The model is able to identify the released volumes of the critical slopes and the areas threatened by the run-out intensity. The obtained final outcome is the estimation of individual building damage and total economic risk. The research leading to these results has received funding from the European Community's Seventh Framework Programme [FP7/2007-2013] under grant agreement No 265138 New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe (MATRIX).
Realization of planning design of mechanical manufacturing system by Petri net simulation model
NASA Astrophysics Data System (ADS)
Wu, Yanfang; Wan, Xin; Shi, Weixiang
1991-09-01
Planning design is to work out a more overall long-term plan. In order to guarantee a mechanical manufacturing system (MMS) designed to obtain maximum economical benefit, it is necessary to carry out a reasonable planning design for the system. First, some principles on planning design for MMS are introduced. Problems of production scheduling and their decision rules for computer simulation are presented. Realizable method of each production scheduling decision rule in Petri net model is discussed. Second, the solution of conflict rules for conflict problems during running Petri net is given. Third, based on the Petri net model of MMS which includes part flow and tool flow, according to the principle of minimum event time advance, a computer dynamic simulation of the Petri net model, that is, a computer dynamic simulation of MMS, is realized. Finally, the simulation program is applied to a simulation exmple, so the scheme of a planning design for MMS can be evaluated effectively.
GASP- General Aviation Synthesis Program. Volume 1: Main program. Part 1: Theoretical development
NASA Technical Reports Server (NTRS)
Hague, D.
1978-01-01
The General Aviation synthesis program performs tasks generally associated with aircraft preliminary design and allows an analyst the capability of performing parametric studies in a rapid manner. GASP emphasizes small fixed-wing aircraft employing propulsion systems varying froma single piston engine with fixed pitch propeller through twin turboprop/ turbofan powered business or transport type aircraft. The program, which may be operated from a computer terminal in either the batch or interactive graphic mode, is comprised of modules representing the various technical disciplines integrated into a computational flow which ensures that the interacting effects of design variables are continuously accounted for in the aircraft sizing procedure. The model is a useful tool for comparing configurations, assessing aircraft performance and economics, performing tradeoff and sensitivity studies, and assessing the impact of advanced technologies on aircraft performance and economics.
NASA Astrophysics Data System (ADS)
Kalyanapu, A. J.; Thames, B. A.
2013-12-01
Dam breach modeling often includes application of models that are sophisticated, yet computationally intensive to compute flood propagation at high temporal and spatial resolutions. This results in a significant need for computational capacity that requires development of newer flood models using multi-processor and graphics processing techniques. Recently, a comprehensive benchmark exercise titled the 12th Benchmark Workshop on Numerical Analysis of Dams, is organized by the International Commission on Large Dams (ICOLD) to evaluate the performance of these various tools used for dam break risk assessment. The ICOLD workshop is focused on estimating the consequences of failure of a hypothetical dam near a hypothetical populated area with complex demographics, and economic activity. The current study uses this hypothetical case study and focuses on evaluating the effects of dam breach methodologies on consequence estimation and analysis. The current study uses ICOLD hypothetical data including the topography, dam geometric and construction information, land use/land cover data along with socio-economic and demographic data. The objective of this study is to evaluate impacts of using four different dam breach methods on the consequence estimates used in the risk assessments. The four methodologies used are: i) Froehlich (1995), ii) MacDonald and Langridge-Monopolis 1984 (MLM), iii) Von Thun and Gillete 1990 (VTG), and iv) Froehlich (2008). To achieve this objective, three different modeling components were used. First, using the HEC-RAS v.4.1, dam breach discharge hydrographs are developed. These hydrographs are then provided as flow inputs into a two dimensional flood model named Flood2D-GPU, which leverages the computer's graphics card for much improved computational capabilities of the model input. Lastly, outputs from Flood2D-GPU, including inundated areas, depth grids, velocity grids, and flood wave arrival time grids, are input into HEC-FIA, which provides the consequence assessment for the solution to the problem statement. For the four breach methodologies, a sensitivity analysis of four breach parameters, breach side slope (SS), breach width (Wb), breach invert elevation (Elb), and time of failure (tf), is conducted. Up to, 68 simulations are computed to produce breach hydrographs in HEC-RAS for input into Flood2D-GPU. The Flood2D-GPU simulation results were then post-processed in HEC-FIA to evaluate: Total Population at Risk (PAR), 14-yr and Under PAR (PAR14-), 65-yr and Over PAR (PAR65+), Loss of Life (LOL) and Direct Economic Impact (DEI). The MLM approach resulted in wide variability in simulated minimum and maximum values of PAR, PAR 65+ and LOL estimates. For PAR14- and DEI, Froehlich (1995) resulted in lower values while MLM resulted in higher estimates. This preliminary study demonstrated the relative performance of four commonly used dam breach methodologies and their impacts on consequence estimation.
Mars Colony in situ resource utilization: An integrated architecture and economics model
NASA Astrophysics Data System (ADS)
Shishko, Robert; Fradet, René; Do, Sydney; Saydam, Serkan; Tapia-Cortez, Carlos; Dempster, Andrew G.; Coulton, Jeff
2017-09-01
This paper reports on our effort to develop an ensemble of specialized models to explore the commercial potential of mining water/ice on Mars in support of a Mars Colony. This ensemble starts with a formal systems architecting framework to describe a Mars Colony and capture its artifacts' parameters and technical attributes. The resulting database is then linked to a variety of ;downstream; analytic models. In particular, we integrated an extraction process (i.e., ;mining;) model, a simulation of the colony's environmental control and life support infrastructure known as HabNet, and a risk-based economics model. The mining model focuses on the technologies associated with in situ resource extraction, processing, storage and handling, and delivery. This model computes the production rate as a function of the systems' technical parameters and the local Mars environment. HabNet simulates the fundamental sustainability relationships associated with establishing and maintaining the colony's population. The economics model brings together market information, investment and operating costs, along with measures of market uncertainty and Monte Carlo techniques, with the objective of determining the profitability of commercial water/ice in situ mining operations. All told, over 50 market and technical parameters can be varied in order to address ;what-if; questions, including colony location.
Visualization of logistic algorithm in Wilson model
NASA Astrophysics Data System (ADS)
Glushchenko, A. S.; Rodin, V. A.; Sinegubov, S. V.
2018-05-01
Economic order quantity (EOQ), defined by the Wilson's model, is widely used at different stages of production and distribution of different products. It is useful for making decisions in the management of inventories, providing a more efficient business operation and thus bringing more economic benefits. There is a large amount of reference material and extensive computer shells that help solving various logistics problems. However, the use of large computer environments is not always justified and requires special user training. A tense supply schedule in a logistics model is optimal, if, and only if, the planning horizon coincides with the beginning of the next possible delivery. For all other possible planning horizons, this plan is not optimal. It is significant that when the planning horizon changes, the plan changes immediately throughout the entire supply chain. In this paper, an algorithm and a program for visualizing models of the optimal value of supplies and their number, depending on the magnitude of the planned horizon, have been obtained. The program allows one to trace (visually and quickly) all main parameters of the optimal plan on the charts. The results of the paper represent a part of the authors’ research work in the field of optimization of protection and support services of ports in the Russian North.
Computer simulation models of pre-diabetes populations: a systematic review protocol
Khurshid, Waqar; Pagano, Eva; Feenstra, Talitha
2017-01-01
Introduction Diabetes is a major public health problem and prediabetes (intermediate hyperglycaemia) is associated with a high risk of developing diabetes. With evidence supporting the use of preventive interventions for prediabetes populations and the discovery of novel biomarkers stratifying the risk of progression, there is a need to evaluate their cost-effectiveness across jurisdictions. In diabetes and prediabetes, it is relevant to inform cost-effectiveness analysis using decision models due to their ability to forecast long-term health outcomes and costs beyond the time frame of clinical trials. To support good implementation and reimbursement decisions of interventions in these populations, models should be clinically credible, based on best available evidence, reproducible and validated against clinical data. Our aim is to identify recent studies on computer simulation models and model-based economic evaluations of populations of individuals with prediabetes, qualify them and discuss the knowledge gaps, challenges and opportunities that need to be addressed for future evaluations. Methods and analysis A systematic review will be conducted in MEDLINE, Embase, EconLit and National Health Service Economic Evaluation Database. We will extract peer-reviewed studies published between 2000 and 2016 that describe computer simulation models of the natural history of individuals with prediabetes and/or decision models to evaluate the impact of interventions, risk stratification and/or screening on these populations. Two reviewers will independently assess each study for inclusion. Data will be extracted using a predefined pro forma developed using best practice. Study quality will be assessed using a modelling checklist. A narrative synthesis of all studies will be presented, focussing on model structure, quality of models and input data, and validation status. Ethics and dissemination This systematic review is exempt from ethics approval because the work is carried out on published documents. The findings of the review will be disseminated in a related peer-reviewed journal and presented at conferences. Reviewregistration number CRD42016047228. PMID:28982807
Computer versus Paper Testing in Precollege Economics
ERIC Educational Resources Information Center
Butters, Roger B.; Walstad, William B.
2011-01-01
Interest is growing at the precollege level in computer testing (CT) instead of paper-and-pencil testing (PT) for subjects in the school curriculum, including economics. Before economic educators adopt CT, a better understanding of its likely effects on test-taking behavior and performance compared with PT is needed. Using two volunteer student…
Technological Change in Assessing Economics: A Cautionary Welcome
ERIC Educational Resources Information Center
Kennelly, Brendan; Considine, John; Flannery, Darragh
2009-01-01
The use of computer-based automated assignment systems in economics has expanded significantly in recent years. The most widely used system is Aplia which was developed by Paul Romer in 2000. Aplia is a computer application designed to replace traditional paper-based assignments in economics. The main features of Aplia are: (1) interactive content…
Confidence bands for measured economically optimal nitrogen rates
USDA-ARS?s Scientific Manuscript database
While numerous researchers have computed economically optimal N rate (EONR) values from measured yield – N rate data, nearly all have neglected to compute or estimate the statistical reliability of these EONR values. In this study, a simple method for computing EONR and its confidence bands is descr...
Electric Composition Cost Comparison.
ERIC Educational Resources Information Center
Joint Committee on Printing, Washington, DC.
Experience of the U.S. Government Printing Office and others has shown that electronic composition of computer processed data is more economical than printing from camera copy produced by the line printers of digital computers. But electronic composition of data not already being processed by computer is not necessarily economical. This analysis…
Discrete event simulation: the preferred technique for health economic evaluations?
Caro, Jaime J; Möller, Jörgen; Getsios, Denis
2010-12-01
To argue that discrete event simulation should be preferred to cohort Markov models for economic evaluations in health care. The basis for the modeling techniques is reviewed. For many health-care decisions, existing data are insufficient to fully inform them, necessitating the use of modeling to estimate the consequences that are relevant to decision-makers. These models must reflect what is known about the problem at a level of detail sufficient to inform the questions. Oversimplification will result in estimates that are not only inaccurate, but potentially misleading. Markov cohort models, though currently popular, have so many limitations and inherent assumptions that they are inadequate to inform most health-care decisions. An event-based individual simulation offers an alternative much better suited to the problem. A properly designed discrete event simulation provides more accurate, relevant estimates without being computationally prohibitive. It does require more data and may be a challenge to convey transparently, but these are necessary trade-offs to provide meaningful and valid results. In our opinion, discrete event simulation should be the preferred technique for health economic evaluations today. © 2010, International Society for Pharmacoeconomics and Outcomes Research (ISPOR).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, La Tonya Nicole; Malczynski, Leonard A.
DYNAMO is a computer program for building and running 'continuous' simulation models. It was developed by the Industrial Dynamics Group at the Massachusetts Institute of Technology for simulating dynamic feedback models of business, economic, and social systems. The history of the system dynamics method since 1957 includes many classic models built in DYANMO. It was not until the late 1980s that software was built to take advantage of the rise of personal computers and graphical user interfaces that DYNAMO was supplanted. There is much learning and insight to be gained from examining the DYANMO models and their accompanying research papers.more » We believe that it is a worthwhile exercise to convert DYNAMO models to more recent software packages. We have made an attempt to make it easier to turn these models into a more current system dynamics software language, Powersim © Studio produced by Powersim AS 2 of Bergen, Norway. This guide shows how to convert DYNAMO syntax into Studio syntax.« less
NASA Astrophysics Data System (ADS)
Xie, W.; Li, N.; Wu, J.-D.; Hao, X.-L.
2013-11-01
Disaster damages have negative effects on economy, whereas reconstruction investments have positive effects. The aim of this study is to model economic causes of disasters and recovery involving positive effects of reconstruction activities. Computable general equilibrium (CGE) model is a promising approach because it can incorporate these two kinds of shocks into a unified framework and further avoid double-counting problem. In order to factor both shocks in CGE model, direct loss is set as the amount of capital stock reduced on supply side of economy; A portion of investments restore the capital stock in existing period; An investment-driven dynamic model is formulated due to available reconstruction data, and the rest of a given country's saving is set as an endogenous variable. The 2008 Wenchuan Earthquake is selected as a case study to illustrate the model, and three scenarios are constructed: S0 (no disaster occurs), S1 (disaster occurs with reconstruction investment) and S2 (disaster occurs without reconstruction investment). S0 is taken as business as usual, and the differences between S1 and S0 and that between S2 and S0 can be interpreted as economic losses including reconstruction and excluding reconstruction respectively. The study showed that output from S1 is found to be closer to real data than that from S2. S2 overestimates economic loss by roughly two times that under S1. The gap in economic aggregate between S1 and S0 is reduced to 3% in 2011, a level that should take another four years to achieve under S2.
NASA Technical Reports Server (NTRS)
Jenkins, Jerald M.
1987-01-01
Temperature, thermal stresses, and residual creep stresses were studied by comparing laboratory values measured on a built-up titanium structure with values calculated from finite-element models. Several such models were used to examine the relationship between computational thermal stresses and thermal stresses measured on a built-up structure. Element suitability, element density, and computational temperature discrepancies were studied to determine their impact on measured and calculated thermal stress. The optimum number of elements is established from a balance between element density and suitable safety margins, such that the answer is acceptably safe yet is economical from a computational viewpoint. It is noted that situations exist where relatively small excursions of calculated temperatures from measured values result in far more than proportional increases in thermal stress values. Measured residual stresses due to creep significantly exceeded the values computed by the piecewise linear elastic strain analogy approach. The most important element in the computation is the correct definition of the creep law. Computational methodology advances in predicting residual stresses due to creep require significantly more viscoelastic material characterization.
NASA Astrophysics Data System (ADS)
Kumar, Manoj; Srivastava, Akanksha
2013-01-01
This paper presents a survey of innovative approaches of the most effective computational techniques for solving singular perturbed partial differential equations, which are useful because of their numerical and computer realizations. Many applied problems appearing in semiconductors theory, biochemistry, kinetics, theory of electrical chains, economics, solid mechanics, fluid dynamics, quantum mechanics, and many others can be modelled as singularly perturbed systems. Here, we summarize a wide range of research articles published by numerous researchers during the last ten years to get a better view of the present scenario in this area of research.
Cloud computing for energy management in smart grid - an application survey
NASA Astrophysics Data System (ADS)
Naveen, P.; Kiing Ing, Wong; Kobina Danquah, Michael; Sidhu, Amandeep S.; Abu-Siada, Ahmed
2016-03-01
The smart grid is the emerging energy system wherein the application of information technology, tools and techniques that make the grid run more efficiently. It possesses demand response capacity to help balance electrical consumption with supply. The challenges and opportunities of emerging and future smart grids can be addressed by cloud computing. To focus on these requirements, we provide an in-depth survey on different cloud computing applications for energy management in the smart grid architecture. In this survey, we present an outline of the current state of research on smart grid development. We also propose a model of cloud based economic power dispatch for smart grid.
An optimization model for energy generation and distribution in a dynamic facility
NASA Technical Reports Server (NTRS)
Lansing, F. L.
1981-01-01
An analytical model is described using linear programming for the optimum generation and distribution of energy demands among competing energy resources and different economic criteria. The model, which will be used as a general engineering tool in the analysis of the Deep Space Network ground facility, considers several essential decisions for better design and operation. The decisions sought for the particular energy application include: the optimum time to build an assembly of elements, inclusion of a storage medium of some type, and the size or capacity of the elements that will minimize the total life-cycle cost over a given number of years. The model, which is structured in multiple time divisions, employ the decomposition principle for large-size matrices, the branch-and-bound method in mixed-integer programming, and the revised simplex technique for efficient and economic computer use.
Towal, R Blythe; Mormann, Milica; Koch, Christof
2013-10-01
Many decisions we make require visually identifying and evaluating numerous alternatives quickly. These usually vary in reward, or value, and in low-level visual properties, such as saliency. Both saliency and value influence the final decision. In particular, saliency affects fixation locations and durations, which are predictive of choices. However, it is unknown how saliency propagates to the final decision. Moreover, the relative influence of saliency and value is unclear. Here we address these questions with an integrated model that combines a perceptual decision process about where and when to look with an economic decision process about what to choose. The perceptual decision process is modeled as a drift-diffusion model (DDM) process for each alternative. Using psychophysical data from a multiple-alternative, forced-choice task, in which subjects have to pick one food item from a crowded display via eye movements, we test four models where each DDM process is driven by (i) saliency or (ii) value alone or (iii) an additive or (iv) a multiplicative combination of both. We find that models including both saliency and value weighted in a one-third to two-thirds ratio (saliency-to-value) significantly outperform models based on either quantity alone. These eye fixation patterns modulate an economic decision process, also described as a DDM process driven by value. Our combined model quantitatively explains fixation patterns and choices with similar or better accuracy than previous models, suggesting that visual saliency has a smaller, but significant, influence than value and that saliency affects choices indirectly through perceptual decisions that modulate economic decisions.
Towal, R. Blythe; Mormann, Milica; Koch, Christof
2013-01-01
Many decisions we make require visually identifying and evaluating numerous alternatives quickly. These usually vary in reward, or value, and in low-level visual properties, such as saliency. Both saliency and value influence the final decision. In particular, saliency affects fixation locations and durations, which are predictive of choices. However, it is unknown how saliency propagates to the final decision. Moreover, the relative influence of saliency and value is unclear. Here we address these questions with an integrated model that combines a perceptual decision process about where and when to look with an economic decision process about what to choose. The perceptual decision process is modeled as a drift–diffusion model (DDM) process for each alternative. Using psychophysical data from a multiple-alternative, forced-choice task, in which subjects have to pick one food item from a crowded display via eye movements, we test four models where each DDM process is driven by (i) saliency or (ii) value alone or (iii) an additive or (iv) a multiplicative combination of both. We find that models including both saliency and value weighted in a one-third to two-thirds ratio (saliency-to-value) significantly outperform models based on either quantity alone. These eye fixation patterns modulate an economic decision process, also described as a DDM process driven by value. Our combined model quantitatively explains fixation patterns and choices with similar or better accuracy than previous models, suggesting that visual saliency has a smaller, but significant, influence than value and that saliency affects choices indirectly through perceptual decisions that modulate economic decisions. PMID:24019496
Physics and financial economics (1776-2014): puzzles, Ising and agent-based models
NASA Astrophysics Data System (ADS)
Sornette, Didier
2014-06-01
This short review presents a selected history of the mutual fertilization between physics and economics—from Isaac Newton and Adam Smith to the present. The fundamentally different perspectives embraced in theories developed in financial economics compared with physics are dissected with the examples of the volatility smile and of the excess volatility puzzle. The role of the Ising model of phase transitions to model social and financial systems is reviewed, with the concepts of random utilities and the logit model as the analog of the Boltzmann factor in statistical physics. Recent extensions in terms of quantum decision theory are also covered. A wealth of models are discussed briefly that build on the Ising model and generalize it to account for the many stylized facts of financial markets. A summary of the relevance of the Ising model and its extensions is provided to account for financial bubbles and crashes. The review would be incomplete if it did not cover the dynamical field of agent-based models (ABMs), also known as computational economic models, of which the Ising-type models are just special ABM implementations. We formulate the ‘Emerging Intelligence Market Hypothesis’ to reconcile the pervasive presence of ‘noise traders’ with the near efficiency of financial markets. Finally, we note that evolutionary biology, more than physics, is now playing a growing role to inspire models of financial markets.
Prescribing Control in Mixed Conifer Stands Affected by Annosus Root Disease
Gary Petersen
1989-01-01
Tree mortality caused by root diseases constitutes a major drain on Forest productivity of mixed-conifer stands. Factors such as changes in species composition, selective harvesting, unfavorable economic climate, and optimizing of short-term benefits have contributed to current stand conditions. Computer simulation models, such as the "RRMOD Computerized Root...
A Noted Physicist's Contrarian View of Global Warming
ERIC Educational Resources Information Center
Goldstein, Evan R., Comp.
2008-01-01
According to Freeman Dyson, an emeritus professor of physics at the Institute for Advanced Study, the debate about global warming has become too narrow and opinions have become too entrenched. Relying on a computer model designed by the Yale University economist William D. Nordhaus, Dyson compared the effectiveness and economic feasibility of…
Economical Unsteady High-Fidelity Aerodynamics for Structural Optimization with a Flutter Constraint
NASA Technical Reports Server (NTRS)
Bartels, Robert E.; Stanford, Bret K.
2017-01-01
Structural optimization with a flutter constraint for a vehicle designed to fly in the transonic regime is a particularly difficult task. In this speed range, the flutter boundary is very sensitive to aerodynamic nonlinearities, typically requiring high-fidelity Navier-Stokes simulations. However, the repeated application of unsteady computational fluid dynamics to guide an aeroelastic optimization process is very computationally expensive. This expense has motivated the development of methods that incorporate aspects of the aerodynamic nonlinearity, classical tools of flutter analysis, and more recent methods of optimization. While it is possible to use doublet lattice method aerodynamics, this paper focuses on the use of an unsteady high-fidelity aerodynamic reduced order model combined with successive transformations that allows for an economical way of utilizing high-fidelity aerodynamics in the optimization process. This approach is applied to the common research model wing structural design. As might be expected, the high-fidelity aerodynamics produces a heavier wing than that optimized with doublet lattice aerodynamics. It is found that the optimized lower skin of the wing using high-fidelity aerodynamics differs significantly from that using doublet lattice aerodynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potchen, E.J.; Harris, G.I.; Gift, D.A.
The report provides information on an assessment of the potential short and long term benefits of emission computed tomography (ECT) in biomedical research and patient care. Work during the past year has been augmented by the development and use of an opinion survey instrument to reach a wider representation of knowledgeable investigators and users of this technology. This survey instrument is reproduced in an appendix. Information derived from analysis of the opinion survey, and used in conjunction with results of independent staff studies of available sources, provides the basis for the discussions given in following sections of PET applications inmore » the brain, of technical factors, and of economic implications. Projections of capital and operating costs on a per study basis were obtained from a computerized, pro forma accounting model and are compared with the survey cost estimates for both research and clinical modes of application. The results of a cash-flow model analysis of the relationship between projected economic benefit of PET research to disease management and the costs associated with such research are presented and discussed.« less
Danuso, Francesco
2017-12-22
A major bottleneck for improving the governance of complex systems, rely on our ability to integrate different forms of knowledge into a decision support system (DSS). Preliminary aspects are the classification of different types of knowledge (a priori or general, a posteriori or specific, with uncertainty, numerical, textual, algorithmic, complete/incomplete, etc.), the definition of ontologies for knowledge management and the availability of proper tools like continuous simulation models, event driven models, statistical approaches, computational methods (neural networks, evolutionary optimization, rule based systems etc.) and procedure for textual documentation. Following these views at University of Udine, a computer language (SEMoLa, Simple, Easy Modelling Language) for knowledge integration has been developed. SEMoLa can handle models, data, metadata and textual knowledge; it implements and extends the system dynamics ontology (Forrester, 1968; Jørgensen, 1994) in which systems are modelled by the concepts of material, group, state, rate, parameter, internal and external events and driving variables. As an example, a SEMoLa model to improve management and sustainability (economical, energetic, environmental) of the agricultural farms is presented. The model (X-Farm) simulates a farm in which cereal and forage yield, oil seeds, milk, calves and wastes can be sold or reused. X-Farm is composed by integrated modules describing fields (crop and soil), feeds and materials storage, machinery management, manpower management, animal husbandry, economic and energetic balances, seed oil extraction, manure and wastes management, biogas production from animal wastes and biomasses.
NASA Astrophysics Data System (ADS)
Mohanty, M. P.; Karmakar, S.; Ghosh, S.
2017-12-01
Many countries across the Globe are victims of floods. To monitor them, various sophisticated algorithms and flood models are used by the scientific community. However, there still lies a gap to efficiently mapping flood risk. The limitations being: (i) scarcity of extensive data inputs required for precise flood modeling, (ii) fizzling performance of models in large and complex terrains (iii) high computational cost and time, and (iv) inexpertise in handling model simulations by civic bodies. These factors trigger the necessity of incorporating uncomplicated and inexpensive, yet precise approaches to identify areas at different levels of flood risk. The present study addresses this issue by utilizing various easily available, low cost data in a GIS environment for a large flood prone and data poor region. A set of geomorphic indicators of Digital Elevation Model (DEM) are analysed through linear binary classification, and are used to identify the flood hazard. The performance of these indicators is then investigated using receiver operating characteristics (ROC) curve, whereas the calibration and validation of the derived flood maps are accomplished through a comparison with dynamically coupled 1-D 2-D flood model outputs. A high degree of similarity on flood inundation proves the reliability of the proposed approach in identifying flood hazard. On the other hand, an extensive list of socio-economic indicators is selected to represent the flood vulnerability at a very finer forward sortation level using multivariate Data Envelopment Analysis (DEA). A set of bivariate flood risk maps is derived combining the flood hazard and socio-economic vulnerability maps. Given the acute problem of floods in developing countries, the proposed methodology which may be characterized by low computational cost, lesser data requirement and limited flood modeling complexity may facilitate local authorities and planners for deriving effective flood management strategies.
Banz, Kurt
2005-01-01
This article describes the framework of a comprehensive European model developed to assess clinical and economic outcomes of cardiac resynchronization therapy (CRT) versus optimal pharmacological therapy (OPT) alone in patients with heart failure. The model structure is based on information obtained from the literature, expert opinion, and a European CRT Steering Committee. The decision-analysis tool allows a consideration of direct medical and indirect costs, and computes outcomes for distinctive periods of time up to 5 years. Qualitative data can also be entered for cost-utility analysis. Model input data for a preliminary economic appraisal of the economic value of CRT in Germany were obtained from clinical trials, experts, health statistics, and medical tariff lists. The model offers comprehensive analysis capabilities and high flexibility so that it can easily be adapted to any European country or special setting. The illustrative analysis for Germany indicates that CRT is a cost-effective intervention. Although CRT is associated with average direct medical net costs of Euro 5880 per patient, this finding means that 22% of its upfront implantation cost is recouped already within 1 year because of significantly decreased hospitalizations. With 36,600 Euros the incremental cost per quality-adjusted life-year (QALY) gained is below the euro equivalent (41,300 Euros, 1 Euro = US1.21 dollars) of the commonly used threshold level of US50,000 dollars considered to represent cost-effectiveness. The sensitivity analysis showed these preliminary results to be fairly robust towards changes in key assumptions. The European CRT model is an important tool to assess the economic value of CRT in patients with moderate to severe heart failure. In the light of the planned introduction of Diagnosis Related Group (DRG) based reimbursement in various European countries, the economic data generated by the model can play an important role in the decision-making process.
NASA Astrophysics Data System (ADS)
Straatsma, Menno; Droogers, Peter; Brandsma, Jaïrus; Buytaert, Wouter; Karssenberg, Derek; Van Beek, Rens; Wada, Yoshihide; Sutanudjaja, Edwin; Vitolo, Claudia; Schmitz, Oliver; Meijer, Karen; Van Aalst, Maaike; Bierkens, Marc
2014-05-01
Water scarcity affects large parts of the world. Over the course of the twenty-first century, water demand is likely to increase due to population growth and associated food production, and increased economic activity, while water supply is projected to decrease in many regions due to climate change. Despite recent studies that analyze the effect of climate change on water scarcity, e.g. using climate projections under representative concentration pathways (RCP) of the fifth assessment report of the IPCC (AR5), decision support for closing the water gap between now and 2100 does not exist at a meaningful scale and with a global coverage. In this study, we aimed (i) to assess the joint impact of climatic and socio-economic change on water scarcity, (ii) to integrate impact and potential adaptation in one workflow, (iii) to prioritize adaptation options to counteract water scarcity based on their financial, regional socio-economic and environmental implications, and (iv) to deliver all this information in an integrated user-friendly web-based service. To enable the combination of global coverage with local relevance, we aggregated all results for 1604 water provinces (food producing units) delineated in this study, which is five times smaller than previous food producing units. Water supply was computed using the PCR-GLOBWB hydrological and water resources model, parameterized at 5 arcminutes for the whole globe, excluding Antarctica and Greenland. We ran PCR-GLOBWB with a daily forcing derived from five different GCM models from the CMIP5 (GFDL-ESM2M, Hadgem2-ES, IPSL-CMA5-LR, MIROC-ESM-CHEM, NorESM1-M) that were bias corrected using observation-based WATCH data between 1960-1999. For each of the models all four RCPs (RCP 2.6, 4.5, 6.0, and 8.5) were run, producing the ensemble of 20 future projections. The blue water supply was aggregated per month and per water province. Industrial, domestic and irrigation water demands were computed for a limited number of realistic combinations of a shared socio-economic pathways (SSPs) and RCPs. Our Water And Climate Adaptation Model (WatCAM) was used to compute the water gap based on reservoir capacity, water supply, and water demand. WatCam is based on the existing ModSim (Labadie, 2010) water allocation model, and facilitated the evaluation of nine technological and infrastructural adaptation measures to assess the investments needed to bridge the future water gap. Regional environmental and socio-economic effects of these investments, such as environmental flows or downstream effects, were evaluated. A scheme was developed to evaluate the strategies on robustness and flexibility under climate change and scenario uncertainty, and each measure was linked to possibilities for investment and financing mechanisms. The WatCAM is available as a web modeling service from www.water2invest.com, and enables user specified adaptation measures and the creation of an ensemble of water gap forecasts.
Roze, S; Liens, D; Palmer, A; Berger, W; Tucker, D; Renaudin, C
2006-12-01
The aim of this study was to describe a health economic model developed to project lifetime clinical and cost outcomes of lipid-modifying interventions in patients not reaching target lipid levels and to assess the validity of the model. The internet-based, computer simulation model is made up of two decision analytic sub-models, the first utilizing Monte Carlo simulation, and the second applying Markov modeling techniques. Monte Carlo simulation generates a baseline cohort for long-term simulation by assigning an individual lipid profile to each patient, and applying the treatment effects of interventions under investigation. The Markov model then estimates the long-term clinical (coronary heart disease events, life expectancy, and quality-adjusted life expectancy) and cost outcomes up to a lifetime horizon, based on risk equations from the Framingham study. Internal and external validation analyses were performed. The results of the model validation analyses, plotted against corresponding real-life values from Framingham, 4S, AFCAPS/TexCAPS, and a meta-analysis by Gordon et al., showed that the majority of values were close to the y = x line, which indicates a perfect fit. The R2 value was 0.9575 and the gradient of the regression line was 0.9329, both very close to the perfect fit (= 1). Validation analyses of the computer simulation model suggest the model is able to recreate the outcomes from published clinical studies and would be a valuable tool for the evaluation of new and existing therapy options for patients with persistent dyslipidemia.
Economic development evaluation based on science and patents
NASA Astrophysics Data System (ADS)
Jokanović, Bojana; Lalic, Bojan; Milovančević, Miloš; Simeunović, Nenad; Marković, Dusan
2017-09-01
Economic development could be achieved through many factors. Science and technology factors could influence economic development drastically. Therefore the main aim in this study was to apply computational intelligence methodology, artificial neural network approach, for economic development estimation based on different science and technology factors. Since economic analyzing could be very challenging task because of high nonlinearity, in this study was applied computational intelligence methodology, artificial neural network approach, to estimate the economic development based on different science and technology factors. As economic development measure, gross domestic product (GDP) was used. As the science and technology factors, patents in different field were used. It was found that the patents in electrical engineering field have the highest influence on the economic development or the GDP.
CFD studies on biomass thermochemical conversion.
Wang, Yiqun; Yan, Lifeng
2008-06-01
Thermochemical conversion of biomass offers an efficient and economically process to provide gaseous, liquid and solid fuels and prepare chemicals derived from biomass. Computational fluid dynamic (CFD) modeling applications on biomass thermochemical processes help to optimize the design and operation of thermochemical reactors. Recent progression in numerical techniques and computing efficacy has advanced CFD as a widely used approach to provide efficient design solutions in industry. This paper introduces the fundamentals involved in developing a CFD solution. Mathematical equations governing the fluid flow, heat and mass transfer and chemical reactions in thermochemical systems are described and sub-models for individual processes are presented. It provides a review of various applications of CFD in the biomass thermochemical process field.
CFD Studies on Biomass Thermochemical Conversion
Wang, Yiqun; Yan, Lifeng
2008-01-01
Thermochemical conversion of biomass offers an efficient and economically process to provide gaseous, liquid and solid fuels and prepare chemicals derived from biomass. Computational fluid dynamic (CFD) modeling applications on biomass thermochemical processes help to optimize the design and operation of thermochemical reactors. Recent progression in numerical techniques and computing efficacy has advanced CFD as a widely used approach to provide efficient design solutions in industry. This paper introduces the fundamentals involved in developing a CFD solution. Mathematical equations governing the fluid flow, heat and mass transfer and chemical reactions in thermochemical systems are described and sub-models for individual processes are presented. It provides a review of various applications of CFD in the biomass thermochemical process field. PMID:19325848
Energy and life-cycle cost analysis of a six-story office building
NASA Astrophysics Data System (ADS)
Turiel, I.
1981-10-01
An energy analysis computer program, DOE-2, was used to compute annual energy use for a typical office building as originally designed and with several energy conserving design modifications. The largest energy use reductions were obtained with the incorporation of daylighting techniques, the use of double pane windows, night temperature setback, and the reduction of artificial lighting levels. A life-cycle cost model was developed to assess the cost-effectiveness of the design modifications discussed. The model incorporates such features as inclusion of taxes, depreciation, and financing of conservation investments. The energy conserving strategies are ranked according to economic criteria such as net present benefit, discounted payback period, and benefit to cost ratio.
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Sanderson, A. C.
1994-01-01
Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.
Incorporating time and spatial-temporal reasoning into situation management
NASA Astrophysics Data System (ADS)
Jakobson, Gabriel
2010-04-01
Spatio-temporal reasoning plays a significant role in situation management that is performed by intelligent agents (human or machine) by affecting how the situations are recognized, interpreted, acted upon or predicted. Many definitions and formalisms for the notion of spatio-temporal reasoning have emerged in various research fields including psychology, economics and computer science (computational linguistics, data management, control theory, artificial intelligence and others). In this paper we examine the role of spatio-temporal reasoning in situation management, particularly how to resolve situations that are described by using spatio-temporal relations among events and situations. We discuss a model for describing context sensitive temporal relations and show have the model can be extended for spatial relations.
Multiscale Mathematics for Biomass Conversion to Renewable Hydrogen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plechac, Petr
2016-03-01
The overall objective of this project was to develop multiscale models for understanding and eventually designing complex processes for renewables. To the best of our knowledge, our work is the first attempt at modeling complex reacting systems, whose performance relies on underlying multiscale mathematics and developing rigorous mathematical techniques and computational algorithms to study such models. Our specific application lies at the heart of biofuels initiatives of DOE and entails modeling of catalytic systems, to enable economic, environmentally benign, and efficient conversion of biomass into either hydrogen or valuable chemicals.
NASA Astrophysics Data System (ADS)
Little, J. C.; Filz, G. M.
2016-12-01
As modern societies become more complex, critical interdependent infrastructure systems become more likely to fail under stress unless they are designed and implemented to be resilient. Hurricane Katrina clearly demonstrated the catastrophic and as yet unpredictable consequences of such failures. Resilient infrastructure systems maintain the flow of goods and services in the face of a broad range of natural and manmade hazards. In this presentation, we illustrate a generic computational framework to facilitate high-level decision-making about how to invest scarce resources most effectively to enhance resilience in coastal protection, transportation, and the economy of a region. Coastal Louisiana, our study area, has experienced the catastrophic effects of several land-falling hurricanes in recent years. In this project, we implement and further refine three process models (a coastal protection model, a transportation model, and an economic model) for the coastal Louisiana region. We upscale essential mechanistic features of the three detailed process models to the systems level and integrate the three reduced-order systems models in a modular fashion. We also evaluate the proposed approach in annual workshops with input from stakeholders. Based on stakeholder inputs, we derive a suite of goals, targets, and indicators for evaluating resilience at the systems level, and assess and enhance resilience using several deterministic scenarios. The unifying framework will be able to accommodate the different spatial and temporal scales that are appropriate for each model. We combine our generic computational framework, which encompasses the entire system of systems, with the targets, and indicators needed to systematically meet our chosen resilience goals. We will start with targets that focus on technical and economic systems, but future work will ensure that targets and indicators are extended to other dimensions of resilience including those in the environmental and social systems. The overall model can be used to optimize decision making in a probabilistic risk-based framework.
Estimation of economic values for traits of dairy sheep: I. Model development.
Wolfová, M; Wolf, J; Krupová, Z; Kica, J
2009-05-01
A bioeconomic model was developed to estimate effects of change in production and functional traits on profit of dairy or dual-purpose milked sheep under alternative management systems. The flock structure was described in terms of animal categories and probabilities of transitions among them, and a Markov chain approach was used to calculate the stationary state of the resultant ewe flock. The model included both deterministic and stochastic components. Performance for most traits was simulated as the population average, but variation in several traits was taken into account. Management options included lambing intervals, mating system, and culling strategy for ewes, weaning and marketing strategy for progeny, and feeding system. The present value of profit computed as the difference between total revenues and total costs per ewe per year, both discounted to the birth date of the animals, was used as the criterion for economic efficiency of the production system in the stationary state. Economic values (change in system profit per unit change in the trait) of up to 35 milk production, growth, carcass, wool, and functional traits may be estimated.
[Comparison Analysis of Economic and Engineering Control of Industrial VOCs].
Wang, Yu-fei; Liu, Chang-xin; Cheng, Jie; Hao, Zheng-ping; Wang, Zheng
2015-04-01
Volatile organic compounds (VOCs) pollutant has become China's major air pollutant in key urban areas like sulfur dioxide, nitrogen oxides and particulate matter. It is mainly produced from industry sectors, and engineering control is one of the most important reduction measures. During the 12th Five-Year Plan, China decides to invest 40 billion RMB to build pollution control projects in key industry sectors with annual emission reduction of 605 000 t x a(-1). It shows that China attaches a great importance to emission reduction by engineering projects and highlights the awareness of engineering reduction technologies. In this paper, a macroeconomic model, namely computable general equilibrium model, (CGE model) was employed to simulate engineering control and economic control (imposing environmental tax). We aim to compare the pros and cons of the two reduction policies. Considering the economic loss of the whole country, the environmental tax has more impacts on the economy system than engineering reduction measures. We suggest that the central government provides 7 500 RMB x t(-1) as subsidy for enterprises in industry sectors to encourage engineering reduction.
Using computer graphics to enhance astronaut and systems safety
NASA Technical Reports Server (NTRS)
Brown, J. W.
1985-01-01
Computer graphics is being employed at the NASA Johnson Space Center as a tool to perform rapid, efficient and economical analyses for man-machine integration, flight operations development and systems engineering. The Operator Station Design System (OSDS), a computer-based facility featuring a highly flexible and versatile interactive software package, PLAID, is described. This unique evaluation tool, with its expanding data base of Space Shuttle elements, various payloads, experiments, crew equipment and man models, supports a multitude of technical evaluations, including spacecraft and workstation layout, definition of astronaut visual access, flight techniques development, cargo integration and crew training. As OSDS is being applied to the Space Shuttle, Orbiter payloads (including the European Space Agency's Spacelab) and future space vehicles and stations, astronaut and systems safety are being enhanced. Typical OSDS examples are presented. By performing physical and operational evaluations during early conceptual phases. supporting systems verification for flight readiness, and applying its capabilities to real-time mission support, the OSDS provides the wherewithal to satisfy a growing need of the current and future space programs for efficient, economical analyses.
Efficiency improvement of technological preparation of power equipment manufacturing
NASA Astrophysics Data System (ADS)
Milukov, I. A.; Rogalev, A. N.; Sokolov, V. P.; Shevchenko, I. V.
2017-11-01
Competitiveness of power equipment primarily depends on speeding-up the development and mastering of new equipment samples and technologies, enhancement of organisation and management of design, manufacturing and operation. Actual political, technological and economic conditions cause the acute need in changing the strategy and tactics of process planning. At that the issues of maintenance of equipment with simultaneous improvement of its efficiency and compatibility to domestically produced components are considering. In order to solve these problems, using the systems of computer-aided process planning for process design at all stages of power equipment life cycle is economically viable. Computer-aided process planning is developed for the purpose of improvement of process planning by using mathematical methods and optimisation of design and management processes on the basis of CALS technologies, which allows for simultaneous process design, process planning organisation and management based on mathematical and physical modelling of interrelated design objects and production system. An integration of computer-aided systems providing the interaction of informative and material processes at all stages of product life cycle is proposed as effective solution to the challenges in new equipment design and process planning.
Solar heating and cooling technical data and systems analysis
NASA Technical Reports Server (NTRS)
Christensen, D. L.
1976-01-01
The accomplishments of a project to study solar heating and air conditioning are outlined. Presentation materials (data packages, slides, charts, and visual aids) were developed. Bibliographies and source materials on materials and coatings, solar water heaters, systems analysis computer models, solar collectors and solar projects were developed. Detailed MIRADS computer formats for primary data parameters were developed and updated. The following data were included: climatic, architectural, topography, heating and cooling equipment, thermal loads, and economics. Data sources in each of these areas were identified as well as solar radiation data stations and instruments.
NASA Technical Reports Server (NTRS)
Dunbar, D. N.; Tunnah, B. G.
1978-01-01
A FORTRAN computer program is described for predicting the flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuel of varying end point and hydrogen content specifications. The program has provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case.
NASA Technical Reports Server (NTRS)
Dunbar, D. N.; Tunnah, B. G.
1978-01-01
The FORTRAN computing program predicts flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuels of varying end point and hydrogen content specifications. The program has a provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case.
ERIC Educational Resources Information Center
WING, RICHARD L.; AND OTHERS
THE PURPOSE OF THE EXPERIMENT WAS TO PRODUCE AND EVALUATE 3 COMPUTER-BASED ECONOMICS GAMES AS A METHOD OF INDIVIDUALIZING INSTRUCTION FOR GRADE 6 STUDENTS. 26 EXPERIMENTAL SUBJECTS PLAYED 2 ECONOMICS GAMES, WHILE A CONTROL GROUP RECEIVED CONVENTIONAL INSTRUCTION ON SIMILAR MATERIAL. IN THE SUMERIAN GAME, STUDENTS SEATED AT THE TYPEWRITER TERMINALS…
HOMER Economic Models - US Navy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, Jason William; Myers, Kurt Steven
This LETTER REPORT has been prepared by Idaho National Laboratory for US Navy NAVFAC EXWC to support in testing pre-commercial SIREN (Simulated Integration of Renewable Energy Networks) computer software models. In the logistics mode SIREN software simulates the combination of renewable power sources (solar arrays, wind turbines, and energy storage systems) in supplying an electrical demand. NAVFAC EXWC will create SIREN software logistics models of existing or planned renewable energy projects at five Navy locations (San Nicolas Island, AUTEC, New London, & China Lake), and INL will deliver additional HOMER computer models for comparative analysis. In the transient mode SIRENmore » simulates the short time-scale variation of electrical parameters when a power outage or other destabilizing event occurs. In the HOMER model, a variety of inputs are entered such as location coordinates, Generators, PV arrays, Wind Turbines, Batteries, Converters, Grid costs/usage, Solar resources, Wind resources, Temperatures, Fuels, and Electric Loads. HOMER's optimization and sensitivity analysis algorithms then evaluate the economic and technical feasibility of these technology options and account for variations in technology costs, electric load, and energy resource availability. The Navy can then use HOMER’s optimization and sensitivity results to compare to those of the SIREN model. The U.S. Department of Energy (DOE) Idaho National Laboratory (INL) possesses unique expertise and experience in the software, hardware, and systems design for the integration of renewable energy into the electrical grid. NAVFAC EXWC will draw upon this expertise to complete mission requirements.« less
Eliciting expert opinion for economic models: an applied example.
Leal, José; Wordsworth, Sarah; Legood, Rosa; Blair, Edward
2007-01-01
Expert opinion is considered as a legitimate source of information for decision-analytic modeling where required data are unavailable. Our objective was to develop a practical computer-based tool for eliciting expert opinion about the shape of the uncertainty distribution around individual model parameters. We first developed a prepilot survey with departmental colleagues to test a number of alternative approaches to eliciting opinions on the shape of the uncertainty distribution around individual parameters. This information was used to develop a survey instrument for an applied clinical example. This involved eliciting opinions from experts to inform a number of parameters involving Bernoulli processes in an economic model evaluating DNA testing for families with a genetic disease, hypertrophic cardiomyopathy. The experts were cardiologists, clinical geneticists, and laboratory scientists working with cardiomyopathy patient populations and DNA testing. Our initial prepilot work suggested that the more complex elicitation techniques advocated in the literature were difficult to use in practice. In contrast, our approach achieved a reasonable response rate (50%), provided logical answers, and was generally rated as easy to use by respondents. The computer software user interface permitted graphical feedback throughout the elicitation process. The distributions obtained were incorporated into the model, enabling the use of probabilistic sensitivity analysis. There is clearly a gap in the literature between theoretical elicitation techniques and tools that can be used in applied decision-analytic models. The results of this methodological study are potentially valuable for other decision analysts deriving expert opinion.
20 CFR 901.11 - Enrollment procedures.
Code of Federal Regulations, 2011 CFR
2011-04-01
.... Examples include economics, computer programs, pension accounting, investment and finance, risk theory... Columbia responsible for the issuance of a license in the field of actuarial science, insurance, accounting... include economics, computer programming, pension accounting, investment and finance, risk theory...
Public health and economic risk assessment of waterborne contaminants and pathogens in Finland.
Juntunen, Janne; Meriläinen, Päivi; Simola, Antti
2017-12-01
This study shows that a variety of mathematical modeling techniques can be applied in a comprehensive assessment of the risks involved in drinking water production. In order to track the effects from water sources to the end consumers, we employed four models from different fields of study. First, two models of the physical environment, which track the movement of harmful substances from the sources to the water distribution. Second, a statistical quantitative microbial risk assessment (QMRA) to assess the public health risks of the consumption of such water. Finally, a regional computable general equilibrium (CGE) model to assess the economic effects of increased illnesses. In order to substantiate our analysis, we used an illustrative case of a recently built artificial recharge system in Southern Finland that provides water for a 300,000 inhabitant area. We examine the effects of various chemicals and microbes separately. Our economic calculations allow for direct effects on labor productivity due to absenteeism, increased health care expenditures and indirect effects for local businesses. We found that even a considerable risk has no notable threat to public health and thus barely measurable economic consequences. Any epidemic is likely to spread widely in the urban setting we examined, but is also going to be short-lived in both public health and economic terms. Our estimate for the ratio of total and direct effects is 1.4, which indicates the importance of general equilibrium effects. Furthermore, the total welfare loss is 2.4 times higher than the initial productivity loss. The major remaining uncertainty in the economic assessment is the indirect effects. Copyright © 2017 Elsevier B.V. All rights reserved.
Computational Nanoelectronics and Nanotechnology at NASA ARC
NASA Technical Reports Server (NTRS)
Saini, Subhash; Kutler, Paul (Technical Monitor)
1998-01-01
Both physical and economic considerations indicate that the scaling era of CMOS will run out of steam around the year 2010. However, physical laws also indicate that it is possible to compute at a rate of a billion times present speeds with the expenditure of only one Watt of electrical power. NASA has long-term needs where ultra-small semiconductor devices are needed for critical applications: high performance, low power, compact computers for intelligent autonomous vehicles and Petaflop computing technology are some key examples. To advance the design, development, and production of future generation micro- and nano-devices, IT Modeling and Simulation Group has been started at NASA Ames with a goal to develop an integrated simulation environment that addresses problems related to nanoelectronics and molecular nanotechnology. Overview of nanoelectronics and nanotechnology research activities being carried out at Ames Research Center will be presented. We will also present the vision and the research objectives of the IT Modeling and Simulation Group including the applications of nanoelectronic based devices relevant to NASA missions.
Computational Nanoelectronics and Nanotechnology at NASA ARC
NASA Technical Reports Server (NTRS)
Saini, Subhash
1998-01-01
Both physical and economic considerations indicate that the scaling era of CMOS will run out of steam around the year 2010. However, physical laws also indicate that it is possible to compute at a rate of a billion times present speeds with the expenditure of only one Watt of electrical power. NASA has long-term needs where ultra-small semiconductor devices are needed for critical applications: high performance, low power, compact computers for intelligent autonomous vehicles and Petaflop computing technolpgy are some key examples. To advance the design, development, and production of future generation micro- and nano-devices, IT Modeling and Simulation Group has been started at NASA Ames with a goal to develop an integrated simulation environment that addresses problems related to nanoelectronics and molecular nanotecnology. Overview of nanoelectronics and nanotechnology research activities being carried out at Ames Research Center will be presented. We will also present the vision and the research objectives of the IT Modeling and Simulation Group including the applications of nanoelectronic based devices relevant to NASA missions.
Modeling and Computing of Stock Index Forecasting Based on Neural Network and Markov Chain
Dai, Yonghui; Han, Dongmei; Dai, Weihui
2014-01-01
The stock index reflects the fluctuation of the stock market. For a long time, there have been a lot of researches on the forecast of stock index. However, the traditional method is limited to achieving an ideal precision in the dynamic market due to the influences of many factors such as the economic situation, policy changes, and emergency events. Therefore, the approach based on adaptive modeling and conditional probability transfer causes the new attention of researchers. This paper presents a new forecast method by the combination of improved back-propagation (BP) neural network and Markov chain, as well as its modeling and computing technology. This method includes initial forecasting by improved BP neural network, division of Markov state region, computing of the state transition probability matrix, and the prediction adjustment. Results of the empirical study show that this method can achieve high accuracy in the stock index prediction, and it could provide a good reference for the investment in stock market. PMID:24782659
An integrated communications demand model
NASA Astrophysics Data System (ADS)
Doubleday, C. F.
1980-11-01
A computer model of communications demand is being developed to permit dynamic simulations of the long-term evolution of demand for communications media in the U.K. to be made under alternative assumptions about social, economic and technological trends in British Telecom's business environment. The context and objectives of the project and the potential uses of the model are reviewed, and four key concepts in the demand for communications media, around which the model is being structured are discussed: (1) the generation of communications demand; (2) substitution between media; (3) technological convergence; and (4) competition. Two outline perspectives on the model itself are given.
Pilot Study: Impact of Computer Simulation on Students' Economic Policy Performance. Pilot Study.
ERIC Educational Resources Information Center
Domazlicky, Bruce; France, Judith
Fiscal and monetary policies taught in macroeconomic principles courses are concepts that might require both lecture and simulation methods. The simulation models, which apply the principles gleened from comparative statistics to a dynamic world, may give students an appreciation for the problems facing policy makers. This paper is a report of a…
A comparison of dynamic and static economic models of uneven-aged stand management
Robert G. Haight
1985-01-01
Numerical techniques have been used to compute the discrete-time sequence of residual diameter distributions that maximize the present net worth (PNW) of harvestable volume from an uneven-aged stand. Results contradicted optimal steady-state diameter distributions determined with static analysis. In this paper, optimality conditions for solutions to dynamic and static...
Hydrogen from coal cost estimation guidebook
NASA Technical Reports Server (NTRS)
Billings, R. E.
1981-01-01
In an effort to establish baseline information whereby specific projects can be evaluated, a current set of parameters which are typical of coal gasification applications was developed. Using these parameters a computer model allows researchers to interrelate cost components in a sensitivity analysis. The results make possible an approximate estimation of hydrogen energy economics from coal, under a variety of circumstances.
ERIC Educational Resources Information Center
Hohlfeld, Tina N.; Ritzhaupt, Albert D.; Barron, Ann E.; Kemker, Kate
2008-01-01
While there is evidence that access to computers in schools has increased, there remain questions about whether low socio-economic status (SES) schools provide students with equitable supports for achieving information communication technology (ICT) literacy. This research first presents a theoretical model to examine the digital divide within…
A Cloud Computing Approach to Personal Risk Management: The Open Hazards Group
NASA Astrophysics Data System (ADS)
Graves, W. R.; Holliday, J. R.; Rundle, J. B.
2010-12-01
According to the California Earthquake Authority, only about 12% of current California residences are covered by any form of earthquake insurance, down from about 30% in 1996 following the 1994, M6.7 Northridge earthquake. Part of the reason for this decreasing rate of insurance uptake is the high deductible, either 10% or 15% of the value of the structure, and the relatively high cost of the premiums, as much as thousands of dollars per year. The earthquake insurance industry is composed of the CEA, a public-private partnership; modeling companies that produce damage and loss models similar to the FEMA HAZUS model; and financial companies such as the insurance, reinsurance, and investment banking companies in New York, London, the Cayman Islands, Zurich, Dubai, Singapore, and elsewhere. In setting earthquake insurance rates, financial companies rely on models like HAZUS, that calculate on risk and exposure. In California, the process begins with an official earthquake forecast by the Working Group on California Earthquake Probabilities. Modeling companies use these 30 year earthquake probabilities as inputs to their attenuation and damage models to estimate the possible damage factors from scenario earthquakes. Economic loss is then estimated from processes such as structural failure, lost economic activity, demand surge, and fire following the earthquake. Once the potential losses are known, rates can be set so that a target ruin probability of less than 1% or so can be assured. Open Hazards Group was founded with the idea that the global public might be interested in a personal estimate of earthquake risk, computed using data supplied by the public, with models running in a cloud computing environment. These models process data from the ANSS catalog, updated at least daily, to produce rupture forecasts that are backtested with standard Reliability/Attributes and Receiver Operating Characteristic tests, among others. Models for attenuation and structural damage are then used in a computationally efficient workflow to produce real-time estimates of damage and loss for individual structures. All models are based on techniques that either have been published in the literature or will soon be published. Using these results, members of the public can gain an appreciation of their risk of exposure to damage from destructive earthquakes, information that has heretofore only been available to a few members of the financial and insurance industries.
NASA Astrophysics Data System (ADS)
1980-03-01
The technical possibilities and economical limitations of solar heating systems for the application in swimming pools, hot water preparation, space heating and air conditioning were investigated. This analysis was performed for dwellings with special consideration of the climatic differences in each community. The computer program, which was used for solar system calculations, and all mathematical models, for technical and economical analysis were elucidated. In the technical and economical analysis, the most suitable solar system sizes for each community was determined. Four types of solar collectors were investigated. The single glass selective collector proved to be the most cost effective collector in all the above applications, provided the the additional cost for the selective coating is not more than 20DM/cu. From the results of the analysis certain recommendations were derived, which can improve the rapid implementation of solar heating systems into the market.
Conditioned associations and economic decision biases.
Guitart-Masip, Marc; Talmi, Deborah; Dolan, Ray
2010-10-15
Humans show substantial deviation from rationality during economic decision making under uncertainty. A computational perspective suggests these deviations arise out of an interaction between distinct valuation systems in the brain. Here, we provide behavioural data showing that the incidental presentation of aversive and appetitive conditioned stimuli can alter subjects' preferences in an economic task, involving a choice between a safe or gamble option. These behavioural effects informed a model-based analysis of a functional magnetic resonance imaging (fMRI) experiment, involving an identical paradigm, where we demonstrate that this conditioned behavioral bias engages the amygdala, a brain structure associated with acquisition and expression of conditioned associations. Our findings suggest that a well known bias in human economic choice can arise from an influence of conditioned associations on goal-directed decision making, consistent with an architecture of choice that invokes distinct decision-making systems. Copyright 2010 Elsevier Inc. All rights reserved.
Physically-based Assessment of Tropical Cyclone Damage and Economic Losses
NASA Astrophysics Data System (ADS)
Lin, N.
2012-12-01
Estimating damage and economic losses caused by tropical cyclones (TC) is a topic of considerable research interest in many scientific fields, including meteorology, structural and coastal engineering, and actuarial sciences. One approach is based on the empirical relationship between TC characteristics and loss data. Another is to model the physical mechanism of TC-induced damage. In this talk we discuss about the physically-based approach to predict TC damage and losses due to extreme wind and storm surge. We first present an integrated vulnerability model, which, for the first time, explicitly models the essential mechanisms causing wind damage to residential areas during storm passage, including windborne-debris impact and the pressure-debris interaction that may lead, in a chain reaction, to structural failures (Lin and Vanmarcke 2010; Lin et al. 2010a). This model can be used to predict the economic losses in a residential neighborhood (with hundreds of buildings) during a specific TC (Yau et al. 2011) or applied jointly with a TC risk model (e.g., Emanuel et al 2008) to estimate the expected losses over long time periods. Then we present a TC storm surge risk model that has been applied to New York City (Lin et al. 2010b; Lin et al. 2012; Aerts et al. 2012), Miami-Dade County, Florida (Klima et al. 2011), Galveston, Texas (Lickley, 2012), and other coastal areas around the world (e.g., Tampa, Florida; Persian Gulf; Darwin, Australia; Shanghai, China). These physically-based models are applicable to various coastal areas and have the capability to account for the change of the climate and coastal exposure over time. We also point out that, although made computationally efficient for risk assessment, these models are not suitable for regional or global analysis, which has been a focus of the empirically-based economic analysis (e.g., Hsiang and Narita 2012). A future research direction is to simplify the physically-based models, possibly through parameterization, and make connections to the global loss data and economic analysis.
Economic impacts of a California tsunami
Rose, Adam; Wing, Ian Sue; Wei, Dan; Wein, Anne
2016-01-01
The economic consequences of a tsunami scenario for Southern California are estimated using computable general equilibrium analysis. The economy is modeled as a set of interconnected supply chains interacting through markets but with explicit constraints stemming from property damage and business downtime. Economic impacts are measured by the reduction of Gross Domestic Product for Southern California, Rest of California, and U.S. economies. For California, total economic impacts represent the general equilibrium (essentially quantity and price multiplier) effects of lost production in industries upstream and downstream in the supply-chain of sectors that are directly impacted by port cargo disruptions at Port of Los Angeles and Port of Long Beach (POLA/POLB), property damage along the coast, and evacuation of potentially inundated areas. These impacts are estimated to be $2.2 billion from port disruptions, $0.9 billion from property damages, and $2.8 billion from evacuations. Various economic-resilience tactics can potentially reduce the direct and total impacts by 80–85%.
FACE-IT. A Science Gateway for Food Security Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montella, Raffaele; Kelly, David; Xiong, Wei
Progress in sustainability science is hindered by challenges in creating and managing complex data acquisition, processing, simulation, post-processing, and intercomparison pipelines. To address these challenges, we developed the Framework to Advance Climate, Economic, and Impact Investigations with Information Technology (FACE-IT) for crop and climate impact assessments. This integrated data processing and simulation framework enables data ingest from geospatial archives; data regridding, aggregation, and other processing prior to simulation; large-scale climate impact simulations with agricultural and other models, leveraging high-performance and cloud computing; and post-processing to produce aggregated yields and ensemble variables needed for statistics, for model intercomparison, and to connectmore » biophysical models to global and regional economic models. FACE-IT leverages the capabilities of the Globus Galaxies platform to enable the capture of workflows and outputs in well-defined, reusable, and comparable forms. We describe FACE-IT and applications within the Agricultural Model Intercomparison and Improvement Project and the Center for Robust Decision-making on Climate and Energy Policy.« less
A nonlinear optimal control approach to stabilization of a macroeconomic development model
NASA Astrophysics Data System (ADS)
Rigatos, G.; Siano, P.; Ghosh, T.; Sarno, D.
2017-11-01
A nonlinear optimal (H-infinity) control approach is proposed for the problem of stabilization of the dynamics of a macroeconomic development model that is known as the Grossman-Helpman model of endogenous product cycles. The dynamics of the macroeconomic development model is divided in two parts. The first one describes economic activities in a developed country and the second part describes variation of economic activities in a country under development which tries to modify its production so as to serve the needs of the developed country. The article shows that through control of the macroeconomic model of the developed country, one can finally control the dynamics of the economy in the country under development. The control method through which this is achieved is the nonlinear H-infinity control. The macroeconomic model for the country under development undergoes approximate linearization round a temporary operating point. This is defined at each time instant by the present value of the system's state vector and the last value of the control input vector that was exerted on it. The linearization is based on Taylor series expansion and the computation of the associated Jacobian matrices. For the linearized model an H-infinity feedback controller is computed. The controller's gain is calculated by solving an algebraic Riccati equation at each iteration of the control method. The asymptotic stability of the control approach is proven through Lyapunov analysis. This assures that the state variables of the macroeconomic model of the country under development will finally converge to the designated reference values.
Effects of the 2008 flood on economic performance and food security in Yemen: a simulation analysis.
Breisinger, Clemens; Ecker, Olivier; Thiele, Rainer; Wiebelt, Manfred
2016-04-01
Extreme weather events such as floods and droughts can have devastating consequences for individual well being and economic development, in particular in poor societies with limited availability of coping mechanisms. Combining a dynamic computable general equilibrium model of the Yemeni economy with a household-level calorie consumption simulation model, this paper assesses the economy-wide, agricultural and food security effects of the 2008 tropical storm and flash flood that hit the Hadramout and Al-Mahrah governorates. The estimation results suggest that agricultural value added, farm household incomes and rural food security deteriorated long term in the flood-affected areas. Due to economic spillover effects, significant income losses and increases in food insecurity also occurred in areas that were unaffected by flooding. This finding suggests that while most relief efforts are typically concentrated in directly affected areas, future efforts should also consider surrounding areas and indirectly affected people. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.
Simplified and refined structural modeling for economical flutter analysis and design
NASA Technical Reports Server (NTRS)
Ricketts, R. H.; Sobieszczanski, J.
1977-01-01
A coordinated use of two finite-element models of different levels of refinement is presented to reduce the computer cost of the repetitive flutter analysis commonly encountered in structural resizing to meet flutter requirements. One model, termed a refined model (RM), represents a high degree of detail needed for strength-sizing and flutter analysis of an airframe. The other model, called a simplified model (SM), has a relatively much smaller number of elements and degrees-of-freedom. A systematic method of deriving an SM from a given RM is described. The method consists of judgmental and numerical operations to make the stiffness and mass of the SM elements equivalent to the corresponding substructures of RM. The structural data are automatically transferred between the two models. The bulk of analysis is performed on the SM with periodical verifications carried out by analysis of the RM. In a numerical example of a supersonic cruise aircraft with an arrow wing, this approach permitted substantial savings in computer costs and acceleration of the job turn-around.
DOE Office of Scientific and Technical Information (OSTI.GOV)
James Francfort; Kevin Morrow; Dimitri Hochard
2007-02-01
This report documents efforts to develop a computer tool for modeling the economic payback for comparative airport ground support equipment (GSE) that are propelled by either electric motors or gasoline and diesel engines. The types of GSE modeled are pushback tractors, baggage tractors, and belt loaders. The GSE modeling tool includes an emissions module that estimates the amount of tailpipe emissions saved by replacing internal combustion engine GSE with electric GSE. This report contains modeling assumptions, methodology, a user’s manual, and modeling results. The model was developed based on the operations of two airlines at four United States airports.
McEwan, Phil; Bergenheim, Klas; Yuan, Yong; Tetlow, Anthony P; Gordon, Jason P
2010-01-01
Simulation techniques are well suited to modelling diseases yet can be computationally intensive. This study explores the relationship between modelled effect size, statistical precision, and efficiency gains achieved using variance reduction and an executable programming language. A published simulation model designed to model a population with type 2 diabetes mellitus based on the UKPDS 68 outcomes equations was coded in both Visual Basic for Applications (VBA) and C++. Efficiency gains due to the programming language were evaluated, as was the impact of antithetic variates to reduce variance, using predicted QALYs over a 40-year time horizon. The use of C++ provided a 75- and 90-fold reduction in simulation run time when using mean and sampled input values, respectively. For a series of 50 one-way sensitivity analyses, this would yield a total run time of 2 minutes when using C++, compared with 155 minutes for VBA when using mean input values. The use of antithetic variates typically resulted in a 53% reduction in the number of simulation replications and run time required. When drawing all input values to the model from distributions, the use of C++ and variance reduction resulted in a 246-fold improvement in computation time compared with VBA - for which the evaluation of 50 scenarios would correspondingly require 3.8 hours (C++) and approximately 14.5 days (VBA). The choice of programming language used in an economic model, as well as the methods for improving precision of model output can have profound effects on computation time. When constructing complex models, more computationally efficient approaches such as C++ and variance reduction should be considered; concerns regarding model transparency using compiled languages are best addressed via thorough documentation and model validation.
NASA Astrophysics Data System (ADS)
Rougé, Charles; Harou, Julien J.; Pulido-Velazquez, Manuel; Matrosov, Evgenii S.
2017-04-01
The marginal opportunity cost of water refers to benefits forgone by not allocating an additional unit of water to its most economically productive use at a specific location in a river basin at a specific moment in time. Estimating the opportunity cost of water is an important contribution to water management as it can be used for better water allocation or better system operation, and can suggest where future water infrastructure could be most beneficial. Opportunity costs can be estimated using 'shadow values' provided by hydro-economic optimization models. Yet, such models' use of optimization means the models had difficulty accurately representing the impact of operating rules and regulatory and institutional mechanisms on actual water allocation. In this work we use more widely available river basin simulation models to estimate opportunity costs. This has been done before by adding in the model a small quantity of water at the place and time where the opportunity cost should be computed, then running a simulation and comparing the difference in system benefits. The added system benefits per unit of water added to the system then provide an approximation of the opportunity cost. This approximation can then be used to design efficient pricing policies that provide incentives for users to reduce their water consumption. Yet, this method requires one simulation run per node and per time step, which is demanding computationally for large-scale systems and short time steps (e.g., a day or a week). Besides, opportunity cost estimates are supposed to reflect the most productive use of an additional unit of water, yet the simulation rules do not necessarily use water that way. In this work, we propose an alternative approach, which computes the opportunity cost through a double backward induction, first recursively from outlet to headwaters within the river network at each time step, then recursively backwards in time. Both backward inductions only require linear operations, and the resulting algorithm tracks the maximal benefit that can be obtained by having an additional unit of water at any node in the network and at any date in time. Results 1) can be obtained from the results of a rule-based simulation using a single post-processing run, and 2) are exactly the (gross) benefit forgone by not allocating an additional unit of water to its most productive use. The proposed method is applied to London's water resource system to track the value of storage in the city's water supply reservoirs on the Thames River throughout a weekly 85-year simulation. Results, obtained in 0.4 seconds on a single processor, reflect the environmental cost of water shortage. This fast computation allows visualizing the seasonal variations of the opportunity cost depending on reservoir levels, demonstrating the potential of this approach for exploring water values and its variations using simulation models with multiple runs (e.g. of stochastically generated plausible future river inflows).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, M.V.
1989-01-01
A numerical model was developed to simulate the operation of an integrated system for the production of methane and single-cell algal protein from a variety of biomass energy crops or waste streams. Economic analysis was performed at the end of each simulation. The model was capable of assisting in the determination of design parameters by providing relative economic information for various strategies. Three configurations of anaerobic reactors were simulated. These included fed-bed reactors, conventional stirred tank reactors, and continuously expanding reactors. A generic anaerobic digestion process model, using lumped substrate parameters, was developed for use by type-specific reactor models. Themore » generic anaerobic digestion model provided a tool for the testing of conversion efficiencies and kinetic parameters for a wide range of substrate types and reactor designs. Dynamic growth models were used to model the growth of algae and Eichornia crassipes was modeled as a function of daily incident radiation and temperature. The growth of Eichornia crassipes was modeled for the production of biomass as a substrate for digestion. Computer simulations with the system model indicated that tropical or subtropical locations offered the most promise for a viable system. The availability of large quantities of digestible waste and low land prices were found to be desirable in order to take advantage of the economies of scale. Other simulations indicated that poultry and swine manure produced larger biogas yields than cattle manure. The model was created in a modular fashion to allow for testing of a wide variety of unit operations. Coding was performed in the Pascal language for use on personal computers.« less
Bernoulli substitution in the Ramsey model: Optimal trajectories under control constraints
NASA Astrophysics Data System (ADS)
Krasovskii, A. A.; Lebedev, P. D.; Tarasyev, A. M.
2017-05-01
We consider a neoclassical (economic) growth model. A nonlinear Ramsey equation, modeling capital dynamics, in the case of Cobb-Douglas production function is reduced to the linear differential equation via a Bernoulli substitution. This considerably facilitates the search for a solution to the optimal growth problem with logarithmic preferences. The study deals with solving the corresponding infinite horizon optimal control problem. We consider a vector field of the Hamiltonian system in the Pontryagin maximum principle, taking into account control constraints. We prove the existence of two alternative steady states, depending on the constraints. A proposed algorithm for constructing growth trajectories combines methods of open-loop control and closed-loop regulatory control. For some levels of constraints and initial conditions, a closed-form solution is obtained. We also demonstrate the impact of technological change on the economic equilibrium dynamics. Results are supported by computer calculations.
Real-time economic nonlinear model predictive control for wind turbine control
NASA Astrophysics Data System (ADS)
Gros, Sebastien; Schild, Axel
2017-12-01
Nonlinear model predictive control (NMPC) is a strong candidate to handle the control challenges emerging in the modern wind energy industry. Recent research suggested that wind turbine (WT) control based on economic NMPC (ENMPC) can improve the closed-loop performance and simplify the task of controller design when compared to a classical NMPC approach. This paper establishes a formal relationship between the ENMPC controller and the classic NMPC approach, and compares empirically their closed-loop nominal behaviour and performance. The robustness of the performance is assessed for an inaccurate modelling of the tower fore-aft main frequency. Additionally, though a perfect wind preview is assumed here, the effect of having a limited horizon of preview of the wind speed via the LIght Detection And Ranging (LIDAR) sensor is investigated. Finally, this paper provides new algorithmic solutions for deploying ENMPC for WT control, and report improved computational times.
Hira, A Y; Nebel de Mello, A; Faria, R A; Odone Filho, V; Lopes, R D; Zuffo, M K
2006-01-01
This article discusses a telemedicine model for emerging countries, through the description of ONCONET, a telemedicine initiative applied to pediatric oncology in Brazil. The ONCONET core technology is a Web-based system that offers health information and other services specialized in childhood cancer such as electronic medical records and cooperative protocols for complex treatments. All Web-based services are supported by the use of high performance computing infrastructure based on clusters of commodity computers. The system was fully implemented on an open-source and free-software approach. Aspects of modeling, implementation and integration are covered. A model, both technologically and economically viable, was created through the research and development of in-house solutions adapted to the emerging countries reality and with focus on scalability both in the total number of patients and in the national infrastructure.
SECOND GENERATION MODEL | Science Inventory | US ...
One of the environmental and economic models that the U.S. EPA uses to assess climate change policies is the Second Generation Model (SGM). SGM is a 13 region, 24 sector computable general equilibrium (CGE) model of the world that can be used to estimate the domestic and international economic impacts of policies designed to reduce greenhouse gas emissions. SGM was developed by Jae Edmonds and others at the Joint Global Change Research Institute (JGCRI) of Pacific Northwest National Laboratory (PNNL) and the University of Maryland. One of SGM's primary purposes is to provide an integrated assessment of a portfolio of greenhouse gas mitigation strategies. The SGM projects economic activity, energy transformation and consumption, and greenhouse gas emissions for each region of the globe in five-year time steps from 1990 through 2050. The model has been used extensively over the last decade to assess U.S. policy options to achieve greenhouse gas mitigation goals. The SGM is one of EPA's primary tools for analyses of climate change policies. It was used extensively by the the U.S. government to analyze the impact of the Kyoto Protocol. Moreover, the SGM has been used by EPA during the current Administration for analyses of the climate components of various multi-emissions bills.
Computational aeroelasticity using a pressure-based solver
NASA Astrophysics Data System (ADS)
Kamakoti, Ramji
A computational methodology for performing fluid-structure interaction computations for three-dimensional elastic wing geometries is presented. The flow solver used is based on an unsteady Reynolds-Averaged Navier-Stokes (RANS) model. A well validated k-ε turbulence model with wall function treatment for near wall region was used to perform turbulent flow calculations. Relative merits of alternative flow solvers were investigated. The predictor-corrector-based Pressure Implicit Splitting of Operators (PISO) algorithm was found to be computationally economic for unsteady flow computations. Wing structure was modeled using Bernoulli-Euler beam theory. A fully implicit time-marching scheme (using the Newmark integration method) was used to integrate the equations of motion for structure. Bilinear interpolation and linear extrapolation techniques were used to transfer necessary information between fluid and structure solvers. Geometry deformation was accounted for by using a moving boundary module. The moving grid capability was based on a master/slave concept and transfinite interpolation techniques. Since computations were performed on a moving mesh system, the geometric conservation law must be preserved. This is achieved by appropriately evaluating the Jacobian values associated with each cell. Accurate computation of contravariant velocities for unsteady flows using the momentum interpolation method on collocated, curvilinear grids was also addressed. Flutter computations were performed for the AGARD 445.6 wing at subsonic, transonic and supersonic Mach numbers. Unsteady computations were performed at various dynamic pressures to predict the flutter boundary. Results showed favorable agreement of experiment and previous numerical results. The computational methodology exhibited capabilities to predict both qualitative and quantitative features of aeroelasticity.
NASA Astrophysics Data System (ADS)
Shorikov, A. F.; Butsenko, E. V.
2017-10-01
This paper discusses the problem of multicriterial adaptive optimization the control of investment projects in the presence of several technologies. On the basis of network modeling proposed a new economic and mathematical model and a method for solving the problem of multicriterial adaptive optimization the control of investment projects in the presence of several technologies. Network economic and mathematical modeling allows you to determine the optimal time and calendar schedule for the implementation of the investment project and serves as an instrument to increase the economic potential and competitiveness of the enterprise. On a meaningful practical example, the processes of forming network models are shown, including the definition of the sequence of actions of a particular investment projecting process, the network-based work schedules are constructed. The calculation of the parameters of network models is carried out. Optimal (critical) paths have been formed and the optimal time for implementing the chosen technologies of the investment project has been calculated. It also shows the selection of the optimal technology from a set of possible technologies for project implementation, taking into account the time and cost of the work. The proposed model and method for solving the problem of managing investment projects can serve as a basis for the development, creation and application of appropriate computer information systems to support the adoption of managerial decisions by business people.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kreutz, Thomas G; Ogden, Joan M
2000-07-01
In the final report, we present results from a technical and economic assessment of residential scale PEM fuel cell power systems. The objectives of our study are to conceptually design an inexpensive, small-scale PEMFC-based stationary power system that converts natural gas to both electricity and heat, and then to analyze the prospective performance and economics of various system configurations. We developed computer models for residential scale PEMFC cogeneration systems to compare various system designs (e.g., steam reforming vs. partial oxidation, compressed vs. atmospheric pressure, etc.) and determine the most technically and economically attractive system configurations at various scales (e.g., singlemore » family, residential, multi-dwelling, neighborhood).« less
Transport Equation Based Wall Distance Computations Aimed at Flows With Time-Dependent Geometry
NASA Technical Reports Server (NTRS)
Tucker, Paul G.; Rumsey, Christopher L.; Bartels, Robert E.; Biedron, Robert T.
2003-01-01
Eikonal, Hamilton-Jacobi and Poisson equations can be used for economical nearest wall distance computation and modification. Economical computations may be especially useful for aeroelastic and adaptive grid problems for which the grid deforms, and the nearest wall distance needs to be repeatedly computed. Modifications are directed at remedying turbulence model defects. For complex grid structures, implementation of the Eikonal and Hamilton-Jacobi approaches is not straightforward. This prohibits their use in industrial CFD solvers. However, both the Eikonal and Hamilton-Jacobi equations can be written in advection and advection-diffusion forms, respectively. These, like the Poisson s Laplacian, are commonly occurring industrial CFD solver elements. Use of the NASA CFL3D code to solve the Eikonal and Hamilton-Jacobi equations in advective-based forms is explored. The advection-based distance equations are found to have robust convergence. Geometries studied include single and two element airfoils, wing body and double delta configurations along with a complex electronics system. It is shown that for Eikonal accuracy, upwind metric differences are required. The Poisson approach is found effective and, since it does not require offset metric evaluations, easiest to implement. The sensitivity of flow solutions to wall distance assumptions is explored. Generally, results are not greatly affected by wall distance traits.
Transport Equation Based Wall Distance Computations Aimed at Flows With Time-Dependent Geometry
NASA Technical Reports Server (NTRS)
Tucker, Paul G.; Rumsey, Christopher L.; Bartels, Robert E.; Biedron, Robert T.
2003-01-01
Eikonal, Hamilton-Jacobi and Poisson equations can be used for economical nearest wall distance computation and modification. Economical computations may be especially useful for aeroelastic and adaptive grid problems for which the grid deforms, and the nearest wall distance needs to be repeatedly computed. Modifications are directed at remedying turbulence model defects. For complex grid structures, implementation of the Eikonal and Hamilton-Jacobi approaches is not straightforward. This prohibits their use in industrial CFD solvers. However, both the Eikonal and Hamilton-Jacobi equations can be written in advection and advection-diffusion forms, respectively. These, like the Poisson's Laplacian, are commonly occurring industrial CFD solver elements. Use of the NASA CFL3D code to solve the Eikonal and Hamilton-Jacobi equations in advective-based forms is explored. The advection-based distance equations are found to have robust convergence. Geometries studied include single and two element airfoils, wing body and double delta configurations along with a complex electronics system. It is shown that for Eikonal accuracy, upwind metric differences are required. The Poisson approach is found effective and, since it does not require offset metric evaluations, easiest to implement. The sensitivity of flow solutions to wall distance assumptions is explored. Generally, results are not greatly affected by wall distance traits.
NASA Astrophysics Data System (ADS)
Inkoom, J. N.; Nyarko, B. K.
2014-12-01
The integration of geographic information systems (GIS) and agent-based modelling (ABM) can be an efficient tool to improve spatial planning practices. This paper utilizes GIS and ABM approaches to simulate spatial growth patterns of settlement structures in Shama. A preliminary household survey on residential location decision-making choice served as the behavioural rule for household agents in the model. Physical environment properties of the model were extracted from a 2005 image implemented in NetLogo. The resulting growth pattern model was compared with empirical growth patterns to ascertain the model's accuracy. The paper establishes that the development of unplanned structures and its evolving structural pattern are a function of land price, proximity to economic centres, household economic status and location decision-making patterns. The application of the proposed model underlines its potential for integration into urban planning policies and practices, and for understanding residential decision-making processes in emerging cities in developing countries. Key Words: GIS; Agent-based modelling; Growth patterns; NetLogo; Location decision making; Computational Intelligence.
NASA Astrophysics Data System (ADS)
Harré, Michael S.
2013-02-01
Two aspects of modern economic theory have dominated the recent discussion on the state of the global economy: Crashes in financial markets and whether or not traditional notions of economic equilibrium have any validity. We have all seen the consequences of market crashes: plummeting share prices, businesses collapsing and considerable uncertainty throughout the global economy. This seems contrary to what might be expected of a system in equilibrium where growth dominates the relatively minor fluctuations in prices. Recent work from within economics as well as by physicists, psychologists and computational scientists has significantly improved our understanding of the more complex aspects of these systems. With this interdisciplinary approach in mind, a behavioural economics model of local optimisation is introduced and three general properties are proven. The first is that under very specific conditions local optimisation leads to a conventional macro-economic notion of a global equilibrium. The second is that if both global optimisation and economic growth are required then under very mild assumptions market catastrophes are an unavoidable consequence. Third, if only local optimisation and economic growth are required then there is sufficient parametric freedom for macro-economic policy makers to steer an economy around catastrophes without overtly disrupting local optimisation.
Undecidability in macroeconomics
NASA Technical Reports Server (NTRS)
Chandra, Siddharth; Chandra, Tushar Deepak
1993-01-01
In this paper we study the difficulty of solving problems in economics. For this purpose, we adopt the notion of undecidability from recursion theory. We show that certain problems in economics are undecidable, i.e., cannot be solved by a Turing Machine, a device that is at least as powerful as any computational device that can be constructed. In particular, we prove that even in finite closed economies subject to a variable initial condition, in which a social planner knows the behavior of every agent in the economy, certain important social planning problems are undecidable. Thus, it may be impossible to make effective policy decisions. Philosophically, this result formally brings into question the Rational Expectations Hypothesis which assumes that each agent is able to determine what it should do if it wishes to maximize its utility. We show that even when an optimal rational forecast exists for each agency (based on the information currently available to it), agents may lack the ability to make these forecasts. For example, Lucas describes economic models as 'mechanical, artificial world(s), populated by ... interacting robots'. Since any mechanical robot can be at most as computationally powerful as a Turing Machine, such economies are vulnerable to the phenomenon of undecidability.
Metrics for Uncertainty in Organizational Decision-Making
2006-06-01
measurement and computational agents. Computational Economics : A Perspective from Computational Intelligence book. S.- H. Chen, Jain, Lakhmi, & Tai...change and development." Annual Review of Psychology 50: 361-386. Von Neumann, J., and Morgenstern, O. (1953). Theory of games and economic ...2006 Interviews versus Field data MI MPU Hanford/HAB (CR: cooperation) Savannah River Site/SAB (MR: competition) ER ER about 7.1% in 2002 ER
NASA Technical Reports Server (NTRS)
Matsuda, Y.
1974-01-01
A low-noise plasma simulation model is developed and applied to a series of linear and nonlinear problems associated with electrostatic wave propagation in a one-dimensional, collisionless, Maxwellian plasma, in the absence of magnetic field. It is demonstrated that use of the hybrid simulation model allows economical studies to be carried out in both the linear and nonlinear regimes with better quantitative results, for comparable computing time, than can be obtained by conventional particle simulation models, or direct solution of the Vlasov equation. The characteristics of the hybrid simulation model itself are first investigated, and it is shown to be capable of verifying the theoretical linear dispersion relation at wave energy levels as low as .000001 of the plasma thermal energy. Having established the validity of the hybrid simulation model, it is then used to study the nonlinear dynamics of monochromatic wave, sideband instability due to trapped particles, and satellite growth.
2009-10-09
Capability of the People’s Republic of China to Conduct Cyber Warfare and Computer Network Exploitation Prepared for The US-China Economic and...the People?s Republic of China to Conduct Cyber Warfare and Computer Network Exploitation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...Capability of the People’s Republic of China to Conduct Cyber Warfare and Computer Network Exploitation 2 US-China Economic and Security Review
PREFACE: International Conference on Applied Sciences 2015 (ICAS2015)
NASA Astrophysics Data System (ADS)
Lemle, Ludovic Dan; Jiang, Yiwen
2016-02-01
The International Conference on Applied Sciences ICAS2015 took place in Wuhan, China on June 3-5, 2015 at the Military Economics Academy of Wuhan. The conference is regularly organized, alternatively in Romania and in P.R. China, by Politehnica University of Timişoara, Romania, and Military Economics Academy of Wuhan, P.R. China, with the joint aims to serve as a platform for exchange of information between various areas of applied sciences, and to promote the communication between the scientists of different nations, countries and continents. The topics of the conference cover a comprehensive spectrum of issues from: >Economical Sciences and Defense: Management Sciences, Business Management, Financial Management, Logistics, Human Resources, Crisis Management, Risk Management, Quality Control, Analysis and Prediction, Government Expenditure, Computational Methods in Economics, Military Sciences, National Security, and others... >Fundamental Sciences and Engineering: Interdisciplinary applications of physics, Numerical approximation and analysis, Computational Methods in Engineering, Metallic Materials, Composite Materials, Metal Alloys, Metallurgy, Heat Transfer, Mechanical Engineering, Mechatronics, Reliability, Electrical Engineering, Circuits and Systems, Signal Processing, Software Engineering, Data Bases, Modeling and Simulation, and others... The conference gathered qualified researchers whose expertise can be used to develop new engineering knowledge that has applicability potential in Engineering, Economics, Defense, etc. The number of participants was 120 from 11 countries (China, Romania, Taiwan, Korea, Denmark, France, Italy, Spain, USA, Jamaica, and Bosnia and Herzegovina). During the three days of the conference four invited and 67 oral talks were delivered. Based on the work presented at the conference, 38 selected papers have been included in this volume of IOP Conference Series: Materials Science and Engineering. These papers present new research in the various fields of Materials Engineering, Mechanical Engineering, Computers Engineering, and Electrical Engineering. It's our great pleasure to present this volume of IOP Conference Series: Materials Science and Engineering to the scientific community to promote further research in these areas. We sincerely hope that the papers published in this volume will contribute to the advancement of knowledge in the respective fields.
Energy, economic growth, and equity in the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kannan, N.P.
1979-01-01
Decades of economic growth in the United States, although improving the lot of many, have failed to solve the problem of poverty. Islands of acute poverty persist amidst affluence even today, invalidating the conventional wisdom that a growing economy lifts everyone. For better or for worse, economic growth has been mainly dependent upon energy to solve the problem of poverty, and the insidious energy crisis that confronts us today threatens this economic growth and the dream of an equitable society. For this reason it is important to consider all the potential consequences of energy policies that are designed to helpmore » achieve energy self-sufficiency. In this study alternate energy policies are identified and compared for their relative degrees of potential trade-offs. The evaluation of the policies is carried out with the aid of two computer simulation models, ECONOMY1 and FOSSIL1, which are designed to capture the interactions between the energy sector and the rest of the economy of the United States. The study proposes an alternate set of hypotheses that emphasize the dynamics of social conflict over the distributive shares in the economy. The ECONOMY1 model is based on these hypotheses. 103 references, 79 figures, 16 tables.« less
Quantum Gauss-Jordan Elimination and Simulation of Accounting Principles on Quantum Computers
NASA Astrophysics Data System (ADS)
Diep, Do Ngoc; Giang, Do Hoang; Van Minh, Nguyen
2017-06-01
The paper is devoted to a version of Quantum Gauss-Jordan Elimination and its applications. In the first part, we construct the Quantum Gauss-Jordan Elimination (QGJE) Algorithm and estimate the complexity of computation of Reduced Row Echelon Form (RREF) of N × N matrices. The main result asserts that QGJE has computation time is of order 2 N/2. The second part is devoted to a new idea of simulation of accounting by quantum computing. We first expose the actual accounting principles in a pure mathematics language. Then, we simulate the accounting principles on quantum computers. We show that, all accounting actions are exhousted by the described basic actions. The main problems of accounting are reduced to some system of linear equations in the economic model of Leontief. In this simulation, we use our constructed Quantum Gauss-Jordan Elimination to solve the problems and the complexity of quantum computing is a square root order faster than the complexity in classical computing.
2013-09-12
CAPE CANAVERAL, Fla. – Tracey Kickbusch, chief of computational sciences at NASA's Kennedy Space Center in Florida, discusses modeling and simulations with attendees at the Technology Transfer Forum of the Economic Development Commission of Florida's Space Coast. A goal of the session was to showcase ways commercial businesses can work with NASA to develop technology and apply existing technology to commercial uses. Photo credit: NASA/Glenn Benson
Finite-Length Line Source Superposition Model (FLLSSM)
NASA Astrophysics Data System (ADS)
1980-03-01
A linearized thermal conduction model was developed to economically determine media temperatures in geologic repositories for nuclear wastes. Individual canisters containing either high level waste or spent fuel assemblies were represented as finite length line sources in a continuous media. The combined effects of multiple canisters in a representative storage pattern were established at selected points of interest by superposition of the temperature rises calculated for each canister. The methodology is outlined and the computer code FLLSSM which performs required numerical integrations and superposition operations is described.
Hamiltonian and potentials in derivative pricing models: exact results and lattice simulations
NASA Astrophysics Data System (ADS)
Baaquie, Belal E.; Corianò, Claudio; Srikant, Marakani
2004-03-01
The pricing of options, warrants and other derivative securities is one of the great success of financial economics. These financial products can be modeled and simulated using quantum mechanical instruments based on a Hamiltonian formulation. We show here some applications of these methods for various potentials, which we have simulated via lattice Langevin and Monte Carlo algorithms, to the pricing of options. We focus on barrier or path dependent options, showing in some detail the computational strategies involved.
Dynamic analysis using superelements for a large helicopter model
NASA Technical Reports Server (NTRS)
Patel, M. P.; Shah, L. C.
1978-01-01
Using superelements (substructures), modal and frequency response analysis was performed for a large model of the Advanced Attack Helicopter developed for the U.S. Army. Whiffletree concept was employed so that the residual structure along with the various superelements could be represented as beam-like structures for economical and accurate dynamic analysis. A very large DMAP alter to the rigid format was developed so that the modal analysis, the frequency response, and the strain energy in each component could be computed in the same run.
Computer programs for estimating civil aircraft economics
NASA Technical Reports Server (NTRS)
Maddalon, D. V.; Molloy, J. K.; Neubawer, M. J.
1980-01-01
Computer programs for calculating airline direct operating cost, indirect operating cost, and return on investment were developed to provide a means for determining commercial aircraft life cycle cost and economic performance. A representative wide body subsonic jet aircraft was evaluated to illustrate use of the programs.
NASA Technical Reports Server (NTRS)
Dunbar, D. N.; Tunnah, B. G.
1978-01-01
The FORTRAN computing program predicts the flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuel of varying end point and hydrogen content specifications. The program has provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case. The report has sufficient detail for the information of most readers.
Computational Modeling in Plasma Processing for 300 mm Wafers
NASA Technical Reports Server (NTRS)
Meyyappan, Meyya; Arnold, James O. (Technical Monitor)
1997-01-01
Migration toward 300 mm wafer size has been initiated recently due to process economics and to meet future demands for integrated circuits. A major issue facing the semiconductor community at this juncture is development of suitable processing equipment, for example, plasma processing reactors that can accomodate 300 mm wafers. In this Invited Talk, scaling of reactors will be discussed with the aid of computational fluid dynamics results. We have undertaken reactor simulations using CFD with reactor geometry, pressure, and precursor flow rates as parameters in a systematic investigation. These simulations provide guidelines for scaling up in reactor design.
NASA Astrophysics Data System (ADS)
Aksenova, Olesya; Nikolaeva, Evgenia; Cehlár, Michal
2017-11-01
This work aims to investigate the effectiveness of mathematical and three-dimensional computer modeling tools in the planning of processes of fuel and energy complexes at the planning and design phase of a thermal power plant (TPP). A solution for purification of gas emissions at the design development phase of waste treatment systems is proposed employing mathematical and three-dimensional computer modeling - using the E-nets apparatus and the development of a 3D model of the future gas emission purification system. Which allows to visualize the designed result, to select and scientifically prove economically feasible technology, as well as to ensure the high environmental and social effect of the developed waste treatment system. The authors present results of a treatment of planned technological processes and the system for purifying gas emissions in terms of E-nets. using mathematical modeling in the Simulink application. What allowed to create a model of a device from the library of standard blocks and to perform calculations. A three-dimensional model of a system for purifying gas emissions has been constructed. It allows to visualize technological processes and compare them with the theoretical calculations at the design phase of a TPP and. if necessary, make adjustments.
The Laboratory-Based Economics Curriculum.
ERIC Educational Resources Information Center
King, Paul G.; LaRoe, Ross M.
1991-01-01
Describes the liberal arts, computer laboratory-based economics program at Denison University (Ohio). Includes as goals helping students to (1) understand deductive arguments, (2) learn to apply theory in real-world situations, and (3) test and modify theory when necessary. Notes that the program combines computer laboratory experiments for…
Modeling and optimization of a hybrid solar combined cycle (HYCS)
NASA Astrophysics Data System (ADS)
Eter, Ahmad Adel
2011-12-01
The main objective of this thesis is to investigate the feasibility of integrating concentrated solar power (CSP) technology with the conventional combined cycle technology for electric generation in Saudi Arabia. The generated electricity can be used locally to meet the annual increasing demand. Specifically, it can be utilized to meet the demand during the hours 10 am-3 pm and prevent blackout hours, of some industrial sectors. The proposed CSP design gives flexibility in the operation system. Since, it works as a conventional combined cycle during night time and it switches to work as a hybrid solar combined cycle during day time. The first objective of the thesis is to develop a thermo-economical mathematical model that can simulate the performance of a hybrid solar-fossil fuel combined cycle. The second objective is to develop a computer simulation code that can solve the thermo-economical mathematical model using available software such as E.E.S. The developed simulation code is used to analyze the thermo-economic performance of different configurations of integrating the CSP with the conventional fossil fuel combined cycle to achieve the optimal integration configuration. This optimal integration configuration has been investigated further to achieve the optimal design of the solar field that gives the optimal solar share. Thermo-economical performance metrics which are available in the literature have been used in the present work to assess the thermo-economic performance of the investigated configurations. The economical and environmental impact of integration CSP with the conventional fossil fuel combined cycle are estimated and discussed. Finally, the optimal integration configuration is found to be solarization steam side in conventional combined cycle with solar multiple 0.38 which needs 29 hectare and LEC of HYCS is 63.17 $/MWh under Dhahran weather conditions.
Economic lot sizing in a production system with random demand
NASA Astrophysics Data System (ADS)
Lee, Shine-Der; Yang, Chin-Ming; Lan, Shu-Chuan
2016-04-01
An extended economic production quantity model that copes with random demand is developed in this paper. A unique feature of the proposed study is the consideration of transient shortage during the production stage, which has not been explicitly analysed in existing literature. The considered costs include set-up cost for the batch production, inventory carrying cost during the production and depletion stages in one replenishment cycle, and shortage cost when demand cannot be satisfied from the shop floor immediately. Based on renewal reward process, a per-unit-time expected cost model is developed and analysed. Under some mild condition, it can be shown that the approximate cost function is convex. Computational experiments have demonstrated that the average reduction in total cost is significant when the proposed lot sizing policy is compared with those with deterministic demand.
GLANCE - calculatinG heaLth impActs of atmospheric pollutioN in a Changing climatE
NASA Astrophysics Data System (ADS)
Vogel, Leif; Faria, Sérgio; Markandya, Anil
2016-04-01
Current annual global estimates of premature deaths from poor air quality are estimated in the range of 2.6-4.4 million, and 2050 projections are expected to double against 2010 levels. In Europe, annual economic burdens are estimated at around 750 bn €. Climate change will further exacerbate air pollution burdens; therefore, a better understanding of the economic impacts on human societies has become an area of intense investigation. European research efforts are being carried out within the MACC project series, which started in 2005. The outcome of this work has been integrated into a European capacity for Earth Observation, the Copernicus Atmospheric Monitoring Service (CAMS). In MACC/CAMS, key pollutant concentrations are computed at the European scale and globally by employing chemically-driven advanced transport models. The project GLANCE (calculatinG heaLth impActs of atmospheric pollutioN in a Changing climatE) aims at developing an integrated assessment model for calculating the health impacts and damage costs of air pollution at different physical scales. It combines MACC/CAMS (assimilated Earth Observations, an ensemble of chemical transport models and state of the art ECWMF weather forecasting) with downscaling based on in-situ network measurements. The strengthening of modelled projections through integration with empirical evidence reduces errors and uncertainties in the health impact projections and subsequent economic cost assessment. In addition, GLANCE will yield improved data accuracy at different time resolutions. This project is a multidisciplinary approach which brings together expertise from natural sciences and socio economic fields. Here, its general approach will be presented together with first results for the years 2007 - 2012 on the European scale. The results on health impacts and economic burdens are compared to existing assessments.
Vogel, Ronald J; Ramachandran, Sulabha; Zachry, Woodie M
2003-01-01
The pharmaceutical industry employs a variety of marketing strategies that have previously been directed primarily toward physicians. However, mass media direct-to-consumer (DTC) advertising of prescription drugs has emerged as a ubiquitous promotional strategy. This article explores the economics of DTC advertising in greater depth than has been done in the past by using a 3-stage economic model to assess the pertinent literature and to show the probable effects of DTC advertising in the United States. Economics literature on the subject was searched using the Journal of Economic Literature. Health services literature was searched using computer callback devices. Spending on DTC advertising in the United States increased from $17 million in 1985 to $2.5 billion in 2000. Proponents of DTC advertising claim that it provides valuable product-related information to health care professionals and patients, may contribute to better use of medications, and helps patients take charge of their own health care. Opponents argue that DTC advertising provides misleading messages rather than well-balanced, evidence-based information. The literature is replete with opinions about the effects of prescription drug advertising on pharmaceutical drug prices and physician-prescribing patterns, but few studies have addressed the issues beyond opinion surveys. The economic literature on advertising effects in other markets, however, may provide insight. DTC advertising indirectly affects the price and the quantity of production of pharmaceuticals via its effect on changes in consumer demand.
Human systems dynamics: Toward a computational model
NASA Astrophysics Data System (ADS)
Eoyang, Glenda H.
2012-09-01
A robust and reliable computational model of complex human systems dynamics could support advancements in theory and practice for social systems at all levels, from intrapersonal experience to global politics and economics. Models of human interactions have evolved from traditional, Newtonian systems assumptions, which served a variety of practical and theoretical needs of the past. Another class of models has been inspired and informed by models and methods from nonlinear dynamics, chaos, and complexity science. None of the existing models, however, is able to represent the open, high dimension, and nonlinear self-organizing dynamics of social systems. An effective model will represent interactions at multiple levels to generate emergent patterns of social and political life of individuals and groups. Existing models and modeling methods are considered and assessed against characteristic pattern-forming processes in observed and experienced phenomena of human systems. A conceptual model, CDE Model, based on the conditions for self-organizing in human systems, is explored as an alternative to existing models and methods. While the new model overcomes the limitations of previous models, it also provides an explanatory base and foundation for prospective analysis to inform real-time meaning making and action taking in response to complex conditions in the real world. An invitation is extended to readers to engage in developing a computational model that incorporates the assumptions, meta-variables, and relationships of this open, high dimension, and nonlinear conceptual model of the complex dynamics of human systems.
Integrating Commercial Off-The-Shelf (COTS) graphics and extended memory packages with CLIPS
NASA Technical Reports Server (NTRS)
Callegari, Andres C.
1990-01-01
This paper addresses the question of how to mix CLIPS with graphics and how to overcome PC's memory limitations by using the extended memory available in the computer. By adding graphics and extended memory capabilities, CLIPS can be converted into a complete and powerful system development tool, on the other most economical and popular computer platform. New models of PCs have amazing processing capabilities and graphic resolutions that cannot be ignored and should be used to the fullest of their resources. CLIPS is a powerful expert system development tool, but it cannot be complete without the support of a graphics package needed to create user interfaces and general purpose graphics, or without enough memory to handle large knowledge bases. Now, a well known limitation on the PC's is the usage of real memory which limits CLIPS to use only 640 Kb of real memory, but now that problem can be solved by developing a version of CLIPS that uses extended memory. The user has access of up to 16 MB of memory on 80286 based computers and, practically, all the available memory (4 GB) on computers that use the 80386 processor. So if we give CLIPS a self-configuring graphics package that will automatically detect the graphics hardware and pointing device present in the computer, and we add the availability of the extended memory that exists in the computer (with no special hardware needed), the user will be able to create more powerful systems at a fraction of the cost and on the most popular, portable, and economic platform available such as the PC platform.
The ultimatum game: Discrete vs. continuous offers
NASA Astrophysics Data System (ADS)
Dishon-Berkovits, Miriam; Berkovits, Richard
2014-09-01
In many experimental setups in social-sciences, psychology and economy the subjects are requested to accept or dispense monetary compensation which is usually given in discrete units. Using computer and mathematical modeling we show that in the framework of studying the dynamics of acceptance of proposals in the ultimatum game, the long time dynamics of acceptance of offers in the game are completely different for discrete vs. continuous offers. For discrete values the dynamics follow an exponential behavior. However, for continuous offers the dynamics are described by a power-law. This is shown using an agent based computer simulation as well as by utilizing an analytical solution of a mean-field equation describing the model. These findings have implications to the design and interpretation of socio-economical experiments beyond the ultimatum game.
Sacramento's parking lot shading ordinance: environmental and economic costs of compliance
E.G. McPherson
2001-01-01
A survey of 15 Sacramento parking lots and computer modeling were used to evaluate parking capacity and compliance with the 1983 ordinance requiring 50% shade of paved areas (PA) 15 years after development. There were 6% more parking spaces than required by ordinance, and 36% were vacant during peak use periods. Current shade was 14% with 44% of this amount provided by...
Solar thermal heating and cooling. A bibliography with abstracts
NASA Technical Reports Server (NTRS)
Arenson, M.
1979-01-01
This bibliographic series cites and abstracts the literature and technical papers on the heating and cooling of buildings with solar thermal energy. Over 650 citations are arranged in the following categories: space heating and cooling systems; space heating and cooling models; building energy conservation; architectural considerations, thermal load computations; thermal load measurements, domestic hot water, solar and atmospheric radiation, swimming pools; and economics.
Yaoxiang Li; Chris B. LeDoux; Jingxin Wang
2006-01-01
The effects of variable width of streamside management zones (25, 50, 75, and 100 ft) (SMZs) and removal level of trees (10%, 30%, and 50% of basal area) on production and cost of implementing SMZs in central Appalachian hardwood forests were simulated by using a computer model. Harvesting operations were performed on an 80-year-old generated natural hardwood stand...
Economic evaluation of long-term impacts of universal newborn hearing screening.
Chiou, Shu-Ti; Lung, Hou-Ling; Chen, Li-Sheng; Yen, Amy Ming-Fang; Fann, Jean Ching-Yuan; Chiu, Sherry Yueh-Hsia; Chen, Hsiu-Hsi
2017-01-01
Little is known about the long-term efficacious and economic impacts of universal newborn hearing screening (UNHS). An analytical Markov decision model was framed with two screening strategies: UNHS with transient evoked otoacoustic emission (TEOAE) test and automatic acoustic brainstem response (aABR) test against no screening. By estimating intervention and long-term costs on treatment and productivity losses and the utility of life years determined by the status of hearing loss, we computed base-case estimates of the incremental cost-utility ratios (ICURs). The scattered plot of ICUR and acceptability curve was used to assess the economic results of aABR versus TEOAE or both versus no screening. A hypothetical cohort of 200,000 Taiwanese newborns. TEOAE and aABR dominated over no screening strategy (ICUR = $-4800.89 and $-4111.23, indicating less cost and more utility). Given $20,000 of willingness to pay (WTP), the probability of being cost-effective of aABR against TEOAE was up to 90%. UNHS for hearing loss with aABR is the most economic option and supported by economically evidence-based evaluation from societal perspective.
Spruijt-Metz, Donna; Hekler, Eric; Saranummi, Niilo; Intille, Stephen; Korhonen, Ilkka; Nilsen, Wendy; Rivera, Daniel E; Spring, Bonnie; Michie, Susan; Asch, David A; Sanna, Alberto; Salcedo, Vicente Traver; Kukakfa, Rita; Pavel, Misha
2015-09-01
Adverse and suboptimal health behaviors and habits are responsible for approximately 40 % of preventable deaths, in addition to their unfavorable effects on quality of life and economics. Our current understanding of human behavior is largely based on static "snapshots" of human behavior, rather than ongoing, dynamic feedback loops of behavior in response to ever-changing biological, social, personal, and environmental states. This paper first discusses how new technologies (i.e., mobile sensors, smartphones, ubiquitous computing, and cloud-enabled processing/computing) and emerging systems modeling techniques enable the development of new, dynamic, and empirical models of human behavior that could facilitate just-in-time adaptive, scalable interventions. The paper then describes concrete steps to the creation of robust dynamic mathematical models of behavior including: (1) establishing "gold standard" measures, (2) the creation of a behavioral ontology for shared language and understanding tools that both enable dynamic theorizing across disciplines, (3) the development of data sharing resources, and (4) facilitating improved sharing of mathematical models and tools to support rapid aggregation of the models. We conclude with the discussion of what might be incorporated into a "knowledge commons," which could help to bring together these disparate activities into a unified system and structure for organizing knowledge about behavior.
Okeno, Tobias O; Magothe, Thomas M; Kahi, Alexander K; Peters, Kurt J
2013-01-01
A bio-economic model was developed to evaluate the utilisation of indigenous chickens (IC) under different production systems accounting for the risk attitude of the farmers. The model classified the production systems into three categories based on the level of management: free-range system (FRS), where chickens were left to scavenge for feed resources with no supplementation and healthcare; intensive system (IS), where the chickens were permanently confined and supplied with rationed feed and healthcare; and semi-intensive system (SIS), a hybrid of FRS and IS, where the chickens were partially confined, supplemented with rationed feeds, provided with healthcare and allowed to scavenge within the homestead or in runs. The model allows prediction of the live weights and feed intake at different stages in the life cycle of the IC and can compute the profitability of each production system using both traditional and risk-rated profit models. The input parameters used in the model represent a typical IC production system in developing countries but are flexible and therefore can be modified to suit specific situations and simulate profitability and costs of other poultry species production systems. The model has the capability to derive the economic values as changes in the genetic merit of the biological parameter results in marginal changes in profitability and costs of the production systems. The results suggested that utilisation of IC in their current genetic merit and production environment is more profitable under FRS and SIS but not economically viable under IS.
A generic hydroeconomic model to assess future water scarcity
NASA Astrophysics Data System (ADS)
Neverre, Noémie; Dumas, Patrice
2015-04-01
We developed a generic hydroeconomic model able to confront future water supply and demand on a large scale, taking into account man-made reservoirs. The assessment is done at the scale of river basins, using only globally available data; the methodology can thus be generalized. On the supply side, we evaluate the impacts of climate change on water resources. The available quantity of water at each site is computed using the following information: runoff is taken from the outputs of CNRM climate model (Dubois et al., 2010), reservoirs are located using Aquastat, and the sub-basin flow-accumulation area of each reservoir is determined based on a Digital Elevation Model (HYDRO1k). On the demand side, agricultural and domestic demands are projected in terms of both quantity and economic value. For the agricultural sector, globally available data on irrigated areas and crops are combined in order to determine irrigated crops localization. Then, crops irrigation requirements are computed for the different stages of the growing season using Allen (1998) method with Hargreaves potential evapotranspiration. Irrigation water economic value is based on a yield comparison approach between rainfed and irrigated crops. Potential irrigated and rainfed yields are taken from LPJmL (Blondeau et al., 2007), or from FAOSTAT by making simple assumptions on yield ratios. For the domestic sector, we project the combined effects of demographic growth, economic development and water cost evolution on future demands. The method consists in building three-blocks inverse demand functions where volume limits of the blocks evolve with the level of GDP per capita. The value of water along the demand curve is determined from price-elasticity, price and demand data from the literature, using the point-expansion method, and from water costs data. Then projected demands are confronted to future water availability. Operating rules of the reservoirs and water allocation between demands are based on the maximization of water benefits, over time and space. A parameterisation-simulation-optimisation approach is used. This gives a projection of future water scarcity in the different locations and an estimation of the associated direct economic losses from unsatisfied demands. This generic hydroeconomic model can be easily applied to large-scale regions, in particular developing regions where little reliable data is available. We will present an application to Algeria, up to the 2050 horizon.
Assessment of environmental impacts following alternative agricultural policy scenarios.
Bárlund, I; Lehtonen, H; Tattari, S
2005-01-01
Abstract Finnish agriculture is likely to undergo major changes in the near and intermediate future. The ifuture policy context can be examined at a general level by strategic scenario building. Computer-based modelling in combination with agricultural policy scenarios can in turn create a basis for the assessments of changes in environmental quality following possible changes in Finnish agriculture. The analysis of economic consequences is based on the DREMFIA model, which is applied to study effects of various agricultural policies on land use, animal production, and farmers' income. The model is suitable for an impact analysis covering an extended time span--here up to the year 2015. The changes in land use, obtained with the DREMFIA model assuming rational economic behaviour, form the basis when evaluating environmental impacts of different agricultural policies. The environmental impact assessment is performed using the field scale nutrient transport model ICECREAM. The modelled variables are nitrogen and phosphorus losses in surface runoff and percolation. In this paper the modelling strategy will be presented and highlighted using two case study catchments with varying environmental conditions and land use as an example. In addition, the paper identifies issues arising when connecting policy scenarios with impact modelling.
Cao, Qi; Buskens, Erik; Feenstra, Talitha; Jaarsma, Tiny; Hillege, Hans; Postmus, Douwe
2016-01-01
Continuous-time state transition models may end up having large unwieldy structures when trying to represent all relevant stages of clinical disease processes by means of a standard Markov model. In such situations, a more parsimonious, and therefore easier-to-grasp, model of a patient's disease progression can often be obtained by assuming that the future state transitions do not depend only on the present state (Markov assumption) but also on the past through time since entry in the present state. Despite that these so-called semi-Markov models are still relatively straightforward to specify and implement, they are not yet routinely applied in health economic evaluation to assess the cost-effectiveness of alternative interventions. To facilitate a better understanding of this type of model among applied health economic analysts, the first part of this article provides a detailed discussion of what the semi-Markov model entails and how such models can be specified in an intuitive way by adopting an approach called vertical modeling. In the second part of the article, we use this approach to construct a semi-Markov model for assessing the long-term cost-effectiveness of 3 disease management programs for heart failure. Compared with a standard Markov model with the same disease states, our proposed semi-Markov model fitted the observed data much better. When subsequently extrapolating beyond the clinical trial period, these relatively large differences in goodness-of-fit translated into almost a doubling in mean total cost and a 60-d decrease in mean survival time when using the Markov model instead of the semi-Markov model. For the disease process considered in our case study, the semi-Markov model thus provided a sensible balance between model parsimoniousness and computational complexity. © The Author(s) 2015.
Estimating HIV Prevalence in Zimbabwe Using Population-Based Survey Data
Chinomona, Amos; Mwambi, Henry Godwell
2015-01-01
Estimates of HIV prevalence computed using data obtained from sampling a subgroup of the national population may lack the representativeness of all the relevant domains of the population. These estimates are often computed on the assumption that HIV prevalence is uniform across all domains of the population. Use of appropriate statistical methods together with population-based survey data can enhance better estimation of national and subgroup level HIV prevalence and can provide improved explanations of the variation in HIV prevalence across different domains of the population. In this study we computed design-consistent estimates of HIV prevalence, and their respective 95% confidence intervals at both the national and subgroup levels. In addition, we provided a multivariable survey logistic regression model from a generalized linear modelling perspective for explaining the variation in HIV prevalence using demographic, socio-economic, socio-cultural and behavioural factors. Essentially, this study borrows from the proximate determinants conceptual framework which provides guiding principles upon which socio-economic and socio-cultural variables affect HIV prevalence through biological behavioural factors. We utilize the 2010–11 Zimbabwe Demographic and Health Survey (2010–11 ZDHS) data (which are population based) to estimate HIV prevalence in different categories of the population and for constructing the logistic regression model. It was established that HIV prevalence varies greatly with age, gender, marital status, place of residence, literacy level, belief on whether condom use can reduce the risk of contracting HIV and level of recent sexual activity whereas there was no marked variation in HIV prevalence with social status (measured using a wealth index), method of contraceptive and an individual’s level of education. PMID:26624280
NASA Technical Reports Server (NTRS)
Forney, J. A.; Walker, D.; Lanier, M.
1979-01-01
Computer program, SHCOST, was used to perform economic analyses of operational test sites. The program allows consideration of the economic parameters which are important to the solar system user. A life cycle cost and cash flow comparison is made between a solar heating system and a conventional system. The program assists in sizing the solar heating system. A sensitivity study and plot capability allow the user to select the most cost effective system configuration.
NASA Astrophysics Data System (ADS)
Rosenberg, D. E.
2008-12-01
Designing and implementing a hydro-economic computer model to support or facilitate collaborative decision making among multiple stakeholders or users can be challenging and daunting. Collaborative modeling is distinguished and more difficult than non-collaborative efforts because of a large number of users with different backgrounds, disagreement or conflict among stakeholders regarding problem definitions, modeling roles, and analysis methods, plus evolving ideas of model scope and scale and needs for information and analysis as stakeholders interact, use the model, and learn about the underlying water system. This presentation reviews the lifecycle for collaborative model making and identifies some key design decisions that stakeholders and model developers must make to develop robust and trusted, verifiable and transparent, integrated and flexible, and ultimately useful models. It advances some best practices to implement and program these decisions. Among these best practices are 1) modular development of data- aware input, storage, manipulation, results recording and presentation components plus ways to couple and link to other models and tools, 2) explicitly structure both input data and the meta data that describes data sources, who acquired it, gaps, and modifications or translations made to put the data in a form usable by the model, 3) provide in-line documentation on model inputs, assumptions, calculations, and results plus ways for stakeholders to document their own model use and share results with others, and 4) flexibly program with graphical object-oriented properties and elements that allow users or the model maintainers to easily see and modify the spatial, temporal, or analysis scope as the collaborative process moves forward. We draw on examples of these best practices from the existing literature, the author's prior work, and some new applications just underway. The presentation concludes by identifying some future directions for collaborative modeling including geo-spatial display and analysis, real-time operations, and internet-based tools plus the design and programming needed to implement these capabilities.
ERIC Educational Resources Information Center
Henry, Mark
1979-01-01
Recounts statistical inaccuracies in an article on computer-aided instruction in economics courses on the college level. The article, published in the J. Econ. Ed (Fall 1978), erroneously placed one student in the TIPS group instead of the control group. Implications of this alteration are discussed. (DB)
Home Economics. Education for Technology Employment.
ERIC Educational Resources Information Center
Northern Illinois Univ., De Kalb. Dept. of Technology.
This guide was developed in an Illinois program to help home economics teachers integrate the use of computers and program-related software into existing programs. After students are taught the basic computer skills outlined in the beginning of the guide, 50 learning activities can be used as an integral part of the instructional program. (One or…
Computers in the Home Economics Classroom.
ERIC Educational Resources Information Center
Browning, Ruth; Durbin, Sandra
This guide for teachers focuses on how microcomputers may be used in the home economics classroom and how the computer is affecting and changing family life. A brief discussion of potential uses of the microcomputer in educational settings is followed by seven major sections. Sections 1 and 2 provide illustrations and definitions for microcomputer…
Ethical issues in engineering models: an operations researcher's reflections.
Kleijnen, J
2011-09-01
This article starts with an overview of the author's personal involvement--as an Operations Research consultant--in several engineering case-studies that may raise ethical questions; e.g., case-studies on nuclear waste, water management, sustainable ecology, military tactics, and animal welfare. All these case studies employ computer simulation models. In general, models are meant to solve practical problems, which may have ethical implications for the various stakeholders; namely, the modelers, the clients, and the public at large. The article further presents an overview of codes of ethics in a variety of disciples. It discusses the role of mathematical models, focusing on the validation of these models' assumptions. Documentation of these model assumptions needs special attention. Some ethical norms and values may be quantified through the model's multiple performance measures, which might be optimized. The uncertainty about the validity of the model leads to risk or uncertainty analysis and to a search for robust models. Ethical questions may be pressing in military models, including war games. However, computer games and the related experimental economics may also provide a special tool to study ethical issues. Finally, the article briefly discusses whistleblowing. Its many references to publications and websites enable further study of ethical issues in modeling.
Carbon accounting and economic model uncertainty of emissions from biofuels-induced land use change.
Plevin, Richard J; Beckman, Jayson; Golub, Alla A; Witcover, Julie; O'Hare, Michael
2015-03-03
Few of the numerous published studies of the emissions from biofuels-induced "indirect" land use change (ILUC) attempt to propagate and quantify uncertainty, and those that have done so have restricted their analysis to a portion of the modeling systems used. In this study, we pair a global, computable general equilibrium model with a model of greenhouse gas emissions from land-use change to quantify the parametric uncertainty in the paired modeling system's estimates of greenhouse gas emissions from ILUC induced by expanded production of three biofuels. We find that for the three fuel systems examined--US corn ethanol, Brazilian sugar cane ethanol, and US soybean biodiesel--95% of the results occurred within ±20 g CO2e MJ(-1) of the mean (coefficient of variation of 20-45%), with economic model parameters related to crop yield and the productivity of newly converted cropland (from forestry and pasture) contributing most of the variance in estimated ILUC emissions intensity. Although the experiments performed here allow us to characterize parametric uncertainty, changes to the model structure have the potential to shift the mean by tens of grams of CO2e per megajoule and further broaden distributions for ILUC emission intensities.
A Multi-Scale Energy Food Systems Modeling Framework For Climate Adaptation
NASA Astrophysics Data System (ADS)
Siddiqui, S.; Bakker, C.; Zaitchik, B. F.; Hobbs, B. F.; Broaddus, E.; Neff, R.; Haskett, J.; Parker, C.
2016-12-01
Our goal is to understand coupled system dynamics across scales in a manner that allows us to quantify the sensitivity of critical human outcomes (nutritional satisfaction, household economic well-being) to development strategies and to climate or market induced shocks in sub-Saharan Africa. We adopt both bottom-up and top-down multi-scale modeling approaches focusing our efforts on food, energy, water (FEW) dynamics to define, parameterize, and evaluate modeled processes nationally as well as across climate zones and communities. Our framework comprises three complementary modeling techniques spanning local, sub-national and national scales to capture interdependencies between sectors, across time scales, and on multiple levels of geographic aggregation. At the center is a multi-player micro-economic (MME) partial equilibrium model for the production, consumption, storage, and transportation of food, energy, and fuels, which is the focus of this presentation. We show why such models can be very useful for linking and integrating across time and spatial scales, as well as a wide variety of models including an agent-based model applied to rural villages and larger population centers, an optimization-based electricity infrastructure model at a regional scale, and a computable general equilibrium model, which is applied to understand FEW resources and economic patterns at national scale. The MME is based on aggregating individual optimization problems for relevant players in an energy, electricity, or food market and captures important food supply chain components of trade and food distribution accounting for infrastructure and geography. Second, our model considers food access and utilization by modeling food waste and disaggregating consumption by income and age. Third, the model is set up to evaluate the effects of seasonality and system shocks on supply, demand, infrastructure, and transportation in both energy and food.
A Decomposition Method for Security Constrained Economic Dispatch of a Three-Layer Power System
NASA Astrophysics Data System (ADS)
Yang, Junfeng; Luo, Zhiqiang; Dong, Cheng; Lai, Xiaowen; Wang, Yang
2018-01-01
This paper proposes a new decomposition method for the security-constrained economic dispatch in a three-layer large-scale power system. The decomposition is realized using two main techniques. The first is to use Ward equivalencing-based network reduction to reduce the number of variables and constraints in the high-layer model without sacrificing accuracy. The second is to develop a price response function to exchange signal information between neighboring layers, which significantly improves the information exchange efficiency of each iteration and results in less iterations and less computational time. The case studies based on the duplicated RTS-79 system demonstrate the effectiveness and robustness of the proposed method.
Economics of Employer-Sponsored Workplace Vaccination to Prevent Pandemic and Seasonal Influenza
Lee, Bruce Y.; Bailey, Rachel R.; Wiringa, Ann E.; Afriyie, Abena; Wateska, Angela R.; Smith, Kenneth J.; Zimmerman, Richard K.
2010-01-01
Employers may be loath to fund vaccination programs without understanding the economic consequences. We developed a decision analytic computational simulation model including dynamic transmission elements that determined the cost-benefit of employer-sponsored workplace vaccination from the employer's perspective. Implementing such programs was relatively inexpensive (<$35/vaccinated employee) and, in many cases, cost saving across diverse occupational groups in all seasonal influenza scenarios. Such programs were cost-saving for a 20% serologic attack rate pandemic scenario (−$15 to −$995) per vaccinated employee) and a 30% serologic attack rate pandemic scenario (range −$39 to −$1,494 per vaccinated employee) across all age and major occupational groups. PMID:20620168
Is there a metric for mineral deposit occurrence probabilities?
Drew, L.J.; Menzie, W.D.
1993-01-01
Traditionally, mineral resource assessments have been used to estimate the physical inventory of critical and strategic mineral commodities that occur in pieces of land and to assess the consequences of supply disruptions of these commodities. More recently, these assessments have been used to estimate the undiscovered mineral wealth in such pieces of land to assess the opportunity cost of using the land for purposes other than mineral production. The field of mineral resource assessment is an interdisciplinary field that draws elements from the disciplines of geology, economic geology (descriptive models), statistics and management science (grade and tonnage models), mineral economics, and operations research (computer simulation models). The purpose of this study is to assert that an occurrenceprobability metric exists that is useful in "filling out" an assessment both for areas in which only a trivial probability exists that a new mining district could be present and for areas where nontrivial probabilities exist for such districts. ?? 1993 Oxford University Press.
The Impact of Alzheimer's Disease on the Chinese Economy.
Keogh-Brown, Marcus R; Jensen, Henning Tarp; Arrighi, H Michael; Smith, Richard D
2016-02-01
Recent increases in life expectancy may greatly expand future Alzheimer's Disease (AD) burdens. China's demographic profile, aging workforce and predicted increasing burden of AD-related care make its economy vulnerable to AD impacts. Previous economic estimates of AD predominantly focus on health system burdens and omit wider whole-economy effects, potentially underestimating the full economic benefit of effective treatment. AD-related prevalence, morbidity and mortality for 2011-2050 were simulated and were, together with associated caregiver time and costs, imposed on a dynamic Computable General Equilibrium model of the Chinese economy. Both economic and non-economic outcomes were analyzed. Simulated Chinese AD prevalence quadrupled during 2011-50 from 6-28 million. The cumulative discounted value of eliminating AD equates to China's 2012 GDP (US$8 trillion), and the annual predicted real value approaches US AD cost-of-illness (COI) estimates, exceeding US$1 trillion by 2050 (2011-prices). Lost labor contributes 62% of macroeconomic impacts. Only 10% derives from informal care, challenging previous COI-estimates of 56%. Health and macroeconomic models predict an unfolding 2011-2050 Chinese AD epidemic with serious macroeconomic consequences. Significant investment in research and development (medical and non-medical) is warranted and international researchers and national authorities should therefore target development of effective AD treatment and prevention strategies.
The Impact of Alzheimer's Disease on the Chinese Economy
Keogh-Brown, Marcus R.; Jensen, Henning Tarp; Arrighi, H. Michael; Smith, Richard D.
2015-01-01
Background Recent increases in life expectancy may greatly expand future Alzheimer's Disease (AD) burdens. China's demographic profile, aging workforce and predicted increasing burden of AD-related care make its economy vulnerable to AD impacts. Previous economic estimates of AD predominantly focus on health system burdens and omit wider whole-economy effects, potentially underestimating the full economic benefit of effective treatment. Methods AD-related prevalence, morbidity and mortality for 2011–2050 were simulated and were, together with associated caregiver time and costs, imposed on a dynamic Computable General Equilibrium model of the Chinese economy. Both economic and non-economic outcomes were analyzed. Findings Simulated Chinese AD prevalence quadrupled during 2011–50 from 6–28 million. The cumulative discounted value of eliminating AD equates to China's 2012 GDP (US$8 trillion), and the annual predicted real value approaches US AD cost-of-illness (COI) estimates, exceeding US$1 trillion by 2050 (2011-prices). Lost labor contributes 62% of macroeconomic impacts. Only 10% derives from informal care, challenging previous COI-estimates of 56%. Interpretation Health and macroeconomic models predict an unfolding 2011–2050 Chinese AD epidemic with serious macroeconomic consequences. Significant investment in research and development (medical and non-medical) is warranted and international researchers and national authorities should therefore target development of effective AD treatment and prevention strategies. PMID:26981556
NASA Astrophysics Data System (ADS)
Davidsen, Claus; Liu, Suxia; Mo, Xingguo; Engelund Holm, Peter; Trapp, Stefan; Rosbjerg, Dan; Bauer-Gottwein, Peter
2015-04-01
Few studies address water quality in hydro-economic models, which often focus primarily on optimal allocation of water quantities. Water quality and water quantity are closely coupled, and optimal management with focus solely on either quantity or quality may cause large costs in terms of the oth-er component. In this study, we couple water quality and water quantity in a joint hydro-economic catchment-scale optimization problem. Stochastic dynamic programming (SDP) is used to minimize the basin-wide total costs arising from water allocation, water curtailment and water treatment. The simple water quality module can handle conservative pollutants, first order depletion and non-linear reactions. For demonstration purposes, we model pollutant releases as biochemical oxygen demand (BOD) and use the Streeter-Phelps equation for oxygen deficit to compute the resulting min-imum dissolved oxygen concentrations. Inelastic water demands, fixed water allocation curtailment costs and fixed wastewater treatment costs (before and after use) are estimated for the water users (agriculture, industry and domestic). If the BOD concentration exceeds a given user pollution thresh-old, the user will need to pay for pre-treatment of the water before use. Similarly, treatment of the return flow can reduce the BOD load to the river. A traditional SDP approach is used to solve one-step-ahead sub-problems for all combinations of discrete reservoir storage, Markov Chain inflow clas-ses and monthly time steps. Pollution concentration nodes are introduced for each user group and untreated return flow from the users contribute to increased BOD concentrations in the river. The pollutant concentrations in each node depend on multiple decision variables (allocation and wastewater treatment) rendering the objective function non-linear. Therefore, the pollution concen-tration decisions are outsourced to a genetic algorithm, which calls a linear program to determine the remainder of the decision variables. This hybrid formulation keeps the optimization problem computationally feasible and represents a flexible and customizable method. The method has been applied to the Ziya River basin, an economic hotspot located on the North China Plain in Northern China. The basin is subject to severe water scarcity, and the rivers are heavily polluted with wastewater and nutrients from diffuse sources. The coupled hydro-economic optimiza-tion model can be used to assess costs of meeting additional constraints such as minimum water qual-ity or to economically prioritize investments in waste water treatment facilities based on economic criteria.
Hybrid and electric advanced vehicle systems (heavy) simulation
NASA Technical Reports Server (NTRS)
Hammond, R. A.; Mcgehee, R. K.
1981-01-01
A computer program to simulate hybrid and electric advanced vehicle systems (HEAVY) is described. It is intended for use early in the design process: concept evaluation, alternative comparison, preliminary design, control and management strategy development, component sizing, and sensitivity studies. It allows the designer to quickly, conveniently, and economically predict the performance of a proposed drive train. The user defines the system to be simulated using a library of predefined component models that may be connected to represent a wide variety of propulsion systems. The development of three models are discussed as examples.
ExM:System Support for Extreme-Scale, Many-Task Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katz, Daniel S
The ever-increasing power of supercomputer systems is both driving and enabling the emergence of new problem-solving methods that require the effi cient execution of many concurrent and interacting tasks. Methodologies such as rational design (e.g., in materials science), uncertainty quanti fication (e.g., in engineering), parameter estimation (e.g., for chemical and nuclear potential functions, and in economic energy systems modeling), massive dynamic graph pruning (e.g., in phylogenetic searches), Monte-Carlo- based iterative fi xing (e.g., in protein structure prediction), and inverse modeling (e.g., in reservoir simulation) all have these requirements. These many-task applications frequently have aggregate computing needs that demand the fastestmore » computers. For example, proposed next-generation climate model ensemble studies will involve 1,000 or more runs, each requiring 10,000 cores for a week, to characterize model sensitivity to initial condition and parameter uncertainty. The goal of the ExM project is to achieve the technical advances required to execute such many-task applications efficiently, reliably, and easily on petascale and exascale computers. In this way, we will open up extreme-scale computing to new problem solving methods and application classes. In this document, we report on combined technical progress of the collaborative ExM project, and the institutional financial status of the portion of the project at University of Chicago, over the rst 8 months (through April 30, 2011)« less
Welch, M C; Kwan, P W; Sajeev, A S M
2014-10-01
Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised. Copyright © 2014 International Atomic Energy Agency 2014. Published by Elsevier B.V. All rights reserved.
Parallel approach to identifying the well-test interpretation model using a neurocomputer
NASA Astrophysics Data System (ADS)
May, Edward A., Jr.; Dagli, Cihan H.
1996-03-01
The well test is one of the primary diagnostic and predictive tools used in the analysis of oil and gas wells. In these tests, a pressure recording device is placed in the well and the pressure response is recorded over time under controlled flow conditions. The interpreted results are indicators of the well's ability to flow and the damage done to the formation surrounding the wellbore during drilling and completion. The results are used for many purposes, including reservoir modeling (simulation) and economic forecasting. The first step in the analysis is the identification of the Well-Test Interpretation (WTI) model, which determines the appropriate solution method. Mis-identification of the WTI model occurs due to noise and non-ideal reservoir conditions. Previous studies have shown that a feed-forward neural network using the backpropagation algorithm can be used to identify the WTI model. One of the drawbacks to this approach is, however, training time, which can run into days of CPU time on personal computers. In this paper a similar neural network is applied using both a personal computer and a neurocomputer. Input data processing, network design, and performance are discussed and compared. The results show that the neurocomputer greatly eases the burden of training and allows the network to outperform a similar network running on a personal computer.
Prediction of resource volumes at untested locations using simple local prediction models
Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.
2006-01-01
This paper shows how local spatial nonparametric prediction models can be applied to estimate volumes of recoverable gas resources at individual undrilled sites, at multiple sites on a regional scale, and to compute confidence bounds for regional volumes based on the distribution of those estimates. An approach that combines cross-validation, the jackknife, and bootstrap procedures is used to accomplish this task. Simulation experiments show that cross-validation can be applied beneficially to select an appropriate prediction model. The cross-validation procedure worked well for a wide range of different states of nature and levels of information. Jackknife procedures are used to compute individual prediction estimation errors at undrilled locations. The jackknife replicates also are used with a bootstrap resampling procedure to compute confidence bounds for the total volume. The method was applied to data (partitioned into a training set and target set) from the Devonian Antrim Shale continuous-type gas play in the Michigan Basin in Otsego County, Michigan. The analysis showed that the model estimate of total recoverable volumes at prediction sites is within 4 percent of the total observed volume. The model predictions also provide frequency distributions of the cell volumes at the production unit scale. Such distributions are the basis for subsequent economic analyses. ?? Springer Science+Business Media, LLC 2007.
A computer program for analyzing the energy consumption of automatically controlled lighting systems
NASA Astrophysics Data System (ADS)
1982-01-01
A computer code to predict the performance of controlled lighting systems with respect to their energy saving capabilities is presented. The computer program provides a mathematical model from which comparisons of control schemes can be made on an economic basis only. The program does not calculate daylighting, but uses daylighting values as input. The program can analyze any of three power input versus light output relationships, continuous dimming with a linear response, continuous dimming with a nonlinear response, or discrete stepped response. Any of these options can be used with or without daylighting, making six distinct modes of control system operation. These relationships are described in detail. The major components of the program are discussed and examples are included to explain how to run the program.
Conservation Risks: When Will Rhinos be Extinct?
Haas, Timothy C; Ferreira, Sam M
2016-08-01
We develop a risk intelligence system for biodiversity enterprises. Such enterprises depend on a supply of endangered species for their revenue. Many of these enterprises, however, cannot purchase a supply of this resource and are largely unable to secure the resource against theft in the form of poaching. Because replacements are not available once a species becomes extinct, insurance products are not available to reduce the risk exposure of these enterprises to an extinction event. For many species, the dynamics of anthropogenic impacts driven by economic as well as noneconomic values of associated wildlife products along with their ecological stressors can help meaningfully predict extinction risks. We develop an agent/individual-based economic-ecological model that captures these effects and apply it to the case of South African rhinos. Our model uses observed rhino dynamics and poaching statistics. It seeks to predict rhino extinction under the present scenario. This scenario has no legal horn trade, but allows live African rhino trade and legal hunting. Present rhino populations are small and threatened by a rising onslaught of poaching. This present scenario and associated dynamics predicts continued decline in rhino population size with accelerated extinction risks of rhinos by 2036. Our model supports the computation of extinction risks at any future time point. This capability can be used to evaluate the effectiveness of proposed conservation strategies at reducing a species' extinction risk. Models used to compute risk predictions, however, need to be statistically estimated. We point out that statistically fitting such models to observations will involve massive numbers of observations on consumer behavior and time-stamped location observations on thousands of animals. Finally, we propose Big Data algorithms to perform such estimates and to interpret the fitted model's output.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lilienthal, P.
1997-12-01
This paper describes three different computer codes which have been written to model village power applications. The reasons which have driven the development of these codes include: the existance of limited field data; diverse applications can be modeled; models allow cost and performance comparisons; simulations generate insights into cost structures. The models which are discussed are: Hybrid2, a public code which provides detailed engineering simulations to analyze the performance of a particular configuration; HOMER - the hybrid optimization model for electric renewables - which provides economic screening for sensitivity analyses; and VIPOR the village power model - which is amore » network optimization model for comparing mini-grids to individual systems. Examples of the output of these codes are presented for specific applications.« less
The economics of bootstrapping space industries - Development of an analytic computer model
NASA Technical Reports Server (NTRS)
Goldberg, A. H.; Criswell, D. R.
1982-01-01
A simple economic model of 'bootstrapping' industrial growth in space and on the Moon is presented. An initial space manufacturing facility (SMF) is assumed to consume lunar materials to enlarge the productive capacity in space. After reaching a predetermined throughput, the enlarged SMF is devoted to products which generate revenue continuously in proportion to the accumulated output mass (such as space solar power stations). Present discounted value and physical estimates for the general factors of production (transport, capital efficiency, labor, etc.) are combined to explore optimum growth in terms of maximized discounted revenues. It is found that 'bootstrapping' reduces the fractional cost to a space industry of transport off-Earth, permits more efficient use of a given transport fleet. It is concluded that more attention should be given to structuring 'bootstrapping' scenarios in which 'learning while doing' can be more fully incorporated in program analysis.
New approaches to investigating social gestures in autism spectrum disorder
2012-01-01
The combination of economic games and human neuroimaging presents the possibility of using economic probes to identify biomarkers for quantitative features of healthy and diseased cognition. These probes span a range of important cognitive functions, but one new use is in the domain of reciprocating social exchange with other humans - a capacity perturbed in a number of psychopathologies. We summarize the use of a reciprocating exchange game to elicit neural and behavioral signatures for subjects diagnosed with autism spectrum disorder (ASD). Furthermore, we outline early efforts to capture features of social exchange in computational models and use these to identify quantitative behavioral differences between subjects with ASD and matched controls. Lastly, we summarize a number of subsequent studies inspired by the modeling results, which suggest new neural and behavioral signatures that could be used to characterize subtle deficits in information processing during interactions with other humans. PMID:22958572
An economical semi-analytical orbit theory for micro-computer applications
NASA Technical Reports Server (NTRS)
Gordon, R. A.
1988-01-01
An economical algorithm is presented for predicting the position of a satellite perturbed by drag and zonal harmonics J sub 2 through J sub 4. Simplicity being of the essence, drag is modeled as a secular decay rate in the semi-axis (retarded motion); with the zonal perturbations modeled from a modified version of the Brouwers formulas. The algorithm is developed as: an alternative on-board orbit predictor; a back up propagator requiring low energy consumption; or a ground based propagator for microcomputer applications (e.g., at the foot of an antenna). An O(J sub 2) secular retarded state partial matrix (matrizant) is also given to employ with state estimation. The theory was implemented in BASIC on an inexpensive microcomputer, the program occupying under 8K bytes of memory. Simulated trajectory data and real tracking data are employed to illustrate the theory's ability to accurately accommodate oblateness and drag effects.
An economical semi-analytical orbit theory for micro-computer applications
NASA Technical Reports Server (NTRS)
Gordon, R. A.
1986-01-01
An economical algorithm is presented for predicting the position of a satellite perturbed by drag and zonal harmonics J2 through J4. Simplicity being of the essence, drag is modeled as a secular decay rate in the semimajor axis (retarded motion) with the zonal perturbations modeled from a modified version of Brouwers formulas. The algorithm is developed as an alternative on-board orbit predictor; a back up propagator requiring low energy consumption; or a ground based propagator for microcomputer applications (e.g., at the foot of an antenna). An O(J2) secular retarded state partial matrix (matrizant) is also given to employ with state estimation. The theory has been implemented in BASIC on an inexpensive microcomputer, the program occupying under 8K bytes of memory. Simulated trajectory data and real tracking data are employed to illustrate the theory's ability to accurately accommodate oblateness and drag effects.
Optimizing noise control strategy in a forging workshop.
Razavi, Hamideh; Ramazanifar, Ehsan; Bagherzadeh, Jalal
2014-01-01
In this paper, a computer program based on a genetic algorithm is developed to find an economic solution for noise control in a forging workshop. Initially, input data, including characteristics of sound sources, human exposure, abatement techniques, and production plans are inserted into the model. Using sound pressure levels at working locations, the operators who are at higher risk are identified and picked out for the next step. The program is devised in MATLAB such that the parameters can be easily defined and changed for comparison. The final results are structured into 4 sections that specify an appropriate abatement method for each operator and machine, minimum allowance time for high-risk operators, required damping material for enclosures, and minimum total cost of these treatments. The validity of input data in addition to proper settings in the optimization model ensures the final solution is practical and economically reasonable.
Some Results Bearing on the Value of Improvements of Membranes for Reverse Osmosis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lamont, A
2006-03-08
This analysis evaluates the potential economic benefits that could result from the improvements in the permeability of membranes for reverse osmosis. The discussion provides a simple model of the operation of a reverse osmosis plant. It examines the change in the operation that might result from improvements in the membrane and computes the cost of water as a function of the membrane permeability.
Johnson, T L; Keith, D W
2001-10-01
The decoupling of fossil-fueled electricity production from atmospheric CO2 emissions via CO2 capture and sequestration (CCS) is increasingly regarded as an important means of mitigating climate change at a reasonable cost. Engineering analyses of CO2 mitigation typically compare the cost of electricity for a base generation technology to that for a similar plant with CO2 capture and then compute the carbon emissions mitigated per unit of cost. It can be hard to interpret mitigation cost estimates from this plant-level approach when a consistent base technology cannot be identified. In addition, neither engineering analyses nor general equilibrium models can capture the economics of plant dispatch. A realistic assessment of the costs of carbon sequestration as an emissions abatement strategy in the electric sector therefore requires a systems-level analysis. We discuss various frameworks for computing mitigation costs and introduce a simplified model of electric sector planning. Results from a "bottom-up" engineering-economic analysis for a representative U.S. North American Electric Reliability Council (NERC) region illustrate how the penetration of CCS technologies and the dispatch of generating units vary with the price of carbon emissions and thereby determine the relationship between mitigation cost and emissions reduction.
Johnson, Timothy L; Keith, David W
2001-10-01
The decoupling of fossil-fueled electricity production from atmospheric CO 2 emissions via CO 2 capture and sequestration (CCS) is increasingly regarded as an important means of mitigating climate change at a reasonable cost. Engineering analyses of CO 2 mitigation typically compare the cost of electricity for a base generation technology to that for a similar plant with CO 2 capture and then compute the carbon emissions mitigated per unit of cost. It can be hard to interpret mitigation cost estimates from this plant-level approach when a consistent base technology cannot be identified. In addition, neither engineering analyses nor general equilibrium models can capture the economics of plant dispatch. A realistic assessment of the costs of carbon sequestration as an emissions abatement strategy in the electric sector therefore requires a systems-level analysis. We discuss various frameworks for computing mitigation costs and introduce a simplified model of electric sector planning. Results from a "bottom-up" engineering-economic analysis for a representative U.S. North American Electric Reliability Council (NERC) region illustrate how the penetration of CCS technologies and the dispatch of generating units vary with the price of carbon emissions and thereby determine the relationship between mitigation cost and emissions reduction.
Smith, Richard D; Keogh-Brown, Marcus R
2013-11-01
Previous research has demonstrated the value of macroeconomic analysis of the impact of influenza pandemics. However, previous modelling applications focus on high-income countries and there is a lack of evidence concerning the potential impact of an influenza pandemic on lower- and middle-income countries. To estimate the macroeconomic impact of pandemic influenza in Thailand, South Africa and Uganda with particular reference to pandemic (H1N1) 2009. A single-country whole-economy computable general equilibrium (CGE) model was set up for each of the three countries in question and used to estimate the economic impact of declines in labour attributable to morbidity, mortality and school closure. Overall GDP impacts were less than 1% of GDP for all countries and scenarios. Uganda's losses were proportionally larger than those of Thailand and South Africa. Labour-intensive sectors suffer the largest losses. The economic cost of unavoidable absence in the event of an influenza pandemic could be proportionally larger for low-income countries. The cost of mild pandemics, such as pandemic (H1N1) 2009, appears to be small, but could increase for more severe pandemics and/or pandemics with greater behavioural change and avoidable absence. © 2013 John Wiley & Sons Ltd.
Scale-up and economic analysis of biodiesel production from municipal primary sewage sludge.
Olkiewicz, Magdalena; Torres, Carmen M; Jiménez, Laureano; Font, Josep; Bengoa, Christophe
2016-08-01
Municipal wastewater sludge is a promising lipid feedstock for biodiesel production, but the need to eliminate the high water content before lipid extraction is the main limitation for scaling up. This study evaluates the economic feasibility of biodiesel production directly from liquid primary sludge based on experimental data at laboratory scale. Computational tools were used for the modelling of the process scale-up and the different configurations of lipid extraction to optimise this step, as it is the most expensive. The operational variables with a major influence in the cost were the extraction time and the amount of solvent. The optimised extraction process had a break-even price of biodiesel of 1232 $/t, being economically competitive with the current cost of fossil diesel. The proposed biodiesel production process from waste sludge eliminates the expensive step of sludge drying, lowering the biodiesel price. Copyright © 2016 Elsevier Ltd. All rights reserved.
Vercelli, Marina; Lillini, Roberto; Capocaccia, Riccardo; Quaglia, Alberto
2012-12-01
The main aim of this work is to compute expected cancer survival for Italian provinces by Socio-Economic and health Resources and Technologic Supplies (SERTS) models, based on demographic, socioeconomic variables and information describing the health care system (SEH). Five-year age-standardised relative survival rates by gender for 11 cancer sites and all cancers combined of patients diagnosed in 1995-1999, were obtained from the Italian Association of Cancer Registries (CRs) database. The SEH variables describe at provincial level macro-economy, demography, labour market, health resources in 1995-2005. A principal components factor analysis was applied to the SEH variables to control their strong mutual correlation. For every considered cancer site, linear regression models were estimated considering the 5-RS% as dependent variable and the principal components factors of the SEH variables as independent variables. The model composition was correlated to the characteristics of take in charge of patients. SEH factors were correlated with the observed survival for all cancer combined and colon-rectum in both sexes, prostate, kidney and non Hodgkin's lymphomas in men, breast, corpus uteri and melanoma in women (R(2) from 40% to 85%). In the provinces without any CR the survival was very similar with that of neighbouring provinces with analogous social, economic and health characteristics. The SERTS models allowed us to interpret the survival outcome of oncologic patients with respect to the role of the socio-economic and health related system characteristics, stressing how the peculiarities of the take in charge at the province level could address the decisions regarding the allocation of resources. Copyright © 2012 Elsevier Ltd. All rights reserved.
Geris, L.; Guyot, Y.; Schrooten, J.; Papantoniou, I.
2016-01-01
The cell therapy market is a highly volatile one, due to the use of disruptive technologies, the current economic situation and the small size of the market. In such a market, companies as well as academic research institutes are in need of tools to advance their understanding and, at the same time, reduce their R&D costs, increase product quality and productivity, and reduce the time to market. An additional difficulty is the regulatory path that needs to be followed, which is challenging in the case of cell-based therapeutic products and should rely on the implementation of quality by design (QbD) principles. In silico modelling is a tool that allows the above-mentioned challenges to be addressed in the field of regenerative medicine. This review discusses such in silico models and focuses more specifically on the bioprocess. Three (clusters of) examples related to this subject are discussed. The first example comes from the pharmaceutical engineering field where QbD principles and their implementation through the use of in silico models are both a regulatory and economic necessity. The second example is related to the production of red blood cells. The described in silico model is mainly used to investigate the manufacturing process of the cell-therapeutic product, and pays special attention to the economic viability of the process. Finally, we describe the set-up of a model capturing essential events in the development of a tissue-engineered combination product in the context of bone tissue engineering. For each of the examples, a short introduction to some economic aspects is given, followed by a description of the in silico tool or tools that have been developed to allow the implementation of QbD principles and optimal design. PMID:27051516
Geris, L; Guyot, Y; Schrooten, J; Papantoniou, I
2016-04-06
The cell therapy market is a highly volatile one, due to the use of disruptive technologies, the current economic situation and the small size of the market. In such a market, companies as well as academic research institutes are in need of tools to advance their understanding and, at the same time, reduce their R&D costs, increase product quality and productivity, and reduce the time to market. An additional difficulty is the regulatory path that needs to be followed, which is challenging in the case of cell-based therapeutic products and should rely on the implementation of quality by design (QbD) principles. In silico modelling is a tool that allows the above-mentioned challenges to be addressed in the field of regenerative medicine. This review discusses such in silico models and focuses more specifically on the bioprocess. Three (clusters of) examples related to this subject are discussed. The first example comes from the pharmaceutical engineering field where QbD principles and their implementation through the use of in silico models are both a regulatory and economic necessity. The second example is related to the production of red blood cells. The described in silico model is mainly used to investigate the manufacturing process of the cell-therapeutic product, and pays special attention to the economic viability of the process. Finally, we describe the set-up of a model capturing essential events in the development of a tissue-engineered combination product in the context of bone tissue engineering. For each of the examples, a short introduction to some economic aspects is given, followed by a description of the in silico tool or tools that have been developed to allow the implementation of QbD principles and optimal design.
A Method For Assessing Economic Thresholds of Hardwood Competition
Steven A. Knowe
2002-01-01
A procedure was developed for computing economic thresholds for hardwood competition in pine plantations. The economic threshold represents the break-even level of competition above which hardwood control is a financially attractive treatment. Sensitivity analyses were conducted to examine the relative importance of biological and economic factors in determining...
Economic Consequence Analysis of Disasters: The ECAT Software Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Adam; Prager, Fynn; Chen, Zhenhua
This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less
ERIC Educational Resources Information Center
Birken, Marvin N.
1967-01-01
Numerous decisions must be made in the design of computer air conditioning, each determined by a combination of economics, physical, and esthetic characteristics, and computer requirements. Several computer air conditioning systems are analyzed--(1) underfloor supply and overhead return, (2) underfloor plenum and overhead supply with computer unit…
Flat-plate solar array project. Volume 8: Project analysis and integration
NASA Technical Reports Server (NTRS)
Mcguire, P.; Henry, P.
1986-01-01
Project Analysis and Integration (PA&I) performed planning and integration activities to support management of the various Flat-Plate Solar Array (FSA) Project R&D activities. Technical and economic goals were established by PA&I for each R&D task within the project to coordinate the thrust toward the National Photovoltaic Program goals. A sophisticated computer modeling capability was developed to assess technical progress toward meeting the economic goals. These models included a manufacturing facility simulation, a photovoltaic power station simulation and a decision aid model incorporating uncertainty. This family of analysis tools was used to track the progress of the technology and to explore the effects of alternative technical paths. Numerous studies conducted by PA&I signaled the achievement of milestones or were the foundation of major FSA project and national program decisions. The most important PA&I activities during the project history are summarized. The PA&I planning function is discussed and how it relates to project direction and important analytical models developed by PA&I for its analytical and assessment activities are reviewed.
A review on economic emission dispatch problems using quantum computational intelligence
NASA Astrophysics Data System (ADS)
Mahdi, Fahad Parvez; Vasant, Pandian; Kallimani, Vish; Abdullah-Al-Wadud, M.
2016-11-01
Economic emission dispatch (EED) problems are one of the most crucial problems in power systems. Growing energy demand, limitation of natural resources and global warming make this topic into the center of discussion and research. This paper reviews the use of Quantum Computational Intelligence (QCI) in solving Economic Emission Dispatch problems. QCI techniques like Quantum Genetic Algorithm (QGA) and Quantum Particle Swarm Optimization (QPSO) algorithm are discussed here. This paper will encourage the researcher to use more QCI based algorithm to get better optimal result for solving EED problems.
Guide to the economic analysis of community energy systems
NASA Astrophysics Data System (ADS)
Pferdehirt, W. P.; Croke, K. G.; Hurter, A. P.; Kennedy, A. S.; Lee, C.
1981-08-01
This guidebook provides a framework for the economic analysis of community energy systems. The analysis facilitates a comparison of competing configurations in community energy systems, as well as a comparison with conventional energy systems. Various components of costs and revenues to be considered are discussed in detail. Computational procedures and accompanying worksheets are provided for calculating the net present value, straight and discounted payback periods, the rate of return, and the savings to investment ratio for the proposed energy system alternatives. These computations are based on a projection of the system's costs and revenues over its economic lifetimes. The guidebook also discusses the sensitivity of the results of this economic analysis to changes in various parameters and assumptions.
NASA Astrophysics Data System (ADS)
Neradilová, Hana; Fedorko, Gabriel
2016-12-01
Automated logistic systems are becoming more widely used within enterprise logistics processes. Their main advantage is that they allow increasing the efficiency and reliability of logistics processes. In terms of evaluating their effectiveness, it is necessary to take into account the economic aspect of the entire process. However, many users ignore and underestimate this area,which is not correct. One of the reasons why the economic aspect is overlooked is the fact that obtaining information for such an analysis is not easy. The aim of this paper is to present the possibilities of computer simulation methods for obtaining data for full-scale economic analysis implementation.
Economic assessment photovoltaic/battery systems
NASA Astrophysics Data System (ADS)
Day, J. T.; Hayes, T. P.; Hobbs, W. J.
1981-02-01
The economics of residential PV/battery systems were determined from the utility perspective using detailed computer simulation to determine marginal costs. Brief consideration is also given to the economics of customer ownership, utility distribution system impact, and the implications of PURPA.
NASA Astrophysics Data System (ADS)
Sarofim, M. C.
2007-12-01
Emissions of greenhouses gases and conventional pollutants are closely linked through shared generation processes and thus policies directed toward long-lived greenhouse gases affect emissions of conventional pollutants and, similarly, policies directed toward conventional pollutants affect emissions of greenhouse gases. Some conventional pollutants such as aerosols also have direct radiative effects. NOx and VOCs are ozone precursors, another substance with both radiative and health impacts, and these ozone precursors also interact with the chemistry of the hydroxyl radical which is the major methane sink. Realistic scenarios of future emissions and concentrations must therefore account for both air pollution and greenhouse gas policies and how they interact economically as well as atmospherically, including the regional pattern of emissions and regulation. We have modified a 16 region computable general equilibrium economic model (the MIT Emissions Prediction and Policy Analysis model) by including elasticities of substitution for ozone precursors and aerosols in order to examine these interactions between climate policy and air pollution policy on a global scale. Urban emissions are distributed based on population density, and aged using a reduced form urban model before release into an atmospheric chemistry/climate model (the earth systems component of the MIT Integrated Global Systems Model). This integrated approach enables examination of the direct impacts of air pollution on climate, the ancillary and complementary interactions between air pollution and climate policies, and the impact of different population distribution algorithms or urban emission aging schemes on global scale properties. This modeling exercise shows that while ozone levels are reduced due to NOx and VOC reductions, these reductions lead to an increase in methane concentrations that eliminates the temperature effects of the ozone reductions. However, black carbon reductions do have significant direct effects on global mean temperatures, as do ancillary reductions of greenhouse gases due to the pollution constraints imposed in the economic model. Finally, we show that the economic benefits of coordinating air pollution and climate policies rather than separate implementation are on the order of 20% of the total policy cost.
Sedentary behaviours and socio-economic status in Spanish adolescents: the AVENA study.
Rey-López, Juan P; Tomas, Concepción; Vicente-Rodriguez, German; Gracia-Marco, Luis; Jiménez-Pavón, David; Pérez-Llamas, Francisca; Redondo, Carlos; Bourdeaudhuij, Ilse De; Sjöström, Michael; Marcos, Ascensión; Chillón, Palma; Moreno, Luis A
2011-04-01
This study aimed to describe the influence of socio-economic status (SES) on the prevalence sedentary behaviours among Spanish adolescents. Cross-sectional data from Spanish adolescents from the Alimentación y Valoración del Estado Nutricional de los Adolescentes (AVENA) Study (2002). A national representative sample of 1776 adolescents aged 13-18.5 years provided information about time spent watching television (TV), playing with computer or videogames and studying. Parental education and occupation were assessed as SES. Participants were categorized by gender, age, parental education and occupation. Logistic regression models were used. No gender differences were found for TV viewing. For computer and videogames use (weekdays), more boys played >3 h/day (P < 0.001), whereas a higher percentage of girls reported studying >3 h/day (P < 0.001). Among boys, parental education and occupation were inversely associated with TV viewing, parental occupation directly associated with study and maternal education inversely with computer and videogames use during weekdays (all P < 0.05). For girls, parental occupation was inversely associated with TV viewing. Spanish adolescents presented different sedentary patterns according to age, gender and SES. Boys reported more time engaged in electronic games, whereas girls reported more time studying. Parental occupation had more influence than parental education on the time spent in sedentary behaviours.
Quick-start guide for version 3.0 of EMINERS - Economic Mineral Resource Simulator
Bawiec, Walter J.; Spanski, Gregory T.
2012-01-01
Quantitative mineral resource assessment, as developed by the U.S. Geological Survey (USGS), consists of three parts: (1) development of grade and tonnage mineral deposit models; (2) delineation of tracts permissive for each deposit type; and (3) probabilistic estimation of the numbers of undiscovered deposits for each deposit type (Singer and Menzie, 2010). The estimate of the number of undiscovered deposits at different levels of probability is the input to the EMINERS (Economic Mineral Resource Simulator) program. EMINERS uses a Monte Carlo statistical process to combine probabilistic estimates of undiscovered mineral deposits with models of mineral deposit grade and tonnage to estimate mineral resources. It is based upon a simulation program developed by Root and others (1992), who discussed many of the methods and algorithms of the program. Various versions of the original program (called "MARK3" and developed by David H. Root, William A. Scott, and Lawrence J. Drew of the USGS) have been published (Root, Scott, and Selner, 1996; Duval, 2000, 2012). The current version (3.0) of the EMINERS program is available as USGS Open-File Report 2004-1344 (Duval, 2012). Changes from version 2.0 include updating 87 grade and tonnage models, designing new templates to produce graphs showing cumulative distribution and summary tables, and disabling economic filters. The economic filters were disabled because embedded data for costs of labor and materials, mining techniques, and beneficiation methods are out of date. However, the cost algorithms used in the disabled economic filters are still in the program and available for reference for mining methods and milling techniques included in Camm (1991). EMINERS is written in C++ and depends upon the Microsoft Visual C++ 6.0 programming environment. The code depends heavily on the use of Microsoft Foundation Classes (MFC) for implementation of the Windows interface. The program works only on Microsoft Windows XP or newer personal computers. It does not work on Macintosh computers. This report demonstrates how to execute EMINERS software using default settings and existing deposit models. Many options are available when setting up the simulation. Information and explanations addressing these optional parameters can be found in the EMINERS Help files. Help files are available during execution of EMINERS by selecting EMINERS Help from the pull-down menu under Help on the EMINERS menu bar. There are four sections in this report. Part I describes the installation, setup, and application of the EMINERS program, and Part II illustrates how to interpret the text file that is produced. Part III describes the creation of tables and graphs by use of the provided Excel templates. Part IV summarizes grade and tonnage models used in version 3.0 of EMINERS.
Parallel computing in enterprise modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.
2008-08-01
This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priorimore » ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.« less
ERIC Educational Resources Information Center
Groff, Warren H.
As our society evolves from an industrial society to a computer literate, high technology, information society, educational planners must reexamine the role of postsecondary education in economic development and in intellectual capital formation. In response to this need, a task force on high technology was established to examine the following…
The genetic and economic effect of preliminary culling in the seedling orchard
Don E. Riemenschneider
1977-01-01
The genetic and economic effects of two stages of truncation selection in a white spruce seedling orchard were investigated by computer simulation. Genetic effects were computed by assuming a bivariate distribution of juvenile and mature traits and volume was used as the selection criterion. Seed production was assumed to rise in a linear fashion to maturity and then...
Eastern Renewable Generation Integration Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bloom, Aaron; Townsend, Aaron; Palchak, David
2016-08-01
The Eastern Interconnection (EI) is one of the largest power systems in the world, and its size and complexity have historically made it difficult to study in high levels of detail in a modeling environment. In order to understand how this system might be impacted by high penetrations (30% of total annual generation) of wind and solar photovoltaic (PV) during steady state operations, the National Renewable Energy Laboratory (NREL) and the U.S. Department of Energy (DOE) conducted the Eastern Renewable Generation Integration Study (ERGIS). This study investigates certain aspects of the reliability and economic efficiency problem faced by power systemmore » operators and planners. Specifically, the study models the ability to meet electricity demand at a 5-minute time interval by scheduling resources for known ramping events, while maintaining adequate reserves to meet random variation in supply and demand, and contingency events. To measure the ability to meet these requirements, a unit commitment and economic dispatch (UC&ED) model is employed to simulate power system operations. The economic costs of managing this system are presented using production costs, a traditional UC&ED metric that does not include any consideration of long-term fixed costs. ERGIS simulated one year of power system operations to understand regional and sub-hourly impacts of wind and PV by developing a comprehensive UC&ED model of the EI. In the analysis, it is shown that, under the study assumptions, generation from approximately 400 GW of combined wind and PV capacity can be balanced on the transmission system at a 5-minute level. In order to address the significant computational burdens associated with a model of this detail we apply novel computing techniques to dramatically reduce simulation solve time while simultaneously increasing the resolution and fidelity of the analysis. Our results also indicate that high penetrations of wind and PV (collectively variable generation (VG)), significantly impact the operation of traditional generating resources and cause these resources to be used less frequently and operate across a broader output range because wind and PV have lower operating costs and variable output levels.« less
Ford, Patrick; Santos, Eduardo; Ferrão, Paulo; Margarido, Fernanda; Van Vliet, Krystyn J; Olivetti, Elsa
2016-05-03
The challenges brought on by the increasing complexity of electronic products, and the criticality of the materials these devices contain, present an opportunity for maximizing the economic and societal benefits derived from recovery and recycling. Small appliances and computer devices (SACD), including mobile phones, contain significant amounts of precious metals including gold and platinum, the present value of which should serve as a key economic driver for many recycling decisions. However, a detailed analysis is required to estimate the economic value that is unrealized by incomplete recovery of these and other materials, and to ascertain how such value could be reinvested to improve recovery processes. We present a dynamic product flow analysis for SACD throughout Portugal, a European Union member, including annual data detailing product sales and industrial-scale preprocessing data for recovery of specific materials from devices. We employ preprocessing facility and metals pricing data to identify losses, and develop an economic framework around the value of recycling including uncertainty. We show that significant economic losses occur during preprocessing (over $70 M USD unrecovered in computers and mobile phones, 2006-2014) due to operations that fail to target high value materials, and characterize preprocessing operations according to material recovery and total costs.
Computational social network modeling of terrorist recruitment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, Nina M.; Turnley, Jessica Glicken; Smrcka, Julianne D.
2004-10-01
The Seldon terrorist model represents a multi-disciplinary approach to developing organization software for the study of terrorist recruitment and group formation. The need to incorporate aspects of social science added a significant contribution to the vision of the resulting Seldon toolkit. The unique addition of and abstract agent category provided a means for capturing social concepts like cliques, mosque, etc. in a manner that represents their social conceptualization and not simply as a physical or economical institution. This paper provides an overview of the Seldon terrorist model developed to study the formation of cliques, which are used as the majormore » recruitment entity for terrorist organizations.« less
Multiscaling Edge Effects in an Agent-based Money Emergence Model
NASA Astrophysics Data System (ADS)
Oświęcimka, P.; Drożdż, S.; Gębarowski, R.; Górski, A. Z.; Kwapień, J.
An agent-based computational economical toy model for the emergence of money from the initial barter trading, inspired by Menger's postulate that money can spontaneously emerge in a commodity exchange economy, is extensively studied. The model considered, while manageable, is significantly complex, however. It is already able to reveal phenomena that can be interpreted as emergence and collapse of money as well as the related competition effects. In particular, it is shown that - as an extra emerging effect - the money lifetimes near the critical threshold value develop multiscaling, which allow one to set parallels to critical phenomena and, thus, to the real financial markets.
A Novel Approach to Develop the Lower Order Model of Multi-Input Multi-Output System
NASA Astrophysics Data System (ADS)
Rajalakshmy, P.; Dharmalingam, S.; Jayakumar, J.
2017-10-01
A mathematical model is a virtual entity that uses mathematical language to describe the behavior of a system. Mathematical models are used particularly in the natural sciences and engineering disciplines like physics, biology, and electrical engineering as well as in the social sciences like economics, sociology and political science. Physicists, Engineers, Computer scientists, and Economists use mathematical models most extensively. With the advent of high performance processors and advanced mathematical computations, it is possible to develop high performing simulators for complicated Multi Input Multi Ouptut (MIMO) systems like Quadruple tank systems, Aircrafts, Boilers etc. This paper presents the development of the mathematical model of a 500 MW utility boiler which is a highly complex system. A synergistic combination of operational experience, system identification and lower order modeling philosophy has been effectively used to develop a simplified but accurate model of a circulation system of a utility boiler which is a MIMO system. The results obtained are found to be in good agreement with the physics of the process and with the results obtained through design procedure. The model obtained can be directly used for control system studies and to realize hardware simulators for boiler testing and operator training.
Beuter, Anne
2017-05-01
Recent publications call for more animal models to be used and more experiments to be performed, in order to better understand the mechanisms of neurodegenerative disorders, to improve human health, and to develop new brain stimulation treatments. In response to these calls, some limitations of the current animal models are examined by using Deep Brain Stimulation (DBS) in Parkinson's disease as an illustrative example. Without focusing on the arguments for or against animal experimentation, or on the history of DBS, the present paper argues that given recent technological and theoretical advances, the time has come to consider bioinspired computational modelling as a valid alternative to animal models, in order to design the next generation of human brain stimulation treatments. However, before computational neuroscience is fully integrated in the translational process and used as a substitute for animal models, several obstacles need to be overcome. These obstacles are examined in the context of institutional, financial, technological and behavioural lock-in. Recommendations include encouraging agreement to change long-term habitual practices, explaining what alternative models can achieve, considering economic stakes, simplifying administrative and regulatory constraints, and carefully examining possible conflicts of interest. 2017 FRAME.
Microcomputers in Vocational Home Economics Classrooms in USD #512.
ERIC Educational Resources Information Center
Shawnee Mission Public Schools, KS.
A project was conducted to identify software suitable for use in home economics classes and to train home economics teachers to use that software with an Apple II Plus microcomputer. During the project, home economics software was identified, evaluated, and catalogued. Teaching strategies were adapted to include using the computer in the…
NASA Astrophysics Data System (ADS)
Kotliar, Gabriel
2005-01-01
Dynamical mean field theory (DMFT) relates extended systems (bulk solids, surfaces and interfaces) to quantum impurity models (QIM) satisfying a self-consistency condition. This mapping provides an economic description of correlated electron materials. It is currently used in practical computations of physical properties of real materials. It has also great conceptual value, providing a simple picture of correlated electron phenomena on the lattice, using concepts derived from quantum impurity models such as the Kondo effect. DMFT can also be formulated as a first principles electronic structure method and is applicable to correlated materials.
Confidence Sharing: An Economic Strategy for Efficient Information Flows in Animal Groups
Korman, Amos; Greenwald, Efrat; Feinerman, Ofer
2014-01-01
Social animals may share information to obtain a more complete and accurate picture of their surroundings. However, physical constraints on communication limit the flow of information between interacting individuals in a way that can cause an accumulation of errors and deteriorated collective behaviors. Here, we theoretically study a general model of information sharing within animal groups. We take an algorithmic perspective to identify efficient communication schemes that are, nevertheless, economic in terms of communication, memory and individual internal computation. We present a simple and natural algorithm in which each agent compresses all information it has gathered into a single parameter that represents its confidence in its behavior. Confidence is communicated between agents by means of active signaling. We motivate this model by novel and existing empirical evidences for confidence sharing in animal groups. We rigorously show that this algorithm competes extremely well with the best possible algorithm that operates without any computational constraints. We also show that this algorithm is minimal, in the sense that further reduction in communication may significantly reduce performances. Our proofs rely on the Cramér-Rao bound and on our definition of a Fisher Channel Capacity. We use these concepts to quantify information flows within the group which are then used to obtain lower bounds on collective performance. The abstract nature of our model makes it rigorously solvable and its conclusions highly general. Indeed, our results suggest confidence sharing as a central notion in the context of animal communication. PMID:25275649
Confidence sharing: an economic strategy for efficient information flows in animal groups.
Korman, Amos; Greenwald, Efrat; Feinerman, Ofer
2014-10-01
Social animals may share information to obtain a more complete and accurate picture of their surroundings. However, physical constraints on communication limit the flow of information between interacting individuals in a way that can cause an accumulation of errors and deteriorated collective behaviors. Here, we theoretically study a general model of information sharing within animal groups. We take an algorithmic perspective to identify efficient communication schemes that are, nevertheless, economic in terms of communication, memory and individual internal computation. We present a simple and natural algorithm in which each agent compresses all information it has gathered into a single parameter that represents its confidence in its behavior. Confidence is communicated between agents by means of active signaling. We motivate this model by novel and existing empirical evidences for confidence sharing in animal groups. We rigorously show that this algorithm competes extremely well with the best possible algorithm that operates without any computational constraints. We also show that this algorithm is minimal, in the sense that further reduction in communication may significantly reduce performances. Our proofs rely on the Cramér-Rao bound and on our definition of a Fisher Channel Capacity. We use these concepts to quantify information flows within the group which are then used to obtain lower bounds on collective performance. The abstract nature of our model makes it rigorously solvable and its conclusions highly general. Indeed, our results suggest confidence sharing as a central notion in the context of animal communication.
NASA Technical Reports Server (NTRS)
Stoll, Frederick
1993-01-01
The NLPAN computer code uses a finite-strip approach to the analysis of thin-walled prismatic composite structures such as stiffened panels. The code can model in-plane axial loading, transverse pressure loading, and constant through-the-thickness thermal loading, and can account for shape imperfections. The NLPAN code represents an attempt to extend the buckling analysis of the VIPASA computer code into the geometrically nonlinear regime. Buckling mode shapes generated using VIPASA are used in NLPAN as global functions for representing displacements in the nonlinear regime. While the NLPAN analysis is approximate in nature, it is computationally economical in comparison with finite-element analysis, and is thus suitable for use in preliminary design and design optimization. A comprehensive description of the theoretical approach of NLPAN is provided. A discussion of some operational considerations for the NLPAN code is included. NLPAN is applied to several test problems in order to demonstrate new program capabilities, and to assess the accuracy of the code in modeling various types of loading and response. User instructions for the NLPAN computer program are provided, including a detailed description of the input requirements and example input files for two stiffened-panel configurations.
The economics of optimal health and productivity in the commercial dairy.
Galligan, D T
1999-08-01
Dairy production practices are changing; in order to remain viable, producers must optimise the health and productivity of dairy herds in economic terms. Health care is important in economic terms because disease can substantially reduce the productivity of individual animals. Preventive disease control programmes can thus result in economic gains for the dairy producer. The author describes new approaches to preventing postpartum diseases and dealing with fertility problems which can result from these diseases. Other aspects of dairy production are also changing, employing new technologies where these are judged to be profitable. Innovations include: the use of bovine somatotropin; systematic breeding/culling programmes; new mathematical modelling techniques to determine optimum feed composition and to define optimal growth levels for accelerated heifer-rearing programmes; the use of computers to collect, store and analyse data on animal production and health; and semen selection programmes. Increasing awareness of bio-security is also vital, not least because of the large investment present in dairy herds. Whatever practices are employed, they must offer economic returns to producers that compete with alternative uses of capital. Optimal levels of disease control must be determined for a particular production situation, taking into account not only the economic health of the producer, but also the well-being of the animals.
Borders as membranes :metaphors and models for improved policy in border regions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malczynski, Leonard A.; Passell, Howard David; Forster, Craig B.
Political borders are controversial and contested spaces. In an attempt to better understand movement along and through political borders, this project applied the metaphor of a membrane to look at how people, ideas, and things ''move'' through a border. More specifically, the research team employed this metaphor in a system dynamics framework to construct a computer model to assess legal and illegal migration on the US-Mexico border. Employing a metaphor can be helpful, as it was in this project, to gain different perspectives on a complex system. In addition to the metaphor, the multidisciplinary team utilized an array of methodsmore » to gather data including traditional literature searches, an experts workshop, a focus group, interviews, and culling expertise from the individuals on the research team. Results from the qualitative efforts revealed strong social as well as economic drivers that motivate individuals to cross the border legally. Based on the information gathered, the team concluded that legal migration dynamics were of a scope we did not want to consider hence, available demographic models sufficiently capture migration at the local level. Results from both the quantitative and qualitative data searches were used to modify a 1977 border model to demonstrate the dynamic nature of illegal migration. Model runs reveal that current US-policies based on neo-classic economic theory have proven ineffective in curbing illegal migration, and that proposed enforcement policies are also likely to be ineffective. We suggest, based on model results, that improvement in economic conditions within Mexico may have the biggest impact on illegal migration to the U.S. The modeling also supports the views expressed in the current literature suggesting that demographic and economic changes within Mexico are likely to slow illegal migration by 2060 with no special interventions made by either government.« less
Assessing the prospective resource base for enhanced geothermal systems in Europe
NASA Astrophysics Data System (ADS)
Limberger, J.; Calcagno, P.; Manzella, A.; Trumpy, E.; Boxem, T.; Pluymaekers, M. P. D.; van Wees, J.-D.
2014-12-01
In this study the resource base for EGS (enhanced geothermal systems) in Europe was quantified and economically constrained, applying a discounted cash-flow model to different techno-economic scenarios for future EGS in 2020, 2030, and 2050. Temperature is a critical parameter that controls the amount of thermal energy available in the subsurface. Therefore, the first step in assessing the European resource base for EGS is the construction of a subsurface temperature model of onshore Europe. Subsurface temperatures were computed to a depth of 10 km below ground level for a regular 3-D hexahedral grid with a horizontal resolution of 10 km and a vertical resolution of 250 m. Vertical conductive heat transport was considered as the main heat transfer mechanism. Surface temperature and basal heat flow were used as boundary conditions for the top and bottom of the model, respectively. If publicly available, the most recent and comprehensive regional temperature models, based on data from wells, were incorporated. With the modeled subsurface temperatures and future technical and economic scenarios, the technical potential and minimum levelized cost of energy (LCOE) were calculated for each grid cell of the temperature model. Calculations for a typical EGS scenario yield costs of EUR 215 MWh-1 in 2020, EUR 127 MWh-1 in 2030, and EUR 70 MWh-1 in 2050. Cutoff values of EUR 200 MWh-1 in 2020, EUR 150 MWh-1 in 2030, and EUR 100 MWh-1 in 2050 are imposed to the calculated LCOE values in each grid cell to limit the technical potential, resulting in an economic potential for Europe of 19 GWe in 2020, 22 GWe in 2030, and 522 GWe in 2050. The results of our approach do not only provide an indication of prospective areas for future EGS in Europe, but also show a more realistic cost determined and depth-dependent distribution of the technical potential by applying different well cost models for 2020, 2030, and 2050.
A parabolic velocity-decomposition method for wind turbines
NASA Astrophysics Data System (ADS)
Mittal, Anshul; Briley, W. Roger; Sreenivas, Kidambi; Taylor, Lafayette K.
2017-02-01
An economical parabolized Navier-Stokes approximation for steady incompressible flow is combined with a compatible wind turbine model to simulate wind turbine flows, both upstream of the turbine and in downstream wake regions. The inviscid parabolizing approximation is based on a Helmholtz decomposition of the secondary velocity vector and physical order-of-magnitude estimates, rather than an axial pressure gradient approximation. The wind turbine is modeled by distributed source-term forces incorporating time-averaged aerodynamic forces generated by a blade-element momentum turbine model. A solution algorithm is given whose dependent variables are streamwise velocity, streamwise vorticity, and pressure, with secondary velocity determined by two-dimensional scalar and vector potentials. In addition to laminar and turbulent boundary-layer test cases, solutions for a streamwise vortex-convection test problem are assessed by mesh refinement and comparison with Navier-Stokes solutions using the same grid. Computed results for a single turbine and a three-turbine array are presented using the NREL offshore 5-MW baseline wind turbine. These are also compared with an unsteady Reynolds-averaged Navier-Stokes solution computed with full rotor resolution. On balance, the agreement in turbine wake predictions for these test cases is very encouraging given the substantial differences in physical modeling fidelity and computer resources required.
An Economical Semi-Analytical Orbit Theory for Retarded Satellite Motion About an Oblate Planet
NASA Technical Reports Server (NTRS)
Gordon, R. A.
1980-01-01
Brouwer and Brouwer-Lyddanes' use of the Von Zeipel-Delaunay method is employed to develop an efficient analytical orbit theory suitable for microcomputers. A succinctly simple pseudo-phenomenologically conceptualized algorithm is introduced which accurately and economically synthesizes modeling of drag effects. The method epitomizes and manifests effortless efficient computer mechanization. Simulated trajectory data is employed to illustrate the theory's ability to accurately accommodate oblateness and drag effects for microcomputer ground based or onboard predicted orbital representation. Real tracking data is used to demonstrate that the theory's orbit determination and orbit prediction capabilities are favorably adaptable to and are comparable with results obtained utilizing complex definitive Cowell method solutions on satellites experiencing significant drag effects.
Buffer thermal energy storage for an air Brayton solar engine
NASA Technical Reports Server (NTRS)
Strumpf, H. J.; Barr, K. P.
1981-01-01
The application of latent-heat buffer thermal energy storage to a point-focusing solar receiver equipped with an air Brayton engine was studied. To demonstrate the effect of buffer thermal energy storage on engine operation, a computer program was written which models the recuperator, receiver, and thermal storage device as finite-element thermal masses. Actual operating or predicted performance data are used for all components, including the rotating equipment. Based on insolation input and a specified control scheme, the program predicts the Brayton engine operation, including flows, temperatures, and pressures for the various components, along with the engine output power. An economic parametric study indicates that the economic viability of buffer thermal energy storage is largely a function of the achievable engine life.
[Metalworking industry management evolution].
Mattucci, Massimo
2011-01-01
Analysis of the evolution drivers of the management systems in the metalworking industry, mainly characterized as "automotive", starting with the "mass production" model, followed for the development of Italian industry in the '50. Through the socio-economic changes of the '90/10, the metalworking plants were deeply restructured with the introduction of computers in the production systems, and then with the first global benchmarks such as the "lean production", towards the needed operational flexibility to respond to the market dynamics. Plants change radically, company networks become real, ICT services are fundamental elements for the integration. These trends help visualizing a new "Factory of the Future" for the years 2020/30, where the competition will be based on the socio-economical, technological and environmental factors included in the "Competitive Sustainable Manufacturing" paradigm.
Economic analysis of wind-powered farmhouse and farm building heating systems
NASA Astrophysics Data System (ADS)
Stafford, R. W.; Greeb, F. J.; Smith, M. H.; Deschenes, C.; Weaver, N. L.
1981-01-01
The break even values of wind energy for selected farmhouses and farm buildings focusing on the effects of thermal storage on the use of WECS production were evaluated. Farmhouse structural models include three types derived from a national survey: an older, a more modern, and a passive solar structure. The eight farm building applications include: (1) poultry layers; (2) poultry brooding/layers; (3) poultry broilers; (4) poultry turkeys; (5) swine farrowing; (6) swine growing/finishing; (7) dairy; and (8) lambing. The farm buildings represent the spectrum of animal types, heating energy use, and major contributions to national agricultural economic values. All energy analyses are based on hour by hour computations which allow for growth of animals, sensible and latent heat production, and ventilation requirements.
Sahra integrated modeling approach to address water resources management in semi-arid river basins
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springer, E. P.; Gupta, Hoshin V.; Brookshire, David S.
Water resources decisions in the 21Sf Century that will affect allocation of water for economic and environmental will rely on simulations from integrated models of river basins. These models will not only couple natural systems such as surface and ground waters, but will include economic components that can assist in model assessments of river basins and bring the social dimension to the decision process. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated models to assess impacts of climate variability and land use change on water resources inmore » semi-arid river basins. The objectives of this paper are to describe the SAHRA integrated modeling approach and to describe the linkage between social and natural sciences in these models. Water resources issues that arise from climate variability or land use change may require different resolution models to answer different questions. For example, a question related to streamflow may not need a high-resolution model whereas a question concerning the source and nature of a pollutant will. SAHRA has taken a multiresolution approach to integrated model development because one cannot anticipate the questions in advance, and the computational and data resources may not always be available or needed for the issue to be addressed. The coarsest resolution model is based on dynamic simulation of subwatersheds or river reaches. This model resolution has the advantage of simplicity and social factors are readily incorporated. Users can readily take this model (and they have) and examine the effects of various management strategies such as increased cost of water. The medium resolution model is grid based and uses variable grid cells of 1-12 km. The surface hydrology is more physically based using basic equations for energy and water balance terms, and modules are being incorporated that will simulate engineering components such as reservoirs or irrigation diversions and economic features such as variable demand. The fine resolution model is viewed as a tool to examine basin response using best available process models. The fine resolution model operates on a grid cell size of 100 m or less, which is consistent with the scale that our process knowledge has developed. The fine resolution model couples atmosphere, surface water and groundwater modules using high performance computing. The medium and fine resolution models are not expected at this time to be operated by users as opposed to the coarse resolution model. One of the objectives of the SAHRA integrated modeling task is to present results in a manner that can be used by those making decisions. The application of these models within SAHRA is driven by a scenario analysis and a place location. The place is the Rio Grande from its headwaters in Colorado to the New Mexico-Texas border. This provides a focus for model development and an attempt to see how the results from the various models relate. The scenario selected by SAHRA is the impact of a 1950's style drought using 1990's population and land use on Rio Grande water resources including surface and groundwater. The same climate variables will be used to drive all three models so that comparison will be based on how the three resolutions partition and route water through the river basin. Aspects of this scenario will be discussed and initial model simulation will be presented. The issue of linking economic modules into the modeling effort will be discussed and the importance of feedback from the social and economic modules to the natural science modules will be reviewed.« less
NASA Astrophysics Data System (ADS)
Li, Guang
2017-01-01
This paper presents a fast constrained optimization approach, which is tailored for nonlinear model predictive control of wave energy converters (WEC). The advantage of this approach relies on its exploitation of the differential flatness of the WEC model. This can reduce the dimension of the resulting nonlinear programming problem (NLP) derived from the continuous constrained optimal control of WEC using pseudospectral method. The alleviation of computational burden using this approach helps to promote an economic implementation of nonlinear model predictive control strategy for WEC control problems. The method is applicable to nonlinear WEC models, nonconvex objective functions and nonlinear constraints, which are commonly encountered in WEC control problems. Numerical simulations demonstrate the efficacy of this approach.
Economic agents and markets as emergent phenomena
Tesfatsion, Leigh
2002-01-01
An overview of recent work in agent-based computational economics is provided, with a stress on the research areas highlighted in the National Academy of Sciences Sackler Colloquium session “Economic Agents and Markets as Emergent Phenomena” held in October 2001. PMID:12011395
NASA Astrophysics Data System (ADS)
Gu, Rui
Vapor compression cycles are widely used in heating, refrigerating and air-conditioning. A slight performance improvement in the components of a vapor compression cycle, such as the compressor, can play a significant role in saving energy use. However, the complexity and cost of these improvements can block their application in the market. Modifying the conventional cycle configuration can offer a less complex and less costly alternative approach. Economizing is a common modification for improving the performance of the refrigeration cycle, resulting in decreasing the work required to compress the gas per unit mass. Traditionally, economizing requires multi-stage compressors, the cost of which has restrained the scope for practical implementation. Compressors with injection ports, which can be used to inject economized refrigerant during the compression process, introduce new possibilities for economization with less cost. This work focuses on computationally investigating a refrigeration system performance with two-phase fluid injection, developing a better understanding of the impact of injected refrigerant quality on refrigeration system performance as well as evaluating the potential COP improvement that injection provides based on refrigeration system performance provided by Copeland.
NASA Technical Reports Server (NTRS)
DeChant, Lawrence Justin
1998-01-01
In spite of rapid advances in both scalar and parallel computational tools, the large number of variables involved in both design and inverse problems make the use of sophisticated fluid flow models impractical, With this restriction, it is concluded that an important family of methods for mathematical/computational development are reduced or approximate fluid flow models. In this study a combined perturbation/numerical modeling methodology is developed which provides a rigorously derived family of solutions. The mathematical model is computationally more efficient than classical boundary layer but provides important two-dimensional information not available using quasi-1-d approaches. An additional strength of the current methodology is its ability to locally predict static pressure fields in a manner analogous to more sophisticated parabolized Navier Stokes (PNS) formulations. To resolve singular behavior, the model utilizes classical analytical solution techniques. Hence, analytical methods have been combined with efficient numerical methods to yield an efficient hybrid fluid flow model. In particular, the main objective of this research has been to develop a system of analytical and numerical ejector/mixer nozzle models, which require minimal empirical input. A computer code, DREA Differential Reduced Ejector/mixer Analysis has been developed with the ability to run sufficiently fast so that it may be used either as a subroutine or called by an design optimization routine. Models are of direct use to the High Speed Civil Transport Program (a joint government/industry project seeking to develop an economically.viable U.S. commercial supersonic transport vehicle) and are currently being adopted by both NASA and industry. Experimental validation of these models is provided by comparison to results obtained from open literature and Limited Exclusive Right Distribution (LERD) sources, as well as dedicated experiments performed at Texas A&M. These experiments have been performed using a hydraulic/gas flow analog. Results of comparisons of DREA computations with experimental data, which include entrainment, thrust, and local profile information, are overall good. Computational time studies indicate that DREA provides considerably more information at a lower computational cost than contemporary ejector nozzle design models. Finally. physical limitations of the method, deviations from experimental data, potential improvements and alternative formulations are described. This report represents closure to the NASA Graduate Researchers Program. Versions of the DREA code and a user's guide may be obtained from the NASA Lewis Research Center.
ENGINEERING ECONOMIC ANALYSIS OF A PROGRAM FOR ARTIFICIAL GROUNDWATER RECHARGE.
Reichard, Eric G.; Bredehoeft, John D.
1984-01-01
This study describes and demonstrates two alternate methods for evaluating the relative costs and benefits of artificial groundwater recharge using percolation ponds. The first analysis considers the benefits to be the reduction of pumping lifts and land subsidence; the second considers benefits as the alternative costs of a comparable surface delivery system. Example computations are carried out for an existing artificial recharge program in Santa Clara Valley in California. A computer groundwater model is used to estimate both the average long term and the drought period effects of artificial recharge in the study area. Results indicate that the costs of artificial recharge are considerably smaller than the alternative costs of an equivalent surface system. Refs.
ERIC Educational Resources Information Center
Hernandez Reyes, Christine M.
2013-01-01
Home computer ownership and Internet access have become essential to education, job security and economic opportunity. The digital divide, the gap between those who can afford and can use computer technologies remains greatest for ethnic/racial groups placing them at a disadvantage for economic and educational opportunities. The purpose of the…
Annual Report of the Metals and Ceramics Information Center, 1 May 1979-30 April 1980.
1980-07-01
MANAGEMENT AND ECONOMIC ANALYSIS DEPT. * Computer and Information SyslemsiD. C Operations 1 Battelle Technical Inputs to Planning * Computer Systems 0...Biomass Resources * Education 0 Business Planning * Information Systems * Economics , Planning and Policy Analysis * Statistical and Mathematical Modelrng...Metals and Ceramics Information Center (MCIC) is one of several technical information analysis centers (IAC’s) chartered and sponsored by the
Study of Wind Effects on Unique Buildings
NASA Astrophysics Data System (ADS)
Olenkov, V.; Puzyrev, P.
2017-11-01
The article deals with a numerical simulation of wind effects on the building of the Church of the Intercession of the Holy Virgin in the village Bulzi of the Chelyabinsk region. We presented a calculation algorithm and obtained pressure fields, velocity fields and the fields of kinetic energy of a wind stream, as well as streamlines. Computational fluid dynamic (CFD) evolved three decades ago at the interfaces of calculus mathematics and theoretical hydromechanics and has become a separate branch of science the subject of which is a numerical simulation of different fluid and gas flows as well as the solution of arising problems with the help of methods that involve computer systems. This scientific field which is of a great practical value is intensively developing. The increase in CFD-calculations is caused by the improvement of computer technologies, creation of multipurpose easy-to-use CFD-packagers that are available to a wide group of researchers and cope with various tasks. Such programs are not only competitive in comparison with physical experiments but sometimes they provide the only opportunity to answer the research questions. The following advantages of computer simulation can be pointed out: a) Reduction in time spent on design and development of a model in comparison with a real experiment (variation of boundary conditions). b) Numerical experiment allows for the simulation of conditions that are not reproducible with environmental tests (use of ideal gas as environment). c) Use of computational gas dynamics methods provides a researcher with a complete and ample information that is necessary to fully describe different processes of the experiment. d) Economic efficiency of computer calculations is more attractive than an experiment. e) Possibility to modify a computational model which ensures efficient timing (change of the sizes of wall layer cells in accordance with the chosen turbulence model).
The Coastal Ocean Prediction Systems program: Understanding and managing our coastal ocean
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eden, H.F.; Mooers, C.N.K.
1990-06-01
The goal of COPS is to couple a program of regular observations to numerical models, through techniques of data assimilation, in order to provide a predictive capability for the US coastal ocean including the Great Lakes, estuaries, and the entire Exclusive Economic Zone (EEZ). The objectives of the program include: determining the predictability of the coastal ocean and the processes that govern the predictability; developing efficient prediction systems for the coastal ocean based on the assimilation of real-time observations into numerical models; and coupling the predictive systems for the physical behavior of the coastal ocean to predictive systems for biological,more » chemical, and geological processes to achieve an interdisciplinary capability. COPS will provide the basis for effective monitoring and prediction of coastal ocean conditions by optimizing the use of increased scientific understanding, improved observations, advanced computer models, and computer graphics to make the best possible estimates of sea level, currents, temperatures, salinities, and other properties of entire coastal regions.« less
NASA Astrophysics Data System (ADS)
Lauren, Ari; Kinnunen, Jyrki-Pekko; Sikanen, Lauri
2016-04-01
Bioenergy contributes 26 % of the total energy use in Finland, and 60 % of this is provided by solid forest fuel consisting of small stems and logging residues such as tops, branches, roots and stumps. Typically the logging residues are stored as piles on site before transporting to regional combined heat and power plants for combustion. Profitability of forest fuel use depends on smart control of the feedstock. Fuel moisture, dry matter loss, and the rate of interest during the storing are the key variables affecting the economic value of the fuel. The value increases with drying, but decreases with wetting, dry matter loss and positive rate of interest. We compiled a simple simulation model computing the moisture change, dry matter loss, transportation costs and present value of feedstock piles. The model was used to predict the time of the maximum value of the stock, and to compose feedstock allocation strategies under the question: how should we choose the piles and the combustion time so that total energy yield and the economic value of the energy production is maximized? The question was assessed concerning the demand of the energy plant. The model parameterization was based on field scale studies. The initial moisture, and the rates of daily moisture change and dry matter loss in the feedstock piles depended on the day of the year according to empirical field measurements. Time step of the computation was one day. Effects of pile use timing on the total energy yield and profitability was studied using combinatorial optimization. Results show that the storing increases the pile maximum value if the natural drying onsets soon after the harvesting; otherwise dry matter loss and the capital cost of the storing overcome the benefits gained by drying. Optimized timing of the pile use can improve slightly the profitability, based on the increased total energy yield and because the energy unit based transportation costs decrease when water content in the biomass is decreased.
ERIC Educational Resources Information Center
Uthe, Elaine F.
1982-01-01
Describes the growing use of computers in our world and how their use will affect vocational education. Discusses recordkeeping and database functions, computer graphics, problem-solving simulations, satellite communications, home computers, and how they will affect office education, home economics education, marketing and distributive education,…
Computational fluid dynamics modelling in cardiovascular medicine
Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P
2016-01-01
This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards ‘digital patient’ or ‘virtual physiological human’ representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges. PMID:26512019
Environmental and socio-economic risk modelling for Chagas disease in Bolivia.
Mischler, Paula; Kearney, Michael; McCarroll, Jennifer C; Scholte, Ronaldo G C; Vounatsou, Penelope; Malone, John B
2012-09-01
Accurately defining disease distributions and calculating disease risk is an important step in the control and prevention of diseases. Geographical information systems (GIS) and remote sensing technologies, with maximum entropy (Maxent) ecological niche modelling computer software, were used to create predictive risk maps for Chagas disease in Bolivia. Prevalence rates were calculated from 2007 to 2009 household infection survey data for Bolivia, while environmental data were compiled from the Worldclim database and MODIS satellite imagery. Socio-economic data were obtained from the Bolivian National Institute of Statistics. Disease models identified altitudes at 500-3,500 m above the mean sea level (MSL), low annual precipitation (45-250 mm), and higher diurnal range of temperature (10-19 °C; peak 16 °C) as compatible with the biological requirements of the insect vectors. Socio-economic analyses demonstrated the importance of improved housing materials and water source. Home adobe wall materials and having to fetch drinking water from rivers or wells without pump were found to be highly related to distribution of the disease by the receiver operator characteristic (ROC) area under the curve (AUC) (0.69 AUC, 0.67 AUC and 0.62 AUC, respectively), while areas with hardwood floors demonstrated a direct negative relationship (-0.71 AUC). This study demonstrates that Maxent modelling can be used in disease prevalence and incidence studies to provide governmental agencies with an easily learned, understandable method to define areas as either high, moderate or low risk for the disease. This information may be used in resource planning, targeting and implementation. However, access to high-resolution, sub-municipality socio-economic data (e.g. census tracts) would facilitate elucidation of the relative influence of poverty-related factors on regional disease dynamics.
Analysis of fuel system technology for broad property fuels
NASA Technical Reports Server (NTRS)
Coffinberry, G. A.
1984-01-01
An analytical study was performed in order to assess relative performance and economic factors involved with alternative advanced fuel systems for future commercial aircraft operating with broad property fuels. Significant results, with emphasis on design practicality from the engine manufacturer' standpoint, are highlighted. Several advanced fuel systems were modeled to determine as accurately as possible the relative merits of each system from the standpoint of compatibility with broad property fuel. Freezing point, thermal stability, and lubricity were key property issues. A computer model was formulated to determine the investment incentive for each system. Results are given.
Analyzing interaction of electricity markets and environmental policies using equilibrium models
NASA Astrophysics Data System (ADS)
Chen, Yihsu
Around the world, the electric sector is evolving from a system of regulated vertically-integrated monopolies to a complex system of competing generation companies, unregulated traders, and regulated transmission and distribution. One emerging challenge faced by environmental policymakers and electricity industry is the interaction between electricity markets and environmental policies. The objective of this dissertation is to examine these interactions using large-scale computational models of electricity markets based on noncooperative game theory. In particular, this dissertation is comprised of four essays. The first essay studies the interaction of the United States Environmental Protection Agency NOx Budget Program and the mid-Atlantic electricity market. This research quantifies emissions, economic inefficiencies, price distortions, and overall social welfare under various market assumptions using engineering-economic models. The models calculate equilibria for imperfectly competitive markets---Cournot oligopoly---considering the actual landscape of power plants and transmission lines, and including the possibility of market power in the NOx allowances market. The second essay extends the results from first essay and models imperfectly competitive markets using a Stackelberg or leader-follower formulation. A leader in the power and NO x markets is assumed to have perfect foresight of its rivals' responses. The rivals' best response functions are explicitly embedded in the leader's constraints. The solutions quantify the extent to which a leader in the markets can extract economic rents on the expense of its followers. The third essay investigates the effect of implementing the European Union (EU) CO2 Emissions Trading Scheme (ETS) on wholesale power prices in the Western European electricity market. This research uses theoretical and computational modeling approaches to quantify the degree to which CO2 costs were passed on to power prices, and quantifies the windfall profits earned by generators under the current EU allowances allocation method. The results show that the generators in EU could earn substantial windfall profits from two sources: free emissions allowances and increased gross margin among inframarginal generating units. The fourth essay examines effect of climate change on future pollution emissions from regional electricity markets, accounting for how climate influences demand profiles and generation efficiencies. This research illustrates that even when seasonal/annual pollution emissions are limited by regulatory caps, significant increases in emissions during high-demand hours could potentially lead to an increase in the occurrences of acute ozone episodes, which worsen public health during summer months. The major contributions of this dissertation are two fold. First, the methodological and computational framework developed in the research provides a basis for understanding complex interactions among several oligopolistic markets and climate policies. Second, the outcomes of the research reinforce the need for careful monitoring of market interactions and a thorough examination of the design of allowances and power markets.
The economic burden of Clostridium difficile
McGlone, S. M.; Bailey, R. R.; Zimmer, S. M.; Popovich, M. J.; Tian, Y.; Ufberg, P.; Muder, R. R.; Lee, B. Y.
2013-01-01
Although Clostridium difficile (C. difficile) is the leading cause of infectious diarrhoea in hospitalized patients, the economic burden of this major nosocomial pathogen for hospitals, third-party payers and society remains unclear. We developed an economic computer simulation model to determine the costs attributable to healthcare-acquired C. difficile infection (CDI) from the hospital, third-party payer and societal perspectives. Sensitivity analyses explored the effects of varying the cost of hospitalization, C. difficile-attributable length of stay, and the probability of initial and secondary recurrences. The median cost of a case ranged from $9179 to $11 456 from the hospital perspective, $8932 to $11 679 from the third-party payor perspective, and $13 310 to $16 464 from the societal perspective. Most of the costs incurred were accrued during a patient’s primary CDI episode. Hospitals with an incidence of 4.1 CDI cases per 100 000 discharges would incur costs ≥$3.2 million (hospital perspective); an incidence of 10.5 would lead to costs ≥$30.6 million. Our model suggests that the annual US economic burden of CDI would be ≥$496 million (hospital perspective), ≥$547 million (third-party payer perspective) and ≥$796 million (societal perspective). Our results show that C. difficile infection is indeed costly, not only to third-party payers and the hospital, but to society as well. These results are consistent with current literature citing C. difficile as a costly disease. PMID:21668576
Economics of movable interior blankets for greenhouses
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, G.B.; Fohner, G.R.; Albright, L.D.
1981-01-01
A model for evaluating the economic impact of investment in a movable interior blanket was formulated. The method of analysis was net present value (NPV), in which the discounted, after-tax cash flow of costs and benefits was computed for the useful life of the system. An added feature was a random number component which permitted any or all of the input parameters to be varied within a specified range. Results from 100 computer runs indicated that all of the NPV estimates generated were positive, showing that the investment was profitable. However, there was a wide range of NPV estimates, frommore » $16.00/m/sup 2/ to $86.40/m/sup 2/, with a median value of $49.34/m/sup 2/. Key variables allowed to range in the analysis were: (1) the cost of fuel before the blanket is installed; (2) the percent fuel savings resulting from use of the blanket; (3) the annual real increase in the cost of fuel; and (4) the change in the annual value of the crop. The wide range in NPV estimates indicates the difficulty in making general recommendations regarding the economic feasibility of the investment when uncertainty exists as to the correct values for key variables in commercial settings. The results also point out needed research into the effect of the blanket on the crop, and on performance characteristics of the blanket.« less
NASA Astrophysics Data System (ADS)
Hamza, Karim; Shalaby, Mohamed
2014-09-01
This article presents a framework for simulation-based design optimization of computationally expensive problems, where economizing the generation of sample designs is highly desirable. One popular approach for such problems is efficient global optimization (EGO), where an initial set of design samples is used to construct a kriging model, which is then used to generate new 'infill' sample designs at regions of the search space where there is high expectancy of improvement. This article attempts to address one of the limitations of EGO, where generation of infill samples can become a difficult optimization problem in its own right, as well as allow the generation of multiple samples at a time in order to take advantage of parallel computing in the evaluation of the new samples. The proposed approach is tested on analytical functions, and then applied to the vehicle crashworthiness design of a full Geo Metro model undergoing frontal crash conditions.
Electric power and the global economy: Advances in database construction and sector representation
NASA Astrophysics Data System (ADS)
Peters, Jeffrey C.
The electricity sector plays a crucial role in the global economy. The sector is a major consumer of fossil fuel resources, producer of greenhouse gas emissions, and an important indicator and correlate of economic development. As such, the sector is a primary target for policy-makers seeking to address these issues. The sector is also experiencing rapid technological change in generation (e.g. renewables), primary inputs (e.g. horizontal drilling and hydraulic fracturing), and end-use efficiency. This dissertation seeks to further our understanding of the role of the electricity sector as part of the dynamic global energy-economy, which requires significant research advances in both database construction and modeling techniques. Chapter 2 identifies useful engineering-level data and presents a novel matrix balancing method for integrating these data in global economic databases. Chapter 3 demonstrates the relationship between matrix balancing method and modeling results, and Chapter 4 presents the full construction methodology for GTAP-Power, the foremost, publicly-available global computable general equilibrium database. Chapter 5 presents an electricity-detailed computational equilibrium model that explicitly and endogenously captures capacity utilization, capacity expansion, and their interdependency - important aspects of technological substitution in the electricity sector. The individual, but interrelated, research contributions to database construction and electricity modeling in computational equilibrium are placed in the context of analyzing the US EPA Clean Power Plan (CPP) CO 2 target of 32 percent reduction of CO2 emissions in the US electricity sector from a 2005 baseline by 2030. Assuming current fuel prices, the model predicts an almost 28 percent CO2 reduction without further policy intervention. Next, a carbon tax and investment subsidies for renewable technologies to meet the CPP full targets are imposed and compared (Chapter 6). The carbon tax achieves the target via both utilization and expansion, while the renewable investment subsidies lead to over-expansion and compromises some of the possibilities via utilization. In doing so, this dissertation furthers our understanding of the role of the electricity sector as part of the dynamic global energy-economy.
2011 Computation Directorate Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, D L
2012-04-11
From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilitiesmore » and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence. Specifically, ASCI/ASC accelerated the development of simulation capabilities necessary to ensure confidence in the nuclear stockpile-far exceeding what might have been achieved in the absence of a focused initiative. While stockpile stewardship research pushed LLNL scientists to develop new computer codes, better simulation methods, and improved visualization technologies, this work also stimulated the exploration of HPC applications beyond the standard sponsor base. As LLNL advances to a petascale platform and pursues exascale computing (1,000 times faster than Sequoia), ASC will be paramount to achieving predictive simulation and uncertainty quantification. Predictive simulation and quantifying the uncertainty of numerical predictions where little-to-no data exists demands exascale computing and represents an expanding area of scientific research important not only to nuclear weapons, but to nuclear attribution, nuclear reactor design, and understanding global climate issues, among other fields. Aside from these lofty goals and challenges, computing at LLNL is anything but 'business as usual.' International competition in supercomputing is nothing new, but the HPC community is now operating in an expanded, more aggressive climate of global competitiveness. More countries understand how science and technology research and development are inextricably linked to economic prosperity, and they are aggressively pursuing ways to integrate HPC technologies into their native industrial and consumer products. In the interest of the nation's economic security and the science and technology that underpins it, LLNL is expanding its portfolio and forging new collaborations. We must ensure that HPC remains an asymmetric engine of innovation for the Laboratory and for the U.S. and, in doing so, protect our research and development dynamism and the prosperity it makes possible. One untapped area of opportunity LLNL is pursuing is to help U.S. industry understand how supercomputing can benefit their business. Industrial investment in HPC applications has historically been limited by the prohibitive cost of entry, the inaccessibility of software to run the powerful systems, and the years it takes to grow the expertise to develop codes and run them in an optimal way. LLNL is helping industry better compete in the global market place by providing access to some of the world's most powerful computing systems, the tools to run them, and the experts who are adept at using them. Our scientists are collaborating side by side with industrial partners to develop solutions to some of industry's toughest problems. The goal of the Livermore Valley Open Campus High Performance Computing Innovation Center is to allow American industry the opportunity to harness the power of supercomputing by leveraging the scientific and computational expertise at LLNL in order to gain a competitive advantage in the global economy.« less
Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.
ERIC Educational Resources Information Center
Pierre, Samuel
2001-01-01
Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…
NASA Astrophysics Data System (ADS)
Park, Chan-Hee; Lee, Cholwoo
2016-04-01
Raspberry Pi series is a low cost, smaller than credit-card sized computers that various operating systems such as linux and recently even Windows 10 are ported to run on. Thanks to massive production and rapid technology development, the price of various sensors that can be attached to Raspberry Pi has been dropping at an increasing speed. Therefore, the device can be an economic choice as a small portable computer to monitor temporal hydrogeological data in fields. In this study, we present a Raspberry Pi system that measures a flow rate, and temperature of groundwater at sites, stores them into mysql database, and produces interactive figures and tables such as google charts online or bokeh offline for further monitoring and analysis. Since all the data are to be monitored on internet, any computers or mobile devices can be good monitoring tools at convenience. The measured data are further integrated with OpenGeoSys, one of the hydrogeological models that is also ported to the Raspberry Pi series. This leads onsite hydrogeological modeling fed by temporal sensor data to meet various needs.
Optimization-Based Inverse Identification of the Parameters of a Concrete Cap Material Model
NASA Astrophysics Data System (ADS)
Král, Petr; Hokeš, Filip; Hušek, Martin; Kala, Jiří; Hradil, Petr
2017-10-01
Issues concerning the advanced numerical analysis of concrete building structures in sophisticated computing systems currently require the involvement of nonlinear mechanics tools. The efforts to design safer, more durable and mainly more economically efficient concrete structures are supported via the use of advanced nonlinear concrete material models and the geometrically nonlinear approach. The application of nonlinear mechanics tools undoubtedly presents another step towards the approximation of the real behaviour of concrete building structures within the framework of computer numerical simulations. However, the success rate of this application depends on having a perfect understanding of the behaviour of the concrete material models used and having a perfect understanding of the used material model parameters meaning. The effective application of nonlinear concrete material models within computer simulations often becomes very problematic because these material models very often contain parameters (material constants) whose values are difficult to obtain. However, getting of the correct values of material parameters is very important to ensure proper function of a concrete material model used. Today, one possibility, which permits successful solution of the mentioned problem, is the use of optimization algorithms for the purpose of the optimization-based inverse material parameter identification. Parameter identification goes hand in hand with experimental investigation while it trying to find parameter values of the used material model so that the resulting data obtained from the computer simulation will best approximate the experimental data. This paper is focused on the optimization-based inverse identification of the parameters of a concrete cap material model which is known under the name the Continuous Surface Cap Model. Within this paper, material parameters of the model are identified on the basis of interaction between nonlinear computer simulations, gradient based and nature inspired optimization algorithms and experimental data, the latter of which take the form of a load-extension curve obtained from the evaluation of uniaxial tensile test results. The aim of this research was to obtain material model parameters corresponding to the quasi-static tensile loading which may be further used for the research involving dynamic and high-speed tensile loading. Based on the obtained results it can be concluded that the set goal has been reached.
The NASA Lewis Research Center: An Economic Impact Study
NASA Technical Reports Server (NTRS)
Austrian, Ziona
1996-01-01
The NASA Lewis Research Center (LeRC), established in 1941, is one of ten NASA research centers in the country. It is situated on 350 acres of land in Cuyahoga County and occupies more than 140 buildings and over 500 specialized research and test facilities. Most of LeRC's facilities are located in the City of Cleveland; some are located within the boundaries of the cities of Fairview Park and Brookpark. LeRC is a lead center for NASA's research, technology, and development in the areas of aeropropulsion and selected space applications. It is a center of excellence for turbomachinery, microgravity fluid and combustion research, and commercial communication. The base research and technology disciplines which serve both aeronautics and space areas include materials and structures, instrumentation and controls, fluid physics, electronics, and computational fluid dynamics. This study investigates LeRC's economic impact on Northeast Ohio's economy. It was conducted by The Urban Center's Economic Development Program in Cleveland State University's Levin College of Urban Affairs. The study measures LeRC's direct impact on the local economy in terms of jobs, output, payroll, and taxes, as well as the indirect impact of these economic activities when they 'ripple' throughout the economy. To fully explain LeRC's overall impact on the region, its contributions in the areas of technology transfer and education are also examined. The study uses a highly credible and widely accepted research methodology. First, regional economic multipliers based on input-output models were used to estimate the effect of LERC spending on the Northeast Ohio economy. Second, the economic models were complemented by interviews with industrial, civic, and university leaders to qualitatively assess LeRC's impact in the areas of technology transfer and education.
Kotsopoulos, Nikolaos; Connolly, Mark P.
2014-01-01
Vaccination is an established intervention that reduces the burden and prevents the spread of infectious diseases. Investing in vaccination is known to offer a wide range of economic and intangible benefits that can potentiate gains for the individual and for society. The discipline of economics provides us with microeconomic and macroeconomic methods for evaluating the economic gains attributed to health status changes. However, the observed gap between micro and macro estimates attributed to health presents challenges to our understanding of health-related productivity changes and, consequently, economic benefits. The gap suggests that the manner in which health-related productive output is quantified in microeconomic models might not adequately reflect the broader economic benefit. We propose that there is a transitional domain that links the micro- and macroeconomic improvement attributed to health status changes. Currently available economic evaluation methods typically omit these consequences, however; they may be adjusted to integrate these transitional consequences. In practical terms, this may give rise to multipliers to apply toward indirect costs to account for the broader macroeconomic benefits linked to changes in health status. In addition, it is possible to consider that different medical conditions and health care interventions may pose different multiplying effects, suggesting that the manner in which resources are allocated within health services gives rise to variation in the amount of the micro–macro gap. An interesting way to move forward in integrating the micro- and macro-level assessment might be by integrating computable general equilibrium (CGE) models as part of the evaluation framework, as was recently performed for pandemic flu and malaria vaccination. PMID:27226842
Kotsopoulos, Nikolaos; Connolly, Mark P
2014-01-01
Vaccination is an established intervention that reduces the burden and prevents the spread of infectious diseases. Investing in vaccination is known to offer a wide range of economic and intangible benefits that can potentiate gains for the individual and for society. The discipline of economics provides us with microeconomic and macroeconomic methods for evaluating the economic gains attributed to health status changes. However, the observed gap between micro and macro estimates attributed to health presents challenges to our understanding of health-related productivity changes and, consequently, economic benefits. The gap suggests that the manner in which health-related productive output is quantified in microeconomic models might not adequately reflect the broader economic benefit. We propose that there is a transitional domain that links the micro- and macroeconomic improvement attributed to health status changes. Currently available economic evaluation methods typically omit these consequences, however; they may be adjusted to integrate these transitional consequences. In practical terms, this may give rise to multipliers to apply toward indirect costs to account for the broader macroeconomic benefits linked to changes in health status. In addition, it is possible to consider that different medical conditions and health care interventions may pose different multiplying effects, suggesting that the manner in which resources are allocated within health services gives rise to variation in the amount of the micro-macro gap. An interesting way to move forward in integrating the micro- and macro-level assessment might be by integrating computable general equilibrium (CGE) models as part of the evaluation framework, as was recently performed for pandemic flu and malaria vaccination.
NASA Astrophysics Data System (ADS)
Moore, F. C.; Baldos, U. L. C.; Hertel, T. W.; Diaz, D.
2016-12-01
Substantial advances have been made in recent years in understanding the effects of climate change on agriculture, but this is not currently represented in economic models used to quantify the benefits of reducing greenhouse gas emissions. In fact, the science regarding climate change impacts on agriculture in these models dates to the early 1990s or before. In this paper we derive new economic damage functions for the agricultural sector based on two methods for aggregating current scientific understanding of the impacts of warming on yields. We first present a new meta-analysis based on a review of the agronomic literature performed for the IPCC 5th Assessment Report and compare results from this approach with findings from the AgMIP Global Gridded Crop Model Intercomparison (GGCMI). We find yield impacts implied by the meta-analysis are generally more negative than those from the GGCMI, particularly at higher latitudes, but show substantial agreement in many areas. We then use both yield products as input to the Global Trade Analysis Project (GTAP) computable general equilibrium (CGE) model in order to estimate the welfare consequences of these yield shocks and to produce two new economic damage functions. These damage functions are consistently more negative than the current representation of agricultural damages in Integrated Asessment Models (IAMs), in some cases substantially so. Replacing the existing damage functions with those based on more recent science increases the social cost of carbon (SCC) by between 43% (GGCMI) and 143% (Meta-Analysis). In addition to presenting a new mutli-crop, multi-model gridded yield impact prouct that complements the GGCMI, this is also the first end-to-end study that directly links the biophysical impacts of climate change to the SCC, something we believe essential to improving the integrity of IAMs going forward.
NASA Astrophysics Data System (ADS)
Neverre, Noémie; Dumas, Patrice; Nassopoulos, Hypatia
2016-04-01
Global changes are expected to exacerbate water scarcity issues in the Mediterranean region in the next decades. In this work, we investigate the impacts of reservoirs operation rules based on an economic criterion. We examine whether can they help reduce the costs of water scarcity, and whether they become more relevant under future climatic and socioeconomic conditions. We develop an original hydroeconomic model able to compare future water supply and demand on a large scale, while representing river basin heterogeneity. On the demand side, we focus on the two main sectors of water use: the irrigation and domestic sectors. Demands are projected in terms of both quantity and economic value. Irrigation requirements are computed for 12 types of crops, at the 0.5° spatial resolution, under future climatic conditions (A1B scenario). The computation of the economic benefits of irrigation water is based on a yield comparison approach between rainfed and irrigated crops. For the domestic sector, we project the combined effects of demographic growth, economic development and water cost evolution on future demands. The economic value of domestic water is defined as the economic surplus. On the supply side, we evaluate the impacts of climate change on water inflows to the reservoirs. Operating rules of the reservoirs are set up using a parameterisation-simulation-optimisation approach. The objective is to maximise water benefits. We introduce prudential parametric rules in order to take into account spatial and temporal trade-offs. The methodology is applied to Algeria at the 2050 horizon. Overall, our results show that the supply-demand imbalance and its costs will increase in most basins under future climatic and socioeconomic conditions. Our results suggest that the benefits of operating rules based on economic criteria are not unequivocally increased with global changes: in some basins the positive impact of economic prioritisation is higher under future conditions, but in other basins it is higher under historical conditions. Global changes may be an incentive to use valuation in operating rules in some basins. In other basins, the benefits of reservoirs management based on economic criteria are less pronounced; in this case, trade-offs could arise between implementing economic based operation policies or not. Given its generic nature and low data requirements, the framework developed could be implemented in other regions concerned with water scarcity and its cost, or extended to a global coverage. Water policies at the country or regional level could be assessed.
Persistence in a Random Bond Ising Model of Socio-Econo Dynamics
NASA Astrophysics Data System (ADS)
Jain, S.; Yamano, T.
We study the persistence phenomenon in a socio-econo dynamics model using computer simulations at a finite temperature on hypercubic lattices in dimensions up to five. The model includes a "social" local field which contains the magnetization at time t. The nearest neighbour quenched interactions are drawn from a binary distribution which is a function of the bond concentration, p. The decay of the persistence probability in the model depends on both the spatial dimension and p. We find no evidence of "blocking" in this model. We also discuss the implications of our results for possible applications in the social and economic fields. It is suggested that the absence, or otherwise, of blocking could be used as a criterion to decide on the validity of a given model in different scenarios.
Impact analysis of government investment on water projects in the arid Gansu Province of China
NASA Astrophysics Data System (ADS)
Wang, Zhan; Deng, Xiangzheng; Li, Xiubin; Zhou, Qing; Yan, Haiming
In this paper, we introduced three-nested Constant Elasticity of Substitution (CES) production function into a static Computable General Equilibrium (CGE) Model. Through four levels of factor productivity, we constructed three nested production function of land use productivity in the conceptual modeling frameworks. The first level of factor productivity is generated by the basic value-added land. On the second level, factor productivity in each sector is generated by human activities that presents human intervention to the first level of factor productivity. On the third level of factor productivity, water allocation reshapes the non-linear structure of transaction among first and second levels. From the perspective of resource utilization, we examined the economic efficiency of water allocation. The scenario-based empirical analysis results show that the three-nested CES production function within CGE model is well-behaved to present the economy system of the case study area. Firstly, water scarcity harmed economic production. Government investment on water projects in Gansu thereby had impacts on economic outcomes. Secondly, huge governmental financing on water projects bring depreciation of present value of social welfare. Moreover, water use for environment adaptation pressures on water supply. The theoretical water price can be sharply increased due to the increasing costs of factor inputs. Thirdly, water use efficiency can be improved by water projects, typically can be benefited from the expansion of water-saving irrigation areas even in those expanding dry area in Gansu. Therefore, increasing governmental financing on water projects can depreciate present value of social welfare but benefit economic efficiency for future generation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kistler, B.L.
DELSOL3 is a revised and updated version of the DELSOL2 computer program (SAND81-8237) for calculating collector field performance and layout and optimal system design for solar thermal central receiver plants. The code consists of a detailed model of the optical performance, a simpler model of the non-optical performance, an algorithm for field layout, and a searching algorithm to find the best system design based on energy cost. The latter two features are coupled to a cost model of central receiver components and an economic model for calculating energy costs. The code can handle flat, focused and/or canted heliostats, and externalmore » cylindrical, multi-aperture cavity, and flat plate receivers. The program optimizes the tower height, receiver size, field layout, heliostat spacings, and tower position at user specified power levels subject to flux limits on the receiver and land constraints for field layout. DELSOL3 maintains the advantages of speed and accuracy which are characteristics of DELSOL2.« less
Scherbaum, Stefan; Dshemuchadse, Maja; Goschke, Thomas
2012-01-01
Temporal discounting denotes the fact that individuals prefer smaller rewards delivered sooner over larger rewards delivered later, often to a higher extent than suggested by normative economical theories. In this article, we identify three lines of research studying this phenomenon which aim (i) to describe temporal discounting mathematically, (ii) to explain observed choice behavior psychologically, and (iii) to predict the influence of specific factors on intertemporal decisions. We then opt for an approach integrating postulated mechanisms and empirical findings from these three lines of research. Our approach focuses on the dynamical properties of decision processes and is based on computational modeling. We present a dynamic connectionist model of intertemporal choice focusing on the role of self-control and time framing as two central factors determining choice behavior. Results of our simulations indicate that the two influences interact with each other, and we present experimental data supporting this prediction. We conclude that computational modeling of the decision process dynamics can advance the integration of different strands of research in intertemporal choice. PMID:23181048
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Wei-Chen; Maitra, Ranjan
2011-01-01
We propose a model-based approach for clustering time series regression data in an unsupervised machine learning framework to identify groups under the assumption that each mixture component follows a Gaussian autoregressive regression model of order p. Given the number of groups, the traditional maximum likelihood approach of estimating the parameters using the expectation-maximization (EM) algorithm can be employed, although it is computationally demanding. The somewhat fast tune to the EM folk song provided by the Alternating Expectation Conditional Maximization (AECM) algorithm can alleviate the problem to some extent. In this article, we develop an alternative partial expectation conditional maximization algorithmmore » (APECM) that uses an additional data augmentation storage step to efficiently implement AECM for finite mixture models. Results on our simulation experiments show improved performance in both fewer numbers of iterations and computation time. The methodology is applied to the problem of clustering mutual funds data on the basis of their average annual per cent returns and in the presence of economic indicators.« less
Scherbaum, Stefan; Dshemuchadse, Maja; Goschke, Thomas
2012-01-01
Temporal discounting denotes the fact that individuals prefer smaller rewards delivered sooner over larger rewards delivered later, often to a higher extent than suggested by normative economical theories. In this article, we identify three lines of research studying this phenomenon which aim (i) to describe temporal discounting mathematically, (ii) to explain observed choice behavior psychologically, and (iii) to predict the influence of specific factors on intertemporal decisions. We then opt for an approach integrating postulated mechanisms and empirical findings from these three lines of research. Our approach focuses on the dynamical properties of decision processes and is based on computational modeling. We present a dynamic connectionist model of intertemporal choice focusing on the role of self-control and time framing as two central factors determining choice behavior. Results of our simulations indicate that the two influences interact with each other, and we present experimental data supporting this prediction. We conclude that computational modeling of the decision process dynamics can advance the integration of different strands of research in intertemporal choice.
Brosch, Tobias; Coppin, Géraldine; Schwartz, Sophie; Sander, David
2012-06-01
Neuroeconomic research has delineated neural regions involved in the computation of value, referring to a currency for concrete choices and decisions ('economic value'). Research in psychology and sociology, on the other hand, uses the term 'value' to describe motivational constructs that guide choices and behaviors across situations ('core value'). As a first step towards an integration of these literatures, we compared the neural regions computing economic value and core value. Replicating previous work, economic value computations activated a network centered on medial orbitofrontal cortex. Core value computations activated medial prefrontal cortex, a region involved in the processing of self-relevant information and dorsal striatum, involved in action selection. Core value ratings correlated with activity in precuneus and anterior prefrontal cortex, potentially reflecting the degree to which a core value is perceived as internalized part of one's self-concept. Distributed activation pattern in insula and ACC allowed differentiating individual core value types. These patterns may represent evaluation profiles reflecting prototypical fundamental concerns expressed in the core value types. Our findings suggest mechanisms by which core values, as motivationally important long-term goals anchored in the self-schema, may have the behavioral power to drive decisions and behaviors in the absence of immediately rewarding behavioral options.
A simulation study on Bayesian Ridge regression models for several collinearity levels
NASA Astrophysics Data System (ADS)
Efendi, Achmad; Effrihan
2017-12-01
When analyzing data with multiple regression model if there are collinearities, then one or several predictor variables are usually omitted from the model. However, there sometimes some reasons, for instance medical or economic reasons, the predictors are all important and should be included in the model. Ridge regression model is not uncommon in some researches to use to cope with collinearity. Through this modeling, weights for predictor variables are used for estimating parameters. The next estimation process could follow the concept of likelihood. Furthermore, for the estimation nowadays the Bayesian version could be an alternative. This estimation method does not match likelihood one in terms of popularity due to some difficulties; computation and so forth. Nevertheless, with the growing improvement of computational methodology recently, this caveat should not at the moment become a problem. This paper discusses about simulation process for evaluating the characteristic of Bayesian Ridge regression parameter estimates. There are several simulation settings based on variety of collinearity levels and sample sizes. The results show that Bayesian method gives better performance for relatively small sample sizes, and for other settings the method does perform relatively similar to the likelihood method.
Ranking of Air Force Heating Plants Relative to the Economic Benefit of Coal Utilization
1989-11-01
HTlW Output Capacity ..................... 27 5.2.2 Combustion Technologies ......................... 31 5.3 COMPUTER MODEL FOR LCC ANALISIS ...and field-erected units have been examined. The packaged units are factory -built, shell (fire-tube) boilers that are small enotgh to be shipped by...40 HBtMu/h with a thermal energy capacity factory of about 65% if used as a baseload heating plant. A water- tube boiler with a steam rating of 1200
The aquatic animals' transcriptome resource for comparative functional analysis.
Chou, Chih-Hung; Huang, Hsi-Yuan; Huang, Wei-Chih; Hsu, Sheng-Da; Hsiao, Chung-Der; Liu, Chia-Yu; Chen, Yu-Hung; Liu, Yu-Chen; Huang, Wei-Yun; Lee, Meng-Lin; Chen, Yi-Chang; Huang, Hsien-Da
2018-05-09
Aquatic animals have great economic and ecological importance. Among them, non-model organisms have been studied regarding eco-toxicity, stress biology, and environmental adaptation. Due to recent advances in next-generation sequencing techniques, large amounts of RNA-seq data for aquatic animals are publicly available. However, currently there is no comprehensive resource exist for the analysis, unification, and integration of these datasets. This study utilizes computational approaches to build a new resource of transcriptomic maps for aquatic animals. This aquatic animal transcriptome map database dbATM provides de novo assembly of transcriptome, gene annotation and comparative analysis of more than twenty aquatic organisms without draft genome. To improve the assembly quality, three computational tools (Trinity, Oases and SOAPdenovo-Trans) were employed to enhance individual transcriptome assembly, and CAP3 and CD-HIT-EST software were then used to merge these three assembled transcriptomes. In addition, functional annotation analysis provides valuable clues to gene characteristics, including full-length transcript coding regions, conserved domains, gene ontology and KEGG pathways. Furthermore, all aquatic animal genes are essential for comparative genomics tasks such as constructing homologous gene groups and blast databases and phylogenetic analysis. In conclusion, we establish a resource for non model organism aquatic animals, which is great economic and ecological importance and provide transcriptomic information including functional annotation and comparative transcriptome analysis. The database is now publically accessible through the URL http://dbATM.mbc.nctu.edu.tw/ .
Optimal CO2 mitigation under damage risk valuation
NASA Astrophysics Data System (ADS)
Crost, Benjamin; Traeger, Christian P.
2014-07-01
The current generation has to set mitigation policy under uncertainty about the economic consequences of climate change. This uncertainty governs both the level of damages for a given level of warming, and the steepness of the increase in damage per warming degree. Our model of climate and the economy is a stochastic version of a model employed in assessing the US Social Cost of Carbon (DICE). We compute the optimal carbon taxes and CO2 abatement levels that maximize welfare from economic consumption over time under different risk states. In accordance with recent developments in finance, we separate preferences about time and risk to improve the model's calibration of welfare to observed market interest. We show that introducing the modern asset pricing framework doubles optimal abatement and carbon taxation. Uncertainty over the level of damages at a given temperature increase can result in a slight increase of optimal emissions as compared to using expected damages. In contrast, uncertainty governing the steepness of the damage increase in temperature results in a substantially higher level of optimal mitigation.
Multiphasic Health Testing in the Clinic Setting
LaDou, Joseph
1971-01-01
The economy of automated multiphasic health testing (amht) activities patterned after the high-volume Kaiser program can be realized in low-volume settings. amht units have been operated at daily volumes of 20 patients in three separate clinical environments. These programs have displayed economics entirely compatible with cost figures published by the established high-volume centers. This experience, plus the expanding capability of small, general purpose, digital computers (minicomputers) indicates that a group of six or more physicians generating 20 laboratory appraisals per day can economically justify a completely automated multiphasic health testing facility. This system would reside in the clinic or hospital where it is used and can be configured to do analyses such as electrocardiography and generate laboratory reports, and communicate with large computer systems in university medical centers. Experience indicates that the most effective means of implementing these benefits of automation is to make them directly available to the medical community with the physician playing the central role. Economic justification of a dedicated computer through low-volume health testing then allows, as a side benefit, automation of administrative as well as other diagnostic activities—for example, patient billing, computer-aided diagnosis, and computer-aided therapeutics. PMID:4935771
Evaluating Health Co-Benefits of Climate Change Mitigation in Urban Mobility
Wolkinger, Brigitte; Weisz, Ulli; Hutter, Hans-Peter; Delcour, Jennifer; Griebler, Robert; Mittelbach, Bernhard; Maier, Philipp; Reifeltshammer, Raphael
2018-01-01
There is growing recognition that implementation of low-carbon policies in urban passenger transport has near-term health co-benefits through increased physical activity and improved air quality. Nevertheless, co-benefits and related cost reductions are often not taken into account in decision processes, likely because they are not easy to capture. In an interdisciplinary multi-model approach we address this gap, investigating the co-benefits resulting from increased physical activity and improved air quality due to climate mitigation policies for three urban areas. Additionally we take a (macro-)economic perspective, since that is the ultimate interest of policy-makers. Methodologically, we link a transport modelling tool, a transport emission model, an emission dispersion model, a health model and a macroeconomic Computable General Equilibrium (CGE) model to analyze three climate change mitigation scenarios. We show that higher levels of physical exercise and reduced exposure to pollutants due to mitigation measures substantially decrease morbidity and mortality. Expenditures are mainly born by the public sector but are mostly offset by the emerging co-benefits. Our macroeconomic results indicate a strong positive welfare effect, yet with slightly negative GDP and employment effects. We conclude that considering economic co-benefits of climate change mitigation policies in urban mobility can be put forward as a forceful argument for policy makers to take action. PMID:29710784
What is needed to eliminate new pediatric HIV infections: The contribution of model-based analyses
Doherty, Katie; Ciaranello, Andrea
2013-01-01
Purpose of Review Computer simulation models can identify key clinical, operational, and economic interventions that will be needed to achieve the elimination of new pediatric HIV infections. In this review, we summarize recent findings from model-based analyses of strategies for prevention of mother-to-child HIV transmission (MTCT). Recent Findings In order to achieve elimination of MTCT (eMTCT), model-based studies suggest that scale-up of services will be needed in several domains: uptake of services and retention in care (the PMTCT “cascade”), interventions to prevent HIV infections in women and reduce unintended pregnancies (the “four-pronged approach”), efforts to support medication adherence through long periods of pregnancy and breastfeeding, and strategies to make breastfeeding safer and/or shorter. Models also project the economic resources that will be needed to achieve these goals in the most efficient ways to allocate limited resources for eMTCT. Results suggest that currently recommended PMTCT regimens (WHO Option A, Option B, and Option B+) will be cost-effective in most settings. Summary Model-based results can guide future implementation science, by highlighting areas in which additional data are needed to make informed decisions and by outlining critical interventions that will be necessary in order to eliminate new pediatric HIV infections. PMID:23743788
Research on monocentric model of urbanization by agent-based simulation
NASA Astrophysics Data System (ADS)
Xue, Ling; Yang, Kaizhong
2008-10-01
Over the past years, GIS have been widely used for modeling urbanization from a variety of perspectives such as digital terrain representation and overlay analysis using cell-based data platform. Similarly, simulation of urban dynamics has been achieved with the use of Cellular Automata. In contrast to these approaches, agent-based simulation provides a much more powerful set of tools. This allows researchers to set up a counterpart for real environmental and urban systems in computer for experimentation and scenario analysis. This Paper basically reviews the research on the economic mechanism of urbanization and an agent-based monocentric model is setup for further understanding the urbanization process and mechanism in China. We build an endogenous growth model with dynamic interactions between spatial agglomeration and urban development by using agent-based simulation. It simulates the migration decisions of two main types of agents, namely rural and urban households between rural and urban area. The model contains multiple economic interactions that are crucial in understanding urbanization and industrial process in China. These adaptive agents can adjust their supply and demand according to the market situation by a learning algorithm. The simulation result shows this agent-based urban model is able to perform the regeneration and to produce likely-to-occur projections of reality.
Evaluating Health Co-Benefits of Climate Change Mitigation in Urban Mobility.
Wolkinger, Brigitte; Haas, Willi; Bachner, Gabriel; Weisz, Ulli; Steininger, Karl; Hutter, Hans-Peter; Delcour, Jennifer; Griebler, Robert; Mittelbach, Bernhard; Maier, Philipp; Reifeltshammer, Raphael
2018-04-28
There is growing recognition that implementation of low-carbon policies in urban passenger transport has near-term health co-benefits through increased physical activity and improved air quality. Nevertheless, co-benefits and related cost reductions are often not taken into account in decision processes, likely because they are not easy to capture. In an interdisciplinary multi-model approach we address this gap, investigating the co-benefits resulting from increased physical activity and improved air quality due to climate mitigation policies for three urban areas. Additionally we take a (macro-)economic perspective, since that is the ultimate interest of policy-makers. Methodologically, we link a transport modelling tool, a transport emission model, an emission dispersion model, a health model and a macroeconomic Computable General Equilibrium (CGE) model to analyze three climate change mitigation scenarios. We show that higher levels of physical exercise and reduced exposure to pollutants due to mitigation measures substantially decrease morbidity and mortality. Expenditures are mainly born by the public sector but are mostly offset by the emerging co-benefits. Our macroeconomic results indicate a strong positive welfare effect, yet with slightly negative GDP and employment effects. We conclude that considering economic co-benefits of climate change mitigation policies in urban mobility can be put forward as a forceful argument for policy makers to take action.
Initial CGE Model Results Summary Exogenous and Endogenous Variables Tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, Brian Keith; Boero, Riccardo; Rivera, Michael Kelly
The following discussion presents initial results of tests of the most recent version of the National Infrastructure Simulation and Analysis Center Dynamic Computable General Equilibrium (CGE) model developed by Los Alamos National Laboratory (LANL). The intent of this is to test and assess the model’s behavioral properties. The test evaluated whether the predicted impacts are reasonable from a qualitative perspective. This issue is whether the predicted change, be it an increase or decrease in other model variables, is consistent with prior economic intuition and expectations about the predicted change. One of the purposes of this effort is to determine whethermore » model changes are needed in order to improve its behavior qualitatively and quantitatively.« less
Monte Carlo simulation of single accident airport risk profile
NASA Technical Reports Server (NTRS)
1979-01-01
A computer simulation model was developed for estimating the potential economic impacts of a carbon fiber release upon facilities within an 80 kilometer radius of a major airport. The model simulated the possible range of release conditions and the resulting dispersion of the carbon fibers. Each iteration of the model generated a specific release scenario, which would cause a specific amount of dollar loss to the surrounding community. By repeated iterations, a risk profile was generated, showing the probability distribution of losses from one accident. Using accident probability estimates, the risks profile for annual losses was derived. The mechanics are described of the simulation model, the required input data, and the risk profiles generated for the 26 large hub airports.
Krieger, Nancy; Feldman, Justin M; Waterman, Pamela D; Chen, Jarvis T; Coull, Brent A; Hemenway, David
2017-04-01
Research on residential segregation and health, primarily conducted in the USA, has chiefly employed city or regional measures of racial segregation. To test our hypothesis that stronger associations would be observed using local measures, especially for racialized economic segregation, we analyzed risk of fatal and non-fatal assault in Massachusetts (1995-2010), since this outcome is strongly associated with residential segregation. The segregation metrics comprised the Index of Concentration at the Extremes (ICE), the Index of Dissimilarity, and poverty rate, with measures computed at both the census tract and city/town level. Key results were that larger associations between fatal and non-fatal assaults and residential segregation occurred for models using the census tract vs. city/town measures, with the greatest associations observed for racialized economic segregation. For fatal assaults, comparing the bottom vs. top quintiles, the incidence rate ratio (and 95% confidence interval (CI)) in models using the census tract measures equaled 3.96 (95% CI 3.10, 5.06) for the ICE for racialized economic segregation, 3.26 (95% CI 2.58, 4.14) for the ICE for income, 3.14 (95% CI 2.47, 3.99) for poverty, 2.90 (95% CI 2.21, 3.81) for the ICE for race/ethnicity, and only 0.93 (95% CI 0.79, 1.11) for the Index of Dissimilarity; in models that included both census tract and city/town ICE measures, this risk ratio for the ICE for racialized economic segregation was higher at the census tract (3.29; 95% CI 2.43, 4.46) vs. city/town level (1.61; 95% CI 1.12, 2.32). These results suggest that, at least in the case of fatal and non-fatal assaults, research on residential segregation should employ local measures, including of racialized economic segregation, to avoid underestimating the adverse impact of segregation on health.
Summerfield, Christopher; Tsetsos, Konstantinos
2012-01-01
Investigation into the neural and computational bases of decision-making has proceeded in two parallel but distinct streams. Perceptual decision-making (PDM) is concerned with how observers detect, discriminate, and categorize noisy sensory information. Economic decision-making (EDM) explores how options are selected on the basis of their reinforcement history. Traditionally, the sub-fields of PDM and EDM have employed different paradigms, proposed different mechanistic models, explored different brain regions, disagreed about whether decisions approach optimality. Nevertheless, we argue that there is a common framework for understanding decisions made in both tasks, under which an agent has to combine sensory information (what is the stimulus) with value information (what is it worth). We review computational models of the decision process typically used in PDM, based around the idea that decisions involve a serial integration of evidence, and assess their applicability to decisions between good and gambles. Subsequently, we consider the contribution of three key brain regions - the parietal cortex, the basal ganglia, and the orbitofrontal cortex (OFC) - to perceptual and EDM, with a focus on the mechanisms by which sensory and reward information are integrated during choice. We find that although the parietal cortex is often implicated in the integration of sensory evidence, there is evidence for its role in encoding the expected value of a decision. Similarly, although much research has emphasized the role of the striatum and OFC in value-guided choices, they may play an important role in categorization of perceptual information. In conclusion, we consider how findings from the two fields might be brought together, in order to move toward a general framework for understanding decision-making in humans and other primates.
Summerfield, Christopher; Tsetsos, Konstantinos
2012-01-01
Investigation into the neural and computational bases of decision-making has proceeded in two parallel but distinct streams. Perceptual decision-making (PDM) is concerned with how observers detect, discriminate, and categorize noisy sensory information. Economic decision-making (EDM) explores how options are selected on the basis of their reinforcement history. Traditionally, the sub-fields of PDM and EDM have employed different paradigms, proposed different mechanistic models, explored different brain regions, disagreed about whether decisions approach optimality. Nevertheless, we argue that there is a common framework for understanding decisions made in both tasks, under which an agent has to combine sensory information (what is the stimulus) with value information (what is it worth). We review computational models of the decision process typically used in PDM, based around the idea that decisions involve a serial integration of evidence, and assess their applicability to decisions between good and gambles. Subsequently, we consider the contribution of three key brain regions – the parietal cortex, the basal ganglia, and the orbitofrontal cortex (OFC) – to perceptual and EDM, with a focus on the mechanisms by which sensory and reward information are integrated during choice. We find that although the parietal cortex is often implicated in the integration of sensory evidence, there is evidence for its role in encoding the expected value of a decision. Similarly, although much research has emphasized the role of the striatum and OFC in value-guided choices, they may play an important role in categorization of perceptual information. In conclusion, we consider how findings from the two fields might be brought together, in order to move toward a general framework for understanding decision-making in humans and other primates. PMID:22654730
Writing Better Software for Economics Principles Textbooks.
ERIC Educational Resources Information Center
Walbert, Mark S.
1989-01-01
Examines computer software currently available with most introductory economics textbooks. Compares what is available with what should be available in order to meet the goal of effectively using the microcomputer to teach economic principles. Recommends 14 specific pedagogical changes that should be made in order to improve current designs. (LS)
Systems Analysis and Design for Decision Support Systems on Economic Feasibility of Projects
NASA Astrophysics Data System (ADS)
Balaji, S. Arun
2010-11-01
This paper discuss about need for development of the Decision Support System (DSS) software for economic feasibility of projects in Rwanda, Africa. The various economic theories needed and the corresponding formulae to compute payback period, internal rate of return and benefit cost ratio of projects are clearly given in this paper. This paper is also deals with the systems flow chart to fabricate the system in any higher level computing language. The various input requirements from the projects and the output needed for the decision makers are also included in this paper. The data dictionary used for input and output data structure is also explained.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.
The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational andmore » theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.« less
Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hebner, Gregory A.
Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less
Artificial Exo-Society Modeling: a New Tool for SETI Research
NASA Astrophysics Data System (ADS)
Gardner, James N.
2002-01-01
One of the newest fields of complexity research is artificial society modeling. Methodologically related to artificial life research, artificial society modeling utilizes agent-based computer simulation tools like SWARM and SUGARSCAPE developed by the Santa Fe Institute, Los Alamos National Laboratory and the Bookings Institution in an effort to introduce an unprecedented degree of rigor and quantitative sophistication into social science research. The broad aim of artificial society modeling is to begin the development of a more unified social science that embeds cultural evolutionary processes in a computational environment that simulates demographics, the transmission of culture, conflict, economics, disease, the emergence of groups and coadaptation with an environment in a bottom-up fashion. When an artificial society computer model is run, artificial societal patterns emerge from the interaction of autonomous software agents (the "inhabitants" of the artificial society). Artificial society modeling invites the interpretation of society as a distributed computational system and the interpretation of social dynamics as a specialized category of computation. Artificial society modeling techniques offer the potential of computational simulation of hypothetical alien societies in much the same way that artificial life modeling techniques offer the potential to model hypothetical exobiological phenomena. NASA recently announced its intention to begin exploring the possibility of including artificial life research within the broad portfolio of scientific fields comprised by the interdisciplinary astrobiology research endeavor. It may be appropriate for SETI researchers to likewise commence an exploration of the possible inclusion of artificial exo-society modeling within the SETI research endeavor. Artificial exo-society modeling might be particularly useful in a post-detection environment by (1) coherently organizing the set of data points derived from a detected ETI signal, (2) mapping trends in the data points over time (assuming receipt of an extended ETI signal), and (3) projecting such trends forward to derive alternative cultural evolutionary scenarios for the exo-society under analysis. The latter exercise might be particularly useful to compensate for the inevitable time lag between generation of an ETI signal and receipt of an ETI signal on Earth. For this reason, such an exercise might be a helpful adjunct to the decisional process contemplated by Paragraph 9 of the Declaration of Principles Concerning Activities Following the Detection of Extraterrestrial Intelligence.
Modeling and Simulation of the Economics of Mining in the Bitcoin Market
Marchesi, Michele
2016-01-01
In January 3, 2009, Satoshi Nakamoto gave rise to the “Bitcoin Blockchain”, creating the first block of the chain hashing on his computer’s central processing unit (CPU). Since then, the hash calculations to mine Bitcoin have been getting more and more complex, and consequently the mining hardware evolved to adapt to this increasing difficulty. Three generations of mining hardware have followed the CPU’s generation. They are GPU’s, FPGA’s and ASIC’s generations. This work presents an agent-based artificial market model of the Bitcoin mining process and of the Bitcoin transactions. The goal of this work is to model the economy of the mining process, starting from GPU’s generation, the first with economic significance. The model reproduces some “stylized facts” found in real-time price series and some core aspects of the mining business. In particular, the computational experiments performed can reproduce the unit root property, the fat tail phenomenon and the volatility clustering of Bitcoin price series. In addition, under proper assumptions, they can reproduce the generation of Bitcoins, the hashing capability, the power consumption, and the mining hardware and electrical energy expenditures of the Bitcoin network. PMID:27768691
Chance-constrained economic dispatch with renewable energy and storage
Cheng, Jianqiang; Chen, Richard Li-Yang; Najm, Habib N.; ...
2018-04-19
Increased penetration of renewables, along with uncertainties associated with them, have transformed how power systems are operated. High levels of uncertainty means that it is not longer possible to guarantee operational feasibility with certainty, instead constraints are required to be satisfied with high probability. We present a chance-constrained economic dispatch model that efficiently integrates energy storage and high renewable penetration to satisfy renewable portfolio requirements. Specifically, it is required that wind energy contributes at least a prespecified ratio of the total demand and that the scheduled wind energy is dispatchable with high probability. We develop an approximated partial sample averagemore » approximation (PSAA) framework to enable efficient solution of large-scale chanceconstrained economic dispatch problems. Computational experiments on the IEEE-24 bus system show that the proposed PSAA approach is more accurate, closer to the prescribed tolerance, and about 100 times faster than sample average approximation. Improved efficiency of our PSAA approach enables solution of WECC-240 system in minutes.« less
Chance-constrained economic dispatch with renewable energy and storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Jianqiang; Chen, Richard Li-Yang; Najm, Habib N.
Increased penetration of renewables, along with uncertainties associated with them, have transformed how power systems are operated. High levels of uncertainty means that it is not longer possible to guarantee operational feasibility with certainty, instead constraints are required to be satisfied with high probability. We present a chance-constrained economic dispatch model that efficiently integrates energy storage and high renewable penetration to satisfy renewable portfolio requirements. Specifically, it is required that wind energy contributes at least a prespecified ratio of the total demand and that the scheduled wind energy is dispatchable with high probability. We develop an approximated partial sample averagemore » approximation (PSAA) framework to enable efficient solution of large-scale chanceconstrained economic dispatch problems. Computational experiments on the IEEE-24 bus system show that the proposed PSAA approach is more accurate, closer to the prescribed tolerance, and about 100 times faster than sample average approximation. Improved efficiency of our PSAA approach enables solution of WECC-240 system in minutes.« less
Distributed Economic Dispatch in Microgrids Based on Cooperative Reinforcement Learning.
Liu, Weirong; Zhuang, Peng; Liang, Hao; Peng, Jun; Huang, Zhiwu; Weirong Liu; Peng Zhuang; Hao Liang; Jun Peng; Zhiwu Huang; Liu, Weirong; Liang, Hao; Peng, Jun; Zhuang, Peng; Huang, Zhiwu
2018-06-01
Microgrids incorporated with distributed generation (DG) units and energy storage (ES) devices are expected to play more and more important roles in the future power systems. Yet, achieving efficient distributed economic dispatch in microgrids is a challenging issue due to the randomness and nonlinear characteristics of DG units and loads. This paper proposes a cooperative reinforcement learning algorithm for distributed economic dispatch in microgrids. Utilizing the learning algorithm can avoid the difficulty of stochastic modeling and high computational complexity. In the cooperative reinforcement learning algorithm, the function approximation is leveraged to deal with the large and continuous state spaces. And a diffusion strategy is incorporated to coordinate the actions of DG units and ES devices. Based on the proposed algorithm, each node in microgrids only needs to communicate with its local neighbors, without relying on any centralized controllers. Algorithm convergence is analyzed, and simulations based on real-world meteorological and load data are conducted to validate the performance of the proposed algorithm.
Keogh-Brown, Marcus Richard; Smith, Richard D; Edmunds, John W; Beutels, Philippe
2010-12-01
The 2003 outbreak of severe acute respiratory syndrome (SARS) showed that infectious disease outbreaks can have notable macroeconomic impacts. The current H1N1 and potential H5N1 flu pandemics could have a much greater impact. Using a multi-sector single country computable general equilibrium model of the United Kingdom, France, Belgium and The Netherlands, together with disease scenarios of varying severity, we examine the potential economic cost of a modern pandemic. Policies of school closure, vaccination and antivirals, together with prophylactic absence from work are evaluated and their cost impacts are estimated. Results suggest GDP losses from the disease of approximately 0.5-2% but school closure and prophylactic absenteeism more than triples these effects. Increasing school closures from 4 weeks at the peak to entire pandemic closure almost doubles the economic cost, but antivirals and vaccinations seem worthwhile. Careful planning is therefore important to ensure expensive policies to mitigate the pandemic are effective in minimising illness and deaths.
Augmented Lagrange Hopfield network for solving economic dispatch problem in competitive environment
NASA Astrophysics Data System (ADS)
Vo, Dieu Ngoc; Ongsakul, Weerakorn; Nguyen, Khai Phuc
2012-11-01
This paper proposes an augmented Lagrange Hopfield network (ALHN) for solving economic dispatch (ED) problem in the competitive environment. The proposed ALHN is a continuous Hopfield network with its energy function based on augmented Lagrange function for efficiently dealing with constrained optimization problems. The ALHN method can overcome the drawbacks of the conventional Hopfield network such as local optimum, long computational time, and linear constraints. The proposed method is used for solving the ED problem with two revenue models of revenue based on payment for power delivered and payment for reserve allocated. The proposed ALHN has been tested on two systems of 3 units and 10 units for the two considered revenue models. The obtained results from the proposed methods are compared to those from differential evolution (DE) and particle swarm optimization (PSO) methods. The result comparison has indicated that the proposed method is very efficient for solving the problem. Therefore, the proposed ALHN could be a favorable tool for ED problem in the competitive environment.
Lee, Bruce Y.; Adam, Atif; Zenkov, Eli; Hertenstein, Daniel; Ferguson, Marie C.; Wang, Peggy I.; Wong, Michelle S.; Wedlock, Patrick; Nyathi, Sindiso; Gittelsohn, Joel; Falah-Fini, Saeideh; Bartsch, Sarah M.; Cheskin, Lawrence J.; Brown, Shawn T.
2017-01-01
Increasing physical activity among children is a potentially important public health intervention. Quantifying the economic and health effects of the intervention would help decision makers understand its impact and priority. Using a computational simulation model that we developed to represent all US children ages 8–11 years, we estimated that maintaining the current physical activity levels (only 31.9 percent of children get twenty-five minutes of high-calorie-burning physical activity three times a week) would result each year in a net present value of $1.1 trillion in direct medical costs and $1.7 trillion in lost productivity over the course of their lifetimes. If 50 percent of children would exercise, the number of obese and overweight youth would decrease by 4.18 percent, averting $8.1 billion in direct medical costs and $13.8 billion in lost productivity. Increasing the proportion of children who exercised to 75 percent would avert $16.6 billion and $23.6 billion, respectively. PMID:28461358
Costa Rica Rainfall in Future Climate Change Scenarios
NASA Astrophysics Data System (ADS)
Castillo Rodriguez, R. A., Sr.; Amador, J. A.; Duran-Quesada, A. M.
2017-12-01
Studies of intraseasonal and annual cycles of meteorological variables, using projections of climate change, are nowadays extremely important to improve regional socio-economic planning for countries. This is particularly true in Costa Rica, as Central America has been identified as a climate change hot spot. Today many of the economic activities in the region, especially those related to agriculture, tourism and hydroelectric power generation are linked to the seasonal cycle of precipitation. Changes in rainfall (mm/day) and in the diurnal temperature range (°C) for the periods 1950-2005 and 2006-2100 were investigated using the NASA Earth Exchange Global Daily Downscaled Projections (NEX-GDDP) constructed using the CMIP5 (Coupled Model Intercomparison Project version 5) data. Differences between the multi-model ensembles of the two prospective scenarios (RCP 4.5 and RCP 8.5) and the retrospective baseline scenario were computed. This study highlights Costa Rica as an inflexion point of the climate change in the region and also suggests future drying conditions.
Analysis of real-time reservoir monitoring : reservoirs, strategies, & modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mani, Seethambal S.; van Bloemen Waanders, Bart Gustaaf; Cooper, Scott Patrick
2006-11-01
The project objective was to detail better ways to assess and exploit intelligent oil and gas field information through improved modeling, sensor technology, and process control to increase ultimate recovery of domestic hydrocarbons. To meet this objective we investigated the use of permanent downhole sensors systems (Smart Wells) whose data is fed real-time into computational reservoir models that are integrated with optimized production control systems. The project utilized a three-pronged approach (1) a value of information analysis to address the economic advantages, (2) reservoir simulation modeling and control optimization to prove the capability, and (3) evaluation of new generation sensormore » packaging to survive the borehole environment for long periods of time. The Value of Information (VOI) decision tree method was developed and used to assess the economic advantage of using the proposed technology; the VOI demonstrated the increased subsurface resolution through additional sensor data. Our findings show that the VOI studies are a practical means of ascertaining the value associated with a technology, in this case application of sensors to production. The procedure acknowledges the uncertainty in predictions but nevertheless assigns monetary value to the predictions. The best aspect of the procedure is that it builds consensus within interdisciplinary teams The reservoir simulation and modeling aspect of the project was developed to show the capability of exploiting sensor information both for reservoir characterization and to optimize control of the production system. Our findings indicate history matching is improved as more information is added to the objective function, clearly indicating that sensor information can help in reducing the uncertainty associated with reservoir characterization. Additional findings and approaches used are described in detail within the report. The next generation sensors aspect of the project evaluated sensors and packaging survivability issues. Our findings indicate that packaging represents the most significant technical challenge associated with application of sensors in the downhole environment for long periods (5+ years) of time. These issues are described in detail within the report. The impact of successful reservoir monitoring programs and coincident improved reservoir management is measured by the production of additional oil and gas volumes from existing reservoirs, revitalization of nearly depleted reservoirs, possible re-establishment of already abandoned reservoirs, and improved economics for all cases. Smart Well monitoring provides the means to understand how a reservoir process is developing and to provide active reservoir management. At the same time it also provides data for developing high-fidelity simulation models. This work has been a joint effort with Sandia National Laboratories and UT-Austin's Bureau of Economic Geology, Department of Petroleum and Geosystems Engineering, and the Institute of Computational and Engineering Mathematics.« less
Economic impact of large public programs: The NASA experience
NASA Technical Reports Server (NTRS)
Ginzburg, E.; Kuhn, J. W.; Schnee, J.; Yavitz, B.
1976-01-01
The economic impact of NASA programs on weather forecasting and the computer and semiconductor industries is discussed. Contributions to the advancement of the science of astronomy are also considered.
HRST architecture modeling and assessments
NASA Astrophysics Data System (ADS)
Comstock, Douglas A.
1997-01-01
This paper presents work supporting the assessment of advanced concept options for the Highly Reusable Space Transportation (HRST) study. It describes the development of computer models as the basis for creating an integrated capability to evaluate the economic feasibility and sustainability of a variety of system architectures. It summarizes modeling capabilities for use on the HRST study to perform sensitivity analysis of alternative architectures (consisting of different combinations of highly reusable vehicles, launch assist systems, and alternative operations and support concepts) in terms of cost, schedule, performance, and demand. In addition, the identification and preliminary assessment of alternative market segments for HRST applications, such as space manufacturing, space tourism, etc., is described. Finally, the development of an initial prototype model that can begin to be used for modeling alternative HRST concepts at the system level is presented.
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; Atlas, Robert (Technical Monitor)
2002-01-01
The Data Assimilation Office (DAO) has been developing a new generation of ultra-high resolution General Circulation Model (GCM) that is suitable for 4-D data assimilation, numerical weather predictions, and climate simulations. These three applications have conflicting requirements. For 4-D data assimilation and weather predictions, it is highly desirable to run the model at the highest possible spatial resolution (e.g., 55 km or finer) so as to be able to resolve and predict socially and economically important weather phenomena such as tropical cyclones, hurricanes, and severe winter storms. For climate change applications, the model simulations need to be carried out for decades, if not centuries. To reduce uncertainty in climate change assessments, the next generation model would also need to be run at a fine enough spatial resolution that can at least marginally simulate the effects of intense tropical cyclones. Scientific problems (e.g., parameterization of subgrid scale moist processes) aside, all three areas of application require the model's computational performance to be dramatically improved as compared to the previous generation. In this talk, I will present the current and future developments of the "finite-volume dynamical core" at the Data Assimilation Office. This dynamical core applies modem monotonicity preserving algorithms and is genuinely conservative by construction, not by an ad hoc fixer. The "discretization" of the conservation laws is purely local, which is clearly advantageous for resolving sharp gradient flow features. In addition, the local nature of the finite-volume discretization also has a significant advantage on distributed memory parallel computers. Together with a unique vertically Lagrangian control volume discretization that essentially reduces the dimension of the computational problem from three to two, the finite-volume dynamical core is very efficient, particularly at high resolutions. I will also present the computational design of the dynamical core using a hybrid distributed-shared memory programming paradigm that is portable to virtually any of today's high-end parallel super-computing clusters.
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; Atlas, Robert (Technical Monitor)
2002-01-01
The Data Assimilation Office (DAO) has been developing a new generation of ultra-high resolution General Circulation Model (GCM) that is suitable for 4-D data assimilation, numerical weather predictions, and climate simulations. These three applications have conflicting requirements. For 4-D data assimilation and weather predictions, it is highly desirable to run the model at the highest possible spatial resolution (e.g., 55 kin or finer) so as to be able to resolve and predict socially and economically important weather phenomena such as tropical cyclones, hurricanes, and severe winter storms. For climate change applications, the model simulations need to be carried out for decades, if not centuries. To reduce uncertainty in climate change assessments, the next generation model would also need to be run at a fine enough spatial resolution that can at least marginally simulate the effects of intense tropical cyclones. Scientific problems (e.g., parameterization of subgrid scale moist processes) aside, all three areas of application require the model's computational performance to be dramatically improved as compared to the previous generation. In this talk, I will present the current and future developments of the "finite-volume dynamical core" at the Data Assimilation Office. This dynamical core applies modem monotonicity preserving algorithms and is genuinely conservative by construction, not by an ad hoc fixer. The "discretization" of the conservation laws is purely local, which is clearly advantageous for resolving sharp gradient flow features. In addition, the local nature of the finite-volume discretization also has a significant advantage on distributed memory parallel computers. Together with a unique vertically Lagrangian control volume discretization that essentially reduces the dimension of the computational problem from three to two, the finite-volume dynamical core is very efficient, particularly at high resolutions. I will also present the computational design of the dynamical core using a hybrid distributed- shared memory programming paradigm that is portable to virtually any of today's high-end parallel super-computing clusters.
From the Patient Perspective: the Economic Value of Seasonal and H1N1 Influenza Vaccination
Lee, Bruce Y.; Bacon, Kristina; Donohue, Julie M.; Wiringa, Ann E.; Bailey, Rachel R.; Zimmerman, Richard K.
2011-01-01
Although studies have suggested that a patient’s perceived cost-benefit of a medical intervention could affect his or her utilization of the intervention, the economic value of influenza vaccine from the patient’s perspective remains unclear. Therefore, we developed a stochastic decision analytic computer model representing an adult’s decision of whether to get vaccinated. Different scenarios explored the impact of the patient being insured versus uninsured, influenza attack rate, vaccine administration costs and vaccination time costs. Results indicated that cost of avoiding influenza was fairly low, with one driver being required vaccination time. To encourage vaccination, decision makers may want to focus on ways to reduce this time, such as vaccinating at work, churches, or other normally frequented locations. PMID:21215340
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roy, L.; Rao, N.D.
1983-04-01
This paper presents a new method for optimal dispatch of real and reactive power generation which is based on cartesian coordinate formulation of economic dispatch problem and reclassification of state and control variables associated with generator buses. The voltage and power at these buses are classified as parametric and functional inequality constraints, and are handled by reduced gradient technique and penalty factor approach respectively. The advantage of this classification is the reduction in the size of the equality constraint model, leading to less storage requirement. The rectangular coordinate formulation results in an exact equality constraint model in which the coefficientmore » matrix is real, sparse, diagonally dominant, smaller in size and need be computed and factorized once only in each gradient step. In addition, Lagragian multipliers are calculated using a new efficient procedure. A natural outcome of these features is the solution of the economic dispatch problem, faster than other methods available to date in the literature. Rapid and reliable convergence is an additional desirable characteristic of the method. Digital simulation results are presented on several IEEE test systems to illustrate the range of application of the method visa-vis the popular Dommel-Tinney (DT) procedure. It is found that the proposed method is more reliable, 3-4 times faster and requires 20-30 percent less storage compared to the DT algorithm, while being just as general. Thus, owing to its exactness, robust mathematical model and less computational requirements, the method developed in the paper is shown to be a practically feasible algorithm for on-line optimal power dispatch.« less
Separate valuation subsystems for delay and effort decision costs.
Prévost, Charlotte; Pessiglione, Mathias; Météreau, Elise; Cléry-Melin, Marie-Laure; Dreher, Jean-Claude
2010-10-20
Decision making consists of choosing among available options on the basis of a valuation of their potential costs and benefits. Most theoretical models of decision making in behavioral economics, psychology, and computer science propose that the desirability of outcomes expected from alternative options can be quantified by utility functions. These utility functions allow a decision maker to assign subjective values to each option under consideration by weighting the likely benefits and costs resulting from an action and to select the one with the highest subjective value. Here, we used model-based neuroimaging to test whether the human brain uses separate valuation systems for rewards (erotic stimuli) associated with different types of costs, namely, delay and effort. We show that humans devalue rewards associated with physical effort in a strikingly similar fashion to those they devalue that are associated with delays, and that a single computational model derived from economics theory can account for the behavior observed in both delay discounting and effort discounting. However, our neuroimaging data reveal that the human brain uses distinct valuation subsystems for different types of costs, reflecting in opposite fashion delayed reward and future energetic expenses. The ventral striatum and the ventromedial prefrontal cortex represent the increasing subjective value of delayed rewards, whereas a distinct network, composed of the anterior cingulate cortex and the anterior insula, represent the decreasing value of the effortful option, coding the expected expense of energy. Together, these data demonstrate that the valuation processes underlying different types of costs can be fractionated at the cerebral level.
NASA Astrophysics Data System (ADS)
Omar, R.; Rani, M. N. Abdul; Yunus, M. A.; Mirza, W. I. I. Wan Iskandar; Zin, M. S. Mohd
2018-04-01
A simple structure with bolted joints consists of the structural components, bolts and nuts. There are several methods to model the structures with bolted joints, however there is no reliable, efficient and economic modelling methods that can accurately predict its dynamics behaviour. Explained in this paper is an investigation that was conducted to obtain an appropriate modelling method for bolted joints. This was carried out by evaluating four different finite element (FE) models of the assembled plates and bolts namely the solid plates-bolts model, plates without bolt model, hybrid plates-bolts model and simplified plates-bolts model. FE modal analysis was conducted for all four initial FE models of the bolted joints. Results of the FE modal analysis were compared with the experimental modal analysis (EMA) results. EMA was performed to extract the natural frequencies and mode shapes of the test physical structure with bolted joints. Evaluation was made by comparing the number of nodes, number of elements, elapsed computer processing unit (CPU) time, and the total percentage of errors of each initial FE model when compared with EMA result. The evaluation showed that the simplified plates-bolts model could most accurately predict the dynamic behaviour of the structure with bolted joints. This study proved that the reliable, efficient and economic modelling of bolted joints, mainly the representation of the bolting, has played a crucial element in ensuring the accuracy of the dynamic behaviour prediction.
Three-dimensional vector modeling and restoration of flat finite wave tank radiometric measurements
NASA Technical Reports Server (NTRS)
Truman, W. M.; Balanis, C. A.; Holmes, J. J.
1977-01-01
In this paper, a three-dimensional Fourier transform inversion method describing the interaction between water surface emitted radiation from a flat finite wave tank and antenna radiation characteristics is reported. The transform technique represents the scanning of the antenna mathematically as a correlation. Computation time is reduced by using the efficient and economical fast Fourier transform algorithm. To verify the inversion method, computations have been made and compared with known data and other available results. The technique has been used to restore data of the finite wave tank system and other available antenna temperature measurements made at the Cape Cod Canal. The restored brightness temperatures serve as better representations of the emitted radiation than the measured antenna temperatures.
Evolutionary Paths to Corrupt Societies of Artificial Agents
NASA Astrophysics Data System (ADS)
Nasrallah, Walid
Virtual corrupt societies can be defined as groups of interacting computer-generated agents who predominantly choose behavior that gives short term personal gain at the expense of a higher aggregate cost to others. This paper focuses on corrupt societies that, unlike published models in which cooperation must evolve in order for the society to continue to survive, do not naturally die out as the corrupt class siphons off the resources. For example, a very computationally simple strategy of avoiding confrontation can allow a majority of "unethical" individuals to survive off the efforts of an "ethical" but productive minority. Analogies are drawn to actual human societies in which similar conditions gave rise to behavior traditionally defined as economic or political corruption.
Sorensen, Mark V; Snodgrass, James J; Leonard, William R; McDade, Thomas W; Tarskaya, Larissa A; Ivanov, Kiundiul I; Krivoshapkin, Vadim G; Alekseev, Vladimir P
2009-01-01
The purpose of this study was to investigate the impact of economic and cultural change on immune function and psychosocial stress in an indigenous Siberian population. We examined Epstein-Barr virus antibodies (EBV), an indirect biomarker of cell-mediated immune function, in venous whole blood samples collected from 143 Yakut (Sakha) herders (45 men and 98 women) in six communities using a cross-sectional study design. We modeled economic change through the analysis of lifestyle incongruity (LI), calculated as the disparity between socioeconomic status and material lifestyle, computed with two orthogonal scales: market and subsistence lifestyle. EBV antibody level was significantly negatively associated with both a market and a subsistence lifestyle, indicating higher cell-mediated immune function associated with higher material lifestyle scores. In contrast, LI was significantly positively associated with EBV antibodies indicating lower immune function, and suggesting higher psychosocial stress, among individuals with economic status in excess of material lifestyle. Individuals with lower incongruity scores (i.e., economic status at parity with material resources, or with material resources in excess of economic status) had significantly lower EBV antibodies. The findings suggest significant health impacts of changes in material well-being and shifting status and prestige markers on health during the transition to a market economy in Siberia. The findings also suggest that relative, as opposed to absolute, level of economic status or material wealth is more strongly related to stress in the Siberian context.
GASP- General Aviation Synthesis Program. Volume 7: Economics
NASA Technical Reports Server (NTRS)
1978-01-01
The economic analysis includes: manufacturing costs; labor costs; parts costs; operating costs; markups and consumer price. A user's manual for a computer program to calculate the final consumer price is included.
Burton, Kirsteen R; Perlis, Nathan; Aviv, Richard I; Moody, Alan R; Kapral, Moira K; Krahn, Murray D; Laupacis, Andreas
2014-03-01
This study reviews the quality of economic evaluations of imaging after acute stroke and identifies areas for improvement. We performed full-text searches of electronic databases that included Medline, Econlit, the National Health Service Economic Evaluation Database, and the Tufts Cost Effectiveness Analysis Registry through July 2012. Search strategy terms included the following: stroke*; cost*; or cost-benefit analysis*; and imag*. Inclusion criteria were empirical studies published in any language that reported the results of economic evaluations of imaging interventions for patients with stroke symptoms. Study quality was assessed by a commonly used checklist (with a score range of 0% to 100%). Of 568 unique potential articles identified, 5 were included in the review. Four of 5 articles were explicit in their analysis perspectives, which included healthcare system payers, hospitals, and stroke services. Two studies reported results during a 5-year time horizon, and 3 studies reported lifetime results. All included the modified Rankin Scale score as an outcome measure. The median quality score was 84.4% (range=71.9%-93.5%). Most studies did not consider the possibility that patients could not tolerate contrast media or could incur contrast-induced nephropathy. Three studies compared perfusion computed tomography with unenhanced computed tomography but assumed that outcomes guided by the results of perfusion computed tomography were equivalent to outcomes guided by the results of magnetic resonance imaging or noncontrast computed tomography. Economic evaluations of imaging modalities after acute ischemic stroke were generally of high methodological quality. However, important radiology-specific clinical components were missing from all of these analyses.
NASA Astrophysics Data System (ADS)
Zhang, Y.; Sankaranarayanan, S.; Zaitchik, B. F.; Siddiqui, S.
2017-12-01
Africa is home to some of the most climate vulnerable populations in the world. Energy and agricultural development have diverse impacts on the region's food security and economic well-being from the household to the national level, particularly considering climate variability and change. Our ultimate goal is to understand coupled Food-Energy-Water (FEW) dynamics across spatial scales in order to quantify the sensitivity of critical human outcomes to FEW development strategies in Ethiopia. We are developing bottom-up and top-down multi-scale models, spanning local, sub-national and national scales to capture the FEW linkages across communities and climatic adaptation zones. The focus of this presentation is the sub-national scale multi-player micro-economic (MME) partial-equilibrium model with coupled food and energy sector for Ethiopia. With fixed large-scale economic, demographic, and resource factors from the national scale computable general equilibrium (CGE) model and inferences of behavior parameters from the local scale agent-based model (ABM), the MME studies how shocks such as drought (crop failure) and development of resilience technologies would influence FEW system at a sub-national scale. The MME model is based on aggregating individual optimization problems for relevant players. It includes production, storage, and consumption of food and energy at spatially disaggregated zones, and transportation in between with endogenously modeled infrastructure. The aggregated players for each zone have different roles such as crop producers, storage managers, and distributors, who make decisions according to their own but interdependent objective functions. The food and energy supply chain across zones is therefore captured. Ethiopia is dominated by rain-fed agriculture with only 2% irrigated farmland. Small-scale irrigation has been promoted as a resilience technology that could potentially play a critical role in food security and economic well-being in Ethiopia, but that also intersects with energy and water consumption. Here, we focus on the energy usage for small-scale irrigation and the collective impact on crop production and water resources across zones in the MME model.
NASA Astrophysics Data System (ADS)
Dai, H.; Xie, Y.; Zhang, Y.
2017-12-01
Context/Purpose: Power generation from renewable energy (RE) could substitute huge amount of fossil energy in the power sector and have substantial co-benefits of air quality and human health improvement. In 2016, China National Renewable Energy Center (CNREC) released China Renewable Energy Outlook, CREO2016 and CREO2017, towards 2030 and 2050, respectively, in which two scenarios are proposed, namely, a conservative "Stated Policy" scenario and a more ambitious "High RE" scenario. This study, together with CNREC, aims to quantify the health and economic benefits of developing renewable energy at the provincial level in China up to 2030 and 2050. Methods: For this purpose, we developed an integrated approach that combines a power dispatch model at CNREC, an air pollutant emission projection model using energy consumption data from the Long-range Energy Alternatives Planning System (LEAP) model, an air quality model (GEOS-Chem at Harvard), an own-developed health model, and a macro economic model (Computable General Equilibrium model). Results: All together, we attempt to quantify how developing RE could reduce the concentration of PM2.5 and ozone in 30 provinces of China, how the human health could be improved in terms of mortality, morbidity and work hour loss, and what is the economic value of the health improvement in terms of increased GDP and the value of statistical life lost. The results show that developing RE as stated in the CREO2016 could prevent chronic mortality of 286 thousand people in China in 2030 alone, the value of saved statistical life is worthy 1200 billion Yuan, equivalent to 1.2% of GDP. In addition, averagely, due to reduced mortality and improved morbidity each person could work additionally by 1.16 hours per year, this could contribute to an increase of GDP by 0.1% in 2030. The assessment up to 2050 is still underway. Interpretation: The results imply that when the external benefit of renewable energy is taken into account, RE could be cost competitive compared with fossil fuel use. In other words, fossil fuel combustion is not so cheap as it appears when considering its external cost in terms of human health damage. Conclusion: Our study finds that developing renewable energy could bring substantial health and economic benefits for China.
Computational fluid dynamics modelling in cardiovascular medicine.
Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P
2016-01-01
This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Social economic decision-making across the lifespan: An fMRI investigation.
Harlé, Katia M; Sanfey, Alan G
2012-06-01
Recent research in neuroeconomics suggests that social economic decision-making may be best understood as a dual-systems process, integrating the influence of deliberative and affective subsystems. However, most of this research has focused on young adults and it remains unclear whether our current models extend to healthy aging. To address this question, we investigated the behavioral and neural basis of simple economic decisions in 18 young and 20 older healthy adults. Participants made decisions which involved accepting or rejecting monetary offers from human and non-human (computer) partners in an Ultimatum Game, while undergoing functional magnetic resonance imaging (fMRI). The partners' proposals involved splitting an amount of money between the two players, and ranged from $1 to $5 (from a $10 pot). Relative to young adults, older participants expected more equitable offers and rejected moderately unfair offers ($3) to a larger extent. Imaging results revealed that, relative to young participants, older adults had higher activations in the left dorsolateral prefrontal cortex (DLPFC) when receiving unfair offers ($1-$3). Age group moderated the relationship between left DLPFC activation and acceptance rates of unfair offers. In contrast, older adults showed lower activation of bilateral anterior insula in response to unfair offers. No age group difference was observed when participants received fair ($5) offers. These findings suggest that healthy aging may be associated with a stronger reliance on computational areas subserving goal maintenance and rule shifting (DLPFC) during interactive economic decision-making. Consistent with a well-documented "positivity effect", older age may also decrease recruitment of areas involved in emotion processing and integration (anterior insula) in the face of social norm violation. Published by Elsevier Ltd.
Social economic decision-making across the lifespan: an fMRI investigation
Harlé, Katia M.; Sanfey, Alan G.
2012-01-01
Recent research in neuroeconomics suggests that social economic decision-making may be best understood as a dual-systems process, integrating the influence of deliberative and affective subsystems. However, most of this research has focused on young adults and it remains unclear whether our current models extend to healthy aging. To address this question, we investigated the behavioral and neural basis of simple economic decisions in 18 young and 20 older healthy adults. Participants made decisions which involved accepting or rejecting monetary offers from human and non-human (computer) partners in an Ultimatum Game, while undergoing functional magnetic resonance imaging (fMRI). The partners’ proposals involved splitting an amount of money between the two players, and ranged from $1 to $5 (from a $10 pot). Relative to young adults, older participants expected more equitable offers and rejected moderately unfair offers ($3) to a larger extent. Imaging results revealed that, relative to young participants, older adults had higher activations in the left dorsolateral prefrontal cortex (DLPFC) when receiving unfair offers ($1–$3). Age group moderated the relationship between left DLPFC activation and acceptance rates of unfair offers. In contrast, older adults showed lower activation of bilateral anterior insula in response to unfair offers. No age group difference was observed when participants received fair ($5) offers. These findings suggest that healthy aging may be associated with a stronger reliance on computational areas subserving goal maintenance and rule shifting (DLPFC) during interactive economic decision-making. Consistent with a well-documented “positivity effect”, older age may also decrease recruitment of areas involved in emotion processing and integration (anterior insula) in the face of social norm violation. PMID:22414593
Legacy model integration for enhancing hydrologic interdisciplinary research
NASA Astrophysics Data System (ADS)
Dozier, A.; Arabi, M.; David, O.
2013-12-01
Many challenges are introduced to interdisciplinary research in and around the hydrologic science community due to advances in computing technology and modeling capabilities in different programming languages, across different platforms and frameworks by researchers in a variety of fields with a variety of experience in computer programming. Many new hydrologic models as well as optimization, parameter estimation, and uncertainty characterization techniques are developed in scripting languages such as Matlab, R, Python, or in newer languages such as Java and the .Net languages, whereas many legacy models have been written in FORTRAN and C, which complicates inter-model communication for two-way feedbacks. However, most hydrologic researchers and industry personnel have little knowledge of the computing technologies that are available to address the model integration process. Therefore, the goal of this study is to address these new challenges by utilizing a novel approach based on a publish-subscribe-type system to enhance modeling capabilities of legacy socio-economic, hydrologic, and ecologic software. Enhancements include massive parallelization of executions and access to legacy model variables at any point during the simulation process by another program without having to compile all the models together into an inseparable 'super-model'. Thus, this study provides two-way feedback mechanisms between multiple different process models that can be written in various programming languages and can run on different machines and operating systems. Additionally, a level of abstraction is given to the model integration process that allows researchers and other technical personnel to perform more detailed and interactive modeling, visualization, optimization, calibration, and uncertainty analysis without requiring deep understanding of inter-process communication. To be compatible, a program must be written in a programming language with bindings to a common implementation of the message passing interface (MPI), which includes FORTRAN, C, Java, the .NET languages, Python, R, Matlab, and many others. The system is tested on a longstanding legacy hydrologic model, the Soil and Water Assessment Tool (SWAT), to observe and enhance speed-up capabilities for various optimization, parameter estimation, and model uncertainty characterization techniques, which is particularly important for computationally intensive hydrologic simulations. Initial results indicate that the legacy extension system significantly decreases developer time, computation time, and the cost of purchasing commercial parallel processing licenses, while enhancing interdisciplinary research by providing detailed two-way feedback mechanisms between various process models with minimal changes to legacy code.
Agent-based approach for generation of a money-centered star network
NASA Astrophysics Data System (ADS)
Yang, Jae-Suk; Kwon, Okyu; Jung, Woo-Sung; Kim, In-mook
2008-09-01
The history of trade is a progression from a pure barter system. A medium of exchange emerges autonomously in the market, a position currently occupied by money. We investigate an agent-based computational economics model consisting of interacting agents considering distinguishable properties of commodities which represent salability. We also analyze the properties of the commodity network using a spanning tree. We find that the “storage fee” is more crucial than “demand” in determining which commodity is used as a medium of exchange.
O-Charoen, Sirimon; Srivannavit, Onnop; Gulari, Erdogan
2008-01-01
Microfluidic microarrays have been developed for economical and rapid parallel synthesis of oligonucleotide and peptide libraries. For a synthesis system to be reproducible and uniform, it is crucial to have a uniform reagent delivery throughout the system. Computational fluid dynamics (CFD) is used to model and simulate the microfluidic microarrays to study geometrical effects on flow patterns. By proper design geometry, flow uniformity could be obtained in every microreactor in the microarrays. PMID:17480053
Data Management System for the National Energy-Water System (NEWS) Assessment Framework
NASA Astrophysics Data System (ADS)
Corsi, F.; Prousevitch, A.; Glidden, S.; Piasecki, M.; Celicourt, P.; Miara, A.; Fekete, B. M.; Vorosmarty, C. J.; Macknick, J.; Cohen, S. M.
2015-12-01
Aiming at providing a comprehensive assessment of the water-energy nexus, the National Energy-Water System (NEWS) project requires the integration of data to support a modeling framework that links climate, hydrological, power production, transmission, and economical models. Large amounts of Georeferenced data has to be streamed to the components of the inter-disciplinary model to explore future challenges and tradeoffs in the US power production, based on climate scenarios, power plant locations and technologies, available water resources, ecosystem sustainability, and economic demand. We used open source and in-house build software components to build a system that addresses two major data challenges: On-the-fly re-projection, re-gridding, interpolation, extrapolation, nodata patching, merging, temporal and spatial aggregation, of static and time series datasets in virtually any file formats and file structures, and any geographic extent for the models I/O, directly at run time; Comprehensive data management based on metadata cataloguing and discovery in repositories utilizing the MAGIC Table (Manipulation and Geographic Inquiry Control database). This innovative concept allows models to access data on-the-fly by data ID, irrespective of file path, file structure, file format and regardless its GIS specifications. In addition, a web-based information and computational system is being developed to control the I/O of spatially distributed Earth system, climate, and hydrological, power grid, and economical data flow within the NEWS framework. The system allows scenario building, data exploration, visualization, querying, and manipulation any loaded gridded, point, and vector polygon dataset. The system has demonstrated its potential for applications in other fields of Earth science modeling, education, and outreach. Over time, this implementation of the system will provide near real-time assessment of various current and future scenarios of the water-energy nexus.
NASA Astrophysics Data System (ADS)
Saldarriaga Vargas, Clarita
When there are diseases affecting large populations where the social, economic and cultural diversity is significant within the same region, the biological parameters that determine the behavior of the dispersion disease analysis are affected by the selection of different individuals. Therefore and because of the variety and magnitude of the communities at risk of contracting dengue disease around all over the world, suggest defining differentiated populations with individual contributions in the results of the dispersion dengue disease analysis. In this paper those conditions were taken in account when several epidemiologic models were analyzed. Initially a stability analysis was done for a SEIR mathematical model of Dengue disease without differential susceptibility. Both free disease and endemic equilibrium states were found in terms of the basic reproduction number and were defined in the Theorem (3.1). Then a DSEIR model was solved when a new susceptible group was introduced to consider the effects of important biological parameters of non-homogeneous populations in the spreading analysis. The results were compiled in the Theorem (3.2). Finally Theorems (3.3) and (3.4) resumed the basic reproduction numbers for three and n different susceptible groups respectively, giving an idea of how differential susceptibility affects the equilibrium states. The computations were done using an algorithmic method implemented in Maple 11, a general-purpose computer algebra system.
Economic Comparison of Processes Using Spreadsheet Programs
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Pappano, A. W.; Jennings, C. N.
1986-01-01
Inexpensive approach aids plant-design decisions. Commercially available electronic spreadsheet programs aid economic comparison of different processes for producing particular end products. Facilitates plantdesign decisions without requiring large expenditures for powerful mainframe computers.
Pedercini, Matteo; Movilla Blanco, Santiago; Kopainsky, Birgit
2011-01-01
DDT is considered to be the most cost-effective insecticide for combating malaria. However, it is also the most environmentally persistent and can pose risks to human health when sprayed indoors. Therefore, the use of DDT for vector control remains controversial. In this paper we develop a computer-based simulation model to assess some of the costs and benefits of the continued use of DDT for Indoor Residual Spraying (IRS) versus its rapid phase out. We apply the prototype model to the aggregated sub Saharan African region. For putting the question about the continued use of DDT for IRS versus its rapid phase out into perspective we calculate the same costs and benefits for alternative combinations of integrated vector management interventions. Our simulation results confirm that the current mix of integrated vector management interventions with DDT as the main insecticide is cheaper than the same mix with alternative insecticides when only direct costs are considered. However, combinations with a stronger focus on insecticide-treated bed nets and environmental management show higher levels of cost-effectiveness than interventions with a focus on IRS. Thus, this focus would also allow phasing out DDT in a cost-effective manner. Although a rapid phase out of DDT for IRS is the most expensive of the tested intervention combinations it can have important economic benefits in addition to health and environmental impacts that are difficult to assess in monetary terms. Those economic benefits captured by the model include the avoided risk of losses in agricultural exports. The prototype simulation model illustrates how a computer-based scenario analysis tool can inform debates on malaria control policies in general and on the continued use of DDT for IRS versus its rapid phase out in specific. Simulation models create systematic mechanisms for analyzing alternative interventions and making informed trade offs.
Pedercini, Matteo; Movilla Blanco, Santiago; Kopainsky, Birgit
2011-01-01
Introduction DDT is considered to be the most cost-effective insecticide for combating malaria. However, it is also the most environmentally persistent and can pose risks to human health when sprayed indoors. Therefore, the use of DDT for vector control remains controversial. Methods In this paper we develop a computer-based simulation model to assess some of the costs and benefits of the continued use of DDT for Indoor Residual Spraying (IRS) versus its rapid phase out. We apply the prototype model to the aggregated sub Saharan African region. For putting the question about the continued use of DDT for IRS versus its rapid phase out into perspective we calculate the same costs and benefits for alternative combinations of integrated vector management interventions. Results Our simulation results confirm that the current mix of integrated vector management interventions with DDT as the main insecticide is cheaper than the same mix with alternative insecticides when only direct costs are considered. However, combinations with a stronger focus on insecticide-treated bed nets and environmental management show higher levels of cost-effectiveness than interventions with a focus on IRS. Thus, this focus would also allow phasing out DDT in a cost-effective manner. Although a rapid phase out of DDT for IRS is the most expensive of the tested intervention combinations it can have important economic benefits in addition to health and environmental impacts that are difficult to assess in monetary terms. Those economic benefits captured by the model include the avoided risk of losses in agricultural exports. Conclusions The prototype simulation model illustrates how a computer-based scenario analysis tool can inform debates on malaria control policies in general and on the continued use of DDT for IRS versus its rapid phase out in specific. Simulation models create systematic mechanisms for analyzing alternative interventions and making informed trade offs. PMID:22140467
Economic optimization of operations for hybrid energy systems under variable markets
Chen, Jen; Garcia, Humberto E.
2016-05-21
We prosed a hybrid energy systems (HES) which is an important element to enable increasing penetration of clean energy. Our paper investigates the operations flexibility of HES, and develops a methodology for operations optimization for maximizing economic value based on predicted renewable generation and market information. A multi-environment computational platform for performing such operations optimization is also developed. In order to compensate for prediction error, a control strategy is accordingly designed to operate a standby energy storage element (ESE) to avoid energy imbalance within HES. The proposed operations optimizer allows systematic control of energy conversion for maximal economic value. Simulationmore » results of two specific HES configurations are included to illustrate the proposed methodology and computational capability. These results demonstrate the economic viability of HES under proposed operations optimizer, suggesting the diversion of energy for alternative energy output while participating in the ancillary service market. Economic advantages of such operations optimizer and associated flexible operations are illustrated by comparing the economic performance of flexible operations against that of constant operations. Sensitivity analysis with respect to market variability and prediction error, are also performed.« less
Economic optimization of operations for hybrid energy systems under variable markets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Jen; Garcia, Humberto E.
We prosed a hybrid energy systems (HES) which is an important element to enable increasing penetration of clean energy. Our paper investigates the operations flexibility of HES, and develops a methodology for operations optimization for maximizing economic value based on predicted renewable generation and market information. A multi-environment computational platform for performing such operations optimization is also developed. In order to compensate for prediction error, a control strategy is accordingly designed to operate a standby energy storage element (ESE) to avoid energy imbalance within HES. The proposed operations optimizer allows systematic control of energy conversion for maximal economic value. Simulationmore » results of two specific HES configurations are included to illustrate the proposed methodology and computational capability. These results demonstrate the economic viability of HES under proposed operations optimizer, suggesting the diversion of energy for alternative energy output while participating in the ancillary service market. Economic advantages of such operations optimizer and associated flexible operations are illustrated by comparing the economic performance of flexible operations against that of constant operations. Sensitivity analysis with respect to market variability and prediction error, are also performed.« less
Numerical Package in Computer Supported Numeric Analysis Teaching
ERIC Educational Resources Information Center
Tezer, Murat
2007-01-01
At universities in the faculties of Engineering, Sciences, Business and Economics together with higher education in Computing, it is stated that because of the difficulty, calculators and computers can be used in Numerical Analysis (NA). In this study, the learning computer supported NA will be discussed together with important usage of the…
Self-concept in fairness and rule establishment during a competitive game: a computational approach
Lee, Sang Ho; Kim, Sung-Phil; Cho, Yang Seok
2015-01-01
People consider fairness as well as their own interest when making decisions in economic games. The present study proposes a model that encompasses the self-concept determined by one's own kindness as a factor of fairness. To observe behavioral patterns that reflect self-concept and fairness, a chicken game experiment was conducted. Behavioral data demonstrates four distinct patterns; “switching,” “mutual rush,” “mutual avoidance,” and “unfair” patterns. Model estimation of chicken game data shows that a model with self-concept predicts those behaviors better than previous models of fairness, suggesting that self-concept indeed affects human behavior in competitive economic games. Moreover, a non-stationary parameter analysis revealed the process of reaching consensus between the players in a game. When the models were fitted to a continuous time window, the parameters of the players in a pair with “switching” and “mutual avoidance” patterns became similar as the game proceeded, suggesting that the players gradually formed a shared rule during the game. In contrast, the difference of parameters between the players in the “unfair” and “mutual rush” patterns did not become stable. The outcomes of the present study showed that people are likely to change their strategy until they reach a mutually beneficial status. PMID:26441707
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter
2012-01-01
The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…
The Cumberland River Flood of 2010 and Corps Reservoir Operations
NASA Astrophysics Data System (ADS)
Charley, W.; Hanbali, F.; Rohrbach, B.
2010-12-01
On Saturday, May 1, 2010, heavy rain began falling in the Cumberland River Valley and continued through the following day. 13.5 inches was measured at Nashville, an unprecedented amount that doubled the previous 2-day record, and exceeded the May monthly total record of 11 inches. Elsewhere in the valley, amounts of over 19 inches were measured. The frequency of this storm was estimated to exceed the one-thousand year event. This historic rainfall brought large scale flooding to the Cumberland-Ohio-Tennessee River Valleys, and caused over 2 billion dollars in damages, despite the numerous flood control projects in the area, including eight U.S. Army Corps of Engineers projects. The vast majority of rainfall occurred in drainage areas that are uncontrolled by Corps flood control projects, which lead to the wide area flooding. However, preliminary analysis indicates that operations of the Corps projects reduced the Cumberland River flood crest in Nashville by approximately five feet. With funding from the American Recovery and Reinvestment Act (ARRA) of 2009, hydrologic, hydraulic and reservoir simulation models have just been completed for the Cumberland-Ohio-Tennessee River Valleys. These models are being implemented in the Corps Water Management System (CWMS), a comprehensive data acquisition and hydrologic modeling system for short-term decision support of water control operations in real time. The CWMS modeling component uses observed rainfall and forecasted rainfall to compute forecasts of river flows into and downstream of reservoirs, using HEC-HMS. Simulation of reservoir operations, utilizing either the HEC-ResSim or CADSWES RiverWare program, uses these flow scenarios to provide operational decision information for the engineer. The river hydraulics program, HEC-RAS, computes river stages and water surface profiles for these scenarios. An inundation boundary and depth map of water in the flood plain can be calculated from the HEC-RAS results using ArcInfo. The economic impacts of the different inundation depths are computed by HEC-FIA. The user-configurable sequence of modeling software allows engineers to evaluate operational decisions for reservoirs and other control structures, and view and compare hydraulic and economic impacts for various “what if?” scenarios. This paper reviews the Cumberland River May 2010 event, the impact of Corps reservoirs and reservoir operations and the expected future benefits and effects of the ARRA funded models and CWMS on future events for this area.
Non-equilibrium thermodynamics theory of econometric source discovery for large data analysis
NASA Astrophysics Data System (ADS)
van Bergem, Rutger; Jenkins, Jeffrey; Benachenhou, Dalila; Szu, Harold
2014-05-01
Almost all consumer and firm transactions are achieved using computers and as a result gives rise to increasingly large amounts of data available for analysts. The gold standard in Economic data manipulation techniques matured during a period of limited data access, and the new Large Data Analysis (LDA) paradigm we all face may quickly obfuscate most tools used by Economists. When coupled with an increased availability of numerous unstructured, multi-modal data sets, the impending 'data tsunami' could have serious detrimental effects for Economic forecasting, analysis, and research in general. Given this reality we propose a decision-aid framework for Augmented-LDA (A-LDA) - a synergistic approach to LDA which combines traditional supervised, rule-based Machine Learning (ML) strategies to iteratively uncover hidden sources in large data, the artificial neural network (ANN) Unsupervised Learning (USL) at the minimum Helmholtz free energy for isothermal dynamic equilibrium strategies, and the Economic intuitions required to handle problems encountered when interpreting large amounts of Financial or Economic data. To make the ANN USL framework applicable to economics we define the temperature, entropy, and energy concepts in Economics from non-equilibrium molecular thermodynamics of Boltzmann viewpoint, as well as defining an information geometry, on which the ANN can operate using USL to reduce information saturation. An exemplar of such a system representation is given for firm industry equilibrium. We demonstrate the traditional ML methodology in the economics context and leverage firm financial data to explore a frontier concept known as behavioral heterogeneity. Behavioral heterogeneity on the firm level can be imagined as a firm's interactions with different types of Economic entities over time. These interactions could impose varying degrees of institutional constraints on a firm's business behavior. We specifically look at behavioral heterogeneity for firms that are operating with the label of `Going-Concern' and firms labeled according to institutional influence they may be experiencing, such as constraints on firm hiring/spending while in a Bankruptcy or a Merger procedure. Uncovering invariant features, or behavioral data metrics from observable firm data in an economy can greatly benefit the FED, World Bank, etc. We find that the ML/LDA communities can benefit from Economic intuitions just as much as Economists can benefit from generic data exploration tools. The future of successful Economic data understanding, modeling, simulation, and visualization can be amplified by new A-LDA models and approaches for new and analogous models of Economic system dynamics. The potential benefits of improved economic data analysis and real time decision aid tools are numerous for researchers, analysts, and federal agencies who all deal with increasingly large amounts of complex data to support their decision making.
Methods of increasing efficiency and maintainability of pipeline systems
NASA Astrophysics Data System (ADS)
Ivanov, V. A.; Sokolov, S. M.; Ogudova, E. V.
2018-05-01
This study is dedicated to the issue of pipeline transportation system maintenance. The article identifies two classes of technical-and-economic indices, which are used to select an optimal pipeline transportation system structure. Further, the article determines various system maintenance strategies and strategy selection criteria. Meanwhile, the maintenance strategies turn out to be not sufficiently effective due to non-optimal values of maintenance intervals. This problem could be solved by running the adaptive maintenance system, which includes a pipeline transportation system reliability improvement algorithm, especially an equipment degradation computer model. In conclusion, three model building approaches for determining optimal technical systems verification inspections duration were considered.
Automated plant, production management system
NASA Astrophysics Data System (ADS)
Aksenova, V. I.; Belov, V. I.
1984-12-01
The development of a complex of tasks for the operational management of production (OUP) within the framework of an automated system for production management (ASUP) shows that it is impossible to have effective computations without reliable initial information. The influence of many factors involving the production and economic activity of the entire enterprise upon the plan and course of production are considered. It is suggested that an adequate model should be available which covers all levels of the hierarchical system: workplace, section (bridgade), shop, enterprise, and the model should be incorporated into the technological sequence of performance and there should be provisions for an adequate man machine system.
Mehrian, Mohammad; Guyot, Yann; Papantoniou, Ioannis; Olofsson, Simon; Sonnaert, Maarten; Misener, Ruth; Geris, Liesbet
2018-03-01
In regenerative medicine, computer models describing bioreactor processes can assist in designing optimal process conditions leading to robust and economically viable products. In this study, we started from a (3D) mechanistic model describing the growth of neotissue, comprised of cells, and extracellular matrix, in a perfusion bioreactor set-up influenced by the scaffold geometry, flow-induced shear stress, and a number of metabolic factors. Subsequently, we applied model reduction by reformulating the problem from a set of partial differential equations into a set of ordinary differential equations. Comparing the reduced model results to the mechanistic model results and to dedicated experimental results assesses the reduction step quality. The obtained homogenized model is 10 5 fold faster than the 3D version, allowing the application of rigorous optimization techniques. Bayesian optimization was applied to find the medium refreshment regime in terms of frequency and percentage of medium replaced that would maximize neotissue growth kinetics during 21 days of culture. The simulation results indicated that maximum neotissue growth will occur for a high frequency and medium replacement percentage, a finding that is corroborated by reports in the literature. This study demonstrates an in silico strategy for bioprocess optimization paying particular attention to the reduction of the associated computational cost. © 2017 Wiley Periodicals, Inc.
The economic burden of Clostridium difficile.
McGlone, S M; Bailey, R R; Zimmer, S M; Popovich, M J; Tian, Y; Ufberg, P; Muder, R R; Lee, B Y
2012-03-01
Although Clostridium difficile (C. difficile) is the leading cause of infectious diarrhoea in hospitalized patients, the economic burden of this major nosocomial pathogen for hospitals, third-party payers and society remains unclear. We developed an economic computer simulation model to determine the costs attributable to healthcare-acquired C. difficile infection (CDI) from the hospital, third-party payer and societal perspectives. Sensitivity analyses explored the effects of varying the cost of hospitalization, C. difficile-attributable length of stay, and the probability of initial and secondary recurrences. The median cost of a case ranged from $9179 to $11 456 from the hospital perspective, $8932 to $11 679 from the third-party payor perspective, and $13 310 to $16 464 from the societal perspective. Most of the costs incurred were accrued during a patient's primary CDI episode. Hospitals with an incidence of 4.1 CDI cases per 100 000 discharges would incur costs ≥$3.2 million (hospital perspective); an incidence of 10.5 would lead to costs ≥$30.6 million. Our model suggests that the annual US economic burden of CDI would be ≥$496 million (hospital perspective), ≥$547 million (third-party payer perspective) and ≥$796 million (societal perspective). Our results show that C. difficile infection is indeed costly, not only to third-party payers and the hospital, but to society as well. These results are consistent with current literature citing C. difficile as a costly disease. © 2011 The Authors. Clinical Microbiology and Infection © 2011 European Society of Clinical Microbiology and Infectious Diseases.
NASA Astrophysics Data System (ADS)
Wilson, Zakiya S.; Iyengar, Sitharama S.; Pang, Su-Seng; Warner, Isiah M.; Luces, Candace A.
2012-10-01
Increasing college degree attainment for students from disadvantaged backgrounds is a prominent component of numerous state and federal legislation focused on higher education. In 1999, the National Science Foundation (NSF) instituted the "Computer Science, Engineering, and Mathematics Scholarships" (CSEMS) program; this initiative was designed to provide greater access and support to academically talented students from economically disadvantaged backgrounds. Originally intended to provide financial support to lower income students, this NSF program also advocated that additional professional development and advising would be strategies to increase undergraduate persistence to graduation. This innovative program for economically disadvantaged students was extended in 2004 to include students from other disciplines including the physical and life sciences as well as the technology fields, and the new name of the program was Scholarships for Science, Technology, Engineering and Mathematics (S-STEM). The implementation of these two programs in Louisiana State University (LSU) has shown significant and measurable success since 2000, making LSU a Model University in providing support to economically disadvantaged students within the STEM disciplines. The achievement of these programs is evidenced by the graduation rates of its participants. This report provides details on the educational model employed through the CSEMS/S-STEM projects at LSU and provides a path to success for increasing student retention rates in STEM disciplines. While the LSU's experience is presented as a case study, the potential relevance of this innovative mentoring program in conjunction with the financial support system is discussed in detail.
Research on application of intelligent computation based LUCC model in urbanization process
NASA Astrophysics Data System (ADS)
Chen, Zemin
2007-06-01
Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents of complexity science research and the conception of complexity feature to reveal the complexity features of LUCC research in urbanization process. Urban space system is a complex economic and cultural phenomenon as well as a social process, is the comprehensive characterization of urban society, economy and culture, and is a complex space system formed by society, economy and nature. It has dissipative structure characteristics, such as opening, dynamics, self-organization, non-balance etc. Traditional model cannot simulate these social, economic and natural driving forces of LUCC including main feedback relation from LUCC to driving force. 2. Establishment of Markov extended model of LUCC analog research in urbanization process. Firstly, use traditional LUCC research model to compute change speed of regional land use through calculating dynamic degree, exploitation degree and consumption degree of land use; use the theory of fuzzy set to rewrite the traditional Markov model, establish structure transfer matrix of land use, forecast and analyze dynamic change and development trend of land use, and present noticeable problems and corresponding measures in urbanization process according to research results. 3. Application of intelligent computation research and complexity science research method in LUCC analog model in urbanization process. On the basis of detailed elaboration of the theory and the model of LUCC research in urbanization process, analyze the problems of existing model used in LUCC research (namely, difficult to resolve many complexity phenomena in complex urban space system), discuss possible structure realization forms of LUCC analog research in combination with the theories of intelligent computation and complexity science research. Perform application analysis on BP artificial neural network and genetic algorithms of intelligent computation and CA model and MAS technology of complexity science research, discuss their theoretical origins and their own characteristics in detail, elaborate the feasibility of them in LUCC analog research, and bring forward improvement methods and measures on existing problems of this kind of model. 4. Establishment of LUCC analog model in urbanization process based on theories of intelligent computation and complexity science. Based on the research on abovementioned BP artificial neural network, genetic algorithms, CA model and multi-agent technology, put forward improvement methods and application assumption towards their expansion on geography, build LUCC analog model in urbanization process based on CA model and Agent model, realize the combination of learning mechanism of BP artificial neural network and fuzzy logic reasoning, express the regulation with explicit formula, and amend the initial regulation through self study; optimize network structure of LUCC analog model and methods and procedures of model parameters with genetic algorithms. In this paper, I introduce research theory and methods of complexity science into LUCC analog research and presents LUCC analog model based upon CA model and MAS theory. Meanwhile, I carry out corresponding expansion on traditional Markov model and introduce the theory of fuzzy set into data screening and parameter amendment of improved model to improve the accuracy and feasibility of Markov model in the research on land use/cover change.
NASA Astrophysics Data System (ADS)
Noffke, Benjamin W.
Carbon materials have the potential to replace some precious metals in renewable energy applications. These materials are particularly attractive because of the elemental abundance and relatively low nuclear mass of carbon, implying economically feasible and lightweight materials. Targeted design of carbon materials is hindered by the lack of fundamental understanding that is required to tailor their properties for the desired application. However, most available synthetic methods to create carbon materials involve harsh conditions that limit the control of the resulting structure. Without a well-defined structure, the system is too complex and fundamental studies cannot be definitive. This work seeks to gain fundamental understanding through the development and application of efficient computational models for these systems, in conjunction with experiments performed on soluble, well-defined graphene nanostructures prepared by our group using a bottom-up synthetic approach. Theory is used to determine mechanistic details for well-defined carbon systems in applications of catalysis and electrochemical transformations. The resulting computational models do well to explain previous observations of carbon materials and provide suggestions for future directions. However, as the system size of the nanostructures gets larger, the computational cost can become prohibitive. To reduce the computational scaling of quantum chemical calculations, a new fragmentation scheme has been developed that addresses the challenges of fragmenting conjugated molecules. By selecting fragments that retain important structural characteristics in graphene, a more efficient method is achieved. The new method paves the way for an automated, systematic fragmentation scheme of graphene molecules.
A model of the wall boundary layer for ducted propellers
NASA Technical Reports Server (NTRS)
Eversman, Walter; Moehring, Willi
1987-01-01
The objective of the present study is to include a representation of a wall boundary layer in an existing finite element model of the propeller in the wind tunnel environment. The major consideration is that the new formulation should introduce only modest alterations in the numerical model and should still be capable of producing economical predictions of the radiated acoustic field. This is accomplished by using a stepped approximation in which the velocity profile is piecewise constant in layers. In the limit of infinitesimally thin layers, the velocity profile of the stepped approximation coincides with that of the continuous profile. The approach described here could also be useful in modeling the boundary layer in other duct applications, particularly in the computation of the radiated acoustic field for sources contained in a duct.
2014-01-01
Military personnel are deployed abroad for missions ranging from humanitarian relief efforts to combat actions; delay or interruption in these activities due to disease transmission can cause operational disruptions, significant economic loss, and stressed or exceeded military medical resources. Deployed troops function in environments favorable to the rapid and efficient transmission of many viruses particularly when levels of protection are suboptimal. When immunity among deployed military populations is low, the risk of vaccine-preventable disease outbreaks increases, impacting troop readiness and achievement of mission objectives. However, targeted vaccination and the optimization of preexisting immunity among deployed populations can decrease the threat of outbreaks among deployed troops. Here we describe methods for the computational modeling of disease transmission to explore how preexisting immunity compares with vaccination at the time of deployment as a means of preventing outbreaks and protecting troops and mission objectives during extended military deployment actions. These methods are illustrated with five modeling case studies for separate diseases common in many parts of the world, to show different approaches required in varying epidemiological settings. PMID:25009579
Burgess, Colleen; Peace, Angela; Everett, Rebecca; Allegri, Buena; Garman, Patrick
2014-01-01
Military personnel are deployed abroad for missions ranging from humanitarian relief efforts to combat actions; delay or interruption in these activities due to disease transmission can cause operational disruptions, significant economic loss, and stressed or exceeded military medical resources. Deployed troops function in environments favorable to the rapid and efficient transmission of many viruses particularly when levels of protection are suboptimal. When immunity among deployed military populations is low, the risk of vaccine-preventable disease outbreaks increases, impacting troop readiness and achievement of mission objectives. However, targeted vaccination and the optimization of preexisting immunity among deployed populations can decrease the threat of outbreaks among deployed troops. Here we describe methods for the computational modeling of disease transmission to explore how preexisting immunity compares with vaccination at the time of deployment as a means of preventing outbreaks and protecting troops and mission objectives during extended military deployment actions. These methods are illustrated with five modeling case studies for separate diseases common in many parts of the world, to show different approaches required in varying epidemiological settings.
ERIC Educational Resources Information Center
McCain, Roger A.
1988-01-01
Reviews the economic theory of property rights and explores four applications of the transaction cost theory of property rights and free distribution in the economics of information: (1) copying technology; (2) computer software and copy protection; (3) satellite television and encryption; and (4) public libraries. (56 references) (MES)
Hypercompetitive Environments: An Agent-based model approach
NASA Astrophysics Data System (ADS)
Dias, Manuel; Araújo, Tanya
Information technology (IT) environments are characterized by complex changes and rapid evolution. Globalization and the spread of technological innovation have increased the need for new strategic information resources, both from individual firms and management environments. Improvements in multidisciplinary methods and, particularly, the availability of powerful computational tools, are giving researchers an increasing opportunity to investigate management environments in their true complex nature. The adoption of a complex systems approach allows for modeling business strategies from a bottom-up perspective — understood as resulting from repeated and local interaction of economic agents — without disregarding the consequences of the business strategies themselves to individual behavior of enterprises, emergence of interaction patterns between firms and management environments. Agent-based models are at the leading approach of this attempt.
Economic growth, motorization, and road traffic injuries in the Sultanate of Oman, 1985-2009.
Al-Reesi, Hamed; Ganguly, Shyam Sunder; Al-Adawi, Samir; Laflamme, Lucie; Hasselberg, Marie; Al-Maniri, Abdullah
2013-01-01
Recent affluence, assisted by exploitation of hydrocarbon, has sparked unprecedented economic growth and influx of all façades of modernity in Oman. Different statistical models have examined the relationship between economic growth, motorization rates, and road traffic fatalities. However, such a relationship in Oman has never been described. To describe and analyze the trend of road traffic injuries (RTIs) in relation to motorization rates and economic growth during the period from 1985 to 2009 using Smeed's (1949) model and Koren and Borsos's (2010) model. The study is based on national data reported between 1985 and 2009. Data on the population and gross domestic product (GDP) per capita in U.S. dollars were gathered from the Ministry of National Economy reports. Data on the number of vehicles and road traffic crashes, fatalities, and injuries were gathered from the Royal Oman Police (ROP) reports. Crash, fatality, and injury rates per 1000 vehicles and per 100,000 population were computed. Linear regression analysis was carried out to estimate the average annual changes in the rates. Smeed's (1949) and Koren and Borsos's (2010) models were used to predict the relations between motorization and road traffic fatalities in Oman. In addition, a cross-sectional analysis of year 2007 data for a number of Arab countries was carried out. The GDP per capita increased from US$6551 in 1985 to US$25,110 in 2009 with an annual increase of UR$547 per capita. The motorization rates increased by 36 percent from 1745 per 10,000 population in 1985 to 2382 per 10,000 population in 2009. Both Smeed's (1949) and Koren and Borsos's (2010) models had a high goodness of fit, with R(2) greater than 0.70. This indicated that road traffic fatalities in Oman may have a direct relationship with increased motorization. The cross-sectional analysis showed that the relation between crash fatalities and motorization rates in Oman and the United Arab Emirates can be better explained by Koren and Borsos's (2010) model than other countries. Recent economic growth in Oman was associated with an increase in motorization rates, which in turn has resulted in an increased burden of road traffic fatalities and injuries.
ERIC Educational Resources Information Center
Prom, Sukai; And Others
The District of Columbia's Nature Computer Camp program, described and evaluated in this paper, was designed to reduce the geographical isolation of economically disadvantaged urban sixth graders, and to provide them with increased knowledge of the environmental and computer sciences. The paper begins by giving details of the program's management,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klitsner, Tom
The recent Executive Order creating the National Strategic Computing Initiative (NSCI) recognizes the value of high performance computing for economic competitiveness and scientific discovery and commits to accelerate delivery of exascale computing. The HPC programs at Sandia –the NNSA ASC program and Sandia’s Institutional HPC Program– are focused on ensuring that Sandia has the resources necessary to deliver computation in the national interest.
Technical economics in the power industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dicks, J.B.
1990-01-01
This paper reports discusses technical economics is an emerging subject different from such areas as engineering management. The subject of engineering economics will be vital in the coming years as the world economy is about to undergo an economic explosion due to the effect of the computer on every area of technology and the simultaneous large expansion of the world market through the opening and democratization of the Communist block together with the European Common Market.
Characteristic analysis-1981: Final program and a possible discovery
McCammon, R.B.; Botbol, J.M.; Sinding-Larsen, R.; Bowen, R.W.
1983-01-01
The latest ornewest version of thecharacteristicanalysis (NCHARAN)computer program offers the exploration geologist a wide variety of options for integrating regionalized multivariate data. The options include the selection of regional cells for characterizing deposit models, the selection of variables that constitute the models, and the choice of logical combinations of variables that best represent these models. Moreover, the program provides for the display of results which, in turn, makes possible review, reselection, and refinement of a model. Most important, the performance of the above-mentioned steps in an interactive computing mode can result in a timely and meaningful interpretation of the data available to the exploration geologist. The most recent application of characteristic analysis has resulted in the possible discovery of economic sulfide mineralization in the Grong area in central Norway. Exploration data for 27 geophysical, geological, and geochemical variables were used to construct a mineralized and a lithogeochemical model for an area that contained a known massive sulfide deposit. The models were applied to exploration data collected from the Gjersvik area in the Grong mining district and resulted in the identification of two localities of possible mineralization. Detailed field examination revealed the presence of a sulfide vein system and a partially inverted stratigraphic sequence indicating the possible presence of a massive sulfide deposit at depth. ?? 1983 Plenum Publishing Corporation.
Computer simulation studies of the growth of strained layers by molecular-beam epitaxy
NASA Astrophysics Data System (ADS)
Faux, D. A.; Gaynor, G.; Carson, C. L.; Hall, C. K.; Bernholc, J.
1990-08-01
Two new types of discrete-space Monte Carlo computer simulation are presented for the modeling of the early stages of strained-layer growth by molecular-beam epitaxy. The simulations are more economical on computer resources than continuous-space Monte Carlo or molecular dynamics. Each model is applied to the study of growth onto a substrate in two dimensions with use of Lennard-Jones interatomic potentials. Up to seven layers are deposited for a variety of lattice mismatches, temperatures, and growth rates. Both simulations give similar results. At small lattice mismatches (<~4%) the growth is in registry with the substrate, while at high mismatches (>~6%) the growth is incommensurate with the substrate. At intermediate mismatches, a transition from registered to incommensurate growth is observed which commences at the top of the crystal and propagates down to the first layer. Faster growth rates are seen to inhibit this transition. The growth mode is van der Merwe (layer-by-layer) at 2% lattice mismatch, but at larger mismatches Volmer-Weber (island) growth is preferred. The Monte Carlo simulations are assessed in the light of these results and the ease at which they can be extended to three dimensions and to more sophisticated potentials is discussed.
The Child as Econometrician: A Rational Model of Preference Understanding in Children
Lucas, Christopher G.; Griffiths, Thomas L.; Xu, Fei; Fawcett, Christine; Gopnik, Alison; Kushnir, Tamar; Markson, Lori; Hu, Jane
2014-01-01
Recent work has shown that young children can learn about preferences by observing the choices and emotional reactions of other people, but there is no unified account of how this learning occurs. We show that a rational model, built on ideas from economics and computer science, explains the behavior of children in several experiments, and offers new predictions as well. First, we demonstrate that when children use statistical information to learn about preferences, their inferences match the predictions of a simple econometric model. Next, we show that this same model can explain children's ability to learn that other people have preferences similar to or different from their own and use that knowledge to reason about the desirability of hidden objects. Finally, we use the model to explain a developmental shift in preference understanding. PMID:24667309
The child as econometrician: a rational model of preference understanding in children.
Lucas, Christopher G; Griffiths, Thomas L; Xu, Fei; Fawcett, Christine; Gopnik, Alison; Kushnir, Tamar; Markson, Lori; Hu, Jane
2014-01-01
Recent work has shown that young children can learn about preferences by observing the choices and emotional reactions of other people, but there is no unified account of how this learning occurs. We show that a rational model, built on ideas from economics and computer science, explains the behavior of children in several experiments, and offers new predictions as well. First, we demonstrate that when children use statistical information to learn about preferences, their inferences match the predictions of a simple econometric model. Next, we show that this same model can explain children's ability to learn that other people have preferences similar to or different from their own and use that knowledge to reason about the desirability of hidden objects. Finally, we use the model to explain a developmental shift in preference understanding.
10 CFR 590.105 - Computation of time.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Computation of time. 590.105 Section 590.105 Energy DEPARTMENT OF ENERGY (CONTINUED) NATURAL GAS (ECONOMIC REGULATORY ADMINISTRATION) ADMINISTRATIVE PROCEDURES WITH RESPECT TO THE IMPORT AND EXPORT OF NATURAL GAS General Provisions § 590.105 Computation of time...
Secondary Computer-Based Instruction in Microeconomics: Cognitive and Affective Issues.
ERIC Educational Resources Information Center
Lasnik, Vincent E.
This paper describes the general rationale, hypotheses, methodology, findings and implications of a recent dissertation research project conducted in the Columbus, Ohio, public schools. The computer-based study investigated the simultaneous relationship between achievement in microeconomics and attitude toward economics, level of computer anxiety,…
10 CFR 590.105 - Computation of time.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 4 2011-01-01 2011-01-01 false Computation of time. 590.105 Section 590.105 Energy DEPARTMENT OF ENERGY (CONTINUED) NATURAL GAS (ECONOMIC REGULATORY ADMINISTRATION) ADMINISTRATIVE PROCEDURES WITH RESPECT TO THE IMPORT AND EXPORT OF NATURAL GAS General Provisions § 590.105 Computation of time...
Exploring the Issues: Humans and Computers.
ERIC Educational Resources Information Center
Walsh, Huber M.
This presentation addresses three basic social issues generated by the computer revolution. The first section, "Money Matters," focuses on the economic effects of computer technology. These include the replacement of workers by fully automated machines, the threat to professionals posed by expanded access to specialized information, and the…
Applications of physics to economics and finance: Money, income, wealth, and the stock market
NASA Astrophysics Data System (ADS)
Dragulescu, Adrian Antoniu
Several problems arising in Economics and Finance are analyzed using concepts and quantitative methods from Physics. The dissertation is organized as follows: In the first chapter it is argued that in a closed economic system, money is conserved. Thus, by analogy with energy, the equilibrium probability distribution of money must follow the exponential Boltzmann-Gibbs law characterized by an effective temperature equal to the average amount of money per economic agent. The emergence of Boltzmann-Gibbs distribution is demonstrated through computer simulations of economic models. A thermal machine which extracts a monetary profit can be constructed between two economic systems with different temperatures. The role of debt and models with broken time-reversal symmetry for which the Boltzmann-Gibbs law does not hold, are discussed. In the second chapter, using data from several sources, it is found that the distribution of income is described for the great majority of population by an exponential distribution, whereas the high-end tail follows a power law. From the individual income distribution, the probability distribution of income for families with two earners is derived and it is shown that it also agrees well with the data. Data on wealth is presented and it is found that the distribution of wealth has a structure similar to the distribution of income. The Lorenz curve and Gini coefficient were calculated and are shown to be in good agreement with both income and wealth data sets. In the third chapter, the stock-market fluctuations at different time scales are investigated. A model where stock-price dynamics is governed by a geometrical (multiplicative) Brownian motion with stochastic variance is proposed. The corresponding Fokker-Planck equation can be solved exactly. Integrating out the variance, an analytic formula for the time-dependent probability distribution of stock price changes (returns) is found. The formula is in excellent agreement with the Dow-Jones index for the time lags from 1 to 250 trading days. For time lags longer than the relaxation time of variance, the probability distribution can be expressed in a scaling form using a Bessel function. The Dow-Jones data follow the scaling function for seven orders of magnitude.
Bioproducts and environmental quality: Biofuels, greenhouse gases, and water quality
NASA Astrophysics Data System (ADS)
Ren, Xiaolin
Promoting bio-based products is one oft-proposed solution to reduce GHG emissions because the feedstocks capture carbon, offsetting at least partially the carbon discharges resulting from use of the products. However, several life cycle analyses point out that while biofuels may emit less life cycle net carbon emissions than fossil fuels, they may exacerbate other parts of biogeochemical cycles, notably nutrient loads in the aquatic environment. In three essays, this dissertation explores the tradeoff between GHG emissions and nitrogen leaching associated with biofuel production using general equilibrium models. The first essay develops a theoretical general equilibrium model to calculate the second-best GHG tax with the existence of a nitrogen leaching distortion. The results indicate that the second-best GHG tax could be higher or lower than the first-best tax rates depending largely on the elasticity of substitution between fossil fuel and biofuel. The second and third essays employ computable general equilibrium models to further explore the tradeoff between GHG emissions and nitrogen leaching. The computable general equilibrium models also incorporate multiple biofuel pathways, i.e., biofuels made from different feedstocks using different processes, to identify the cost-effective combinations of biofuel pathways under different policies, and the corresponding economic and environmental impacts.
NASA Astrophysics Data System (ADS)
Mwakabuta, Ndaga Stanslaus
Electric power distribution systems play a significant role in providing continuous and "quality" electrical energy to different classes of customers. In the context of the present restrictions on transmission system expansions and the new paradigm of "open and shared" infrastructure, new approaches to distribution system analyses, economic and operational decision-making need investigation. This dissertation includes three layers of distribution system investigations. In the basic level, improved linear models are shown to offer significant advantages over previous models for advanced analysis. In the intermediate level, the improved model is applied to solve the traditional problem of operating cost minimization using capacitors and voltage regulators. In the advanced level, an artificial intelligence technique is applied to minimize cost under Distributed Generation injection from private vendors. Soft computing techniques are finding increasing applications in solving optimization problems in large and complex practical systems. The dissertation focuses on Genetic Algorithm for investigating the economic aspects of distributed generation penetration without compromising the operational security of the distribution system. The work presents a methodology for determining the optimal pricing of distributed generation that would help utilities make a decision on how to operate their system economically. This would enable modular and flexible investments that have real benefits to the electric distribution system. Improved reliability for both customers and the distribution system in general, reduced environmental impacts, increased efficiency of energy use, and reduced costs of energy services are some advantages.