Sample records for economic distributed computing

  1. Pricing the Computing Resources: Reading Between the Lines and Beyond

    NASA Technical Reports Server (NTRS)

    Nakai, Junko; Veronico, Nick (Editor); Thigpen, William W. (Technical Monitor)

    2001-01-01

    Distributed computing systems have the potential to increase the usefulness of existing facilities for computation without adding anything physical, but that is realized only when necessary administrative features are in place. In a distributed environment, the best match is sought between a computing job to be run and a computer to run the job (global scheduling), which is a function that has not been required by conventional systems. Viewing the computers as 'suppliers' and the users as 'consumers' of computing services, markets for computing services/resources have been examined as one of the most promising mechanisms for global scheduling. We first establish why economics can contribute to scheduling. We further define the criterion for a scheme to qualify as an application of economics. Many studies to date have claimed to have applied economics to scheduling. If their scheduling mechanisms do not utilize economics, contrary to their claims, their favorable results do not contribute to the assertion that markets provide the best framework for global scheduling. We examine the well-known scheduling schemes, which concern pricing and markets, using our criterion of what application of economics is. Our conclusion is that none of the schemes examined makes full use of economics.

  2. Efficient Monte Carlo Estimation of the Expected Value of Sample Information Using Moment Matching.

    PubMed

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2018-02-01

    The Expected Value of Sample Information (EVSI) is used to calculate the economic value of a new research strategy. Although this value would be important to both researchers and funders, there are very few practical applications of the EVSI. This is due to computational difficulties associated with calculating the EVSI in practical health economic models using nested simulations. We present an approximation method for the EVSI that is framed in a Bayesian setting and is based on estimating the distribution of the posterior mean of the incremental net benefit across all possible future samples, known as the distribution of the preposterior mean. Specifically, this distribution is estimated using moment matching coupled with simulations that are available for probabilistic sensitivity analysis, which is typically mandatory in health economic evaluations. This novel approximation method is applied to a health economic model that has previously been used to assess the performance of other EVSI estimators and accurately estimates the EVSI. The computational time for this method is competitive with other methods. We have developed a new calculation method for the EVSI which is computationally efficient and accurate. This novel method relies on some additional simulation so can be expensive in models with a large computational cost.

  3. Economic assessment photovoltaic/battery systems

    NASA Astrophysics Data System (ADS)

    Day, J. T.; Hayes, T. P.; Hobbs, W. J.

    1981-02-01

    The economics of residential PV/battery systems were determined from the utility perspective using detailed computer simulation to determine marginal costs. Brief consideration is also given to the economics of customer ownership, utility distribution system impact, and the implications of PURPA.

  4. The Computers in Our Lives.

    ERIC Educational Resources Information Center

    Uthe, Elaine F.

    1982-01-01

    Describes the growing use of computers in our world and how their use will affect vocational education. Discusses recordkeeping and database functions, computer graphics, problem-solving simulations, satellite communications, home computers, and how they will affect office education, home economics education, marketing and distributive education,…

  5. The genetic and economic effect of preliminary culling in the seedling orchard

    Treesearch

    Don E. Riemenschneider

    1977-01-01

    The genetic and economic effects of two stages of truncation selection in a white spruce seedling orchard were investigated by computer simulation. Genetic effects were computed by assuming a bivariate distribution of juvenile and mature traits and volume was used as the selection criterion. Seed production was assumed to rise in a linear fashion to maturity and then...

  6. Computationally intensive econometrics using a distributed matrix-programming language.

    PubMed

    Doornik, Jurgen A; Hendry, David F; Shephard, Neil

    2002-06-15

    This paper reviews the need for powerful computing facilities in econometrics, focusing on concrete problems which arise in financial economics and in macroeconomics. We argue that the profession is being held back by the lack of easy-to-use generic software which is able to exploit the availability of cheap clusters of distributed computers. Our response is to extend, in a number of directions, the well-known matrix-programming interpreted language Ox developed by the first author. We note three possible levels of extensions: (i) Ox with parallelization explicit in the Ox code; (ii) Ox with a parallelized run-time library; and (iii) Ox with a parallelized interpreter. This paper studies and implements the first case, emphasizing the need for deterministic computing in science. We give examples in the context of financial economics and time-series modelling.

  7. Simulating Quantile Models with Applications to Economics and Management

    NASA Astrophysics Data System (ADS)

    Machado, José A. F.

    2010-05-01

    The massive increase in the speed of computers over the past forty years changed the way that social scientists, applied economists and statisticians approach their trades and also the very nature of the problems that they could feasibly tackle. The new methods that use intensively computer power go by the names of "computer-intensive" or "simulation". My lecture will start with bird's eye view of the uses of simulation in Economics and Statistics. Then I will turn out to my own research on uses of computer- intensive methods. From a methodological point of view the question I address is how to infer marginal distributions having estimated a conditional quantile process, (Counterfactual Decomposition of Changes in Wage Distributions using Quantile Regression," Journal of Applied Econometrics 20, 2005). Illustrations will be provided of the use of the method to perform counterfactual analysis in several different areas of knowledge.

  8. Information as Property and as a Public Good: Perspectives from the Economic Theory of Property Rights.

    ERIC Educational Resources Information Center

    McCain, Roger A.

    1988-01-01

    Reviews the economic theory of property rights and explores four applications of the transaction cost theory of property rights and free distribution in the economics of information: (1) copying technology; (2) computer software and copy protection; (3) satellite television and encryption; and (4) public libraries. (56 references) (MES)

  9. Distributed Computer Networks in Support of Complex Group Practices

    PubMed Central

    Wess, Bernard P.

    1978-01-01

    The economics of medical computer networks are presented in context with the patient care and administrative goals of medical networks. Design alternatives and network topologies are discussed with an emphasis on medical network design requirements in distributed data base design, telecommunications, satellite systems, and software engineering. The success of the medical computer networking technology is predicated on the ability of medical and data processing professionals to design comprehensive, efficient, and virtually impenetrable security systems to protect data bases, network access and services, and patient confidentiality.

  10. The Development of a Distributive Interactive Computing Model in Consumer Economics, Utilizing Jerome S. Bruner's Theory of Instruction.

    ERIC Educational Resources Information Center

    Morrison, James L.

    A computerized delivery system in consumer economics developed at the University of Delaware uses the PLATO system to provide a basis for analyzing consumer behavior in the marketplace. The 16 sequential lessons, part of the Consumer in the Marketplace Series (CMS), demonstrate consumer economic theory in layman's terms and are structured to focus…

  11. A performance analysis method for distributed real-time robotic systems: A case study of remote teleoperation

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Sanderson, A. C.

    1994-01-01

    Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.

  12. Some aspects of resource uncertainty and their economic consequences in assessment of the 1002 area of the Arctic National Wildlife Refuge

    USGS Publications Warehouse

    Attanasi, E.D.; Schuenemeyer, J.H.

    2002-01-01

    Exploration ventures in frontier areas have high risks. Before committing to them, firms prepare regional resource assessments to evaluate the potential payoffs. With no historical basis for directly estimating size distribution of undiscovered accumulations, reservoir attribute probability distributions can be assessed subjectively and used to project undiscovered accumulation sizes. Three questions considered here are: (1) what distributions should be used to characterize the subjective assessments of reservoir attributes, (2) how parsimonious can the analyst be when eliciting subjective information from the assessment geologist, and (3) what are consequences of ignoring dependencies among reservoir attributes? The standard or norm used for comparing outcomes is the computed cost function describing costs of finding, developing, and producing undiscovered oil accumulations. These questions are examined in the context of the US Geological Survey's recently published regional assessment of the 1002 Area of the Arctic National Wildlife Refuge, Alaska. We study effects of using the various common distributions to characterize the geologist's subjective distributions representing reservoir attributes. Specific findings show that triangular distributions result in substantial bias in economic forecasts when used to characterize skewed distributions. Moreover, some forms of the lognormal distribution also result in biased economic inferences. Alternatively, we generally determined four fractiles (100, 50, 5, 0) to be sufficient to capture essential economic characteristics of the underlying attribute distributions. Ignoring actual dependencies among reservoir attributes biases the economic evaluation. ?? 2002 International Association for Mathematical Geology.

  13. Secure or Insure: An Economic Analysis of Security Interdependencies and Investment Types

    ERIC Educational Resources Information Center

    Grossklags, Jens

    2009-01-01

    Computer users express a strong desire to prevent attacks, and to reduce the losses from computer and information security breaches. However, despite the widespread availability of various technologies, actual investments in security remain highly variable across the Internet population. As a result, attacks such as distributed denial-of-service…

  14. A comparison of dynamic and static economic models of uneven-aged stand management

    Treesearch

    Robert G. Haight

    1985-01-01

    Numerical techniques have been used to compute the discrete-time sequence of residual diameter distributions that maximize the present net worth (PNW) of harvestable volume from an uneven-aged stand. Results contradicted optimal steady-state diameter distributions determined with static analysis. In this paper, optimality conditions for solutions to dynamic and static...

  15. The Production of Software for Distribution in EEC Countries. Copyright and Contract Issues.

    ERIC Educational Resources Information Center

    Crabb, Geoffrey

    This pamphlet begins by discussing two legal issues to be considered when negotiating and formalizing the production of computer programs for distribution within ECC (European Economic Community) countries: protection of the program against unauthorized copying, and the nature of the contracts to be prepared. It is noted that all member states of…

  16. Distributed Economic Dispatch in Microgrids Based on Cooperative Reinforcement Learning.

    PubMed

    Liu, Weirong; Zhuang, Peng; Liang, Hao; Peng, Jun; Huang, Zhiwu; Weirong Liu; Peng Zhuang; Hao Liang; Jun Peng; Zhiwu Huang; Liu, Weirong; Liang, Hao; Peng, Jun; Zhuang, Peng; Huang, Zhiwu

    2018-06-01

    Microgrids incorporated with distributed generation (DG) units and energy storage (ES) devices are expected to play more and more important roles in the future power systems. Yet, achieving efficient distributed economic dispatch in microgrids is a challenging issue due to the randomness and nonlinear characteristics of DG units and loads. This paper proposes a cooperative reinforcement learning algorithm for distributed economic dispatch in microgrids. Utilizing the learning algorithm can avoid the difficulty of stochastic modeling and high computational complexity. In the cooperative reinforcement learning algorithm, the function approximation is leveraged to deal with the large and continuous state spaces. And a diffusion strategy is incorporated to coordinate the actions of DG units and ES devices. Based on the proposed algorithm, each node in microgrids only needs to communicate with its local neighbors, without relying on any centralized controllers. Algorithm convergence is analyzed, and simulations based on real-world meteorological and load data are conducted to validate the performance of the proposed algorithm.

  17. Computation of convex bounds for present value functions with random payments

    NASA Astrophysics Data System (ADS)

    Ahcan, Ales; Darkiewicz, Grzegorz; Goovaerts, Marc; Hoedemakers, Tom

    2006-02-01

    In this contribution we study the distribution of the present value function of a series of random payments in a stochastic financial environment. Such distributions occur naturally in a wide range of applications within fields of insurance and finance. We obtain accurate approximations by developing upper and lower bounds in the convex-order sense for present value functions. Technically speaking, our methodology is an extension of the results of Dhaene et al. [Insur. Math. Econom. 31(1) (2002) 3-33, Insur. Math. Econom. 31(2) (2002) 133-161] to the case of scalar products of mutually independent random vectors.

  18. Economic models for management of resources in peer-to-peer and grid computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  19. Translations on Telecommunications Policy, Research and Development, Number 56

    DTIC Science & Technology

    1978-10-18

    conference to discuss the future of computer communications and their social impacts. The Fourth Inter- national Computer Communications Conference...standards, applications of light or satellite communications, international distribution of infor- mation and their political and social impacts...Berlin, September 25, TASS—A symposium devoted to the task of the mass media at the present stage of struggle for the economic independence, social

  20. Analysis of electrophoresis performance

    NASA Technical Reports Server (NTRS)

    Roberts, Glyn O.

    1988-01-01

    A flexible efficient computer code is being developed to simulate electrophoretic separation phenomena, in either a cylindrical or a rectangular geometry. The code will computer the evolution in time of the concentrations of an arbitrary number of chemical species, and of the temperature, pH distribution, conductivity, electric field, and fluid motion. Use of nonuniform meshes and fast accurate implicit time-stepping will yield accurate answers at economical cost.

  1. Integrated Assessment of Health-related Economic Impacts of U.S. Air Pollution Policy

    NASA Astrophysics Data System (ADS)

    Saari, R. K.; Rausch, S.; Selin, N. E.

    2012-12-01

    We examine the environmental impacts, health-related economic benefits, and distributional effects of new US regulations to reduce smog from power plants, namely: the Cross-State Air Pollution Rule. Using integrated assessment methods, linking atmospheric and economic models, we assess the magnitude of economy-wide effects and distributional consequences that are not captured by traditional regulatory impact assessment methods. We study the Cross-State Air Pollution Rule, a modified allowance trading scheme that caps emissions of nitrogen oxides and sulfur dioxide from power plants in the eastern United States and thus reduces ozone and particulate matter pollution. We use results from the regulatory regional air quality model, CAMx (the Comprehensive Air Quality Model with extensions), and epidemiologic studies in BenMAP (Environmental Benefits Mapping and Analysis Program), to quantify differences in morbidities and mortalities due to this policy. To assess the economy-wide and distributional consequences of these health impacts, we apply a recently developed economic and policy model, the US Regional Energy and Environmental Policy Model (USREP), a multi-region, multi-sector, multi-household, recursive dynamic computable general equilibrium economic model of the US that provides a detailed representation of the energy sector, and the ability to represent energy and environmental policies. We add to USREP a representation of air pollution impacts, including the estimation and valuation of health outcomes and their effects on health services, welfare, and factor markets. We find that the economic welfare benefits of the Rule are underestimated by traditional methods, which omit economy-wide impacts. We also quantify the distribution of benefits, which have varying effects across US regions, income groups, and pollutants, and we identify factors influencing this distribution, including the geographic variation of pollution and population as well as underlying economic conditions.

  2. SEASAT economic assessment. Volume 10: The SATIL 2 program (a program for the evaluation of the costs of an operational SEASAT system as a function of operational requirements and reliability. [computer programs for economic analysis and systems analysis of SEASAT satellite systems

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The SATIL 2 computer program was developed to assist with the programmatic evaluation of alternative approaches to establishing and maintaining a specified mix of operational sensors on spacecraft in an operational SEASAT system. The program computes the probability distributions of events (i.e., number of launch attempts, number of spacecraft purchased, etc.), annual recurring cost, and present value of recurring cost. This is accomplished for the specific task of placing a desired mix of sensors in orbit in an optimal fashion in order to satisfy a specified sensor demand function. Flow charts are shown, and printouts of the programs are given.

  3. The importance of actions and the worth of an object: dissociable neural systems representing core value and economic value.

    PubMed

    Brosch, Tobias; Coppin, Géraldine; Schwartz, Sophie; Sander, David

    2012-06-01

    Neuroeconomic research has delineated neural regions involved in the computation of value, referring to a currency for concrete choices and decisions ('economic value'). Research in psychology and sociology, on the other hand, uses the term 'value' to describe motivational constructs that guide choices and behaviors across situations ('core value'). As a first step towards an integration of these literatures, we compared the neural regions computing economic value and core value. Replicating previous work, economic value computations activated a network centered on medial orbitofrontal cortex. Core value computations activated medial prefrontal cortex, a region involved in the processing of self-relevant information and dorsal striatum, involved in action selection. Core value ratings correlated with activity in precuneus and anterior prefrontal cortex, potentially reflecting the degree to which a core value is perceived as internalized part of one's self-concept. Distributed activation pattern in insula and ACC allowed differentiating individual core value types. These patterns may represent evaluation profiles reflecting prototypical fundamental concerns expressed in the core value types. Our findings suggest mechanisms by which core values, as motivationally important long-term goals anchored in the self-schema, may have the behavioral power to drive decisions and behaviors in the absence of immediately rewarding behavioral options.

  4. State of inequality in malaria intervention coverage in sub-Saharan African countries.

    PubMed

    Galactionova, Katya; Smith, Thomas A; de Savigny, Don; Penny, Melissa A

    2017-10-18

    Scale-up of malaria interventions over the last decade have yielded a significant reduction in malaria transmission and disease burden in sub-Saharan Africa. We estimated economic gradients in the distribution of these efforts and of their impacts within and across endemic countries. Using Demographic and Health Surveys we computed equity metrics to characterize the distribution of malaria interventions in 30 endemic countries proxying economic position with an asset-wealth index. Gradients were summarized in a concentration index, tabulated against level of coverage, and compared among interventions, across countries, and against respective trends over the period 2005-2015. There remain broad differences in coverage of malaria interventions and their distribution by wealth within and across countries. In most, economic gradients are lacking or favor the poorest for vector control; malaria services delivered through the formal healthcare sector are much less equitable. Scale-up of interventions in many countries improved access across the wealth continuum; in some, these efforts consistently prioritized the poorest. Expansions in control programs generally narrowed coverage gaps between economic strata; gradients persist in countries where growth was slower in the poorest quintile or where baseline inequality was large. Despite progress, malaria is consistently concentrated in the poorest, with the degree of inequality in burden far surpassing that expected given gradients in the distribution of interventions. Economic gradients in the distribution of interventions persist over time, limiting progress toward equity in malaria control. We found that, in countries with large baseline inequality in the distribution of interventions, even a small bias in expansion favoring the least poor yielded large gradients in intervention coverage while pro-poor growth failed to close the gap between the poorest and least poor. We demonstrated that dimensions of disadvantage compound for the poor; a lack of economic gradients in the distribution of malaria services does not translate to equity in coverage nor can it be interpreted to imply equity in distribution of risk or disease burden. Our analysis testifies to the progress made by countries in narrowing economic gradients in malaria interventions and highlights the scope for continued monitoring of programs with respect to equity.

  5. Applications of physics to economics and finance: Money, income, wealth, and the stock market

    NASA Astrophysics Data System (ADS)

    Dragulescu, Adrian Antoniu

    Several problems arising in Economics and Finance are analyzed using concepts and quantitative methods from Physics. The dissertation is organized as follows: In the first chapter it is argued that in a closed economic system, money is conserved. Thus, by analogy with energy, the equilibrium probability distribution of money must follow the exponential Boltzmann-Gibbs law characterized by an effective temperature equal to the average amount of money per economic agent. The emergence of Boltzmann-Gibbs distribution is demonstrated through computer simulations of economic models. A thermal machine which extracts a monetary profit can be constructed between two economic systems with different temperatures. The role of debt and models with broken time-reversal symmetry for which the Boltzmann-Gibbs law does not hold, are discussed. In the second chapter, using data from several sources, it is found that the distribution of income is described for the great majority of population by an exponential distribution, whereas the high-end tail follows a power law. From the individual income distribution, the probability distribution of income for families with two earners is derived and it is shown that it also agrees well with the data. Data on wealth is presented and it is found that the distribution of wealth has a structure similar to the distribution of income. The Lorenz curve and Gini coefficient were calculated and are shown to be in good agreement with both income and wealth data sets. In the third chapter, the stock-market fluctuations at different time scales are investigated. A model where stock-price dynamics is governed by a geometrical (multiplicative) Brownian motion with stochastic variance is proposed. The corresponding Fokker-Planck equation can be solved exactly. Integrating out the variance, an analytic formula for the time-dependent probability distribution of stock price changes (returns) is found. The formula is in excellent agreement with the Dow-Jones index for the time lags from 1 to 250 trading days. For time lags longer than the relaxation time of variance, the probability distribution can be expressed in a scaling form using a Bessel function. The Dow-Jones data follow the scaling function for seven orders of magnitude.

  6. Helping DE Keep Pace with Changes in Marketing

    ERIC Educational Resources Information Center

    Patchen, Frank M.

    1974-01-01

    A futuristic outlook on changes in retail business and marketing is given. Distributive educational needs in developing a person in the fields of marketing, retailing and economics will increase along with the use of computers for research in the next ten or fifteen years. (DS)

  7. Statistical, economic and other tools for assessing natural aggregate

    USGS Publications Warehouse

    Bliss, J.D.; Moyle, P.R.; Bolm, K.S.

    2003-01-01

    Quantitative aggregate resource assessment provides resource estimates useful for explorationists, land managers and those who make decisions about land allocation, which may have long-term implications concerning cost and the availability of aggregate resources. Aggregate assessment needs to be systematic and consistent, yet flexible enough to allow updating without invalidating other parts of the assessment. Evaluators need to use standard or consistent aggregate classification and statistic distributions or, in other words, models with geological, geotechnical and economic variables or interrelationships between these variables. These models can be used with subjective estimates, if needed, to estimate how much aggregate may be present in a region or country using distributions generated by Monte Carlo computer simulations.

  8. Determination of optimum allocation and pricing of distributed generation using genetic algorithm methodology

    NASA Astrophysics Data System (ADS)

    Mwakabuta, Ndaga Stanslaus

    Electric power distribution systems play a significant role in providing continuous and "quality" electrical energy to different classes of customers. In the context of the present restrictions on transmission system expansions and the new paradigm of "open and shared" infrastructure, new approaches to distribution system analyses, economic and operational decision-making need investigation. This dissertation includes three layers of distribution system investigations. In the basic level, improved linear models are shown to offer significant advantages over previous models for advanced analysis. In the intermediate level, the improved model is applied to solve the traditional problem of operating cost minimization using capacitors and voltage regulators. In the advanced level, an artificial intelligence technique is applied to minimize cost under Distributed Generation injection from private vendors. Soft computing techniques are finding increasing applications in solving optimization problems in large and complex practical systems. The dissertation focuses on Genetic Algorithm for investigating the economic aspects of distributed generation penetration without compromising the operational security of the distribution system. The work presents a methodology for determining the optimal pricing of distributed generation that would help utilities make a decision on how to operate their system economically. This would enable modular and flexible investments that have real benefits to the electric distribution system. Improved reliability for both customers and the distribution system in general, reduced environmental impacts, increased efficiency of energy use, and reduced costs of energy services are some advantages.

  9. A study of the feasibility of statistical analysis of airport performance simulation

    NASA Technical Reports Server (NTRS)

    Myers, R. H.

    1982-01-01

    The feasibility of conducting a statistical analysis of simulation experiments to study airport capacity is investigated. First, the form of the distribution of airport capacity is studied. Since the distribution is non-Gaussian, it is important to determine the effect of this distribution on standard analysis of variance techniques and power calculations. Next, power computations are made in order to determine how economic simulation experiments would be if they are designed to detect capacity changes from condition to condition. Many of the conclusions drawn are results of Monte-Carlo techniques.

  10. Dimensioning appropriate technical and economic parameters of elements in urban distribution power nets based on discrete fast marching method

    NASA Astrophysics Data System (ADS)

    Afanasyev, A. P.; Bazhenov, R. I.; Luchaninov, D. V.

    2018-05-01

    The main purpose of the research is to develop techniques for defining the best technical and economic trajectories of cables in urban power systems. The proposed algorithms of calculation of the routes for laying cables take into consideration topological, technical and economic features of the cabling. The discrete option of an algorithm Fast marching method is applied as a calculating tool. It has certain advantages compared to other approaches. In particular, this algorithm is cost-effective to compute, therefore, it is not iterative. Trajectories of received laying cables are considered as optimal ones from the point of view of technical and economic criteria. They correspond to the present rules of modern urban development.

  11. Information security: where computer science, economics and psychology meet.

    PubMed

    Anderson, Ross; Moore, Tyler

    2009-07-13

    Until ca. 2000, information security was seen as a technological discipline, based on computer science but with mathematics helping in the design of ciphers and protocols. That perspective started to change as researchers and practitioners realized the importance of economics. As distributed systems are increasingly composed of machines that belong to principals with divergent interests, incentives are becoming as important to dependability as technical design. A thriving new field of information security economics provides valuable insights not just into 'security' topics such as privacy, bugs, spam and phishing, but into more general areas of system dependability and policy. This research programme has recently started to interact with psychology. One thread is in response to phishing, the most rapidly growing form of online crime, in which fraudsters trick people into giving their credentials to bogus websites; a second is through the increasing importance of security usability; and a third comes through the psychology-and-economics tradition. The promise of this multidisciplinary research programme is a novel framework for analysing information security problems-one that is both principled and effective.

  12. Assessment of distributed solar power systems: Issues and impacts

    NASA Astrophysics Data System (ADS)

    Moyle, R. A.; Chernoff, H.; Schweizer, T. C.; Patton, J. B.

    1982-11-01

    The installation of distributed solar-power systems presents electric utilities with a host of questions. Some of the technical and economic impacts of these systems are discussed. Among the technical interconnect issues are isolated operation, power quality, line safety, and metering options. Economic issues include user purchase criteria, structures and installation costs, marketing and product distribution costs, and interconnect costs. An interactive computer program that allows easy calculation of allowable system prices and allowable generation-equipment prices was developed as part of this project. It is concluded that the technical problems raised by distributed solar systems are surmountable, but their resolution may be costly. The stringent purchase criteria likely to be imposed by many potential system users and the economies of large-scale systems make small systems (less than 10 to 20 kW) less attractive than larger systems. Utilities that consider life-cycle costs in making investment decisions and third-party investors who have tax and financial advantages are likely to place the highest value on solar-power systems.

  13. Intermediate quantum maps for quantum computation

    NASA Astrophysics Data System (ADS)

    Giraud, O.; Georgeot, B.

    2005-10-01

    We study quantum maps displaying spectral statistics intermediate between Poisson and Wigner-Dyson. It is shown that they can be simulated on a quantum computer with a small number of gates, and efficiently yield information about fidelity decay or spectral statistics. We study their matrix elements and entanglement production and show that they converge with time to distributions which differ from random matrix predictions. A randomized version of these maps can be implemented even more economically and yields pseudorandom operators with original properties, enabling, for example, one to produce fractal random vectors. These algorithms are within reach of present-day quantum computers.

  14. An innovative privacy preserving technique for incremental datasets on cloud computing.

    PubMed

    Aldeen, Yousra Abdul Alsahib S; Salleh, Mazleena; Aljeroudi, Yazan

    2016-08-01

    Cloud computing (CC) is a magnificent service-based delivery with gigantic computer processing power and data storage across connected communications channels. It imparted overwhelming technological impetus in the internet (web) mediated IT industry, where users can easily share private data for further analysis and mining. Furthermore, user affable CC services enable to deploy sundry applications economically. Meanwhile, simple data sharing impelled various phishing attacks and malware assisted security threats. Some privacy sensitive applications like health services on cloud that are built with several economic and operational benefits necessitate enhanced security. Thus, absolute cyberspace security and mitigation against phishing blitz became mandatory to protect overall data privacy. Typically, diverse applications datasets are anonymized with better privacy to owners without providing all secrecy requirements to the newly added records. Some proposed techniques emphasized this issue by re-anonymizing the datasets from the scratch. The utmost privacy protection over incremental datasets on CC is far from being achieved. Certainly, the distribution of huge datasets volume across multiple storage nodes limits the privacy preservation. In this view, we propose a new anonymization technique to attain better privacy protection with high data utility over distributed and incremental datasets on CC. The proficiency of data privacy preservation and improved confidentiality requirements is demonstrated through performance evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Effects of economic interactions on credit risk

    NASA Astrophysics Data System (ADS)

    Hatchett, J. P. L.; Kühn, R.

    2006-03-01

    We study a credit-risk model which captures effects of economic interactions on a firm's default probability. Economic interactions are represented as a functionally defined graph, and the existence of both cooperative and competitive business relations is taken into account. We provide an analytic solution of the model in a limit where the number of business relations of each company is large, but the overall fraction of the economy with which a given company interacts may be small. While the effects of economic interactions are relatively weak in typical (most probable) scenarios, they are pronounced in situations of economic stress, and thus lead to a substantial fattening of the tails of loss distributions in large loan portfolios. This manifests itself in a pronounced enhancement of the value at risk computed for interacting economies in comparison with their non-interacting counterparts.

  16. Economic weights of somatic cell score in dairy sheep.

    PubMed

    Legarra, A; Ramón, M; Ugarte, E; Pérez-Guzmán, M D; Arranz, J

    2007-03-01

    The economic weights for somatic cell score (SCS) have been calculated using profit functions. Economic data were collected in the Latxa breed. Three aspects have been considered: bulk tank milk payment, veterinary treatments due to high SCS, and culling. All of them are non-linear profit functions. Milk payment is based on the sum of the log-normal distributions of somatic cell count, and veterinary treatments on the probability of subclinical mastitis, which is inferred when individual SCS surpass some threshold. Both functions lead to non-standard distributions. The derivatives of the profit function were computed numerically. Culling was computed by assuming that a conceptual trait culled by mastitis (CBM) is genetically correlated to SCS. The economic weight considers the increase in the breeding value of CBM correlated to an increase in the breeding value of SCS, assuming genetic correlations ranging from 0 to 0.9. The relevance of the economic weights for selection purposes was checked by the estimation of genetic gains for milk yield and SCS under several scenarios of genetic parameters and economic weights. The overall economic weights for SCS range from - 2.6 to - 9.5 € per point of SCS, with an average of - 4 € per point of SCS, depending on the expected average SCS of the flock. The economic weight is higher around the thresholds for payment policies. Economic weights did not change greatly with other assumptions. The estimated genetic gains with economic weights of 0.83 € per l of milk yield and - 4 € per point of SCS, assuming a genetic correlation of - 0.30, were 3.85 l and - 0.031 SCS per year, with an associated increase in profit of 3.32 €. This represents a very small increase in profit (about 1%) relative to selecting only for milk yield. Other situations (increased economic weights, different genetic correlations) produced similar genetic gains and changes in profit. A desired-gains index reduced the increase in profit by 3%, although it could be greater depending on the genetic parameters. It is concluded that the inclusion of SCS in dairy sheep breeding programs is of low economic relevance and recommended only if recording is inexpensive or for animal welfare concerns.

  17. The importance of employing computational resources for the automation of drug discovery.

    PubMed

    Rosales-Hernández, Martha Cecilia; Correa-Basurto, José

    2015-03-01

    The application of computational tools to drug discovery helps researchers to design and evaluate new drugs swiftly with a reduce economic resources. To discover new potential drugs, computational chemistry incorporates automatization for obtaining biological data such as adsorption, distribution, metabolism, excretion and toxicity (ADMET), as well as drug mechanisms of action. This editorial looks at examples of these computational tools, including docking, molecular dynamics simulation, virtual screening, quantum chemistry, quantitative structural activity relationship, principal component analysis and drug screening workflow systems. The authors then provide their perspectives on the importance of these techniques for drug discovery. Computational tools help researchers to design and discover new drugs for the treatment of several human diseases without side effects, thus allowing for the evaluation of millions of compounds with a reduced cost in both time and economic resources. The problem is that operating each program is difficult; one is required to use several programs and understand each of the properties being tested. In the future, it is possible that a single computer and software program will be capable of evaluating the complete properties (mechanisms of action and ADMET properties) of ligands. It is also possible that after submitting one target, this computer-software will be capable of suggesting potential compounds along with ways to synthesize them, and presenting biological models for testing.

  18. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  19. An economic model of friendship and enmity for measuring social balance in networks

    NASA Astrophysics Data System (ADS)

    Lee, Kyu-Min; Shin, Euncheol; You, Seungil

    2017-12-01

    We propose a dynamic economic model of networks where agents can be friends or enemies with one another. This is a decentralized relationship model in that agents decide whether to change their relationships so as to minimize their imbalanced triads. In this model, there is a single parameter, which we call social temperature, that captures the degree to which agents care about social balance in their relationships. We show that the global structure of relationship configuration converges to a unique stationary distribution. Using this stationary distribution, we characterize the maximum likelihood estimator of the social temperature parameter. Since the estimator is computationally challenging to calculate from real social network datasets, we provide a simple simulation algorithm and verify its performance with real social network datasets.

  20. Offdiagonal complexity: A computationally quick complexity measure for graphs and networks

    NASA Astrophysics Data System (ADS)

    Claussen, Jens Christian

    2007-02-01

    A vast variety of biological, social, and economical networks shows topologies drastically differing from random graphs; yet the quantitative characterization remains unsatisfactory from a conceptual point of view. Motivated from the discussion of small scale-free networks, a biased link distribution entropy is defined, which takes an extremum for a power-law distribution. This approach is extended to the node-node link cross-distribution, whose nondiagonal elements characterize the graph structure beyond link distribution, cluster coefficient and average path length. From here a simple (and computationally cheap) complexity measure can be defined. This offdiagonal complexity (OdC) is proposed as a novel measure to characterize the complexity of an undirected graph, or network. While both for regular lattices and fully connected networks OdC is zero, it takes a moderately low value for a random graph and shows high values for apparently complex structures as scale-free networks and hierarchical trees. The OdC approach is applied to the Helicobacter pylori protein interaction network and randomly rewired surrogates.

  1. The ICT Laboratory: An Analysis of Computers in Public High Schools in Rural India

    ERIC Educational Resources Information Center

    Arora, Payal

    2007-01-01

    There has been a strong push towards e-literacy in India, particularly in the distribution and usage of information and communication technologies (ICT) in schools for economic and social growth. As a result, the Vidhya Vahini scheme was launched in Kuppam, a marginalized village constituency in Andhra Pradesh. This scheme strived to disseminate…

  2. Transonic flow theory of airfoils and wings

    NASA Technical Reports Server (NTRS)

    Garabedian, P. R.

    1976-01-01

    There are plans to use the supercritical wing on the next generation of commercial aircraft so as to economize on fuel consumption by reducing drag. Computer codes have served well in meeting the consequent demand for new wing sections. The possibility of replacing wind tunnel tests by computational fluid dynamics is discussed. Another approach to the supercritical wing is through shockless airfoils. A novel boundary value problem in the hodograph plane is studied that enables one to design a shockless airfoil so that its pressure distribution very nearly takes on data that are prescribed.

  3. Multi-criteria decision analysis in conservation planning: Designing conservation area networks in San Diego County

    NASA Astrophysics Data System (ADS)

    MacDonald, Garrick Richard

    To limit biodiversity loss caused by human activity, conservation planning must protect biodiversity while considering socio-economic cost criteria. This research aimed to determine the effects of socio-economic criteria and spatial configurations on the development of CANs for three species with different distribution patterns, while simultaneously attempting to address the uncertainty and sensitivity of CANs produced by ConsNet. The socio-economic factors and spatial criteria included the cost of land, population density, agricultural output value, area, average cluster area, number of clusters, shape, and perimeter. Three sensitive mammal species with different distribution patterns were selected and included the Bobcat, Ringtail, and a custom created mammal distribution. Forty problems and the corresponding number of CANs were formulated and computed by running each predicted presence species model with and without the four different socioeconomic threshold groups at two different resolutions. Thirty-two percent less area was conserved after considering multiple socio-economic constraints and spatial configurations in comparison to CANs that did not consider multiple socio-economic constraints and spatial configurations. Without including socio-economic costs, ConsNet's ALL_CELLS heuristic solution was the highest ranking CAN. After considering multiple socio-economic costs, the number one ranking CAN was no longer the ALL_CELLS heuristic solution, but a spatially different meta-heuristic solution. The effects of multiple constraints and objectives on the design of CANs with different distribution patterns did not vary significantly across the criteria. The CANs produced by ConsNet appeared to demonstrate some uncertainty surrounding particular criteria, but did not demonstrate substantial uncertainty across all criteria used to rank the CANs. Similarly, the range of socio-economic criteria thresholds did not have a substantial impact. ConsNet was very applicable to the research project, however, it did exhibit a few limitations. Both the advantages and disadvantages of ConsNet should be considered before using ConsNet for future conservation planning projects. The research project is an example of a large data scenario undertaken with a multiple criteria decision analysis (MCDA) approach.

  4. Statistical mechanics of money and income

    NASA Astrophysics Data System (ADS)

    Dragulescu, Adrian; Yakovenko, Victor

    2001-03-01

    Money: In a closed economic system, money is conserved. Thus, by analogy with energy, the equilibrium probability distribution of money will assume the exponential Boltzmann-Gibbs form characterized by an effective temperature. We demonstrate how the Boltzmann-Gibbs distribution emerges in computer simulations of economic models. We discuss thermal machines, the role of debt, and models with broken time-reversal symmetry for which the Boltzmann-Gibbs law does not hold. Reference: A. Dragulescu and V. M. Yakovenko, "Statistical mechanics of money", Eur. Phys. J. B 17, 723-729 (2000), [cond-mat/0001432]. Income: Using tax and census data, we demonstrate that the distribution of individual income in the United States is exponential. Our calculated Lorenz curve without fitting parameters and Gini coefficient 1/2 agree well with the data. We derive the distribution function of income for families with two earners and show that it also agrees well with the data. The family data for the period 1947-1994 fit the Lorenz curve and Gini coefficient 3/8=0.375 calculated for two-earners families. Reference: A. Dragulescu and V. M. Yakovenko, "Evidence for the exponential distribution of income in the USA", cond-mat/0008305.

  5. State-of-the-art and dissemination of computational tools for drug-design purposes: a survey among Italian academics and industrial institutions.

    PubMed

    Artese, Anna; Alcaro, Stefano; Moraca, Federica; Reina, Rocco; Ventura, Marzia; Costantino, Gabriele; Beccari, Andrea R; Ortuso, Francesco

    2013-05-01

    During the first edition of the Computationally Driven Drug Discovery meeting, held in November 2011 at Dompé Pharma (L'Aquila, Italy), a questionnaire regarding the diffusion and the use of computational tools for drug-design purposes in both academia and industry was distributed among all participants. This is a follow-up of a previously reported investigation carried out among a few companies in 2007. The new questionnaire implemented five sections dedicated to: research group identification and classification; 18 different computational techniques; software information; hardware data; and economical business considerations. In this article, together with a detailed history of the different computational methods, a statistical analysis of the survey results that enabled the identification of the prevalent computational techniques adopted in drug-design projects is reported and a profile of the computational medicinal chemist currently working in academia and pharmaceutical companies in Italy is highlighted.

  6. Online catalog access and distribution of remotely sensed information

    NASA Astrophysics Data System (ADS)

    Lutton, Stephen M.

    1997-09-01

    Remote sensing is providing voluminous data and value added information products. Electronic sensors, communication electronics, computer software, hardware, and network communications technology have matured to the point where a distributed infrastructure for remotely sensed information is a reality. The amount of remotely sensed data and information is making distributed infrastructure almost a necessity. This infrastructure provides data collection, archiving, cataloging, browsing, processing, and viewing for applications from scientific research to economic, legal, and national security decision making. The remote sensing field is entering a new exciting stage of commercial growth and expansion into the mainstream of government and business decision making. This paper overviews this new distributed infrastructure and then focuses on describing a software system for on-line catalog access and distribution of remotely sensed information.

  7. An economic evaluation of solar radiation management.

    PubMed

    Aaheim, Asbjørn; Romstad, Bård; Wei, Taoyuan; Kristjánsson, Jón Egill; Muri, Helene; Niemeier, Ulrike; Schmidt, Hauke

    2015-11-01

    Economic evaluations of solar radiation management (SRM) usually assume that the temperature will be stabilized, with no economic impacts of climate change, but with possible side-effects. We know from experiments with climate models, however, that unlike emission control the spatial and temporal distributions of temperature, precipitation and wind conditions will change. Hence, SRM may have economic consequences under a stabilization of global mean temperature even if side-effects other than those related to the climatic responses are disregarded. This paper addresses the economic impacts of implementing two SRM technologies; stratospheric sulfur injection and marine cloud brightening. By the use of a computable general equilibrium model, we estimate the economic impacts of climatic responses based on the results from two earth system models, MPI-ESM and NorESM. We find that under a moderately increasing greenhouse-gas concentration path, RCP4.5, the economic benefits of implementing climate engineering are small, and may become negative. Global GDP increases in three of the four experiments and all experiments include regions where the benefits from climate engineering are negative. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Economic Analysis of Complex Nuclear Fuel Cycles with NE-COST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ganda, Francesco; Dixon, Brent; Hoffman, Edward

    The purpose of this work is to present a new methodology, and associated computational tools, developed within the U.S. Department of Energy (U.S. DOE) Fuel Cycle Option Campaign to quantify the economic performance of complex nuclear fuel cycles. The levelized electricity cost at the busbar is generally chosen to quantify and compare the economic performance of different baseload generating technologies, including of nuclear: it is the cost of electricity which renders the risk-adjusted discounted net present value of the investment cash flow equal to zero. The work presented here is focused on the calculation of the levelized cost of electricitymore » of fuel cycles at mass balance equilibrium, which is termed LCAE (Levelized Cost of Electricity at Equilibrium). To alleviate the computational issues associated with the calculation of the LCAE for complex fuel cycles, a novel approach has been developed, which has been called the “island approach” because of its logical structure: a generic complex fuel cycle is subdivided into subsets of fuel cycle facilities, called islands, each containing one and only one type of reactor or blanket and an arbitrary number of fuel cycle facilities. A nuclear economic software tool, NE-COST, written in the commercial programming software MATLAB®, has been developed to calculate the LCAE of complex fuel cycles with the “island” computational approach. NE-COST has also been developed with the capability to handle uncertainty: the input parameters (both unit costs and fuel cycle characteristics) can have uncertainty distributions associated with them, and the output can be computed in terms of probability density functions of the LCAE. In this paper NE-COST will be used to quantify, as examples, the economic performance of (1) current Light Water Reactors (LWR) once-through systems; (2) continuous plutonium recycling in Fast Reactors (FR) with driver and blanket; (3) Recycling of plutonium bred in FR into LWR. For each fuel cycle, the contributions to the total LCAE of the main cost components will be identified.« less

  9. Reward and uncertainty in exploration programs

    NASA Technical Reports Server (NTRS)

    Kaufman, G. M.; Bradley, P. G.

    1971-01-01

    A set of variables which are crucial to the economic outcome of petroleum exploration are discussed. These are treated as random variables; the values they assume indicate the number of successes that occur in a drilling program and determine, for a particular discovery, the unit production cost and net economic return if that reservoir is developed. In specifying the joint probability law for those variables, extreme and probably unrealistic assumptions are made. In particular, the different random variables are assumed to be independently distributed. Using postulated probability functions and specified parameters, values are generated for selected random variables, such as reservoir size. From this set of values the economic magnitudes of interest, net return and unit production cost are computed. This constitutes a single trial, and the procedure is repeated many times. The resulting histograms approximate the probability density functions of the variables which describe the economic outcomes of an exploratory drilling program.

  10. The Importance of Considering the Temporal Distribution of Climate Variables for Ecological-Economic Modeling to Calculate the Consequences of Climate Change for Agriculture

    NASA Astrophysics Data System (ADS)

    Plegnière, Sabrina; Casper, Markus; Hecker, Benjamin; Müller-Fürstenberger, Georg

    2014-05-01

    The basis of many models to calculate and assess climate change and its consequences are annual means of temperature and precipitation. This method leads to many uncertainties especially at the regional or local level: the results are not realistic or too coarse. Particularly in agriculture, single events and the distribution of precipitation and temperature during the growing season have enormous influences on plant growth. Therefore, the temporal distribution of climate variables should not be ignored. To reach this goal, a high-resolution ecological-economic model was developed which combines a complex plant growth model (STICS) and an economic model. In this context, input data of the plant growth model are daily climate values for a specific climate station calculated by the statistical climate model (WETTREG). The economic model is deduced from the results of the plant growth model STICS. The chosen plant is corn because corn is often cultivated and used in many different ways. First of all, a sensitivity analysis showed that the plant growth model STICS is suitable to calculate the influences of different cultivation methods and climate on plant growth or yield as well as on soil fertility, e.g. by nitrate leaching, in a realistic way. Additional simulations helped to assess a production function that is the key element of the economic model. Thereby the problems when using mean values of temperature and precipitation in order to compute a production function by linear regression are pointed out. Several examples show why a linear regression to assess a production function based on mean climate values or smoothed natural distribution leads to imperfect results and why it is not possible to deduce a unique climate factor in the production function. One solution for this problem is the additional consideration of stress indices that show the impairment of plants by water or nitrate shortage. Thus, the resulting model takes into account not only the ecological factors (e.g. the plant growth) or the economical factors as a simple monetary calculation, but also their mutual influences. Finally, the ecological-economic model enables us to make a risk assessment or evaluate adaptation strategies.

  11. An optimization model for energy generation and distribution in a dynamic facility

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.

    1981-01-01

    An analytical model is described using linear programming for the optimum generation and distribution of energy demands among competing energy resources and different economic criteria. The model, which will be used as a general engineering tool in the analysis of the Deep Space Network ground facility, considers several essential decisions for better design and operation. The decisions sought for the particular energy application include: the optimum time to build an assembly of elements, inclusion of a storage medium of some type, and the size or capacity of the elements that will minimize the total life-cycle cost over a given number of years. The model, which is structured in multiple time divisions, employ the decomposition principle for large-size matrices, the branch-and-bound method in mixed-integer programming, and the revised simplex technique for efficient and economic computer use.

  12. Documentary of MFENET, a national computer network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shuttleworth, B.O.

    1977-06-01

    The national Magnetic Fusion Energy Computer Network (MFENET) is a newly operational star network of geographically separated heterogeneous hosts and a communications subnetwork of PDP-11 processors. Host processors interfaced to the subnetwork currently include a CDC 7600 at the Central Computer Center (CCC) and several DECsystem-10's at User Service Centers (USC's). The network was funded by a U.S. government agency (ERDA) to provide in an economical manner the needed computational resources to magnetic confinement fusion researchers. Phase I operation of MFENET distributed the processing power of the CDC 7600 among the USC's through the provision of file transport between anymore » two hosts and remote job entry to the 7600. Extending the capabilities of Phase I, MFENET Phase II provided interactive terminal access to the CDC 7600 from the USC's. A file management system is maintained at the CCC for all network users. The history and development of MFENET are discussed, with emphasis on the protocols used to link the host computers and the USC software. Comparisons are made of MFENET versus ARPANET (Advanced Research Projects Agency Computer Network) and DECNET (Digital Distributed Network Architecture). DECNET and MFENET host-to host, host-to-CCP, and link protocols are discussed in detail. The USC--CCP interface is described briefly. 43 figures, 2 tables.« less

  13. Eliciting expert opinion for economic models: an applied example.

    PubMed

    Leal, José; Wordsworth, Sarah; Legood, Rosa; Blair, Edward

    2007-01-01

    Expert opinion is considered as a legitimate source of information for decision-analytic modeling where required data are unavailable. Our objective was to develop a practical computer-based tool for eliciting expert opinion about the shape of the uncertainty distribution around individual model parameters. We first developed a prepilot survey with departmental colleagues to test a number of alternative approaches to eliciting opinions on the shape of the uncertainty distribution around individual parameters. This information was used to develop a survey instrument for an applied clinical example. This involved eliciting opinions from experts to inform a number of parameters involving Bernoulli processes in an economic model evaluating DNA testing for families with a genetic disease, hypertrophic cardiomyopathy. The experts were cardiologists, clinical geneticists, and laboratory scientists working with cardiomyopathy patient populations and DNA testing. Our initial prepilot work suggested that the more complex elicitation techniques advocated in the literature were difficult to use in practice. In contrast, our approach achieved a reasonable response rate (50%), provided logical answers, and was generally rated as easy to use by respondents. The computer software user interface permitted graphical feedback throughout the elicitation process. The distributions obtained were incorporated into the model, enabling the use of probabilistic sensitivity analysis. There is clearly a gap in the literature between theoretical elicitation techniques and tools that can be used in applied decision-analytic models. The results of this methodological study are potentially valuable for other decision analysts deriving expert opinion.

  14. HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation

    DOE PAGES

    Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian; ...

    2017-09-29

    Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less

  15. HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian

    Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less

  16. Physical Premium Principle: A New Way for Insurance Pricing

    NASA Astrophysics Data System (ADS)

    Darooneh, Amir H.

    2005-03-01

    In our previous work we suggested a way for computing the non-life insurance premium. The probable surplus of the insurer company assumed to be distributed according to the canonical ensemble theory. The Esscher premium principle appeared as its special case. The difference between our method and traditional principles for premium calculation was shown by simulation. Here we construct a theoretical foundation for the main assumption in our method, in this respect we present a new (physical) definition for the economic equilibrium. This approach let us to apply the maximum entropy principle in the economic systems. We also extend our method to deal with the problem of premium calculation for correlated risk categories. Like the Buhlman economic premium principle our method considers the effect of the market on the premium but in a different way.

  17. Learning to Love Your Computer: A Fourth Grade Study in the Use of Computers and Their Economic Impact on the World Today.

    ERIC Educational Resources Information Center

    McKeever, Barbara

    An award-winning fourth-grade unit combines computer and economics education by examining the impact of computer usage on various segments of the economy. Students spent one semester becoming familiar with a classroom computer and gaining a general understanding of basic economic concepts through class discussion, field trips, and bulletin boards.…

  18. Economic Value of Dispensing Home-Based Preoperative Chlorhexidine Bathing Cloths to Prevent Surgical Site Infection

    PubMed Central

    Bailey, Rachel R.; Stuckey, Dianna R.; Norman, Bryan A.; Duggan, Andrew P.; Bacon, Kristina M.; Connor, Diana L.; Lee, Ingi; Muder, Robert R.; Lee, Bruce Y.

    2012-01-01

    OBJECTIVE To estimate the economic value of dispensing preoperative home-based chlorhexidine bathing cloth kits to orthopedic patients to prevent surgical site infection (SSI). METHODS A stochastic decision-analytic computer simulation model was developed from the hospital’s perspective depicting the decision of whether to dispense the kits preoperatively to orthopedic patients. We varied patient age, cloth cost, SSI-attributable excess length of stay, cost per bed-day, patient compliance with the regimen, and cloth antimicrobial efficacy to determine which variables were the most significant drivers of the model’s outcomes. RESULTS When all other variables remained at baseline and cloth efficacy was at least 50%, patient compliance only had to be half of baseline (baseline mean, 15.3%; range, 8.23%–20.0%) for chlorhexidine cloths to remain the dominant strategy (ie, less costly and providing better health outcomes). When cloth efficacy fell to 10%, 1.5 times the baseline bathing compliance also afforded dominance of the preoperative bath. CONCLUSIONS The results of our study favor the routine distribution of bathing kits. Even with low patient compliance and cloth efficacy values, distribution of bathing kits is an economically beneficial strategy for the prevention of SSI. PMID:21515977

  19. A Gaussian Approximation Approach for Value of Information Analysis.

    PubMed

    Jalal, Hawre; Alarid-Escudero, Fernando

    2018-02-01

    Most decisions are associated with uncertainty. Value of information (VOI) analysis quantifies the opportunity loss associated with choosing a suboptimal intervention based on current imperfect information. VOI can inform the value of collecting additional information, resource allocation, research prioritization, and future research designs. However, in practice, VOI remains underused due to many conceptual and computational challenges associated with its application. Expected value of sample information (EVSI) is rooted in Bayesian statistical decision theory and measures the value of information from a finite sample. The past few years have witnessed a dramatic growth in computationally efficient methods to calculate EVSI, including metamodeling. However, little research has been done to simplify the experimental data collection step inherent to all EVSI computations, especially for correlated model parameters. This article proposes a general Gaussian approximation (GA) of the traditional Bayesian updating approach based on the original work by Raiffa and Schlaifer to compute EVSI. The proposed approach uses a single probabilistic sensitivity analysis (PSA) data set and involves 2 steps: 1) a linear metamodel step to compute the EVSI on the preposterior distributions and 2) a GA step to compute the preposterior distribution of the parameters of interest. The proposed approach is efficient and can be applied for a wide range of data collection designs involving multiple non-Gaussian parameters and unbalanced study designs. Our approach is particularly useful when the parameters of an economic evaluation are correlated or interact.

  20. Remote sensing of ferric iron minerals as guides for gold exploration

    NASA Technical Reports Server (NTRS)

    Taranik, Dan L.; Kruse, Fred A.; Goetz, Alexander F. H.; Atkinson, William W.

    1991-01-01

    The relationship between the surficial iron mineralogy and economic mineralization is investigated, using data from an airborne imaging spectrometer (the 63-channel Geophysical and Environmental Research Imaging Spectrometer) to map the distribution of iron minerals in the Cripple Creek mining district in Colorado. The airborne image data were coregistered with the field map data for the distribution of iron oxides in the district, in a geographic information computer system, in order to compare their information content. It is shown that the remote imagery was able to uniquely identify the mineral hematite, a mixture of goethite/jarosite, and a mixture of hematite/goethite.

  1. Prediction of resource volumes at untested locations using simple local prediction models

    USGS Publications Warehouse

    Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.

    2006-01-01

    This paper shows how local spatial nonparametric prediction models can be applied to estimate volumes of recoverable gas resources at individual undrilled sites, at multiple sites on a regional scale, and to compute confidence bounds for regional volumes based on the distribution of those estimates. An approach that combines cross-validation, the jackknife, and bootstrap procedures is used to accomplish this task. Simulation experiments show that cross-validation can be applied beneficially to select an appropriate prediction model. The cross-validation procedure worked well for a wide range of different states of nature and levels of information. Jackknife procedures are used to compute individual prediction estimation errors at undrilled locations. The jackknife replicates also are used with a bootstrap resampling procedure to compute confidence bounds for the total volume. The method was applied to data (partitioned into a training set and target set) from the Devonian Antrim Shale continuous-type gas play in the Michigan Basin in Otsego County, Michigan. The analysis showed that the model estimate of total recoverable volumes at prediction sites is within 4 percent of the total observed volume. The model predictions also provide frequency distributions of the cell volumes at the production unit scale. Such distributions are the basis for subsequent economic analyses. ?? Springer Science+Business Media, LLC 2007.

  2. Gaming via Computer Simulation Techniques for Junior College Economics Education. Final Report.

    ERIC Educational Resources Information Center

    Thompson, Fred A.

    A study designed to answer the need for more attractive and effective economics education involved the teaching of one junior college economics class by the conventional (lecture) method and an experimental class by computer simulation techniques. Econometric models approximating the "real world" were computer programed to enable the experimental…

  3. Decentralized Optimal Dispatch of Photovoltaic Inverters in Residential Distribution Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall'Anese, Emiliano; Dhople, Sairaj V.; Johnson, Brian B.

    Summary form only given. Decentralized methods for computing optimal real and reactive power setpoints for residential photovoltaic (PV) inverters are developed in this paper. It is known that conventional PV inverter controllers, which are designed to extract maximum power at unity power factor, cannot address secondary performance objectives such as voltage regulation and network loss minimization. Optimal power flow techniques can be utilized to select which inverters will provide ancillary services, and to compute their optimal real and reactive power setpoints according to well-defined performance criteria and economic objectives. Leveraging advances in sparsity-promoting regularization techniques and semidefinite relaxation, this papermore » shows how such problems can be solved with reduced computational burden and optimality guarantees. To enable large-scale implementation, a novel algorithmic framework is introduced - based on the so-called alternating direction method of multipliers - by which optimal power flow-type problems in this setting can be systematically decomposed into sub-problems that can be solved in a decentralized fashion by the utility and customer-owned PV systems with limited exchanges of information. Since the computational burden is shared among multiple devices and the requirement of all-to-all communication can be circumvented, the proposed optimization approach scales favorably to large distribution networks.« less

  4. An oscillatory kernel function method for lifting surfaces in mixed transonic flow

    NASA Technical Reports Server (NTRS)

    Cunningham, A. M., Jr.

    1974-01-01

    A study was conducted on the use of combined subsonic and supersonic linear theory to obtain economical and yet realistic solutions to unsteady transonic flow problems. With some modification, existing linear theory methods were combined into a single computer program. The method was applied to problems for which measured steady Mach number distributions and unsteady pressure distributions were available. By comparing theory and experiment, the transonic method showed a significant improvement over uniform flow methods. The results also indicated that more exact local Mach number effects and normal shock boundary conditions on the perturbation potential were needed. The validity of these improvements was demonstrated by application to steady flow.

  5. The Use of Computer Simulation Gaming in Teaching Broadcast Economics.

    ERIC Educational Resources Information Center

    Mancuso, Louis C.

    The purpose of this study was to develop a broadcast economic computer simulation and to ascertain how a lecture-computer simulation game compared as a teaching method with a more traditional lecture and case study instructional methods. In each of three sections of a broadcast economics course, a different teaching methodology was employed: (1)…

  6. Distributed energy storage systems on the basis of electric-vehicle fleets

    NASA Astrophysics Data System (ADS)

    Zhuk, A. Z.; Buzoverov, E. A.; Sheindlin, A. E.

    2015-01-01

    Several power technologies directed to solving the problem of covering nonuniform loads in power systems are developed at the Joint Institute of High Temperatures, Russian Academy of Sciences (JIHT RAS). One direction of investigations is the use of storage batteries of electric vehicles to compensate load peaks in the power system (V2G—vehicle-to-grid technology). The efficiency of energy storage systems based on electric vehicles with traditional energy-saving technologies is compared in the article by means of performing computations. The comparison is performed by the minimum-cost criterion for the peak energy supply to the system. Computations show that the distributed storage systems based on fleets of electric cars are efficient economically with their usage regime to 1 h/day. In contrast to traditional methods, the prime cost of regulation of the loads in the power system based on V2G technology is independent of the duration of the load compensation period (the duration of the consumption peak).

  7. Robustness of disaggregate oil and gas discovery forecasting models

    USGS Publications Warehouse

    Attanasi, E.D.; Schuenemeyer, J.H.

    1989-01-01

    The trend in forecasting oil and gas discoveries has been to develop and use models that allow forecasts of the size distribution of future discoveries. From such forecasts, exploration and development costs can more readily be computed. Two classes of these forecasting models are the Arps-Roberts type models and the 'creaming method' models. This paper examines the robustness of the forecasts made by these models when the historical data on which the models are based have been subject to economic upheavals or when historical discovery data are aggregated from areas having widely differing economic structures. Model performance is examined in the context of forecasting discoveries for offshore Texas State and Federal areas. The analysis shows how the model forecasts are limited by information contained in the historical discovery data. Because the Arps-Roberts type models require more regularity in discovery sequence than the creaming models, prior information had to be introduced into the Arps-Roberts models to accommodate the influence of economic changes. The creaming methods captured the overall decline in discovery size but did not easily allow introduction of exogenous information to compensate for incomplete historical data. Moreover, the predictive log normal distribution associated with the creaming model methods appears to understate the importance of the potential contribution of small fields. ?? 1989.

  8. Lognormal field size distributions as a consequence of economic truncation

    USGS Publications Warehouse

    Attanasi, E.D.; Drew, L.J.

    1985-01-01

    The assumption of lognormal (parent) field size distributions has for a long time been applied to resource appraisal and evaluation of exploration strategy by the petroleum industry. However, frequency distributions estimated with observed data and used to justify this hypotheses are conditional. Examination of various observed field size distributions across basins and over time shows that such distributions should be regarded as the end result of an economic filtering process. Commercial discoveries depend on oil and gas prices and field development costs. Some new fields are eliminated due to location, depths, or water depths. This filtering process is called economic truncation. Economic truncation may occur when predictions of a discovery process are passed through an economic appraisal model. We demonstrate that (1) economic resource appraisals, (2) forecasts of levels of petroleum industry activity, and (3) expected benefits of developing and implementing cost reducing technology are sensitive to assumptions made about the nature of that portion of (parent) field size distribution subject to economic truncation. ?? 1985 Plenum Publishing Corporation.

  9. Development of an integrated economic and ecological framework for ecosystem-based fisheries management in New England

    NASA Astrophysics Data System (ADS)

    Jin, D.; Hoagland, P.; Dalton, T. M.; Thunberg, E. M.

    2012-09-01

    We present an integrated economic-ecological framework designed to help assess the implementation of ecosystem-based fisheries management (EBFM) in New England. We develop the framework by linking a computable general equilibrium (CGE) model of a coastal economy to an end-to-end (E2E) model of a marine food web for Georges Bank. We focus on the New England region using coastal county economic data for a restricted set of industry sectors and marine ecological data for three top level trophic feeding guilds: planktivores, benthivores, and piscivores. We undertake numerical simulations to model the welfare effects of changes in alternative combinations of yields from feeding guilds and alternative manifestations of biological productivity. We estimate the economic and distributional effects of these alternative simulations across a range of consumer income levels. This framework could be used to extend existing methodologies for assessing the impacts on human communities of groundfish stock rebuilding strategies, such as those expected through the implementation of the sector management program in the US northeast fishery. We discuss other possible applications of and modifications and limitations to the framework.

  10. Design of material management system of mining group based on Hadoop

    NASA Astrophysics Data System (ADS)

    Xia, Zhiyuan; Tan, Zhuoying; Qi, Kuan; Li, Wen

    2018-01-01

    Under the background of persistent slowdown in mining market at present, improving the management level in mining group has become the key link to improve the economic benefit of the mine. According to the practical material management in mining group, three core components of Hadoop are applied: distributed file system HDFS, distributed computing framework Map/Reduce and distributed database HBase. Material management system of mining group based on Hadoop is constructed with the three core components of Hadoop and SSH framework technology. This system was found to strengthen collaboration between mining group and affiliated companies, and then the problems such as inefficient management, server pressure, hardware equipment performance deficiencies that exist in traditional mining material-management system are solved, and then mining group materials management is optimized, the cost of mining management is saved, the enterprise profit is increased.

  11. Interactive Computer Lessons for Introductory Economics: Guided Inquiry-From Supply and Demand to Women in the Economy.

    ERIC Educational Resources Information Center

    Miller, John; Weil, Gordon

    1986-01-01

    The interactive feature of computers is used to incorporate a guided inquiry method of learning introductory economics, extending the Computer Assisted Instruction (CAI) method beyond drills. (Author/JDH)

  12. The Gradual Shift of Overweight, Obesity, and Abdominal Obesity Towards the Poor in a Multi-ethnic Developing Country: Findings From the Malaysian National Health and Morbidity Surveys

    PubMed Central

    Mariapun, Jeevitha; Ng, Chiu-Wan; Hairi, Noran N.

    2018-01-01

    Background Economic development is known to shift the distribution of obesity from the socioeconomically more advantaged to the less advantaged. We assessed the socioeconomic trends in overweight, obesity, and abdominal obesity across a period of significant economic growth. Methods We used the Malaysian National Health and Morbidity Survey data sets for the years 1996, 2006, and 2011 to analyze the trends among adults aged 30 years and above. The World Health Organization’s Asian body mass index cut-off points of ≥23.0 kg/m2 and ≥27.5 kg/m2 were used to define overweight and obesity, respectively. Abdominal obesity was defined as having a waist circumference of ≥90 cm for men and ≥80 cm for women. Household per-capita income was used as a measure of socioeconomic position. As a summary measure of inequality, we computed the concentration index. Results Women in Peninsular Malaysia demonstrated patterns that were similar to that of developed countries in which the distributions for overweight, obesity, and abdominal obesity became concentrated among the poor. For women in East Malaysia, distributions became neither concentrated among the rich nor poor, while distributions for men were still concentrated among the rich. Chinese women, particularly from the richest quintile, had the lowest rates and lowest increase in overweight and obesity. All distributions of Chinese women were concentrated among the poor. The distributions of Malay men were still concentrated among the rich, while distributions for Chinese and Indian men and Malay and Indian women were neither concentrated among the rich nor poor. Conclusion As the country continues to progress, increasing risks of overweight and obesity among the socioeconomically less advantaged is expected. PMID:29657257

  13. The Gradual Shift of Overweight, Obesity, and Abdominal Obesity Towards the Poor in a Multi-ethnic Developing Country: Findings From the Malaysian National Health and Morbidity Surveys.

    PubMed

    Mariapun, Jeevitha; Ng, Chiu-Wan; Hairi, Noran N

    2018-06-05

    Economic development is known to shift the distribution of obesity from the socioeconomically more advantaged to the less advantaged. We assessed the socioeconomic trends in overweight, obesity, and abdominal obesity across a period of significant economic growth. We used the Malaysian National Health and Morbidity Survey data sets for the years 1996, 2006, and 2011 to analyze the trends among adults aged 30 years and above. The World Health Organization's Asian body mass index cut-off points of ≥23.0 kg/m 2 and ≥27.5 kg/m 2 were used to define overweight and obesity, respectively. Abdominal obesity was defined as having a waist circumference of ≥90 cm for men and ≥80 cm for women. Household per-capita income was used as a measure of socioeconomic position. As a summary measure of inequality, we computed the concentration index. Women in Peninsular Malaysia demonstrated patterns that were similar to that of developed countries in which the distributions for overweight, obesity, and abdominal obesity became concentrated among the poor. For women in East Malaysia, distributions became neither concentrated among the rich nor poor, while distributions for men were still concentrated among the rich. Chinese women, particularly from the richest quintile, had the lowest rates and lowest increase in overweight and obesity. All distributions of Chinese women were concentrated among the poor. The distributions of Malay men were still concentrated among the rich, while distributions for Chinese and Indian men and Malay and Indian women were neither concentrated among the rich nor poor. As the country continues to progress, increasing risks of overweight and obesity among the socioeconomically less advantaged is expected.

  14. The Economics of Educational Software Portability.

    ERIC Educational Resources Information Center

    Oliveira, Joao Batista Araujo e

    1990-01-01

    Discusses economic issues that affect the portability of educational software. Topics discussed include economic reasons for portability, including cost effectiveness; the nature and behavior of educational computer software markets; the role of producers, buyers, and consumers; potential effects of government policies; computer piracy; and…

  15. Social inequality in adolescents' healthy food intake: the interplay between economic, social and cultural capital.

    PubMed

    De Clercq, Bart; Abel, Thomas; Moor, Irene; Elgar, Frank J; Lievens, John; Sioen, Isabelle; Braeckman, Lutgart; Deforche, Benedicte

    2017-04-01

    Current explanations of health inequalities in adolescents focus on behavourial and economic determinants and rarely include more meaningful forms of economic, cultural, and social capital. The aim of the study was to investigate how the interplay between capitals constitutes social inequalities in adolescent healthy food intake. Data were collected in the 2013/14 Flemish Health Behavior among School-aged Children (HBSC) survey, which is part of the international WHO HBSC survey. The total sample included 7266 adolescents aged 12-18. A comprehensive set of 58 capital indicators was used to measure economic, cultural and social capital and a healthy food index was computed from a 17-item food frequency questionnaire (FFQ) to assess the consumption frequency of healthy food within the overall food intake. The different forms of capital were unequally distributed in accordance with the subdivisions within the education system. Only half of the capital indicators positively related to healthy food intake, and instead 17 interactions were found that both increased or reduced inequalities. Cultural capital was a crucial component for explaining inequalities such that social gradients in healthy food intake increased when adolescents participated in elite cultural practices ( P < 0.05), and were consequently reduced when adolescents reported to have a high number of books at home ( P < 0.05). A combination of selected resources in the form of economic, cultural and social capital may both increase or reduce healthy food intake inequalities in adolescents. Policy action needs to take into account the unequal distribution of these resources within the education system. © The Author 2016. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  16. Introduction to Computers for Home Economics Teachers.

    ERIC Educational Resources Information Center

    Thompson, Cecelia; And Others

    Written in simple language and designed in a large-print format, this short guide is aimed at teaching home economics teachers to use computers in their classrooms. The guide is organized in six sections. The first section covers the basics of computer equipment and explains how computers work while the second section outlines how to use…

  17. Representation-Independent Iteration of Sparse Data Arrays

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    An approach is defined that describes a method of iterating over massively large arrays containing sparse data using an approach that is implementation independent of how the contents of the sparse arrays are laid out in memory. What is unique and important here is the decoupling of the iteration over the sparse set of array elements from how they are internally represented in memory. This enables this approach to be backward compatible with existing schemes for representing sparse arrays as well as new approaches. What is novel here is a new approach for efficiently iterating over sparse arrays that is independent of the underlying memory layout representation of the array. A functional interface is defined for implementing sparse arrays in any modern programming language with a particular focus for the Chapel programming language. Examples are provided that show the translation of a loop that computes a matrix vector product into this representation for both the distributed and not-distributed cases. This work is directly applicable to NASA and its High Productivity Computing Systems (HPCS) program that JPL and our current program are engaged in. The goal of this program is to create powerful, scalable, and economically viable high-powered computer systems suitable for use in national security and industry by 2010. This is important to NASA for its computationally intensive requirements for analyzing and understanding the volumes of science data from our returned missions.

  18. Version 3.0 of EMINERS - Economic Mineral Resource Simulator

    USGS Publications Warehouse

    Duval, Joseph S.

    2012-01-01

    Quantitative mineral resource assessment, as developed by the U.S. Geological Survey (USGS), consists of three parts: (1) development of grade and tonnage mineral deposit models; (2) delineation of tracts permissive for each deposit type; and (3) probabilistic estimation of the numbers of undiscovered deposits for each deposit type. The estimate of the number of undiscovered deposits at different levels of probability is the input to the EMINERS (Economic Mineral Resource Simulator) program. EMINERS uses a Monte Carlo statistical process to combine probabilistic estimates of undiscovered mineral deposits with models of mineral deposit grade and tonnage to estimate mineral resources. Version 3.0 of the EMINERS program is available as this USGS Open-File Report 2004-1344. Changes from version 2.0 include updating 87 grade and tonnage models, designing new templates to produce graphs showing cumulative distribution and summary tables, and disabling economic filters. The economic filters were disabled because embedded data for costs of labor and materials, mining techniques, and beneficiation methods are out of date. However, the cost algorithms used in the disabled economic filters are still in the program and available for reference for mining methods and milling techniques. The release notes included with this report give more details on changes in EMINERS over the years. EMINERS is written in C++ and depends upon the Microsoft Visual C++ 6.0 programming environment. The code depends heavily on the use of Microsoft Foundation Classes (MFC) for implementation of the Windows interface. The program works only on Microsoft Windows XP or newer personal computers. It does not work on Macintosh computers. For help in using the program in this report, see the "Quick-Start Guide for Version 3.0 of EMINERS-Economic Mineral Resource Simulator" (W.J. Bawiec and G.T. Spanski, 2012, USGS Open-File Report 2009-1057, linked at right). It demonstrates how to execute EMINERS software using default settings and existing deposit models.

  19. Design of shared unit-dose drug distribution network using multi-level particle swarm optimization.

    PubMed

    Chen, Linjie; Monteiro, Thibaud; Wang, Tao; Marcon, Eric

    2018-03-01

    Unit-dose drug distribution systems provide optimal choices in terms of medication security and efficiency for organizing the drug-use process in large hospitals. As small hospitals have to share such automatic systems for economic reasons, the structure of their logistic organization becomes a very sensitive issue. In the research reported here, we develop a generalized multi-level optimization method - multi-level particle swarm optimization (MLPSO) - to design a shared unit-dose drug distribution network. Structurally, the problem studied can be considered as a type of capacitated location-routing problem (CLRP) with new constraints related to specific production planning. This kind of problem implies that a multi-level optimization should be performed in order to minimize logistic operating costs. Our results show that with the proposed algorithm, a more suitable modeling framework, as well as computational time savings and better optimization performance are obtained than that reported in the literature on this subject.

  20. Estimation of transition probabilities of credit ratings

    NASA Astrophysics Data System (ADS)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  1. Organization of the secure distributed computing based on multi-agent system

    NASA Astrophysics Data System (ADS)

    Khovanskov, Sergey; Rumyantsev, Konstantin; Khovanskova, Vera

    2018-04-01

    Nowadays developing methods for distributed computing is received much attention. One of the methods of distributed computing is using of multi-agent systems. The organization of distributed computing based on the conventional network computers can experience security threats performed by computational processes. Authors have developed the unified agent algorithm of control system of computing network nodes operation. Network PCs is used as computing nodes. The proposed multi-agent control system for the implementation of distributed computing allows in a short time to organize using of the processing power of computers any existing network to solve large-task by creating a distributed computing. Agents based on a computer network can: configure a distributed computing system; to distribute the computational load among computers operated agents; perform optimization distributed computing system according to the computing power of computers on the network. The number of computers connected to the network can be increased by connecting computers to the new computer system, which leads to an increase in overall processing power. Adding multi-agent system in the central agent increases the security of distributed computing. This organization of the distributed computing system reduces the problem solving time and increase fault tolerance (vitality) of computing processes in a changing computing environment (dynamic change of the number of computers on the network). Developed a multi-agent system detects cases of falsification of the results of a distributed system, which may lead to wrong decisions. In addition, the system checks and corrects wrong results.

  2. Los Alamos National Laboratory Economic Analysis Capability Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boero, Riccardo; Edwards, Brian Keith; Pasqualini, Donatella

    Los Alamos National Laboratory has developed two types of models to compute the economic impact of infrastructure disruptions. FastEcon is a fast running model that estimates first-­order economic impacts of large scale events such as hurricanes and floods and can be used to identify the amount of economic activity that occurs in a specific area. LANL’s Computable General Equilibrium (CGE) model estimates more comprehensive static and dynamic economic impacts of a broader array of events and captures the interactions between sectors and industries when estimating economic impacts.

  3. Using Computers in Undergraduate Economics Courses.

    ERIC Educational Resources Information Center

    Barr, Saul Z.; Harmon, Oscar

    Seven computer assignments for undergraduate economics students that concentrate on building a foundation for programming higher level mathematical calculations are described. The purpose of each assignment, the computer program for it, and the correct answers are provided. "Introduction to Text Editing" acquaints the student with some…

  4. Macromod: Computer Simulation For Introductory Economics

    ERIC Educational Resources Information Center

    Ross, Thomas

    1977-01-01

    The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)

  5. Tableau Economique: Teaching Economics with a Tablet Computer

    ERIC Educational Resources Information Center

    Scott, Robert H., III

    2011-01-01

    The typical method of instruction in economics is chalk and talk. Economics courses often require writing equations and drawing graphs and charts, which are all best done in freehand. Unlike static PowerPoint presentations, tablet computers create dynamic nonlinear presentations. Wireless technology allows professors to write on their tablets and…

  6. The relationship between venture capital investment and macro economic variables via statistical computation method

    NASA Astrophysics Data System (ADS)

    Aygunes, Gunes

    2017-07-01

    The objective of this paper is to survey and determine the macroeconomic factors affecting the level of venture capital (VC) investments in a country. The literary depends on venture capitalists' quality and countries' venture capital investments. The aim of this paper is to give relationship between venture capital investment and macro economic variables via statistical computation method. We investigate the countries and macro economic variables. By using statistical computation method, we derive correlation between venture capital investments and macro economic variables. According to method of logistic regression model (logit regression or logit model), macro economic variables are correlated with each other in three group. Venture capitalists regard correlations as a indicator. Finally, we give correlation matrix of our results.

  7. Modern Methods for fast generation of digital holograms

    NASA Astrophysics Data System (ADS)

    Tsang, P. W. M.; Liu, J. P.; Cheung, K. W. K.; Poon, T.-C.

    2010-06-01

    With the advancement of computers, digital holography (DH) has become an area of interest that has gained much popularity. Research findings derived from this technology enables holograms representing three dimensional (3-D) scenes to be acquired with optical means, or generated with numerical computation. In both cases, the holograms are in the form of numerical data that can be recorded, transmitted, and processed with digital techniques. On top of that, the availability of high capacity digital storage and wide-band communication technologies also cast light on the emergence of real time video holographic systems, enabling animated 3-D contents to be encoded as holographic data, and distributed via existing medium. At present, development in DH has reached a reasonable degree of maturity, but at the same time the heavy computation involved also imposes difficulty in practical applications. In this paper, a summary on a number of successful accomplishments that have been made recently in overcoming this problem is presented. Subsequently, we shall propose an economical framework that is suitable for real time generation and transmission of holographic video signals over existing distribution media. The proposed framework includes an aspect of extending the depth range of the object scene, which is important for the display of large-scale objects. [Figure not available: see fulltext.

  8. GENIE(++): A Multi-Block Structured Grid System

    NASA Technical Reports Server (NTRS)

    Williams, Tonya; Nadenthiran, Naren; Thornburg, Hugh; Soni, Bharat K.

    1996-01-01

    The computer code GENIE++ is a continuously evolving grid system containing a multitude of proven geometry/grid techniques. The generation process in GENIE++ is based on an earlier version. The process uses several techniques either separately or in combination to quickly and economically generate sculptured geometry descriptions and grids for arbitrary geometries. The computational mesh is formed by using an appropriate algebraic method. Grid clustering is accomplished with either exponential or hyperbolic tangent routines which allow the user to specify a desired point distribution. Grid smoothing can be accomplished by using an elliptic solver with proper forcing functions. B-spline and Non-Uniform Rational B-splines (NURBS) algorithms are used for surface definition and redistribution. The built in sculptured geometry definition with desired distribution of points, automatic Bezier curve/surface generation for interior boundaries/surfaces, and surface redistribution is based on NURBS. Weighted Lagrance/Hermite transfinite interpolation methods, interactive geometry/grid manipulation modules, and on-line graphical visualization of the generation process are salient features of this system which result in a significant time savings for a given geometry/grid application.

  9. The Treatment of Wealth Distribution by High School Economics Textbooks

    ERIC Educational Resources Information Center

    Neumann, Richard

    2014-01-01

    This article presents findings from an investigation of the treatment of wealth distribution by high school economics textbooks. The eight leading high school economics texts in the United States were examined.

  10. Energy, economic growth, and equity in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kannan, N.P.

    1979-01-01

    Decades of economic growth in the United States, although improving the lot of many, have failed to solve the problem of poverty. Islands of acute poverty persist amidst affluence even today, invalidating the conventional wisdom that a growing economy lifts everyone. For better or for worse, economic growth has been mainly dependent upon energy to solve the problem of poverty, and the insidious energy crisis that confronts us today threatens this economic growth and the dream of an equitable society. For this reason it is important to consider all the potential consequences of energy policies that are designed to helpmore » achieve energy self-sufficiency. In this study alternate energy policies are identified and compared for their relative degrees of potential trade-offs. The evaluation of the policies is carried out with the aid of two computer simulation models, ECONOMY1 and FOSSIL1, which are designed to capture the interactions between the energy sector and the rest of the economy of the United States. The study proposes an alternate set of hypotheses that emphasize the dynamics of social conflict over the distributive shares in the economy. The ECONOMY1 model is based on these hypotheses. 103 references, 79 figures, 16 tables.« less

  11. Computer versus Paper Testing in Precollege Economics

    ERIC Educational Resources Information Center

    Butters, Roger B.; Walstad, William B.

    2011-01-01

    Interest is growing at the precollege level in computer testing (CT) instead of paper-and-pencil testing (PT) for subjects in the school curriculum, including economics. Before economic educators adopt CT, a better understanding of its likely effects on test-taking behavior and performance compared with PT is needed. Using two volunteer student…

  12. Technological Change in Assessing Economics: A Cautionary Welcome

    ERIC Educational Resources Information Center

    Kennelly, Brendan; Considine, John; Flannery, Darragh

    2009-01-01

    The use of computer-based automated assignment systems in economics has expanded significantly in recent years. The most widely used system is Aplia which was developed by Paul Romer in 2000. Aplia is a computer application designed to replace traditional paper-based assignments in economics. The main features of Aplia are: (1) interactive content…

  13. Confidence bands for measured economically optimal nitrogen rates

    USDA-ARS?s Scientific Manuscript database

    While numerous researchers have computed economically optimal N rate (EONR) values from measured yield – N rate data, nearly all have neglected to compute or estimate the statistical reliability of these EONR values. In this study, a simple method for computing EONR and its confidence bands is descr...

  14. Electric Composition Cost Comparison.

    ERIC Educational Resources Information Center

    Joint Committee on Printing, Washington, DC.

    Experience of the U.S. Government Printing Office and others has shown that electronic composition of computer processed data is more economical than printing from camera copy produced by the line printers of digital computers. But electronic composition of data not already being processed by computer is not necessarily economical. This analysis…

  15. The Impact of Economic Policies on Poverty and Income Distribution: Evaluation Techniques and Tools.

    ERIC Educational Resources Information Center

    Bourguignon, Francois, Ed.; Pereira da Silva, Luiz A., Ed.

    This book, a collection of articles and papers, reviews techniques and tools that can be used to evaluate the poverty and distributional impact of economic policy choices. Following are its contents: "Evaluating the Poverty and Distributional Impact of Economic Policies: A Compendium of Existing Techniques" (Francois Bourguignon and Luiz A.…

  16. Developing an Index to Measure Health System Performance: Measurement for Districts of Nepal.

    PubMed

    Kandel, N; Fric, A; Lamichhane, J

    2014-01-01

    Various frameworks for measuring health system performance have been proposed and discussed. The scope of using performance indicators are broad, ranging from examining national health system to individual patients at various levels of health system. Development of innovative and easy index is essential to measure multidimensionality of health systems. We used indicators, which also serve as proxy to the set of activities, whose primary goal is to maintain and improve health. We used eleven indicators of MDGs, which represent all dimensions of health to develop index. These indicators are computed with similar methodology that of human development index. We used published data of Nepal for computation of the index for districts of Nepal as an illustration. To validate our finding, we compared the indices of these districts with other development indices of Nepal. An index for each district has been computed from eleven indicators. Then indices are compared with that of human development index, socio-economic and infrastructure development indices and findings has shown the similarity on distribution of districts. Categories of low and high performing districts on health system performance are also having low and high human development, socio-economic, and infrastructure indices respectively. This methodology of computing index from various indicators could assist policy makers and program managers to prioritize activities based on their performance. Validation of the findings with that of other development indicators show that this can be one of the tools, which can assist on assessing health system performance for policy makers, program managers and others.

  17. High Performance Computing for Modeling Wind Farms and Their Impact

    NASA Astrophysics Data System (ADS)

    Mavriplis, D.; Naughton, J. W.; Stoellinger, M. K.

    2016-12-01

    As energy generated by wind penetrates further into our electrical system, modeling of power production, power distribution, and the economic impact of wind-generated electricity is growing in importance. The models used for this work can range in fidelity from simple codes that run on a single computer to those that require high performance computing capabilities. Over the past several years, high fidelity models have been developed and deployed on the NCAR-Wyoming Supercomputing Center's Yellowstone machine. One of the primary modeling efforts focuses on developing the capability to compute the behavior of a wind farm in complex terrain under realistic atmospheric conditions. Fully modeling this system requires the simulation of continental flows to modeling the flow over a wind turbine blade, including down to the blade boundary level, fully 10 orders of magnitude in scale. To accomplish this, the simulations are broken up by scale, with information from the larger scales being passed to the lower scale models. In the code being developed, four scale levels are included: the continental weather scale, the local atmospheric flow in complex terrain, the wind plant scale, and the turbine scale. The current state of the models in the latter three scales will be discussed. These simulations are based on a high-order accurate dynamic overset and adaptive mesh approach, which runs at large scale on the NWSC Yellowstone machine. A second effort on modeling the economic impact of new wind development as well as improvement in wind plant performance and enhancements to the transmission infrastructure will also be discussed.

  18. An economic and financial exploratory

    NASA Astrophysics Data System (ADS)

    Cincotti, S.; Sornette, D.; Treleaven, P.; Battiston, S.; Caldarelli, G.; Hommes, C.; Kirman, A.

    2012-11-01

    This paper describes the vision of a European Exploratory for economics and finance using an interdisciplinary consortium of economists, natural scientists, computer scientists and engineers, who will combine their expertise to address the enormous challenges of the 21st century. This Academic Public facility is intended for economic modelling, investigating all aspects of risk and stability, improving financial technology, and evaluating proposed regulatory and taxation changes. The European Exploratory for economics and finance will be constituted as a network of infrastructure, observatories, data repositories, services and facilities and will foster the creation of a new cross-disciplinary research community of social scientists, complexity scientists and computing (ICT) scientists to collaborate in investigating major issues in economics and finance. It is also considered a cradle for training and collaboration with the private sector to spur spin-offs and job creations in Europe in the finance and economic sectors. The Exploratory will allow Social Scientists and Regulators as well as Policy Makers and the private sector to conduct realistic investigations with real economic, financial and social data. The Exploratory will (i) continuously monitor and evaluate the status of the economies of countries in their various components, (ii) use, extend and develop a large variety of methods including data mining, process mining, computational and artificial intelligence and every other computer and complex science techniques coupled with economic theory and econometric, and (iii) provide the framework and infrastructure to perform what-if analysis, scenario evaluations and computational, laboratory, field and web experiments to inform decision makers and help develop innovative policy, market and regulation designs.

  19. Economic development evaluation based on science and patents

    NASA Astrophysics Data System (ADS)

    Jokanović, Bojana; Lalic, Bojan; Milovančević, Miloš; Simeunović, Nenad; Marković, Dusan

    2017-09-01

    Economic development could be achieved through many factors. Science and technology factors could influence economic development drastically. Therefore the main aim in this study was to apply computational intelligence methodology, artificial neural network approach, for economic development estimation based on different science and technology factors. Since economic analyzing could be very challenging task because of high nonlinearity, in this study was applied computational intelligence methodology, artificial neural network approach, to estimate the economic development based on different science and technology factors. As economic development measure, gross domestic product (GDP) was used. As the science and technology factors, patents in different field were used. It was found that the patents in electrical engineering field have the highest influence on the economic development or the GDP.

  20. Social media fingerprints of unemployment.

    PubMed

    Llorente, Alejandro; Garcia-Herranz, Manuel; Cebrian, Manuel; Moro, Esteban

    2015-01-01

    Recent widespread adoption of electronic and pervasive technologies has enabled the study of human behavior at an unprecedented level, uncovering universal patterns underlying human activity, mobility, and interpersonal communication. In the present work, we investigate whether deviations from these universal patterns may reveal information about the socio-economical status of geographical regions. We quantify the extent to which deviations in diurnal rhythm, mobility patterns, and communication styles across regions relate to their unemployment incidence. For this we examine a country-scale publicly articulated social media dataset, where we quantify individual behavioral features from over 19 million geo-located messages distributed among more than 340 different Spanish economic regions, inferred by computing communities of cohesive mobility fluxes. We find that regions exhibiting more diverse mobility fluxes, earlier diurnal rhythms, and more correct grammatical styles display lower unemployment rates. As a result, we provide a simple model able to produce accurate, easily interpretable reconstruction of regional unemployment incidence from their social-media digital fingerprints alone. Our results show that cost-effective economical indicators can be built based on publicly-available social media datasets.

  1. Social Media Fingerprints of Unemployment

    PubMed Central

    Llorente, Alejandro; Garcia-Herranz, Manuel; Cebrian, Manuel; Moro, Esteban

    2015-01-01

    Recent widespread adoption of electronic and pervasive technologies has enabled the study of human behavior at an unprecedented level, uncovering universal patterns underlying human activity, mobility, and interpersonal communication. In the present work, we investigate whether deviations from these universal patterns may reveal information about the socio-economical status of geographical regions. We quantify the extent to which deviations in diurnal rhythm, mobility patterns, and communication styles across regions relate to their unemployment incidence. For this we examine a country-scale publicly articulated social media dataset, where we quantify individual behavioral features from over 19 million geo-located messages distributed among more than 340 different Spanish economic regions, inferred by computing communities of cohesive mobility fluxes. We find that regions exhibiting more diverse mobility fluxes, earlier diurnal rhythms, and more correct grammatical styles display lower unemployment rates. As a result, we provide a simple model able to produce accurate, easily interpretable reconstruction of regional unemployment incidence from their social-media digital fingerprints alone. Our results show that cost-effective economical indicators can be built based on publicly-available social media datasets. PMID:26020628

  2. Measuring direct and indirect costs of land retirement in an irrigated river basin: A budgeting regional multiplier approach

    NASA Astrophysics Data System (ADS)

    Hamilton, Joel; Whittlesey, Norman K.; Robison, M. Henry; Willis, David

    2002-08-01

    This analysis addresses three important conceptual problems in the measurement of direct and indirect costs and benefits: (1) the distribution of impacts between a regional economy and the encompassing state economy; (2) the distinction between indirect impacts and indirect costs (IC), focusing on the dynamic time path unemployed resources follow to find alternative employment; and (3) the distinction among the affected firms' microeconomic categories of fixed and variable costs as they are used to compute regional direct and indirect costs. It uses empirical procedures that reconcile the usual measures of economic impact provided by input/output models with the estimates of economic costs and benefits required for analysis of welfare changes. The paper illustrates the relationships and magnitudes involved in the context of water policy issues facing the Pecos River Basin of New Mexico.

  3. THE PRODUCTION AND EVALUATION OF THREE COMPUTER-BASED ECONOMICS GAMES FOR THE SIXTH GRADE. FINAL REPORT.

    ERIC Educational Resources Information Center

    WING, RICHARD L.; AND OTHERS

    THE PURPOSE OF THE EXPERIMENT WAS TO PRODUCE AND EVALUATE 3 COMPUTER-BASED ECONOMICS GAMES AS A METHOD OF INDIVIDUALIZING INSTRUCTION FOR GRADE 6 STUDENTS. 26 EXPERIMENTAL SUBJECTS PLAYED 2 ECONOMICS GAMES, WHILE A CONTROL GROUP RECEIVED CONVENTIONAL INSTRUCTION ON SIMILAR MATERIAL. IN THE SUMERIAN GAME, STUDENTS SEATED AT THE TYPEWRITER TERMINALS…

  4. Computational investigation of fluid flow and heat transfer of an economizer by porous medium approach

    NASA Astrophysics Data System (ADS)

    Babu, C. Rajesh; Kumar, P.; Rajamohan, G.

    2017-07-01

    Computation of fluid flow and heat transfer in an economizer is simulated by a porous medium approach, with plain tubes having a horizontal in-line arrangement and cross flow arrangement in a coal-fired thermal power plant. The economizer is a thermal mechanical device that captures waste heat from the thermal exhaust flue gasses through heat transfer surfaces to preheat boiler feed water. In order to evaluate the fluid flow and heat transfer on tubes, a numerical analysis on heat transfer performance is carried out on an 110 t/h MCR (Maximum continuous rating) boiler unit. In this study, thermal performance is investigated using the computational fluid dynamics (CFD) simulation using ANSYS FLUENT. The fouling factor ε and the overall heat transfer coefficient ψ are employed to evaluate the fluid flow and heat transfer. The model demands significant computational details for geometric modeling, grid generation, and numerical calculations to evaluate the thermal performance of an economizer. The simulation results show that the overall heat transfer coefficient 37.76 W/(m2K) and economizer coil side pressure drop of 0.2 (kg/cm2) are found to be conformity within the tolerable limits when compared with existing industrial economizer data.

  5. Counting Jobs and Economic Impacts from Distributed Wind in the United States (Poster)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tegen, S.

    This conference poster describes the distributed wind Jobs and Economic Development Imapcts (JEDI) model. The goal of this work is to provide a model that estimates jobs and other economic effects associated with the domestic distributed wind industry. The distributed wind JEDI model is a free input-output model that estimates employment and other impacts resulting from an investment in distributed wind installations. Default inputs are from installers and industry experts and are based on existing projects. User input can be minimal (use defaults) or very detailed for more precise results. JEDI can help evaluate potential scenarios, current or future; informmore » stakeholders and decision-makers; assist businesses in evaluating economic development impacts and estimating jobs; assist government organizations with planning and evaluating and developing communities.« less

  6. Large-scale-system effectiveness analysis. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Foster, J.W.

    1979-11-01

    Objective of the research project has been the investigation and development of methods for calculating system reliability indices which have absolute, and measurable, significance to consumers. Such indices are a necessary prerequisite to any scheme for system optimization which includes the economic consequences of consumer service interruptions. A further area of investigation has been joint consideration of generation and transmission in reliability studies. Methods for finding or estimating the probability distributions of some measures of reliability performance have been developed. The application of modern Monte Carlo simulation methods to compute reliability indices in generating systems has been studied.

  7. 20 CFR 901.11 - Enrollment procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    .... Examples include economics, computer programs, pension accounting, investment and finance, risk theory... Columbia responsible for the issuance of a license in the field of actuarial science, insurance, accounting... include economics, computer programming, pension accounting, investment and finance, risk theory...

  8. Environmental and socio-economic risk modelling for Chagas disease in Bolivia.

    PubMed

    Mischler, Paula; Kearney, Michael; McCarroll, Jennifer C; Scholte, Ronaldo G C; Vounatsou, Penelope; Malone, John B

    2012-09-01

    Accurately defining disease distributions and calculating disease risk is an important step in the control and prevention of diseases. Geographical information systems (GIS) and remote sensing technologies, with maximum entropy (Maxent) ecological niche modelling computer software, were used to create predictive risk maps for Chagas disease in Bolivia. Prevalence rates were calculated from 2007 to 2009 household infection survey data for Bolivia, while environmental data were compiled from the Worldclim database and MODIS satellite imagery. Socio-economic data were obtained from the Bolivian National Institute of Statistics. Disease models identified altitudes at 500-3,500 m above the mean sea level (MSL), low annual precipitation (45-250 mm), and higher diurnal range of temperature (10-19 °C; peak 16 °C) as compatible with the biological requirements of the insect vectors. Socio-economic analyses demonstrated the importance of improved housing materials and water source. Home adobe wall materials and having to fetch drinking water from rivers or wells without pump were found to be highly related to distribution of the disease by the receiver operator characteristic (ROC) area under the curve (AUC) (0.69 AUC, 0.67 AUC and 0.62 AUC, respectively), while areas with hardwood floors demonstrated a direct negative relationship (-0.71 AUC). This study demonstrates that Maxent modelling can be used in disease prevalence and incidence studies to provide governmental agencies with an easily learned, understandable method to define areas as either high, moderate or low risk for the disease. This information may be used in resource planning, targeting and implementation. However, access to high-resolution, sub-municipality socio-economic data (e.g. census tracts) would facilitate elucidation of the relative influence of poverty-related factors on regional disease dynamics.

  9. Using a Simple Neural Network to Delineate Some Principles of Distributed Economic Choice.

    PubMed

    Balasubramani, Pragathi P; Moreno-Bote, Rubén; Hayden, Benjamin Y

    2018-01-01

    The brain uses a mixture of distributed and modular organization to perform computations and generate appropriate actions. While the principles under which the brain might perform computations using modular systems have been more amenable to modeling, the principles by which the brain might make choices using distributed principles have not been explored. Our goal in this perspective is to delineate some of those distributed principles using a neural network method and use its results as a lens through which to reconsider some previously published neurophysiological data. To allow for direct comparison with our own data, we trained the neural network to perform binary risky choices. We find that value correlates are ubiquitous and are always accompanied by non-value information, including spatial information (i.e., no pure value signals). Evaluation, comparison, and selection were not distinct processes; indeed, value signals even in the earliest stages contributed directly, albeit weakly, to action selection. There was no place, other than at the level of action selection, at which dimensions were fully integrated. No units were specialized for specific offers; rather, all units encoded the values of both offers in an anti-correlated format, thus contributing to comparison. Individual network layers corresponded to stages in a continuous rotation from input to output space rather than to functionally distinct modules. While our network is likely to not be a direct reflection of brain processes, we propose that these principles should serve as hypotheses to be tested and evaluated for future studies.

  10. Using a Simple Neural Network to Delineate Some Principles of Distributed Economic Choice

    PubMed Central

    Balasubramani, Pragathi P.; Moreno-Bote, Rubén; Hayden, Benjamin Y.

    2018-01-01

    The brain uses a mixture of distributed and modular organization to perform computations and generate appropriate actions. While the principles under which the brain might perform computations using modular systems have been more amenable to modeling, the principles by which the brain might make choices using distributed principles have not been explored. Our goal in this perspective is to delineate some of those distributed principles using a neural network method and use its results as a lens through which to reconsider some previously published neurophysiological data. To allow for direct comparison with our own data, we trained the neural network to perform binary risky choices. We find that value correlates are ubiquitous and are always accompanied by non-value information, including spatial information (i.e., no pure value signals). Evaluation, comparison, and selection were not distinct processes; indeed, value signals even in the earliest stages contributed directly, albeit weakly, to action selection. There was no place, other than at the level of action selection, at which dimensions were fully integrated. No units were specialized for specific offers; rather, all units encoded the values of both offers in an anti-correlated format, thus contributing to comparison. Individual network layers corresponded to stages in a continuous rotation from input to output space rather than to functionally distinct modules. While our network is likely to not be a direct reflection of brain processes, we propose that these principles should serve as hypotheses to be tested and evaluated for future studies. PMID:29643773

  11. Benford's law and the FSD distribution of economic behavioral micro data

    NASA Astrophysics Data System (ADS)

    Villas-Boas, Sofia B.; Fu, Qiuzi; Judge, George

    2017-11-01

    In this paper, we focus on the first significant digit (FSD) distribution of European micro income data and use information theoretic-entropy based methods to investigate the degree to which Benford's FSD law is consistent with the nature of these economic behavioral systems. We demonstrate that Benford's law is not an empirical phenomenon that occurs only in important distributions in physical statistics, but that it also arises in self-organizing dynamic economic behavioral systems. The empirical likelihood member of the minimum divergence-entropy family, is used to recover country based income FSD probability density functions and to demonstrate the implications of using a Benford prior reference distribution in economic behavioral system information recovery.

  12. A study using a Monte Carlo method of the optimal configuration of a distribution network in terms of power loss sensing.

    PubMed

    Moon, Hyun Ho; Lee, Jong Joo; Choi, Sang Yule; Cha, Jae Sang; Kang, Jang Mook; Kim, Jong Tae; Shin, Myong Chul

    2011-01-01

    Recently there have been many studies of power systems with a focus on "New and Renewable Energy" as part of "New Growth Engine Industry" promoted by the Korean government. "New And Renewable Energy"-especially focused on wind energy, solar energy and fuel cells that will replace conventional fossil fuels-is a part of the Power-IT Sector which is the basis of the SmartGrid. A SmartGrid is a form of highly-efficient intelligent electricity network that allows interactivity (two-way communications) between suppliers and consumers by utilizing information technology in electricity production, transmission, distribution and consumption. The New and Renewable Energy Program has been driven with a goal to develop and spread through intensive studies, by public or private institutions, new and renewable energy which, unlike conventional systems, have been operated through connections with various kinds of distributed power generation systems. Considerable research on smart grids has been pursued in the United States and Europe. In the United States, a variety of research activities on the smart power grid have been conducted within EPRI's IntelliGrid research program. The European Union (EU), which represents Europe's Smart Grid policy, has focused on an expansion of distributed generation (decentralized generation) and power trade between countries with improved environmental protection. Thus, there is current emphasis on a need for studies that assesses the economic efficiency of such distributed generation systems. In this paper, based on the cost of distributed power generation capacity, calculations of the best profits obtainable were made by a Monte Carlo simulation. Monte Carlo simulations that rely on repeated random sampling to compute their results take into account the cost of electricity production, daily loads and the cost of sales and generate a result faster than mathematical computations. In addition, we have suggested the optimal design, which considers the distribution loss associated with power distribution systems focus on sensing aspect and distributed power generation.

  13. What might we learn from climate forecasts?

    PubMed Central

    Smith, Leonard A.

    2002-01-01

    Most climate models are large dynamical systems involving a million (or more) variables on big computers. Given that they are nonlinear and not perfect, what can we expect to learn from them about the earth's climate? How can we determine which aspects of their output might be useful and which are noise? And how should we distribute resources between making them “better,” estimating variables of true social and economic interest, and quantifying how good they are at the moment? Just as “chaos” prevents accurate weather forecasts, so model error precludes accurate forecasts of the distributions that define climate, yielding uncertainty of the second kind. Can we estimate the uncertainty in our uncertainty estimates? These questions are discussed. Ultimately, all uncertainty is quantified within a given modeling paradigm; our forecasts need never reflect the uncertainty in a physical system. PMID:11875200

  14. Two statistical mechanics aspects of complex networks

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Biely, Christoly

    2006-12-01

    By adopting an ensemble interpretation of non-growing rewiring networks, network theory can be reduced to a counting problem of possible network states and an identification of their associated probabilities. We present two scenarios of how different rewirement schemes can be used to control the state probabilities of the system. In particular, we review how by generalizing the linking rules of random graphs, in combination with superstatistics and quantum mechanical concepts, one can establish an exact relation between the degree distribution of any given network and the nodes’ linking probability distributions. In a second approach, we control state probabilities by a network Hamiltonian, whose characteristics are motivated by biological and socio-economical statistical systems. We demonstrate that a thermodynamics of networks becomes a fully consistent concept, allowing to study e.g. ‘phase transitions’ and computing entropies through thermodynamic relations.

  15. Improving Search Algorithms by Using Intelligent Coordinates

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Tumer, Kagan; Bandari, Esfandiar

    2004-01-01

    We consider algorithms that maximize a global function G in a distributed manner, using a different adaptive computational agent to set each variable of the underlying space. Each agent eta is self-interested; it sets its variable to maximize its own function g (sub eta). Three factors govern such a distributed algorithm's performance, related to exploration/exploitation, game theory, and machine learning. We demonstrate how to exploit alI three factors by modifying a search algorithm's exploration stage: rather than random exploration, each coordinate of the search space is now controlled by a separate machine-learning-based player engaged in a noncooperative game. Experiments demonstrate that this modification improves simulated annealing (SA) by up to an order of magnitude for bin packing and for a model of an economic process run over an underlying network. These experiments also reveal interesting small-world phenomena.

  16. Improving search algorithms by using intelligent coordinates

    NASA Astrophysics Data System (ADS)

    Wolpert, David; Tumer, Kagan; Bandari, Esfandiar

    2004-01-01

    We consider algorithms that maximize a global function G in a distributed manner, using a different adaptive computational agent to set each variable of the underlying space. Each agent η is self-interested; it sets its variable to maximize its own function gη. Three factors govern such a distributed algorithm’s performance, related to exploration/exploitation, game theory, and machine learning. We demonstrate how to exploit all three factors by modifying a search algorithm’s exploration stage: rather than random exploration, each coordinate of the search space is now controlled by a separate machine-learning-based “player” engaged in a noncooperative game. Experiments demonstrate that this modification improves simulated annealing (SA) by up to an order of magnitude for bin packing and for a model of an economic process run over an underlying network. These experiments also reveal interesting small-world phenomena.

  17. Theory and Experiment of Multielement Airfoils: A Comparison

    NASA Technical Reports Server (NTRS)

    Czerwiec, Ryan; Edwards, J. R.; Rumsey, C. L.; Hassan, H. A.

    2000-01-01

    A detailed comparison of computed and measured pressure distributions, velocity profiles, transition onset, and Reynolds shear stresses for multi-element airfoils is presented. It is shown that the transitional k-zeta model, which is implemented into CFL3D, does a good job of predicting pressure distributions, transition onset, and velocity profiles with the exception of velocities in the slat wake region. Considering the fact that the hot wire used was not fine enough to resolve Reynolds stresses in the boundary layer, comparisons of turbulence stresses varied from good to fair. It is suggested that the effects of unsteadiness be thoroughly evaluated before more complicated transition/turbulence models are used. Further, it is concluded that the present work presents a viable and economical method for calculating laminar/transitional/turbuient flows over complex shapes without user interface.

  18. NEUROBIOLOGY OF ECONOMIC CHOICE: A GOOD-BASED MODEL

    PubMed Central

    Padoa-Schioppa, Camillo

    2012-01-01

    Traditionally the object of economic theory and experimental psychology, economic choice recently became a lively research focus in systems neuroscience. Here I summarize the emerging results and I propose a unifying model of how economic choice might function at the neural level. Economic choice entails comparing options that vary on multiple dimensions. Hence, while choosing, individuals integrate different determinants into a subjective value; decisions are then made by comparing values. According to the good-based model, the values of different goods are computed independently of one another, which implies transitivity. Values are not learned as such, but rather computed at the time of choice. Most importantly, values are compared within the space of goods, independent of the sensori-motor contingencies of choice. Evidence from neurophysiology, imaging and lesion studies indicates that abstract representations of value exist in the orbitofrontal and ventromedial prefrontal cortices. The computation and comparison of values may thus take place within these regions. PMID:21456961

  19. A Survey of Collectives

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Wolpert, David

    2004-01-01

    Due to the increasing sophistication and miniaturization of computational components, complex, distributed systems of interacting agents are becoming ubiquitous. Such systems, where each agent aims to optimize its own performance, but where there is a well-defined set of system-level performance criteria, are called collectives. The fundamental problem in analyzing/designing such systems is in determining how the combined actions of self-interested agents leads to 'coordinated' behavior on a iarge scale. Examples of artificial systems which exhibit such behavior include packet routing across a data network, control of an array of communication satellites, coordination of multiple deployables, and dynamic job scheduling across a distributed computer grid. Examples of natural systems include ecosystems, economies, and the organelles within a living cell. No current scientific discipline provides a thorough understanding of the relation between the structure of collectives and how well they meet their overall performance criteria. Although still very young, research on collectives has resulted in successes both in understanding and designing such systems. It is eqected that as it matures and draws upon other disciplines related to collectives, this field will greatly expand the range of computationally addressable tasks. Moreover, in addition to drawing on them, such a fully developed field of collective intelligence may provide insight into already established scientific fields, such as mechanism design, economics, game theory, and population biology. This chapter provides a survey to the emerging science of collectives.

  20. Modeling of Subsurface Lagrangian Sensor Swarms for Spatially Distributed Current Measurements in High Energy Coastal Environments

    NASA Astrophysics Data System (ADS)

    Harrison, T. W.; Polagye, B. L.

    2016-02-01

    Coastal ecosystems are characterized by spatially and temporally varying hydrodynamics. In marine renewable energy applications, these variations strongly influence project economics and in oceanographic studies, they impact accuracy of biological transport and pollutant dispersion models. While stationary point or profile measurements are relatively straight forward, spatial representativeness of point measurements can be poor due to strong gradients. Moving platforms, such as AUVs or surface vessels, offer better coverage, but suffer from energetic constraints (AUVs) and resolvable scales (vessels). A system of sub-surface, drifting sensor packages is being developed to provide spatially distributed, synoptic data sets of coastal hydrodynamics with meter-scale resolution over a regional extent of a kilometer. Computational investigation has informed system parameters such as drifter size and shape, necessary position accuracy, number of drifters, and deployment methods. A hydrodynamic domain with complex flow features was created using a computational fluid dynamics code. A simple model of drifter dynamics propagate the drifters through the domain in post-processing. System parameters are evaluated relative to their ability to accurately recreate domain hydrodynamics. Implications of these results for an inexpensive, depth-controlled Lagrangian drifter system is presented.

  1. Metrics for Uncertainty in Organizational Decision-Making

    DTIC Science & Technology

    2006-06-01

    measurement and computational agents. Computational Economics : A Perspective from Computational Intelligence book. S.- H. Chen, Jain, Lakhmi, & Tai...change and development." Annual Review of Psychology 50: 361-386. Von Neumann, J., and Morgenstern, O. (1953). Theory of games and economic ...2006 Interviews versus Field data MI MPU Hanford/HAB (CR: cooperation) Savannah River Site/SAB (MR: competition) ER ER about 7.1% in 2002 ER

  2. Capability of the People’s Republic of China to Conduct Cyber Warfare and Computer Network Exploitation

    DTIC Science & Technology

    2009-10-09

    Capability of the People’s Republic of China to Conduct Cyber Warfare and Computer Network Exploitation Prepared for The US-China Economic and...the People?s Republic of China to Conduct Cyber Warfare and Computer Network Exploitation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...Capability of the People’s Republic of China to Conduct Cyber Warfare and Computer Network Exploitation 2 US-China Economic and Security Review

  3. Visualization of logistic algorithm in Wilson model

    NASA Astrophysics Data System (ADS)

    Glushchenko, A. S.; Rodin, V. A.; Sinegubov, S. V.

    2018-05-01

    Economic order quantity (EOQ), defined by the Wilson's model, is widely used at different stages of production and distribution of different products. It is useful for making decisions in the management of inventories, providing a more efficient business operation and thus bringing more economic benefits. There is a large amount of reference material and extensive computer shells that help solving various logistics problems. However, the use of large computer environments is not always justified and requires special user training. A tense supply schedule in a logistics model is optimal, if, and only if, the planning horizon coincides with the beginning of the next possible delivery. For all other possible planning horizons, this plan is not optimal. It is significant that when the planning horizon changes, the plan changes immediately throughout the entire supply chain. In this paper, an algorithm and a program for visualizing models of the optimal value of supplies and their number, depending on the magnitude of the planned horizon, have been obtained. The program allows one to trace (visually and quickly) all main parameters of the optimal plan on the charts. The results of the paper represent a part of the authors’ research work in the field of optimization of protection and support services of ports in the Russian North.

  4. Economic preparation of the environment: A selective empirical analysis of chinese investment in the Philippines

    DTIC Science & Technology

    2017-06-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. ECONOMIC ...Leave blank) 2. REPORT DATE June 2017 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE ECONOMIC PREPARATION OF THE...DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) Over the past decade, the People’s Republic of China has increasingly used its economic might

  5. Computer programs for estimating civil aircraft economics

    NASA Technical Reports Server (NTRS)

    Maddalon, D. V.; Molloy, J. K.; Neubawer, M. J.

    1980-01-01

    Computer programs for calculating airline direct operating cost, indirect operating cost, and return on investment were developed to provide a means for determining commercial aircraft life cycle cost and economic performance. A representative wide body subsonic jet aircraft was evaluated to illustrate use of the programs.

  6. The development of computer networks: First results from a microeconomic model

    NASA Astrophysics Data System (ADS)

    Maier, Gunther; Kaufmann, Alexander

    Computer networks like the Internet are gaining importance in social and economic life. The accelerating pace of the adoption of network technologies for business purposes is a rather recent phenomenon. Many applications are still in the early, sometimes even experimental, phase. Nevertheless, it seems to be certain that networks will change the socioeconomic structures we know today. This is the background for our special interest in the development of networks, in the role of spatial factors influencing the formation of networks, and consequences of networks on spatial structures, and in the role of externalities. This paper discusses a simple economic model - based on a microeconomic calculus - that incorporates the main factors that generate the growth of computer networks. The paper provides analytic results about the generation of computer networks. The paper discusses (1) under what conditions economic factors will initiate the process of network formation, (2) the relationship between individual and social evaluation, and (3) the efficiency of a network that is generated based on economic mechanisms.

  7. Distributed Network, Wireless and Cloud Computing Enabled 3-D Ultrasound; a New Medical Technology Paradigm

    PubMed Central

    Meir, Arie; Rubinsky, Boris

    2009-01-01

    Medical technologies are indispensable to modern medicine. However, they have become exceedingly expensive and complex and are not available to the economically disadvantaged majority of the world population in underdeveloped as well as developed parts of the world. For example, according to the World Health Organization about two thirds of the world population does not have access to medical imaging. In this paper we introduce a new medical technology paradigm centered on wireless technology and cloud computing that was designed to overcome the problems of increasing health technology costs. We demonstrate the value of the concept with an example; the design of a wireless, distributed network and central (cloud) computing enabled three-dimensional (3-D) ultrasound system. Specifically, we demonstrate the feasibility of producing a 3-D high end ultrasound scan at a central computing facility using the raw data acquired at the remote patient site with an inexpensive low end ultrasound transducer designed for 2-D, through a mobile device and wireless connection link between them. Producing high-end 3D ultrasound images with simple low-end transducers reduces the cost of imaging by orders of magnitude. It also removes the requirement of having a highly trained imaging expert at the patient site, since the need for hand-eye coordination and the ability to reconstruct a 3-D mental image from 2-D scans, which is a necessity for high quality ultrasound imaging, is eliminated. This could enable relatively untrained medical workers in developing nations to administer imaging and a more accurate diagnosis, effectively saving the lives of people. PMID:19936236

  8. Distributed network, wireless and cloud computing enabled 3-D ultrasound; a new medical technology paradigm.

    PubMed

    Meir, Arie; Rubinsky, Boris

    2009-11-19

    Medical technologies are indispensable to modern medicine. However, they have become exceedingly expensive and complex and are not available to the economically disadvantaged majority of the world population in underdeveloped as well as developed parts of the world. For example, according to the World Health Organization about two thirds of the world population does not have access to medical imaging. In this paper we introduce a new medical technology paradigm centered on wireless technology and cloud computing that was designed to overcome the problems of increasing health technology costs. We demonstrate the value of the concept with an example; the design of a wireless, distributed network and central (cloud) computing enabled three-dimensional (3-D) ultrasound system. Specifically, we demonstrate the feasibility of producing a 3-D high end ultrasound scan at a central computing facility using the raw data acquired at the remote patient site with an inexpensive low end ultrasound transducer designed for 2-D, through a mobile device and wireless connection link between them. Producing high-end 3D ultrasound images with simple low-end transducers reduces the cost of imaging by orders of magnitude. It also removes the requirement of having a highly trained imaging expert at the patient site, since the need for hand-eye coordination and the ability to reconstruct a 3-D mental image from 2-D scans, which is a necessity for high quality ultrasound imaging, is eliminated. This could enable relatively untrained medical workers in developing nations to administer imaging and a more accurate diagnosis, effectively saving the lives of people.

  9. Worth of data and natural disaster insurance

    USGS Publications Warehouse

    Attanasi, E.D.; Karlinger, M.R.

    1979-01-01

    The Federal Government in the past has provided medical and economic aid to victims of earthquakes and floods. However, regulating the use of hazard-prone areas would probably be more efficient. One way to implement such land use regulation is through the national flood and earthquake insurance program. Because insurance firms base their premium rates on available information, the benefits from additional data used to improve parameter estimates of the probability distribution (governing actual disaster events) can be computed by computing changes in the premiums as a function of additional data. An insurance firm is assumed to set rates so as to trade off penalties of overestimation and underestimation of expected damages. A Bayesian preposterior analysis is applied to determine the worth of additional data, as measured by changes in consumers’ surplus, by examining the effects of changes in premiums as a function of a longer hydrologic record.

  10. ASCR Cybersecurity for Scientific Computing Integrity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piesert, Sean

    The Department of Energy (DOE) has the responsibility to address the energy, environmental, and nuclear security challenges that face our nation. Much of DOE’s enterprise involves distributed, collaborative teams; a signi¬cant fraction involves “open science,” which depends on multi-institutional, often international collaborations that must access or share signi¬cant amounts of information between institutions and over networks around the world. The mission of the Office of Science is the delivery of scienti¬c discoveries and major scienti¬c tools to transform our understanding of nature and to advance the energy, economic, and national security of the United States. The ability of DOE tomore » execute its responsibilities depends critically on its ability to assure the integrity and availability of scienti¬c facilities and computer systems, and of the scienti¬c, engineering, and operational software and data that support its mission.« less

  11. Alexander Disease

    MedlinePlus

    ... there are no ethnic, racial, geographic, or cultural/economic differences in its distribution. Alexander disease is a ... there are no ethnic, racial, geographic, or cultural/economic differences in its distribution. Alexander disease is a ...

  12. The Laboratory-Based Economics Curriculum.

    ERIC Educational Resources Information Center

    King, Paul G.; LaRoe, Ross M.

    1991-01-01

    Describes the liberal arts, computer laboratory-based economics program at Denison University (Ohio). Includes as goals helping students to (1) understand deductive arguments, (2) learn to apply theory in real-world situations, and (3) test and modify theory when necessary. Notes that the program combines computer laboratory experiments for…

  13. Economy Over Security: Why Crises Fail to Impact Economic Behavior in East Asia

    DTIC Science & Technology

    2017-12-01

    SECURITY: WHY CRISES FAIL TO IMPACT ECONOMIC BEHAVIOR IN EAST ASIA by Aaron R. Sipos December 2017 Thesis Advisor: Michael Glosny Second...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE ECONOMY OVER SECURITY: WHY CRISES FAIL TO IMPACT ECONOMIC BEHAVIOR IN EAST...release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) This study examines changes in economic behavior in

  14. Rigorous Results for the Distribution of Money on Connected Graphs

    NASA Astrophysics Data System (ADS)

    Lanchier, Nicolas; Reed, Stephanie

    2018-05-01

    This paper is concerned with general spatially explicit versions of three stochastic models for the dynamics of money that have been introduced and studied numerically by statistical physicists: the uniform reshuffling model, the immediate exchange model and the model with saving propensity. All three models consist of systems of economical agents that consecutively engage in pairwise monetary transactions. Computer simulations performed in the physics literature suggest that, when the number of agents and the average amount of money per agent are large, the limiting distribution of money as time goes to infinity approaches the exponential distribution for the first model, the gamma distribution with shape parameter two for the second model and a distribution similar but not exactly equal to a gamma distribution whose shape parameter depends on the saving propensity for the third model. The main objective of this paper is to give rigorous proofs of these conjectures and also extend these conjectures to generalizations of the first two models and a variant of the third model that include local rather than global interactions, i.e., instead of choosing the two interacting agents uniformly at random from the system, the agents are located on the vertex set of a general connected graph and can only interact with their neighbors.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan S; Bugbee, Bruce; Gotseff, Peter

    Capturing technical and economic impacts of solar photovoltaics (PV) and other distributed energy resources (DERs) on electric distribution systems can require high-time resolution (e.g. 1 minute), long-duration (e.g. 1 year) simulations. However, such simulations can be computationally prohibitive, particularly when including complex control schemes in quasi-steady-state time series (QSTS) simulation. Various approaches have been used in the literature to down select representative time segments (e.g. days), but typically these are best suited for lower time resolutions or consider only a single data stream (e.g. PV production) for selection. We present a statistical approach that combines stratified sampling and bootstrapping tomore » select representative days while also providing a simple method to reassemble annual results. We describe the approach in the context of a recent study with a utility partner. This approach enables much faster QSTS analysis by simulating only a subset of days, while maintaining accurate annual estimates.« less

  16. Economic decision making and the application of nonparametric prediction models

    USGS Publications Warehouse

    Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.

    2008-01-01

    Sustained increases in energy prices have focused attention on gas resources in low-permeability shale or in coals that were previously considered economically marginal. Daily well deliverability is often relatively small, although the estimates of the total volumes of recoverable resources in these settings are often large. Planning and development decisions for extraction of such resources must be areawide because profitable extraction requires optimization of scale economies to minimize costs and reduce risk. For an individual firm, the decision to enter such plays depends on reconnaissance-level estimates of regional recoverable resources and on cost estimates to develop untested areas. This paper shows how simple nonparametric local regression models, used to predict technically recoverable resources at untested sites, can be combined with economic models to compute regional-scale cost functions. The context of the worked example is the Devonian Antrim-shale gas play in the Michigan basin. One finding relates to selection of the resource prediction model to be used with economic models. Models chosen because they can best predict aggregate volume over larger areas (many hundreds of sites) smooth out granularity in the distribution of predicted volumes at individual sites. This loss of detail affects the representation of economic cost functions and may affect economic decisions. Second, because some analysts consider unconventional resources to be ubiquitous, the selection and order of specific drilling sites may, in practice, be determined arbitrarily by extraneous factors. The analysis shows a 15-20% gain in gas volume when these simple models are applied to order drilling prospects strategically rather than to choose drilling locations randomly. Copyright ?? 2008 Society of Petroleum Engineers.

  17. A Study of Economical Incentives for Voltage Profile Control Method in Future Distribution Network

    NASA Astrophysics Data System (ADS)

    Tsuji, Takao; Sato, Noriyuki; Hashiguchi, Takuhei; Goda, Tadahiro; Tange, Seiji; Nomura, Toshio

    In a future distribution network, it is difficult to maintain system voltage because a large number of distributed generators are introduced to the system. The authors have proposed “voltage profile control method” using power factor control of distributed generators in the previous work. However, the economical disbenefit is caused by the active power decrease when the power factor is controlled in order to increase the reactive power. Therefore, proper incentives must be given to the customers that corporate to the voltage profile control method. Thus, in this paper, we develop a new rules which can decide the economical incentives to the customers. The method is tested in one feeder distribution network model and its effectiveness is shown.

  18. Economics of Mass Media Health Campaigns with Health-Related Product Distribution: A Community Guide Systematic Review

    PubMed Central

    Jacob, Verughese; Chattopadhyay, Sajal K.; Elder, Randy W.; Robinson, Maren N.; Tansil, Kristin A.; Soler, Robin E.; Labre, Magdala P.; Mercer, Shawna L.

    2015-01-01

    Context The objective of this systematic review was to determine the costs, benefits, and overall economic value of communication campaigns that included mass media and distribution of specified health-related products at reduced price or free of charge. Evidence Acquisition Economic evaluation studies from a literature search from January 1980–December 2009 were screened and abstracted following systematic economic review methods developed by The Community Guide. Data were analyzed in 2011. Evidence Synthesis The economic evidence was grouped and assessed by type of product distributed and health risk addressed. A total of 15 evaluation studies were included in the economic review, involving campaigns promoting the use of child car seats or booster seats, pedometers, condoms, recreational safety helmets, and nicotine replacement therapy (NRT). Conclusion Economic merits of the intervention could not be determined for health communication campaigns associated with use of recreational helmets, child car seats, and pedometers, primarily because available economic information and analyses were incomplete. There is some evidence that campaigns with free condom distribution to promote safer sex practices were cost-effective among high-risk populations and the cost per quit achieved in campaigns promoting tobacco cessation with NRT products may translate to a cost per quality-adjusted life year (QALY) less than $50,000. Many interventions were publicly funded trials or programs, and the failure to properly evaluate their economic cost and benefit is a serious gap in the science and practice of public health. PMID:25145619

  19. Deep uncertainty and broad heterogeneity in country-level social cost of carbon

    NASA Astrophysics Data System (ADS)

    Ricke, K.; Drouet, L.; Caldeira, K.; Tavoni, M.

    2017-12-01

    The social cost of carbon (SCC) is a commonly employed metric of the expected economic damages expected from carbon dioxide (CO2) emissions. Recent estimates of SCC range from approximately 10/tonne of CO2 to as much as 1000/tCO2, but these have been computed at the global level. While useful in an optimal policy context, a world-level approach obscures the heterogeneous geography of climate damages and vast differences in country-level contributions to global SCC, as well as climate and socio-economic uncertainties, which are much larger at the regional level. For the first time, we estimate country-level contributions to SCC using recent climate and carbon-cycle model projections, empirical climate-driven economic damage estimations, and information from the Shared Socio-economic Pathways. Central specifications show high global SCC values (median: 417 /tCO2, 66% confidence intervals: 168 - 793 /tCO2) with country-level contributions ranging from -11 (-8 - -14) /tCO2 to 86 (50 - 158) /tCO2. We quantify climate-, scenario- and economic damage- driven uncertainties associated with the calculated values of SCC. We find that while the magnitude of country-level social cost of carbon is highly uncertain, the relative positioning among countries is consistent. Countries incurring large fractions of the global cost include India, China, and the United States. The share of SCC distributed among countries is robust, indicating climate change winners and losers from a geopolitical perspective.

  20. Profit--The Key. Principles of the American Free Enterprise System for Distributive Education Students. Student Guide.

    ERIC Educational Resources Information Center

    Hook, Sallie A.; And Others

    In this distributive education student's guide on the free enterprise system, principles of the American economic system are introduced as they relate to marketing and distribution. This twenty-four-page document includes directions and an introduction to the material. Twelve "Principle Sheets" on various economic concepts contain…

  1. Computer Program for Assessing the Economic Feasibility of Solar Energy for Single Family Residences and Light Commercial Applications

    NASA Technical Reports Server (NTRS)

    Forney, J. A.; Walker, D.; Lanier, M.

    1979-01-01

    Computer program, SHCOST, was used to perform economic analyses of operational test sites. The program allows consideration of the economic parameters which are important to the solar system user. A life cycle cost and cash flow comparison is made between a solar heating system and a conventional system. The program assists in sizing the solar heating system. A sensitivity study and plot capability allow the user to select the most cost effective system configuration.

  2. Some advanced parametric methods for assessing waveform distortion in a smart grid with renewable generation

    NASA Astrophysics Data System (ADS)

    Alfieri, Luisa

    2015-12-01

    Power quality (PQ) disturbances are becoming an important issue in smart grids (SGs) due to the significant economic consequences that they can generate on sensible loads. However, SGs include several distributed energy resources (DERs) that can be interconnected to the grid with static converters, which lead to a reduction of the PQ levels. Among DERs, wind turbines and photovoltaic systems are expected to be used extensively due to the forecasted reduction in investment costs and other economic incentives. These systems can introduce significant time-varying voltage and current waveform distortions that require advanced spectral analysis methods to be used. This paper provides an application of advanced parametric methods for assessing waveform distortions in SGs with dispersed generation. In particular, the Standard International Electrotechnical Committee (IEC) method, some parametric methods (such as Prony and Estimation of Signal Parameters by Rotational Invariance Technique (ESPRIT)), and some hybrid methods are critically compared on the basis of their accuracy and the computational effort required.

  3. The Effects of Computer-Aided Instruction on Learning and Attitudes in Economic Principles Courses: Revised Results.

    ERIC Educational Resources Information Center

    Henry, Mark

    1979-01-01

    Recounts statistical inaccuracies in an article on computer-aided instruction in economics courses on the college level. The article, published in the J. Econ. Ed (Fall 1978), erroneously placed one student in the TIPS group instead of the control group. Implications of this alteration are discussed. (DB)

  4. Home Economics. Education for Technology Employment.

    ERIC Educational Resources Information Center

    Northern Illinois Univ., De Kalb. Dept. of Technology.

    This guide was developed in an Illinois program to help home economics teachers integrate the use of computers and program-related software into existing programs. After students are taught the basic computer skills outlined in the beginning of the guide, 50 learning activities can be used as an integral part of the instructional program. (One or…

  5. Computers in the Home Economics Classroom.

    ERIC Educational Resources Information Center

    Browning, Ruth; Durbin, Sandra

    This guide for teachers focuses on how microcomputers may be used in the home economics classroom and how the computer is affecting and changing family life. A brief discussion of potential uses of the microcomputer in educational settings is followed by seven major sections. Sections 1 and 2 provide illustrations and definitions for microcomputer…

  6. Evaluation of trade influence on economic growth rate by computational intelligence approach

    NASA Astrophysics Data System (ADS)

    Sokolov-Mladenović, Svetlana; Milovančević, Milos; Mladenović, Igor

    2017-01-01

    In this study was analyzed the influence of trade parameters on the economic growth forecasting accuracy. Computational intelligence method was used for the analyzing since the method can handle highly nonlinear data. It is known that the economic growth could be modeled based on the different trade parameters. In this study five input parameters were considered. These input parameters were: trade in services, exports of goods and services, imports of goods and services, trade and merchandise trade. All these parameters were calculated as added percentages in gross domestic product (GDP). The main goal was to select which parameters are the most impactful on the economic growth percentage. GDP was used as economic growth indicator. Results show that the imports of goods and services has the highest influence on the economic growth forecasting accuracy.

  7. Socio-economics effect of the use of space distribution in the coastal of Kampung Nelayan Belawan Medan

    NASA Astrophysics Data System (ADS)

    O. Y Marpaung, Beny; Widyasari, Mutiara

    2018-03-01

    Kampung Nelayan Belawan Medan is one of the unplanned settlements located in the coastal area. The social life of society such as livelihood, activity, and behavior can be said still traditional. It is also supported by the lack of public facilities in this area. In addition, the economic life of people in this region is far below standard. This is seen from the low-income people, so it can affect the use of space in coastal areas. This study aims to examine and find socio economic impacts on communities on the distribution of land use and dissemination of activities in coastal areas. This research uses quantitative and qualitative descriptive method. Researchers collected data by observation and distributed questionnaires. Then, the researchers relate the theory of the interpreted data. This study finds the social and economic situation and its effect on the distribution of space usage in Kampung Nelayan Belawan Medan.

  8. A Study Using a Monte Carlo Method of the Optimal Configuration of a Distribution Network in Terms of Power Loss Sensing

    PubMed Central

    Moon, Hyun Ho; Lee, Jong Joo; Choi, Sang Yule; Cha, Jae Sang; Kang, Jang Mook; Kim, Jong Tae; Shin, Myong Chul

    2011-01-01

    Recently there have been many studies of power systems with a focus on “New and Renewable Energy” as part of “New Growth Engine Industry” promoted by the Korean government. “New And Renewable Energy”—especially focused on wind energy, solar energy and fuel cells that will replace conventional fossil fuels—is a part of the Power-IT Sector which is the basis of the SmartGrid. A SmartGrid is a form of highly-efficient intelligent electricity network that allows interactivity (two-way communications) between suppliers and consumers by utilizing information technology in electricity production, transmission, distribution and consumption. The New and Renewable Energy Program has been driven with a goal to develop and spread through intensive studies, by public or private institutions, new and renewable energy which, unlike conventional systems, have been operated through connections with various kinds of distributed power generation systems. Considerable research on smart grids has been pursued in the United States and Europe. In the United States, a variety of research activities on the smart power grid have been conducted within EPRI’s IntelliGrid research program. The European Union (EU), which represents Europe’s Smart Grid policy, has focused on an expansion of distributed generation (decentralized generation) and power trade between countries with improved environmental protection. Thus, there is current emphasis on a need for studies that assesses the economic efficiency of such distributed generation systems. In this paper, based on the cost of distributed power generation capacity, calculations of the best profits obtainable were made by a Monte Carlo simulation. Monte Carlo simulations that rely on repeated random sampling to compute their results take into account the cost of electricity production, daily loads and the cost of sales and generate a result faster than mathematical computations. In addition, we have suggested the optimal design, which considers the distribution loss associated with power distribution systems focus on sensing aspect and distributed power generation. PMID:22164047

  9. An economic growth model based on financial credits distribution to the government economy priority sectors of each regency in Indonesia using hierarchical Bayesian method

    NASA Astrophysics Data System (ADS)

    Yasmirullah, Septia Devi Prihastuti; Iriawan, Nur; Sipayung, Feronika Rosalinda

    2017-11-01

    The success of regional economic establishment could be measured by economic growth. Since the Act No. 32 of 2004 has been implemented, unbalance economic among the regency in Indonesia is increasing. This condition is contrary different with the government goal to build society welfare through the economic activity development in each region. This research aims to examine economic growth through the distribution of bank credits to each Indonesia's regency. The data analyzed in this research is hierarchically structured data which follow normal distribution in first level. Two modeling approaches are employed in this research, a global-one level Bayesian approach and two-level hierarchical Bayesian approach. The result shows that hierarchical Bayesian has succeeded to demonstrate a better estimation than a global-one level Bayesian. It proves that the different economic growth in each province is significantly influenced by the variations of micro level characteristics in each province. These variations are significantly affected by cities and province characteristics in second level.

  10. Flood Risk Due to Hurricane Flooding

    NASA Astrophysics Data System (ADS)

    Olivera, Francisco; Hsu, Chih-Hung; Irish, Jennifer

    2015-04-01

    In this study, we evaluated the expected economic losses caused by hurricane inundation. We used surge response functions, which are physics-based dimensionless scaling laws that give surge elevation as a function of the hurricane's parameters (i.e., central pressure, radius, forward speed, approach angle and landfall location) at specified locations along the coast. These locations were close enough to avoid significant changes in surge elevations between consecutive points, and distant enough to minimize calculations. The probability of occurrence of a surge elevation value at a given location was estimated using a joint probability distribution of the hurricane parameters. The surge elevation, at the shoreline, was assumed to project horizontally inland within a polygon of influence. Individual parcel damage was calculated based on flood water depth and damage vs. depth curves available for different building types from the HAZUS computer application developed by the Federal Emergency Management Agency (FEMA). Parcel data, including property value and building type, were obtained from the county appraisal district offices. The expected economic losses were calculated as the sum of the products of the estimated parcel damages and their probability of occurrence for the different storms considered. Anticipated changes for future climate scenarios were considered by accounting for projected hurricane intensification, as indicated by sea surface temperature rise, and sea level rise, which modify the probability distribution of hurricane central pressure and change the baseline of the damage calculation, respectively. Maps of expected economic losses have been developed for Corpus Christi in Texas, Gulfport in Mississippi and Panama City in Florida. Specifically, for Port Aransas, in the Corpus Christi area, it was found that the expected economic losses were in the range of 1% to 4% of the property value for current climate conditions, of 1% to 8% for the 2030's and of 1% to 14% for the 2080's.

  11. A Method For Assessing Economic Thresholds of Hardwood Competition

    Treesearch

    Steven A. Knowe

    2002-01-01

    A procedure was developed for computing economic thresholds for hardwood competition in pine plantations. The economic threshold represents the break-even level of competition above which hardwood control is a financially attractive treatment. Sensitivity analyses were conducted to examine the relative importance of biological and economic factors in determining...

  12. Xcas as a Programming Environment for Stability Conditions for a Class of Differential Equation Models in Economics

    NASA Astrophysics Data System (ADS)

    Halkos, George E.; Tsilika, Kyriaki D.

    2011-09-01

    In this paper we examine the property of asymptotic stability in several dynamic economic systems, modeled in ordinary differential equation formulations of time parameter t. Asymptotic stability ensures intertemporal equilibrium for the economic quantity the solution stands for, regardless of what the initial conditions happen to be. Existence of economic equilibrium in continuous time models is checked via a Symbolic language, the Xcas program editor. Using stability theorems of differential equations as background a brief overview of symbolic capabilities of free software Xcas is given. We present computational experience with a programming style for stability results of ordinary linear and nonlinear differential equations. Numerical experiments on traditional applications of economic dynamics exhibit the simplicity clarity and brevity of input and output of our computer codes.

  13. Technical and Economic Assessment of the Implementation of Measures for Reducing Energy Losses in Distribution Systems

    NASA Astrophysics Data System (ADS)

    Aguila, Alexander; Wilson, Jorge

    2017-07-01

    This paper develops a methodology to assess a group of measures of electrical improvements in distribution systems, starting from the complementation of technical and economic criteria. In order to solve the problem of energy losses in distribution systems, technical and economic analysis was performed based on a mathematical model to establish a direct relationship between the energy saved by way of minimized losses and the costs of implementing the proposed measures. This paper aims at analysing the feasibility of reducing energy losses in distribution systems, by changing existing network conductors by larger crosssection conductors and distribution voltage change at higher levels. The impact of this methodology provides a highly efficient mathematical tool for analysing the feasibility of implementing improvement projects based on their costs which is a very useful tool for the distribution companies that will serve as a starting point to the analysis for this type of projects in distribution systems.

  14. An Integrated Approach to Economic and Environmental Aspects of Air Pollution and Climate Interactions

    NASA Astrophysics Data System (ADS)

    Sarofim, M. C.

    2007-12-01

    Emissions of greenhouses gases and conventional pollutants are closely linked through shared generation processes and thus policies directed toward long-lived greenhouse gases affect emissions of conventional pollutants and, similarly, policies directed toward conventional pollutants affect emissions of greenhouse gases. Some conventional pollutants such as aerosols also have direct radiative effects. NOx and VOCs are ozone precursors, another substance with both radiative and health impacts, and these ozone precursors also interact with the chemistry of the hydroxyl radical which is the major methane sink. Realistic scenarios of future emissions and concentrations must therefore account for both air pollution and greenhouse gas policies and how they interact economically as well as atmospherically, including the regional pattern of emissions and regulation. We have modified a 16 region computable general equilibrium economic model (the MIT Emissions Prediction and Policy Analysis model) by including elasticities of substitution for ozone precursors and aerosols in order to examine these interactions between climate policy and air pollution policy on a global scale. Urban emissions are distributed based on population density, and aged using a reduced form urban model before release into an atmospheric chemistry/climate model (the earth systems component of the MIT Integrated Global Systems Model). This integrated approach enables examination of the direct impacts of air pollution on climate, the ancillary and complementary interactions between air pollution and climate policies, and the impact of different population distribution algorithms or urban emission aging schemes on global scale properties. This modeling exercise shows that while ozone levels are reduced due to NOx and VOC reductions, these reductions lead to an increase in methane concentrations that eliminates the temperature effects of the ozone reductions. However, black carbon reductions do have significant direct effects on global mean temperatures, as do ancillary reductions of greenhouse gases due to the pollution constraints imposed in the economic model. Finally, we show that the economic benefits of coordinating air pollution and climate policies rather than separate implementation are on the order of 20% of the total policy cost.

  15. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  16. Cooling Computers.

    ERIC Educational Resources Information Center

    Birken, Marvin N.

    1967-01-01

    Numerous decisions must be made in the design of computer air conditioning, each determined by a combination of economics, physical, and esthetic characteristics, and computer requirements. Several computer air conditioning systems are analyzed--(1) underfloor supply and overhead return, (2) underfloor plenum and overhead supply with computer unit…

  17. A review on economic emission dispatch problems using quantum computational intelligence

    NASA Astrophysics Data System (ADS)

    Mahdi, Fahad Parvez; Vasant, Pandian; Kallimani, Vish; Abdullah-Al-Wadud, M.

    2016-11-01

    Economic emission dispatch (EED) problems are one of the most crucial problems in power systems. Growing energy demand, limitation of natural resources and global warming make this topic into the center of discussion and research. This paper reviews the use of Quantum Computational Intelligence (QCI) in solving Economic Emission Dispatch problems. QCI techniques like Quantum Genetic Algorithm (QGA) and Quantum Particle Swarm Optimization (QPSO) algorithm are discussed here. This paper will encourage the researcher to use more QCI based algorithm to get better optimal result for solving EED problems.

  18. Networks In Real Space: Characteristics and Analysis for Biology and Mechanics

    NASA Astrophysics Data System (ADS)

    Modes, Carl; Magnasco, Marcelo; Katifori, Eleni

    Functional networks embedded in physical space play a crucial role in countless biological and physical systems, from the efficient dissemination of oxygen, blood sugars, and hormonal signals in vascular systems to the complex relaying of informational signals in the brain to the distribution of stress and strain in architecture or static sand piles. Unlike their more-studied abstract cousins, such as the hyperlinked internet, social networks, or economic and financial connections, these networks are both constrained by and intimately connected to the physicality of their real, embedding space. We report on the results of new computational and analytic approaches tailored to these physical networks with particular implications and insights for mammalian organ vasculature.

  19. Guide to the economic analysis of community energy systems

    NASA Astrophysics Data System (ADS)

    Pferdehirt, W. P.; Croke, K. G.; Hurter, A. P.; Kennedy, A. S.; Lee, C.

    1981-08-01

    This guidebook provides a framework for the economic analysis of community energy systems. The analysis facilitates a comparison of competing configurations in community energy systems, as well as a comparison with conventional energy systems. Various components of costs and revenues to be considered are discussed in detail. Computational procedures and accompanying worksheets are provided for calculating the net present value, straight and discounted payback periods, the rate of return, and the savings to investment ratio for the proposed energy system alternatives. These computations are based on a projection of the system's costs and revenues over its economic lifetimes. The guidebook also discusses the sensitivity of the results of this economic analysis to changes in various parameters and assumptions.

  20. The Use of Computer Simulation Methods to Reach Data for Economic Analysis of Automated Logistic Systems

    NASA Astrophysics Data System (ADS)

    Neradilová, Hana; Fedorko, Gabriel

    2016-12-01

    Automated logistic systems are becoming more widely used within enterprise logistics processes. Their main advantage is that they allow increasing the efficiency and reliability of logistics processes. In terms of evaluating their effectiveness, it is necessary to take into account the economic aspect of the entire process. However, many users ignore and underestimate this area,which is not correct. One of the reasons why the economic aspect is overlooked is the fact that obtaining information for such an analysis is not easy. The aim of this paper is to present the possibilities of computer simulation methods for obtaining data for full-scale economic analysis implementation.

  1. Economic decision making and the application of nonparametric prediction models

    USGS Publications Warehouse

    Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.

    2007-01-01

    Sustained increases in energy prices have focused attention on gas resources in low permeability shale or in coals that were previously considered economically marginal. Daily well deliverability is often relatively small, although the estimates of the total volumes of recoverable resources in these settings are large. Planning and development decisions for extraction of such resources must be area-wide because profitable extraction requires optimization of scale economies to minimize costs and reduce risk. For an individual firm the decision to enter such plays depends on reconnaissance level estimates of regional recoverable resources and on cost estimates to develop untested areas. This paper shows how simple nonparametric local regression models, used to predict technically recoverable resources at untested sites, can be combined with economic models to compute regional scale cost functions. The context of the worked example is the Devonian Antrim shale gas play, Michigan Basin. One finding relates to selection of the resource prediction model to be used with economic models. Models which can best predict aggregate volume over larger areas (many hundreds of sites) may lose granularity in the distribution of predicted volumes at individual sites. This loss of detail affects the representation of economic cost functions and may affect economic decisions. Second, because some analysts consider unconventional resources to be ubiquitous, the selection and order of specific drilling sites may, in practice, be determined by extraneous factors. The paper also shows that when these simple prediction models are used to strategically order drilling prospects, the gain in gas volume over volumes associated with simple random site selection amounts to 15 to 20 percent. It also discusses why the observed benefit of updating predictions from results of new drilling, as opposed to following static predictions, is somewhat smaller. Copyright 2007, Society of Petroleum Engineers.

  2. Public health and economic risk assessment of waterborne contaminants and pathogens in Finland.

    PubMed

    Juntunen, Janne; Meriläinen, Päivi; Simola, Antti

    2017-12-01

    This study shows that a variety of mathematical modeling techniques can be applied in a comprehensive assessment of the risks involved in drinking water production. In order to track the effects from water sources to the end consumers, we employed four models from different fields of study. First, two models of the physical environment, which track the movement of harmful substances from the sources to the water distribution. Second, a statistical quantitative microbial risk assessment (QMRA) to assess the public health risks of the consumption of such water. Finally, a regional computable general equilibrium (CGE) model to assess the economic effects of increased illnesses. In order to substantiate our analysis, we used an illustrative case of a recently built artificial recharge system in Southern Finland that provides water for a 300,000 inhabitant area. We examine the effects of various chemicals and microbes separately. Our economic calculations allow for direct effects on labor productivity due to absenteeism, increased health care expenditures and indirect effects for local businesses. We found that even a considerable risk has no notable threat to public health and thus barely measurable economic consequences. Any epidemic is likely to spread widely in the urban setting we examined, but is also going to be short-lived in both public health and economic terms. Our estimate for the ratio of total and direct effects is 1.4, which indicates the importance of general equilibrium effects. Furthermore, the total welfare loss is 2.4 times higher than the initial productivity loss. The major remaining uncertainty in the economic assessment is the indirect effects. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Public census data on CD-ROM at Lawrence Berkeley Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merrill, D.W.

    The Comprehensive Epidemiologic Data Resource (CEDR) and Populations at Risk to Environmental Pollution (PAREP) projects, of the Information and Computing Sciences Division (ICSD) at Lawrence Berkeley Laboratory (LBL), are using public socio-economic and geographic data files which are available to CEDR and PAREP collaborators via LBL`s computing network. At this time 70 CD-ROM diskettes (approximately 36 gigabytes) are on line via the Unix file server cedrcd. lbl. gov. Most of the files are from the US Bureau of the Census, and most pertain to the 1990 Census of Population and Housing. All the CD-ROM diskettes contain documentation in the formmore » of ASCII text files. Printed documentation for most files is available for inspection at University of California Data and Technical Assistance (UC DATA), or the UC Documents Library. Many of the CD-ROM diskettes distributed by the Census Bureau contain software for PC compatible computers, for easily accessing the data. Shared access to the data is maintained through a collaboration among the CEDR and PAREP projects at LBL, and UC DATA, and the UC Documents Library. Via the Sun Network File System (NFS), these data can be exported to Internet computers for direct access by the user`s application program(s).« less

  4. Public census data on CD-ROM at Lawrence Berkeley Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merrill, D.W.

    The Comprehensive Epidemiologic Data Resource (CEDR) and Populations at Risk to Environmental Pollution (PAREP) projects, of the Information and Computing Sciences Division (ICSD) at Lawrence Berkeley Laboratory (LBL), are using public socio-economic and geographic data files which are available to CEDR and PAREP collaborators via LBL's computing network. At this time 70 CD-ROM diskettes (approximately 36 gigabytes) are on line via the Unix file server cedrcd. lbl. gov. Most of the files are from the US Bureau of the Census, and most pertain to the 1990 Census of Population and Housing. All the CD-ROM diskettes contain documentation in the formmore » of ASCII text files. Printed documentation for most files is available for inspection at University of California Data and Technical Assistance (UC DATA), or the UC Documents Library. Many of the CD-ROM diskettes distributed by the Census Bureau contain software for PC compatible computers, for easily accessing the data. Shared access to the data is maintained through a collaboration among the CEDR and PAREP projects at LBL, and UC DATA, and the UC Documents Library. Via the Sun Network File System (NFS), these data can be exported to Internet computers for direct access by the user's application program(s).« less

  5. Multi-time Scale Coordination of Distributed Energy Resources in Isolated Power Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayhorn, Ebony; Xie, Le; Butler-Purry, Karen

    2016-03-31

    In isolated power systems, including microgrids, distributed assets, such as renewable energy resources (e.g. wind, solar) and energy storage, can be actively coordinated to reduce dependency on fossil fuel generation. The key challenge of such coordination arises from significant uncertainty and variability occurring at small time scales associated with increased penetration of renewables. Specifically, the problem is with ensuring economic and efficient utilization of DERs, while also meeting operational objectives such as adequate frequency performance. One possible solution is to reduce the time step at which tertiary controls are implemented and to ensure feedback and look-ahead capability are incorporated tomore » handle variability and uncertainty. However, reducing the time step of tertiary controls necessitates investigating time-scale coupling with primary controls so as not to exacerbate system stability issues. In this paper, an optimal coordination (OC) strategy, which considers multiple time-scales, is proposed for isolated microgrid systems with a mix of DERs. This coordination strategy is based on an online moving horizon optimization approach. The effectiveness of the strategy was evaluated in terms of economics, technical performance, and computation time by varying key parameters that significantly impact performance. The illustrative example with realistic scenarios on a simulated isolated microgrid test system suggests that the proposed approach is generalizable towards designing multi-time scale optimal coordination strategies for isolated power systems.« less

  6. Young adolescents' engagement in dietary behaviour - the impact of gender, socio-economic status, self-efficacy and scientific literacy. Methodological aspects of constructing measures in nutrition literacy research using the Rasch model.

    PubMed

    Guttersrud, Øystein; Petterson, Kjell Sverre

    2015-10-01

    The present study validates a revised scale measuring individuals' level of the 'engagement in dietary behaviour' aspect of 'critical nutrition literacy' and describes how background factors affect this aspect of Norwegian tenth-grade students' nutrition literacy. Data were gathered electronically during a field trial of a standardised sample test in science. Test items and questionnaire constructs were distributed evenly across four electronic field-test booklets. Data management and analysis were performed using the RUMM2030 item analysis package and the IBM SPSS Statistics 20 statistical software package. Students responded on computers at school. Seven hundred and forty tenth-grade students at twenty-seven randomly sampled public schools were enrolled in the field-test study. The engagement in dietary behaviour scale and the self-efficacy in science scale were distributed to 178 of these students. The dietary behaviour scale and the self-efficacy in science scale came out as valid, reliable and well-targeted instruments usable for the construction of measurements. Girls and students with high self-efficacy reported higher engagement in dietary behaviour than other students. Socio-economic status and scientific literacy - measured as ability in science by applying an achievement test - did not correlate significantly different from zero with students' engagement in dietary behaviour.

  7. The Economic and Social Impacts of the Transition from the Industrial Society to a Computer Literate, High Technology, Information Society.

    ERIC Educational Resources Information Center

    Groff, Warren H.

    As our society evolves from an industrial society to a computer literate, high technology, information society, educational planners must reexamine the role of postsecondary education in economic development and in intellectual capital formation. In response to this need, a task force on high technology was established to examine the following…

  8. Economic Analysis of Alternatives for PC Upgrade of OR Department Laboratory

    DTIC Science & Technology

    1990-09-01

    DEPARTMENT LABORATORY by Chen Lung-Shan September, 1990 Thesis Advisor: Thomas E. Halwachs Approved for public release; distribution is unlimited. 91...b. RESTRICTIVE MARKINGS UNCLASSIFIED 2a. SECURITY CLASSIFICATION AUTHORITY 3. DISTRIBUTION/AVAILABILITY OF REPORT Approved for public release...Approved for public release; distribution is unlimited. Economic Analysis Of Alternatives For PC Upgrade Of OR Department Laboratory by Lung-shan Chen

  9. Economics of End-of-Life Materials Recovery: A Study of Small Appliances and Computer Devices in Portugal.

    PubMed

    Ford, Patrick; Santos, Eduardo; Ferrão, Paulo; Margarido, Fernanda; Van Vliet, Krystyn J; Olivetti, Elsa

    2016-05-03

    The challenges brought on by the increasing complexity of electronic products, and the criticality of the materials these devices contain, present an opportunity for maximizing the economic and societal benefits derived from recovery and recycling. Small appliances and computer devices (SACD), including mobile phones, contain significant amounts of precious metals including gold and platinum, the present value of which should serve as a key economic driver for many recycling decisions. However, a detailed analysis is required to estimate the economic value that is unrealized by incomplete recovery of these and other materials, and to ascertain how such value could be reinvested to improve recovery processes. We present a dynamic product flow analysis for SACD throughout Portugal, a European Union member, including annual data detailing product sales and industrial-scale preprocessing data for recovery of specific materials from devices. We employ preprocessing facility and metals pricing data to identify losses, and develop an economic framework around the value of recycling including uncertainty. We show that significant economic losses occur during preprocessing (over $70 M USD unrecovered in computers and mobile phones, 2006-2014) due to operations that fail to target high value materials, and characterize preprocessing operations according to material recovery and total costs.

  10. The economic contribution of the Northern Ontario School of Medicine to communities participating in distributed medical education.

    PubMed

    Hogenbirk, John C; Robinson, David R; Hill, Mary Ellen; Pong, Raymond W; Minore, Bruce; Adams, Ken; Strasser, Roger P; Lipinski, Joe

    2015-01-01

    The economic contribution of medical schools to major urban centres can be substantial, but there is little information on the contribution to the economy of participating communities made by schools that provide education and training away from major cities and academic health science centres. We sought to assess the economic contribution of the Northern Ontario School of Medicine (NOSM) to northern Ontario communities participating in NOSM's distributed medical education programs. We developed a local economic model and used actual expenditures from 2007/08 to assess the economic contribution of NOSM to communities in northern Ontario. We also estimated the economic contribution of medical students or residents participating in different programs in communities away from the university campuses. To explore broader economic effects, we conducted semistructured interviews with leaders in education, health care and politics in northern Ontario. The total economic contribution to northern Ontario was $67.1 million based on $36.3 million in spending by NOSM and $1.0 million spent by students. Economic contributions were greatest in the university campus cities of Thunder Bay ($26.7 million) and Sudbury ($30.4 million), and $0.8-$1.2 million accrued to the next 3 largest population centres. Communities might realize an economic contribution of $7300-$103 900 per pair of medical learners per placement. Several of the 59 interviewees remarked that the dollar amount could be small to moderate but had broader economic implications. Distributed medical education at the NOSM resulted in a substantial economic contribution to participating communities.

  11. Microcomputers in Vocational Home Economics Classrooms in USD #512.

    ERIC Educational Resources Information Center

    Shawnee Mission Public Schools, KS.

    A project was conducted to identify software suitable for use in home economics classes and to train home economics teachers to use that software with an Apple II Plus microcomputer. During the project, home economics software was identified, evaluated, and catalogued. Teaching strategies were adapted to include using the computer in the…

  12. Distributed computing environments for future space control systems

    NASA Technical Reports Server (NTRS)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  13. Computational Models of Consumer Confidence from Large-Scale Online Attention Data: Crowd-Sourcing Econometrics

    PubMed Central

    2015-01-01

    Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting. PMID:25826692

  14. Computational models of consumer confidence from large-scale online attention data: crowd-sourcing econometrics.

    PubMed

    Dong, Xianlei; Bollen, Johan

    2015-01-01

    Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.

  15. Entropies of negative incomes, Pareto-distributed loss, and financial crises.

    PubMed

    Gao, Jianbo; Hu, Jing; Mao, Xiang; Zhou, Mi; Gurbaxani, Brian; Lin, Johnny

    2011-01-01

    Health monitoring of world economy is an important issue, especially in a time of profound economic difficulty world-wide. The most important aspect of health monitoring is to accurately predict economic downturns. To gain insights into how economic crises develop, we present two metrics, positive and negative income entropy and distribution analysis, to analyze the collective "spatial" and temporal dynamics of companies in nine sectors of the world economy over a 19 year period from 1990-2008. These metrics provide accurate predictive skill with a very low false-positive rate in predicting downturns. The new metrics also provide evidence of phase transition-like behavior prior to the onset of recessions. Such a transition occurs when negative pretax incomes prior to or during economic recessions transition from a thin-tailed exponential distribution to the higher entropy Pareto distribution, and develop even heavier tails than those of the positive pretax incomes. These features propagate from the crisis initiating sector of the economy to other sectors.

  16. High-Efficiency High-Resolution Global Model Developments at the NASA Goddard Data Assimilation Office

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann; Atlas, Robert (Technical Monitor)

    2002-01-01

    The Data Assimilation Office (DAO) has been developing a new generation of ultra-high resolution General Circulation Model (GCM) that is suitable for 4-D data assimilation, numerical weather predictions, and climate simulations. These three applications have conflicting requirements. For 4-D data assimilation and weather predictions, it is highly desirable to run the model at the highest possible spatial resolution (e.g., 55 km or finer) so as to be able to resolve and predict socially and economically important weather phenomena such as tropical cyclones, hurricanes, and severe winter storms. For climate change applications, the model simulations need to be carried out for decades, if not centuries. To reduce uncertainty in climate change assessments, the next generation model would also need to be run at a fine enough spatial resolution that can at least marginally simulate the effects of intense tropical cyclones. Scientific problems (e.g., parameterization of subgrid scale moist processes) aside, all three areas of application require the model's computational performance to be dramatically improved as compared to the previous generation. In this talk, I will present the current and future developments of the "finite-volume dynamical core" at the Data Assimilation Office. This dynamical core applies modem monotonicity preserving algorithms and is genuinely conservative by construction, not by an ad hoc fixer. The "discretization" of the conservation laws is purely local, which is clearly advantageous for resolving sharp gradient flow features. In addition, the local nature of the finite-volume discretization also has a significant advantage on distributed memory parallel computers. Together with a unique vertically Lagrangian control volume discretization that essentially reduces the dimension of the computational problem from three to two, the finite-volume dynamical core is very efficient, particularly at high resolutions. I will also present the computational design of the dynamical core using a hybrid distributed-shared memory programming paradigm that is portable to virtually any of today's high-end parallel super-computing clusters.

  17. High-Efficiency High-Resolution Global Model Developments at the NASA Goddard Data Assimilation Office

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann; Atlas, Robert (Technical Monitor)

    2002-01-01

    The Data Assimilation Office (DAO) has been developing a new generation of ultra-high resolution General Circulation Model (GCM) that is suitable for 4-D data assimilation, numerical weather predictions, and climate simulations. These three applications have conflicting requirements. For 4-D data assimilation and weather predictions, it is highly desirable to run the model at the highest possible spatial resolution (e.g., 55 kin or finer) so as to be able to resolve and predict socially and economically important weather phenomena such as tropical cyclones, hurricanes, and severe winter storms. For climate change applications, the model simulations need to be carried out for decades, if not centuries. To reduce uncertainty in climate change assessments, the next generation model would also need to be run at a fine enough spatial resolution that can at least marginally simulate the effects of intense tropical cyclones. Scientific problems (e.g., parameterization of subgrid scale moist processes) aside, all three areas of application require the model's computational performance to be dramatically improved as compared to the previous generation. In this talk, I will present the current and future developments of the "finite-volume dynamical core" at the Data Assimilation Office. This dynamical core applies modem monotonicity preserving algorithms and is genuinely conservative by construction, not by an ad hoc fixer. The "discretization" of the conservation laws is purely local, which is clearly advantageous for resolving sharp gradient flow features. In addition, the local nature of the finite-volume discretization also has a significant advantage on distributed memory parallel computers. Together with a unique vertically Lagrangian control volume discretization that essentially reduces the dimension of the computational problem from three to two, the finite-volume dynamical core is very efficient, particularly at high resolutions. I will also present the computational design of the dynamical core using a hybrid distributed- shared memory programming paradigm that is portable to virtually any of today's high-end parallel super-computing clusters.

  18. Distributed Maritime Capability: Optimized U.S. Navy-U.S. Coast Guard Interoperability, a Case in the South China Sea

    DTIC Science & Technology

    2017-12-01

    poses a threat to regional security and economic stability—major U.S. national interests. Distributed maritime capability is demonstrated by applying...regional security, economic stability, fisheries enforcement 15. NUMBER OF PAGES 95 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT...a dominant aggressor in the South China Sea that poses a threat to regional security and economic stability—major U.S. national interests

  19. Computer Applications in the Design Process.

    ERIC Educational Resources Information Center

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  20. Analysis and assessment of STES technologies

    NASA Astrophysics Data System (ADS)

    Brown, D. R.; Blahnik, D. E.; Huber, H. D.

    1982-12-01

    Technical and economic assessments completed in FY 1982 in support of the Seasonal Thermal Energy Storage (STES) segment of the Underground Energy Storage Program included: (1) a detailed economic investigation of the cost of heat storage in aquifers, (2) documentation for AQUASTOR, a computer model for analyzing aquifer thermal energy storage (ATES) coupled with district heating or cooling, and (3) a technical and economic evaluation of several ice storage concepts. This paper summarizes the research efforts and main results of each of these three activities. In addition, a detailed economic investigation of the cost of chill storage in aquifers is currently in progress. The work parallels that done for ATES heat storage with technical and economic assumptions being varied in a parametric analysis of the cost of ATES delivered chill. The computer model AQUASTOR is the principal analytical tool being employed.

  1. An Overview of Cloud Computing in Distributed Systems

    NASA Astrophysics Data System (ADS)

    Divakarla, Usha; Kumari, Geetha

    2010-11-01

    Cloud computing is the emerging trend in the field of distributed computing. Cloud computing evolved from grid computing and distributed computing. Cloud plays an important role in huge organizations in maintaining huge data with limited resources. Cloud also helps in resource sharing through some specific virtual machines provided by the cloud service provider. This paper gives an overview of the cloud organization and some of the basic security issues pertaining to the cloud.

  2. Evaluation of a Compact Hybrid Brain-Computer Interface System

    PubMed Central

    Müller, Klaus-Robert; Schmitz, Christoph H.

    2017-01-01

    We realized a compact hybrid brain-computer interface (BCI) system by integrating a portable near-infrared spectroscopy (NIRS) device with an economical electroencephalography (EEG) system. The NIRS array was located on the subjects' forehead, covering the prefrontal area. The EEG electrodes were distributed over the frontal, motor/temporal, and parietal areas. The experimental paradigm involved a Stroop word-picture matching test in combination with mental arithmetic (MA) and baseline (BL) tasks, in which the subjects were asked to perform either MA or BL in response to congruent or incongruent conditions, respectively. We compared the classification accuracies of each of the modalities (NIRS or EEG) with that of the hybrid system. We showed that the hybrid system outperforms the unimodal EEG and NIRS systems by 6.2% and 2.5%, respectively. Since the proposed hybrid system is based on portable platforms, it is not confined to a laboratory environment and has the potential to be used in real-life situations, such as in neurorehabilitation. PMID:28373984

  3. Evaluation of a Compact Hybrid Brain-Computer Interface System.

    PubMed

    Shin, Jaeyoung; Müller, Klaus-Robert; Schmitz, Christoph H; Kim, Do-Won; Hwang, Han-Jeong

    2017-01-01

    We realized a compact hybrid brain-computer interface (BCI) system by integrating a portable near-infrared spectroscopy (NIRS) device with an economical electroencephalography (EEG) system. The NIRS array was located on the subjects' forehead, covering the prefrontal area. The EEG electrodes were distributed over the frontal, motor/temporal, and parietal areas. The experimental paradigm involved a Stroop word-picture matching test in combination with mental arithmetic (MA) and baseline (BL) tasks, in which the subjects were asked to perform either MA or BL in response to congruent or incongruent conditions, respectively. We compared the classification accuracies of each of the modalities (NIRS or EEG) with that of the hybrid system. We showed that the hybrid system outperforms the unimodal EEG and NIRS systems by 6.2% and 2.5%, respectively. Since the proposed hybrid system is based on portable platforms, it is not confined to a laboratory environment and has the potential to be used in real-life situations, such as in neurorehabilitation.

  4. Price schedules coordination for electricity pool markets

    NASA Astrophysics Data System (ADS)

    Legbedji, Alexis Motto

    2002-04-01

    We consider the optimal coordination of a class of mathematical programs with equilibrium constraints, which is formally interpreted as a resource-allocation problem. Many decomposition techniques were proposed to circumvent the difficulty of solving large systems with limited computer resources. The considerable improvement in computer architecture has allowed the solution of large-scale problems with increasing speed. Consequently, interest in decomposition techniques has waned. Nonetheless, there is an important class of applications for which decomposition techniques will still be relevant, among others, distributed systems---the Internet, perhaps, being the most conspicuous example---and competitive economic systems. Conceptually, a competitive economic system is a collection of agents that have similar or different objectives while sharing the same system resources. In theory, constructing a large-scale mathematical program and solving it centrally, using currently available computing power can optimize such systems of agents. In practice, however, because agents are self-interested and not willing to reveal some sensitive corporate data, one cannot solve these kinds of coordination problems by simply maximizing the sum of agent's objective functions with respect to their constraints. An iterative price decomposition or Lagrangian dual method is considered best suited because it can operate with limited information. A price-directed strategy, however, can only work successfully when coordinating or equilibrium prices exist, which is not generally the case when a weak duality is unavoidable. Showing when such prices exist and how to compute them is the main subject of this thesis. Among our results, we show that, if the Lagrangian function of a primal program is additively separable, price schedules coordination may be attained. The prices are Lagrange multipliers, and are also the decision variables of a dual program. In addition, we propose a new form of augmented or nonlinear pricing, which is an example of the use of penalty functions in mathematical programming. Applications are drawn from mathematical programming problems of the form arising in electric power system scheduling under competition.

  5. Computer Series, 13: Bits and Pieces, 11.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1982-01-01

    Describes computer programs (with ordering information) on various topics including, among others, modeling of thermodynamics and economics of solar energy, radioactive decay simulation, stoichiometry drill/tutorial (in Spanish), computer-generated safety quiz, medical chemistry computer game, medical biochemistry question bank, generation of…

  6. Econo-Thermodynamics: The Nature of Economic Interactions

    NASA Astrophysics Data System (ADS)

    Mimkes, Juergen

    2006-03-01

    Physicists often model economic interactions like collisions of atoms in gases: by interaction one agent gains, the other loses. This leads to a Boltzmann distribution of capital, which has been observed in wealth distributions of different countries. However, economists object: no economic agent will attend a market in which he gets robbed! This conflict may be resolved by writing basic laws of economics into terms of calculus. In these terms the daily struggle for survival of all economic systems turns out to be a Carnot cycle that is driven by energy: heat pumps and economic production depend on oil, GNP and oil consumption run parallel for all countries. Motors and markets are based on the same laws of calculus (macro-economics) and statistics (micro-economics). Economic interactions mean exploiting a third party (nature) and are indeed close to robbing! A baker sells bread to his customers, but the flour comes from nature. Banks sells loans to investors, but the money comes from savers. Econo-thermodynamics is a thrilling new interdisciplinary field.

  7. Simulation on an optimal combustion control strategy for 3-D temperature distributions in tangentially pc-fired utility boiler furnaces.

    PubMed

    Wang, Xi-fen; Zhou, Huai-chun

    2005-01-01

    The control of 3-D temperature distribution in a utility boiler furnace is essential for the safe, economic and clean operation of pc-fired furnace with multi-burner system. The development of the visualization of 3-D temperature distributions in pc-fired furnaces makes it possible for a new combustion control strategy directly with the furnace temperature as its goal to improve the control quality for the combustion processes. Studied in this paper is such a new strategy that the whole furnace is divided into several parts in the vertical direction, and the average temperature and its bias from the center in every cross section can be extracted from the visualization results of the 3-D temperature distributions. In the simulation stage, a computational fluid dynamics (CFD) code served to calculate the 3-D temperature distributions in a furnace, then a linear model was set up to relate the features of the temperature distributions with the input of the combustion processes, such as the flow rates of fuel and air fed into the furnaces through all the burners. The adaptive genetic algorithm was adopted to find the optimal combination of the whole input parameters which ensure to form an optimal 3-D temperature field in the furnace desired for the operation of boiler. Simulation results showed that the strategy could soon find the factors making the temperature distribution apart from the optimal state and give correct adjusting suggestions.

  8. GIS-based poverty and population distribution analysis in China

    NASA Astrophysics Data System (ADS)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  9. Analysis and Application of Microgrids

    NASA Astrophysics Data System (ADS)

    Yue, Lu

    New trends of generating electricity locally and utilizing non-conventional or renewable energy sources have attracted increasing interests due to the gradual depletion of conventional fossil fuel energy sources. The new type of power generation is called Distributed Generation (DG) and the energy sources utilized by Distributed Generation are termed Distributed Energy Sources (DERs). With DGs embedded in the distribution networks, they evolve from passive distribution networks to active distribution networks enabling bidirectional power flows in the networks. Further incorporating flexible and intelligent controllers and employing future technologies, active distribution networks will turn to a Microgrid. A Microgrid is a small-scale, low voltage Combined with Heat and Power (CHP) supply network designed to supply electrical and heat loads for a small community. To further implement Microgrids, a sophisticated Microgrid Management System must be integrated. However, due to the fact that a Microgrid has multiple DERs integrated and is likely to be deregulated, the ability to perform real-time OPF and economic dispatch with fast speed advanced communication network is necessary. In this thesis, first, problems such as, power system modelling, power flow solving and power system optimization, are studied. Then, Distributed Generation and Microgrid are studied and reviewed, including a comprehensive review over current distributed generation technologies and Microgrid Management Systems, etc. Finally, a computer-based AC optimization method which minimizes the total transmission loss and generation cost of a Microgrid is proposed and a wireless communication scheme based on synchronized Code Division Multiple Access (sCDMA) is proposed. The algorithm is tested with a 6-bus power system and a 9-bus power system.

  10. A Weibull distribution accrual failure detector for cloud computing.

    PubMed

    Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.

  11. The Continental Margins Program in Georgia

    USGS Publications Warehouse

    Cocker, M.D.; Shapiro, E.A.

    1999-01-01

    From 1984 to 1993, the Georgia Geologic Survey (GGS) participated in the Minerals Management Service-funded Continental Margins Program. Geological and geophysical data acquisition focused on offshore stratigraphic framework studies, phosphate-bearing Miocene-age strata, distribution of heavy minerals, near-surface alternative sources of groundwater, and development of a PC-based Coastal Geographic Information System (GIS). Seven GGS publications document results of those investigations. In addition to those publications, direct benefits of the GGS's participation include an impetus to the GGS's investigations of economic minerals on the Georgia coast, establishment of a GIS that includes computer hardware and software, and seeds for additional investigations through the information and training acquired as a result of the Continental Margins Program. These addtional investigations are quite varied in scope, and many were made possible because of GIS expertise gained as a result of the Continental Margins Program. Future investigations will also reap the benefits of the Continental Margins Program.From 1984 to 1993, the Georgia Geologic Survey (GGS) participated in the Minerals Management Service-funded Continental Margins Program. Geological and geophysical data acquisition focused on offshore stratigraphic framework studies, phosphate-bearing Miocene-age strata, distribution of heavy minerals, near-surface alternative sources of groundwater, and development of a PC-based Coastal Geographic Information System (GIS). Seven GGS publications document results of those investigations. In addition to those publications, direct benefits of the GGS's participation include an impetus to the GGS's investigations of economic minerals on the Georgia coast, establishment of a GIS that includes computer hardware and software, and seeds for additional investigations through the information and training acquired as a result of the Continental Margins Program. These additional investigations are quite varied in scope, and many were made possible because of GIS expertise gained as a result of the Continental Margins Program. Future investigations will also reap the benefits of the Continental Margins Program.

  12. Monte Carlo verification of radiotherapy treatments with CloudMC.

    PubMed

    Miras, Hector; Jiménez, Rubén; Perales, Álvaro; Terrón, José Antonio; Bertolet, Alejandro; Ortiz, Antonio; Macías, José

    2018-06-27

    A new implementation has been made on CloudMC, a cloud-based platform presented in a previous work, in order to provide services for radiotherapy treatment verification by means of Monte Carlo in a fast, easy and economical way. A description of the architecture of the application and the new developments implemented is presented together with the results of the tests carried out to validate its performance. CloudMC has been developed over Microsoft Azure cloud. It is based on a map/reduce implementation for Monte Carlo calculations distribution over a dynamic cluster of virtual machines in order to reduce calculation time. CloudMC has been updated with new methods to read and process the information related to radiotherapy treatment verification: CT image set, treatment plan, structures and dose distribution files in DICOM format. Some tests have been designed in order to determine, for the different tasks, the most suitable type of virtual machines from those available in Azure. Finally, the performance of Monte Carlo verification in CloudMC is studied through three real cases that involve different treatment techniques, linac models and Monte Carlo codes. Considering computational and economic factors, D1_v2 and G1 virtual machines were selected as the default type for the Worker Roles and the Reducer Role respectively. Calculation times up to 33 min and costs of 16 € were achieved for the verification cases presented when a statistical uncertainty below 2% (2σ) was required. The costs were reduced to 3-6 € when uncertainty requirements are relaxed to 4%. Advantages like high computational power, scalability, easy access and pay-per-usage model, make Monte Carlo cloud-based solutions, like the one presented in this work, an important step forward to solve the long-lived problem of truly introducing the Monte Carlo algorithms in the daily routine of the radiotherapy planning process.

  13. Economic agents and markets as emergent phenomena

    PubMed Central

    Tesfatsion, Leigh

    2002-01-01

    An overview of recent work in agent-based computational economics is provided, with a stress on the research areas highlighted in the National Academy of Sciences Sackler Colloquium session “Economic Agents and Markets as Emergent Phenomena” held in October 2001. PMID:12011395

  14. The Sociocultural Factors That Influenced the Success of Non-Traditional, Latina, Pre-Service Teachers in a Required Online Instructional Media and Technology Course

    ERIC Educational Resources Information Center

    Hernandez Reyes, Christine M.

    2013-01-01

    Home computer ownership and Internet access have become essential to education, job security and economic opportunity. The digital divide, the gap between those who can afford and can use computer technologies remains greatest for ethnic/racial groups placing them at a disadvantage for economic and educational opportunities. The purpose of the…

  15. Annual Report of the Metals and Ceramics Information Center, 1 May 1979-30 April 1980.

    DTIC Science & Technology

    1980-07-01

    MANAGEMENT AND ECONOMIC ANALYSIS DEPT. * Computer and Information SyslemsiD. C Operations 1 Battelle Technical Inputs to Planning * Computer Systems 0...Biomass Resources * Education 0 Business Planning * Information Systems * Economics , Planning and Policy Analysis * Statistical and Mathematical Modelrng...Metals and Ceramics Information Center (MCIC) is one of several technical information analysis centers (IAC’s) chartered and sponsored by the

  16. Assessment of distributed photovoltair electric-power systems

    NASA Astrophysics Data System (ADS)

    Neal, R. W.; Deduck, P. F.; Marshall, R. N.

    1982-10-01

    The development of a methodology to assess the potential impacts of distributed photovoltaic (PV) systems on electric utility systems, including subtransmission and distribution networks, and to apply that methodology to several illustrative examples was developed. The investigations focused upon five specific utilities. Impacts upon utility system operations and generation mix were assessed using accepted utility planning methods in combination with models that simulate PV system performance and life cycle economics. Impacts on the utility subtransmission and distribution systems were also investigated. The economic potential of distributed PV systems was investigated for ownership by the utility as well as by the individual utility customer.

  17. Development of climate data storage and processing model

    NASA Astrophysics Data System (ADS)

    Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.

    2016-11-01

    We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.

  18. Next Generation Distributed Computing for Cancer Research

    PubMed Central

    Agarwal, Pankaj; Owzar, Kouros

    2014-01-01

    Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing. PMID:25983539

  19. LaRC local area networks to support distributed computing

    NASA Technical Reports Server (NTRS)

    Riddle, E. P.

    1984-01-01

    The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs.

  20. Next generation distributed computing for cancer research.

    PubMed

    Agarwal, Pankaj; Owzar, Kouros

    2014-01-01

    Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing.

  1. Multilinear Computing and Multilinear Algebraic Geometry

    DTIC Science & Technology

    2016-08-10

    instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...performance period of this project. 15. SUBJECT TERMS Tensors , multilinearity, algebraic geometry, numerical computations, computational tractability, high...Reset DISTRIBUTION A: Distribution approved for public release. DISTRIBUTION A: Distribution approved for public release. INSTRUCTIONS FOR COMPLETING

  2. Sharing out NASA's spoils. [economic benefits of U.S. space program

    NASA Technical Reports Server (NTRS)

    Bezdek, Roger H.; Wendling, Robert M.

    1992-01-01

    The economic benefits of NASA programs are discussed. Emphasis is given to an analysis of indirect economic benefits which estimates the effect of NASA programs on employment, personal income, corporate sales and profits, and government tax revenues in the U.S. and in each state. Data are presented that show that NASA programs have widely varying multipliers by industry and that illustrate the distribution of jobs by industry as well as the distribution of sales.

  3. Equity from an Economic Perspective. Research and Development Series No. 214B.

    ERIC Educational Resources Information Center

    Cardenas, Gilbert

    Although the distribution of income has become more equitable for some groups, inequitable distribution has affected the poor, minorities, and women most adversely. Income inequality and poverty may be attributed to ability differences, education and training, job tastes, property ownership, market power, and discrimination. In economics, the…

  4. Distributive Education. Economics of Marketing. Instructor's Curriculum.

    ERIC Educational Resources Information Center

    House, John; Bruns, Joe

    Twelve lesson plans on economics of marketing are presented in this performance-based curriculum unit for distributive education. This unit is self-contained and consists of the following components: introduction (provides overview of unit content and describes why mastery of the objectives is important); performance objectives; and unit outline…

  5. Defining conservation priorities using fragmentation forecasts

    Treesearch

    David Wear; John Pye; Kurt H. Riitters

    2004-01-01

    Methods are developed for forecasting the effects of population and economic growth on the distribution of interior forest habitat. An application to the southeastern United States shows that models provide significant explanatory power with regard to the observed distribution of interior forest. Estimates for economic and biophysical variables are significant and...

  6. A distributed computing model for telemetry data processing

    NASA Astrophysics Data System (ADS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  7. A distributed computing model for telemetry data processing

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  8. Integrated Multi-Scale Data Analytics and Machine Learning for the Distribution Grid and Building-to-Grid Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Emma M.; Hendrix, Val; Chertkov, Michael

    This white paper introduces the application of advanced data analytics to the modernized grid. In particular, we consider the field of machine learning and where it is both useful, and not useful, for the particular field of the distribution grid and buildings interface. While analytics, in general, is a growing field of interest, and often seen as the golden goose in the burgeoning distribution grid industry, its application is often limited by communications infrastructure, or lack of a focused technical application. Overall, the linkage of analytics to purposeful application in the grid space has been limited. In this paper wemore » consider the field of machine learning as a subset of analytical techniques, and discuss its ability and limitations to enable the future distribution grid and the building-to-grid interface. To that end, we also consider the potential for mixing distributed and centralized analytics and the pros and cons of these approaches. Machine learning is a subfield of computer science that studies and constructs algorithms that can learn from data and make predictions and improve forecasts. Incorporation of machine learning in grid monitoring and analysis tools may have the potential to solve data and operational challenges that result from increasing penetration of distributed and behind-the-meter energy resources. There is an exponentially expanding volume of measured data being generated on the distribution grid, which, with appropriate application of analytics, may be transformed into intelligible, actionable information that can be provided to the right actors – such as grid and building operators, at the appropriate time to enhance grid or building resilience, efficiency, and operations against various metrics or goals – such as total carbon reduction or other economic benefit to customers. While some basic analysis into these data streams can provide a wealth of information, computational and human boundaries on performing the analysis are becoming significant, with more data and multi-objective concerns. Efficient applications of analysis and the machine learning field are being considered in the loop.« less

  9. Multi-Criteria Optimization of the Deployment of a Grid for Rural Electrification Based on a Heuristic Method

    NASA Astrophysics Data System (ADS)

    Ortiz-Matos, L.; Aguila-Tellez, A.; Hincapié-Reyes, R. C.; González-Sanchez, J. W.

    2017-07-01

    In order to design electrification systems, recent mathematical models solve the problem of location, type of electrification components, and the design of possible distribution microgrids. However, due to the amount of points to be electrified increases, the solution to these models require high computational times, thereby becoming unviable practice models. This study posed a new heuristic method for the electrification of rural areas in order to solve the problem. This heuristic algorithm presents the deployment of rural electrification microgrids in the world, by finding routes for optimal placement lines and transformers in transmission and distribution microgrids. The challenge is to obtain a display with equity in losses, considering the capacity constraints of the devices and topology of the land at minimal economic cost. An optimal scenario ensures the electrification of all neighbourhoods to a minimum investment cost in terms of the distance between electric conductors and the amount of transformation devices.

  10. Laser SRS tracker for reverse prototyping tasks

    NASA Astrophysics Data System (ADS)

    Kolmakov, Egor; Redka, Dmitriy; Grishkanich, Aleksandr; Tsvetkov, Konstantin

    2017-10-01

    According to the current great interest concerning Large-Scale Metrology applications in many different fields of manufacturing industry, technologies and techniques for dimensional measurement have recently shown a substantial improvement. Ease-of-use, logistic and economic issues, as well as metrological performance, are assuming a more and more important role among system requirements. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of chip and microlasers as radiators on the linear-angular characteristics of existing measurement systems. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of microlasers as radiators on the linear-angular characteristics of existing measurement systems. The system consists of a distributed network-based layout, whose modularity allows to fit differently sized and shaped working volumes by adequately increasing the number of sensing units. Differently from existing spatially distributed metrological instruments, the remote sensor devices are intended to provide embedded data elaboration capabilities, in order to share the overall computational load.

  11. DC Microgrids Scoping Study. Estimate of Technical and Economic Benefits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backhaus, Scott N.; Swift, Gregory William; Chatzivasileiadis, Spyridon

    Microgrid demonstrations and deployments are expanding in US power systems and around the world. Although goals are specific to each site, these microgrids have demonstrated the ability to provide higher reliability and higher power quality than utility power systems and improved energy utilization. The vast majority of these microgrids are based on AC power transfer because this has been the traditionally dominant power delivery scheme. Independently, manufacturers, power system designers and researchers are demonstrating and deploying DC power distribution systems for applications where the end-use loads are natively DC, e.g., computers, solid-state lighting, and building networks. These early DC applicationsmore » may provide higher efficiency, added flexibility, reduced capital costs over their AC counterparts. Further, when onsite renewable generation, electric vehicles and storage systems are present, DC-based microgrids may offer additional benefits. Early successes from these efforts raises a question - can a combination of microgrid concepts and DC distribution systems provide added benefits beyond what has been achieved individually?« less

  12. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    NASA Astrophysics Data System (ADS)

    Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-10-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computing model and data structures used by Distributed Computing applications and services are continuously evolving and trend to fit newer requirements from ADC community. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing, like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others. The improvements of information model and general updates are also shown, in particular we explain how other collaborations outside ATLAS could benefit the system as a computing resources information catalogue. AGIS is evolving towards a common information system, not coupled to a specific experiment.

  13. Study on Karst Information Identification of Qiandongnan Prefecture Based on RS and GIS Technology

    NASA Astrophysics Data System (ADS)

    Yao, M.; Zhou, G.; Wang, W.; Wu, Z.; Huang, Y.; Huang, X.

    2018-04-01

    Karst area is a pure natural resource base, at the same time, due to the special geological environment; there are droughts and floods alternating with frequent karst collapse, rocky desertification and other resource and environment problems, which seriously restrict the sustainable economic and social development in karst areas. Therefore, this paper identifies and studies the karst, and clarifies the distribution of karst. Provide basic data for the rational development of resources in the karst region and the governance of desertification. Due to the uniqueness of the karst landscape, it can't be directly recognized and extracted by computer in remote sensing images. Therefore, this paper uses the idea of "RS + DEM" to solve the above problems. this article is based on Landsat-5 TM imagery in 2010 and DEM data, proposes the methods to identify karst information research what is use of slope vector diagram, vegetation distribution map, distribution map of karst rocky desertification and other auxiliary data in combination with the signs for human-computer interaction interpretation, identification and extraction of peak forest, peaks cluster and isolated peaks, and further extraction of karst depression. Experiments show that this method achieves the "RS + DEM" mode through the reasonable combination of remote sensing images and DEM data. It not only effectively extracts karst areas covered with vegetation, but also quickly and accurately locks down the karst area and greatly improves the efficiency and precision of visual interpretation. The accurate interpretation rate of karst information in study area in this paper is 86.73 %.

  14. A Weibull distribution accrual failure detector for cloud computing

    PubMed Central

    Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229

  15. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    ERIC Educational Resources Information Center

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  16. Parallel computing method for simulating hydrological processesof large rivers under climate change

    NASA Astrophysics Data System (ADS)

    Wang, H.; Chen, Y.

    2016-12-01

    Climate change is one of the proverbial global environmental problems in the world.Climate change has altered the watershed hydrological processes in time and space distribution, especially in worldlarge rivers.Watershed hydrological process simulation based on physically based distributed hydrological model can could have better results compared with the lumped models.However, watershed hydrological process simulation includes large amount of calculations, especially in large rivers, thus needing huge computing resources that may not be steadily available for the researchers or at high expense, this seriously restricted the research and application. To solve this problem, the current parallel method are mostly parallel computing in space and time dimensions.They calculate the natural features orderly thatbased on distributed hydrological model by grid (unit, a basin) from upstream to downstream.This articleproposes ahigh-performancecomputing method of hydrological process simulation with high speedratio and parallel efficiency.It combinedthe runoff characteristics of time and space of distributed hydrological model withthe methods adopting distributed data storage, memory database, distributed computing, parallel computing based on computing power unit.The method has strong adaptability and extensibility,which means it canmake full use of the computing and storage resources under the condition of limited computing resources, and the computing efficiency can be improved linearly with the increase of computing resources .This method can satisfy the parallel computing requirements ofhydrological process simulation in small, medium and large rivers.

  17. SocioEconomic Characteristics of the All Volunteer Force: Evidence from the National Longitudinal Survey, 1979

    DTIC Science & Technology

    1982-02-01

    Fredland 0>-and DTIC Roger D. Little ELECTE 1 Economics Department S JUL 6 1982 U.S. Naval Academy Annapolis, Maryland 21402 D DISTRIBUTION STATEMENT A...NATIONAL LONGITUDINAL SURVEY, 1979 by J. Eric Fredland and Roger D. Little Economics Department U.S. Naval Academy Annapolis, MD 21402 ii; DISTRIBUTION...comments, Steven Lerch for technical assis- tance, and Doris Keating for typing support. i. 4 I• I SECURITY CLASSIFICATIONI OF THIS PAGE (When Dal* EnterNAU

  18. Computer Graphics Simulations of Sampling Distributions.

    ERIC Educational Resources Information Center

    Gordon, Florence S.; Gordon, Sheldon P.

    1989-01-01

    Describes the use of computer graphics simulations to enhance student understanding of sampling distributions that arise in introductory statistics. Highlights include the distribution of sample proportions, the distribution of the difference of sample means, the distribution of the difference of sample proportions, and the distribution of sample…

  19. Directed Random Markets: Connectivity Determines Money

    NASA Astrophysics Data System (ADS)

    Martínez-Martínez, Ismael; López-Ruiz, Ricardo

    2013-12-01

    Boltzmann-Gibbs (BG) distribution arises as the statistical equilibrium probability distribution of money among the agents of a closed economic system where random and undirected exchanges are allowed. When considering a model with uniform savings in the exchanges, the final distribution is close to the gamma family. In this paper, we implement these exchange rules on networks and we find that these stationary probability distributions are robust and they are not affected by the topology of the underlying network. We introduce a new family of interactions: random but directed ones. In this case, it is found the topology to be determinant and the mean money per economic agent is related to the degree of the node representing the agent in the network. The relation between the mean money per economic agent and its degree is shown to be linear.

  20. Simulation study of entropy production in the one-dimensional Vlasov system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Zongliang, E-mail: liangliang1223@gmail.com; Wang, Shaojie

    2016-07-15

    The coarse-grain averaged distribution function of the one-dimensional Vlasov system is obtained by numerical simulation. The entropy productions in cases of the random field, the linear Landau damping, and the bump-on-tail instability are computed with the coarse-grain averaged distribution function. The computed entropy production is converged with increasing length of coarse-grain average. When the distribution function differs slightly from a Maxwellian distribution, the converged value agrees with the result computed by using the definition of thermodynamic entropy. The length of the coarse-grain average to compute the coarse-grain averaged distribution function is discussed.

  1. Assessment of the Economic Potential of Distributed Wind in Colorado, Minnesota, and New York

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCabe, Kevin; Sigrin, Benjamin O.; Lantz, Eric J.

    This work seeks to identify current and future spatial distributions of economic potential for behind-the-meter distributed wind, serving primarily rural or suburban homes, farms, and manufacturing facilities in Colorado, Minnesota, and New York. These states were identified by technical experts based on their current favorability for distributed wind deployment. We use NREL's Distributed Wind Market Demand Model (dWind) (Lantz et al. 2017; Sigrin et al. 2016) to identify and rank counties in each of the states by their overall and per capita potential. From this baseline assessment, we also explore how and where improvements in cost, performance, and other marketmore » sensitivities affect distributed wind potential.« less

  2. A comparison of queueing, cluster and distributed computing systems

    NASA Technical Reports Server (NTRS)

    Kaplan, Joseph A.; Nelson, Michael L.

    1993-01-01

    Using workstation clusters for distributed computing has become popular with the proliferation of inexpensive, powerful workstations. Workstation clusters offer both a cost effective alternative to batch processing and an easy entry into parallel computing. However, a number of workstations on a network does not constitute a cluster. Cluster management software is necessary to harness the collective computing power. A variety of cluster management and queuing systems are compared: Distributed Queueing Systems (DQS), Condor, Load Leveler, Load Balancer, Load Sharing Facility (LSF - formerly Utopia), Distributed Job Manager (DJM), Computing in Distributed Networked Environments (CODINE), and NQS/Exec. The systems differ in their design philosophy and implementation. Based on published reports on the different systems and conversations with the system's developers and vendors, a comparison of the systems are made on the integral issues of clustered computing.

  3. Fault Tolerant Software Technology for Distributed Computer Systems

    DTIC Science & Technology

    1989-03-01

    RAY.) &-TR-88-296 I Fin;.’ Technical Report ,r 19,39 i A28 3329 F’ULT TOLERANT SOFTWARE TECHNOLOGY FOR DISTRIBUTED COMPUTER SYSTEMS Georgia Institute...GrfisABN 34-70IiWftlI NO0. IN?3. NO IACCESSION NO. 158 21 7 11. TITLE (Incld security Cassification) FAULT TOLERANT SOFTWARE FOR DISTRIBUTED COMPUTER ...Technology for Distributed Computing Systems," a two year effort performed at Georgia Institute of Technology as part of the Clouds Project. The Clouds

  4. Clusters of poverty and disease emerge from feedbacks on an epidemiological network.

    PubMed

    Pluciński, Mateusz M; Ngonghala, Calistus N; Getz, Wayne M; Bonds, Matthew H

    2013-03-06

    The distribution of health conditions is characterized by extreme inequality. These disparities have been alternately attributed to disease ecology and the economics of poverty. Here, we provide a novel framework that integrates epidemiological and economic growth theory on an individual-based hierarchically structured network. Our model indicates that, under certain parameter regimes, feedbacks between disease ecology and economics create clusters of low income and high disease that can stably persist in populations that become otherwise predominantly rich and free of disease. Surprisingly, unlike traditional poverty trap models, these localized disease-driven poverty traps can arise despite homogeneity of parameters and evenly distributed initial economic conditions.

  5. Control of dispatch dynamics for lowering the cost of distributed generation in the built environment

    NASA Astrophysics Data System (ADS)

    Flores, Robert Joseph

    Distributed generation can provide many benefits over traditional central generation such as increased reliability and efficiency while reducing emissions. Despite these potential benefits, distributed generation is generally not purchased unless it reduces energy costs. Economic dispatch strategies can be designed such that distributed generation technologies reduce overall facility energy costs. In this thesis, a microturbine generator is dispatched using different economic control strategies, reducing the cost of energy to the facility. Several industrial and commercial facilities are simulated using acquired electrical, heating, and cooling load data. Industrial and commercial utility rate structures are modeled after Southern California Edison and Southern California Gas Company tariffs and used to find energy costs for the simulated buildings and corresponding microturbine dispatch. Using these control strategies, building models, and utility rate models, a parametric study examining various generator characteristics is performed. An economic assessment of the distributed generation is then performed for both the microturbine generator and parametric study. Without the ability to export electricity to the grid, the economic value of distributed generation is limited to reducing the individual costs that make up the cost of energy for a building. Any economic dispatch strategy must be built to reduce these individual costs. While the ability of distributed generation to reduce cost depends of factors such as electrical efficiency and operations and maintenance cost, the building energy demand being serviced has a strong effect on cost reduction. Buildings with low load factors can accept distributed generation with higher operating costs (low electrical efficiency and/or high operations and maintenance cost) due to the value of demand reduction. As load factor increases, lower operating cost generators are desired due to a larger portion of the building load being met in an effort to reduce demand. In addition, buildings with large thermal demand have access to the least expensive natural gas, lowering the cost of operating distributed generation. Recovery of exhaust heat from DG reduces cost only if the buildings thermal demand coincides with the electrical demand. Capacity limits exist where annual savings from operation of distributed generation decrease if further generation is installed. For low operating cost generators, the approximate limit is the average building load. This limit decreases as operating costs increase. In addition, a high capital cost of distributed generation can be accepted if generator operating costs are low. As generator operating costs increase, capital cost must decrease if a positive economic performance is desired.

  6. Colloidal micelles of block copolymers as nanoreactors, templates for gold nanoparticles, and vehicles for biomedical applications.

    PubMed

    Bakshi, Mandeep Singh

    2014-11-01

    Target drug delivery methodology is becoming increasingly important to overcome the shortcomings of conventional drug delivery absorption method. It improves the action time with uniform distribution and poses minimum side effects, but is usually difficult to design to achieve the desire results. Economically favorable, environment friendly, multifunctional, and easy to design, hybrid nanomaterials have demonstrated their enormous potential as target drug delivery vehicles. A combination of both micelles and nanoparticles makes them fine target delivery vehicles in a variety of biological applications where precision is primarily required to achieve the desired results as in the case of cytotoxicity of cancer cells, chemotherapy, and computed tomography guided radiation therapy. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Monte Carlo simulation of single accident airport risk profile

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A computer simulation model was developed for estimating the potential economic impacts of a carbon fiber release upon facilities within an 80 kilometer radius of a major airport. The model simulated the possible range of release conditions and the resulting dispersion of the carbon fibers. Each iteration of the model generated a specific release scenario, which would cause a specific amount of dollar loss to the surrounding community. By repeated iterations, a risk profile was generated, showing the probability distribution of losses from one accident. Using accident probability estimates, the risks profile for annual losses was derived. The mechanics are described of the simulation model, the required input data, and the risk profiles generated for the 26 large hub airports.

  8. Stimulating the Manufacturing and Distribution of Rehabilitation Products: Economic and Policy Incentives and Disincentives.

    ERIC Educational Resources Information Center

    Scadden, Lawrence A.

    Personal interviews and written correspondence were used to obtain information from 39 officers of companies involved in the manufacture and distribution of rehabilitation-related products, regarding their perceptions of the potential effects of various economic factors and governmental policies. An attempt was made to identify disincentives to…

  9. Multiphasic Health Testing in the Clinic Setting

    PubMed Central

    LaDou, Joseph

    1971-01-01

    The economy of automated multiphasic health testing (amht) activities patterned after the high-volume Kaiser program can be realized in low-volume settings. amht units have been operated at daily volumes of 20 patients in three separate clinical environments. These programs have displayed economics entirely compatible with cost figures published by the established high-volume centers. This experience, plus the expanding capability of small, general purpose, digital computers (minicomputers) indicates that a group of six or more physicians generating 20 laboratory appraisals per day can economically justify a completely automated multiphasic health testing facility. This system would reside in the clinic or hospital where it is used and can be configured to do analyses such as electrocardiography and generate laboratory reports, and communicate with large computer systems in university medical centers. Experience indicates that the most effective means of implementing these benefits of automation is to make them directly available to the medical community with the physician playing the central role. Economic justification of a dedicated computer through low-volume health testing then allows, as a side benefit, automation of administrative as well as other diagnostic activities—for example, patient billing, computer-aided diagnosis, and computer-aided therapeutics. PMID:4935771

  10. Computer program MCAP-TOSS calculates steady-state fluid dynamics of coolant in parallel channels and temperature distribution in surrounding heat-generating solid

    NASA Technical Reports Server (NTRS)

    Lee, A. Y.

    1967-01-01

    Computer program calculates the steady state fluid distribution, temperature rise, and pressure drop of a coolant, the material temperature distribution of a heat generating solid, and the heat flux distributions at the fluid-solid interfaces. It performs the necessary iterations automatically within the computer, in one machine run.

  11. Constructing probabilistic scenarios for wide-area solar power generation

    DOE PAGES

    Woodruff, David L.; Deride, Julio; Staid, Andrea; ...

    2017-12-22

    Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less

  12. Constructing probabilistic scenarios for wide-area solar power generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodruff, David L.; Deride, Julio; Staid, Andrea

    Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less

  13. Computer-based mechanical design of overhead lines

    NASA Astrophysics Data System (ADS)

    Rusinaru, D.; Bratu, C.; Dinu, R. C.; Manescu, L. G.

    2016-02-01

    Beside the performance, the safety level according to the actual standards is a compulsory condition for distribution grids’ operation. Some of the measures leading to improvement of the overhead lines reliability ask for installations’ modernization. The constraints imposed to the new lines components refer to the technical aspects as thermal stress or voltage drop, and look for economic efficiency, too. The mechanical sizing of the overhead lines is after all an optimization problem. More precisely, the task in designing of the overhead line profile is to size poles, cross-arms and stays and locate poles along a line route so that the total costs of the line's structure to be minimized and the technical and safety constraints to be fulfilled.The authors present in this paper an application for the Computer-Based Mechanical Design of the Overhead Lines and the features of the corresponding Visual Basic program, adjusted to the distribution lines. The constraints of the optimization problem are adjusted to the existing weather and loading conditions of Romania. The outputs of the software application for mechanical design of overhead lines are: the list of components chosen for the line: poles, cross-arms, stays; the list of conductor tension and forces for each pole, cross-arm and stay for different weather conditions; the line profile drawings.The main features of the mechanical overhead lines design software are interactivity, local optimization function and high-level user-interface

  14. Writing Better Software for Economics Principles Textbooks.

    ERIC Educational Resources Information Center

    Walbert, Mark S.

    1989-01-01

    Examines computer software currently available with most introductory economics textbooks. Compares what is available with what should be available in order to meet the goal of effectively using the microcomputer to teach economic principles. Recommends 14 specific pedagogical changes that should be made in order to improve current designs. (LS)

  15. The planning and implementation of a demand-side management/distribution automation system at Taiwan Power Company

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, S.S.; Chen, Yun-Wu

    1994-12-31

    This paper would describe the Taipower`s experience of DSM/DAS development. For the past 5 years, the demand of electricity has maintained a high annual growth rate of 8.45% due to economic prosperity in Taiwan. As the environmental protection consciousness has recently made Taipower difficult to develope and construct new power plants, substations, transmission and distribution lines, and our power grid is an independent system, we do need to concern how to do DSM to manage the load problems. Since 1984, Taipower has established two pilot systems and these systems performed the functions of fault detect and isolation certainly good formore » Distribution Automation. With the rapid development of computer, communication and control technology, the concept of the DAS has gradually been implemented in real cases. Taipower organized an engineering task group to study DAS several years ago, and based on the operation experience of the existing systems, today Taipower is planning to launch a new DAS project for Tai-Chung area. According to Taipower requirements, the DAS will have the functions of feeder automation, automatic meter reading, load management and disteibution system analysis.« less

  16. An approach for heterogeneous and loosely coupled geospatial data distributed computing

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui

    2010-07-01

    Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.

  17. Systems Analysis and Design for Decision Support Systems on Economic Feasibility of Projects

    NASA Astrophysics Data System (ADS)

    Balaji, S. Arun

    2010-11-01

    This paper discuss about need for development of the Decision Support System (DSS) software for economic feasibility of projects in Rwanda, Africa. The various economic theories needed and the corresponding formulae to compute payback period, internal rate of return and benefit cost ratio of projects are clearly given in this paper. This paper is also deals with the systems flow chart to fabricate the system in any higher level computing language. The various input requirements from the projects and the output needed for the decision makers are also included in this paper. The data dictionary used for input and output data structure is also explained.

  18. [Are the "elderly" living at the expense of the "young"? Remarks on the burden distribution between "generations" in an aging population from the economic perspective].

    PubMed

    Schmähl, W

    2002-08-01

    Public and scientific discussion on the effects of an aging population is often biased: aging is primarily seen as an economic burden. Increasing contribution rates in pension schemes, health and long-term care insurance are highlighted. This paper tries to provide a more balanced view. The distinction between a cross-sectional and a longitudinal view already gives different information on distributional effects. Labeling older people as "economically inactive" is a much too narrow perspective focused on the activity on the labor market only. Other types of work are neglected such as caring for children or frail elderly as well as economic activities from wealth, consumption as well as paying taxes to finance public expenditure. The approach of "generational accounting" is also narrow, focusing on public expenditure, social insurance contributions and only some types of taxes, but not dealing with private, especially intrafamily transfers. In economic terms, a comprehensive approach is needed regarding the effect of institutions and measures on the economic situation of cohorts. The role of investment in human capital is mentioned as a decisive factor for productivity in a country. Further education and retraining of older workers is one important element. An integrative approach dealing with the different fields of activities is needed when analyzing the intergenerational as well as the intragenerational distribution. This requires an elaborated and differentiated reporting of distributional effects. This important precondition, however, does not exist in Germany.

  19. Computer models for economic and silvicultural decisions

    Treesearch

    Rosalie J. Ingram

    1989-01-01

    Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.

  20. Systems analysis of the space shuttle. [communication systems, computer systems, and power distribution

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.; Oh, S. J.; Thau, F.

    1975-01-01

    Developments in communications systems, computer systems, and power distribution systems for the space shuttle are described. The use of high speed delta modulation for bit rate compression in the transmission of television signals is discussed. Simultaneous Multiprocessor Organization, an approach to computer organization, is presented. Methods of computer simulation and automatic malfunction detection for the shuttle power distribution system are also described.

  1. Distributed intrusion detection system based on grid security model

    NASA Astrophysics Data System (ADS)

    Su, Jie; Liu, Yahui

    2008-03-01

    Grid computing has developed rapidly with the development of network technology and it can solve the problem of large-scale complex computing by sharing large-scale computing resource. In grid environment, we can realize a distributed and load balance intrusion detection system. This paper first discusses the security mechanism in grid computing and the function of PKI/CA in the grid security system, then gives the application of grid computing character in the distributed intrusion detection system (IDS) based on Artificial Immune System. Finally, it gives a distributed intrusion detection system based on grid security system that can reduce the processing delay and assure the detection rates.

  2. Wind Energy Conversion System Analysis Model (WECSAM) computer program documentation

    NASA Astrophysics Data System (ADS)

    Downey, W. T.; Hendrick, P. L.

    1982-07-01

    Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation.

  3. A distributed programming environment for Ada

    NASA Technical Reports Server (NTRS)

    Brennan, Peter; Mcdonnell, Tom; Mcfarland, Gregory; Timmins, Lawrence J.; Litke, John D.

    1986-01-01

    Despite considerable commercial exploitation of fault tolerance systems, significant and difficult research problems remain in such areas as fault detection and correction. A research project is described which constructs a distributed computing test bed for loosely coupled computers. The project is constructing a tool kit to support research into distributed control algorithms, including a distributed Ada compiler, distributed debugger, test harnesses, and environment monitors. The Ada compiler is being written in Ada and will implement distributed computing at the subsystem level. The design goal is to provide a variety of control mechanics for distributed programming while retaining total transparency at the code level.

  4. Economic impact of large public programs: The NASA experience

    NASA Technical Reports Server (NTRS)

    Ginzburg, E.; Kuhn, J. W.; Schnee, J.; Yavitz, B.

    1976-01-01

    The economic impact of NASA programs on weather forecasting and the computer and semiconductor industries is discussed. Contributions to the advancement of the science of astronomy are also considered.

  5. PLIF Temperature and Velocity Distributions in Laminar Hypersonic Flat-plate Flow

    NASA Technical Reports Server (NTRS)

    OByrne, S.; Danehy, P. M.; Houwing, A. F. P.

    2003-01-01

    Rotational temperature and velocity distributions have been measured across a hypersonic laminar flat-plate boundary layer, using planar laser-induced fluorescence. The measurements are compared to a finite-volume computation and a first-order boundary layer computation, assuming local similarity. Both computations produced similar temperature distributions and nearly identical velocity distributions. The disagreement between calculations is ascribed to the similarity solution not accounting for leading-edge displacement effects. The velocity measurements agreed to within the measurement uncertainty of 2 % with both calculated distributions. The peak measured temperature was 200 K lower than the computed values. This discrepancy is tentatively ascribed to vibrational relaxation in the boundary layer.

  6. Distributed metadata in a high performance computing environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Zhang, Zhenhua

    A computer-executable method, system, and computer program product for managing meta-data in a distributed storage system, wherein the distributed storage system includes one or more burst buffers enabled to operate with a distributed key-value store, the co computer-executable method, system, and computer program product comprising receiving a request for meta-data associated with a block of data stored in a first burst buffer of the one or more burst buffers in the distributed storage system, wherein the meta data is associated with a key-value, determining which of the one or more burst buffers stores the requested metadata, and upon determination thatmore » a first burst buffer of the one or more burst buffers stores the requested metadata, locating the key-value in a portion of the distributed key-value store accessible from the first burst buffer.« less

  7. Can cognitive science create a cognitive economics?

    PubMed

    Chater, Nick

    2015-02-01

    Cognitive science can intersect with economics in at least three productive ways: by providing richer models of individual behaviour for use in economic analysis; by drawing from economic theory in order to model distributed cognition; and jointly to create more powerful 'rational' models of cognitive processes and social interaction. There is the prospect of moving from behavioural economics to a genuinely cognitive economics. Copyright © 2014. Published by Elsevier B.V.

  8. Summary of the First Network-Centric Sensing Community Workshop, ’Netted Sensors: A Government, Industry and Academia Dialogue’

    DTIC Science & Technology

    2006-04-01

    and Scalability, (2) Sensors and Platforms, (3) Distributed Computing and Processing , (4) Information Management, (5) Fusion and Resource Management...use of the deployed system. 3.3 Distributed Computing and Processing Session The Distributed Computing and Processing Session consisted of three

  9. Economic design of control charts considering process shift distributions

    NASA Astrophysics Data System (ADS)

    Vommi, Vijayababu; Kasarapu, Rukmini V.

    2014-09-01

    Process shift is an important input parameter in the economic design of control charts. Earlier control chart designs considered constant shifts to occur in the mean of the process for a given assignable cause. This assumption has been criticized by many researchers since it may not be realistic to produce a constant shift whenever an assignable cause occurs. To overcome this difficulty, in the present work, a distribution for the shift parameter has been considered instead of a single value for a given assignable cause. Duncan's economic design model for chart has been extended to incorporate the distribution for the process shift parameter. It is proposed to minimize total expected loss-cost to obtain the control chart parameters. Further, three types of process shifts namely, positively skewed, uniform and negatively skewed distributions are considered and the situations where it is appropriate to use the suggested methodology are recommended.

  10. Heuristic Scheduling in Grid Environments: Reducing the Operational Energy Demand

    NASA Astrophysics Data System (ADS)

    Bodenstein, Christian

    In a world where more and more businesses seem to trade in an online market, the supply of online services to the ever-growing demand could quickly reach its capacity limits. Online service providers may find themselves maxed out at peak operation levels during high-traffic timeslots but too little demand during low-traffic timeslots, although the latter is becoming less frequent. At this point deciding which user is allocated what level of service becomes essential. The concept of Grid computing could offer a meaningful alternative to conventional super-computing centres. Not only can Grids reach the same computing speeds as some of the fastest supercomputers, but distributed computing harbors a great energy-saving potential. When scheduling projects in such a Grid environment however, simply assigning one process to a system becomes so complex in calculation that schedules are often too late to execute, rendering their optimizations useless. Current schedulers attempt to maximize the utility, given some sort of constraint, often reverting to heuristics. This optimization often comes at the cost of environmental impact, in this case CO 2 emissions. This work proposes an alternate model of energy efficient scheduling while keeping a respectable amount of economic incentives untouched. Using this model, it is possible to reduce the total energy consumed by a Grid environment using 'just-in-time' flowtime management, paired with ranking nodes by efficiency.

  11. Monte Carlo simulation as a tool to predict blasting fragmentation based on the Kuz Ram model

    NASA Astrophysics Data System (ADS)

    Morin, Mario A.; Ficarazzo, Francesco

    2006-04-01

    Rock fragmentation is considered the most important aspect of production blasting because of its direct effects on the costs of drilling and blasting and on the economics of the subsequent operations of loading, hauling and crushing. Over the past three decades, significant progress has been made in the development of new technologies for blasting applications. These technologies include increasingly sophisticated computer models for blast design and blast performance prediction. Rock fragmentation depends on many variables such as rock mass properties, site geology, in situ fracturing and blasting parameters and as such has no complete theoretical solution for its prediction. However, empirical models for the estimation of size distribution of rock fragments have been developed. In this study, a blast fragmentation Monte Carlo-based simulator, based on the Kuz-Ram fragmentation model, has been developed to predict the entire fragmentation size distribution, taking into account intact and joints rock properties, the type and properties of explosives and the drilling pattern. Results produced by this simulator were quite favorable when compared with real fragmentation data obtained from a blast quarry. It is anticipated that the use of Monte Carlo simulation will increase our understanding of the effects of rock mass and explosive properties on the rock fragmentation by blasting, as well as increase our confidence in these empirical models. This understanding will translate into improvements in blasting operations, its corresponding costs and the overall economics of open pit mines and rock quarries.

  12. Families at the Century's Turn: The Troubling Economic Trends. Family Review.

    ERIC Educational Resources Information Center

    Lindjord, Denise

    2000-01-01

    Discusses U.S. economic trends for the past century. Notes that distribution of wealth is more concentrated at top than is distribution of income, with income inequality growing worse in the 1990s. Maintains that wealth disparity explains achievement test score gaps between white and minority students. Presents proposals for asset-building,…

  13. GASP- General Aviation Synthesis Program. Volume 7: Economics

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The economic analysis includes: manufacturing costs; labor costs; parts costs; operating costs; markups and consumer price. A user's manual for a computer program to calculate the final consumer price is included.

  14. Systematic review, critical appraisal, and analysis of the quality of economic evaluations in stroke imaging.

    PubMed

    Burton, Kirsteen R; Perlis, Nathan; Aviv, Richard I; Moody, Alan R; Kapral, Moira K; Krahn, Murray D; Laupacis, Andreas

    2014-03-01

    This study reviews the quality of economic evaluations of imaging after acute stroke and identifies areas for improvement. We performed full-text searches of electronic databases that included Medline, Econlit, the National Health Service Economic Evaluation Database, and the Tufts Cost Effectiveness Analysis Registry through July 2012. Search strategy terms included the following: stroke*; cost*; or cost-benefit analysis*; and imag*. Inclusion criteria were empirical studies published in any language that reported the results of economic evaluations of imaging interventions for patients with stroke symptoms. Study quality was assessed by a commonly used checklist (with a score range of 0% to 100%). Of 568 unique potential articles identified, 5 were included in the review. Four of 5 articles were explicit in their analysis perspectives, which included healthcare system payers, hospitals, and stroke services. Two studies reported results during a 5-year time horizon, and 3 studies reported lifetime results. All included the modified Rankin Scale score as an outcome measure. The median quality score was 84.4% (range=71.9%-93.5%). Most studies did not consider the possibility that patients could not tolerate contrast media or could incur contrast-induced nephropathy. Three studies compared perfusion computed tomography with unenhanced computed tomography but assumed that outcomes guided by the results of perfusion computed tomography were equivalent to outcomes guided by the results of magnetic resonance imaging or noncontrast computed tomography. Economic evaluations of imaging modalities after acute ischemic stroke were generally of high methodological quality. However, important radiology-specific clinical components were missing from all of these analyses.

  15. Guest Editor's Introduction: Special section on dependable distributed systems

    NASA Astrophysics Data System (ADS)

    Fetzer, Christof

    1999-09-01

    We rely more and more on computers. For example, the Internet reshapes the way we do business. A `computer outage' can cost a company a substantial amount of money. Not only with respect to the business lost during an outage, but also with respect to the negative publicity the company receives. This is especially true for Internet companies. After recent computer outages of Internet companies, we have seen a drastic fall of the shares of the affected companies. There are multiple causes for computer outages. Although computer hardware becomes more reliable, hardware related outages remain an important issue. For example, some of the recent computer outages of companies were caused by failed memory and system boards, and even by crashed disks - a failure type which can easily be masked using disk mirroring. Transient hardware failures might also look like software failures and, hence, might be incorrectly classified as such. However, many outages are software related. Faulty system software, middleware, and application software can crash a system. Dependable computing systems are systems we can rely on. Dependable systems are, by definition, reliable, available, safe and secure [3]. This special section focuses on issues related to dependable distributed systems. Distributed systems have the potential to be more dependable than a single computer because the probability that all computers in a distributed system fail is smaller than the probability that a single computer fails. However, if a distributed system is not built well, it is potentially less dependable than a single computer since the probability that at least one computer in a distributed system fails is higher than the probability that one computer fails. For example, if the crash of any computer in a distributed system can bring the complete system to a halt, the system is less dependable than a single-computer system. Building dependable distributed systems is an extremely difficult task. There is no silver bullet solution. Instead one has to apply a variety of engineering techniques [2]: fault-avoidance (minimize the occurrence of faults, e.g. by using a proper design process), fault-removal (remove faults before they occur, e.g. by testing), fault-evasion (predict faults by monitoring and reconfigure the system before failures occur), and fault-tolerance (mask and/or contain failures). Building a system from scratch is an expensive and time consuming effort. To reduce the cost of building dependable distributed systems, one would choose to use commercial off-the-shelf (COTS) components whenever possible. The usage of COTS components has several potential advantages beyond minimizing costs. For example, through the widespread usage of a COTS component, design failures might be detected and fixed before the component is used in a dependable system. Custom-designed components have to mature without the widespread in-field testing of COTS components. COTS components have various potential disadvantages when used in dependable systems. For example, minimizing the time to market might lead to the release of components with inherent design faults (e.g. use of `shortcuts' that only work most of the time). In addition, the components might be more complex than needed and, hence, potentially have more design faults than simpler components. However, given economic constraints and the ability to cope with some of the problems using fault-evasion and fault-tolerance, only for a small percentage of systems can one justify not using COTS components. Distributed systems built from current COTS components are asynchronous systems in the sense that there exists no a priori known bound on the transmission delay of messages or the execution time of processes. When designing a distributed algorithm, one would like to make sure (e.g. by testing or verification) that it is correct, i.e. satisfies its specification. Many distributed algorithms make use of consensus (eventually all non-crashed processes have to agree on a value), leader election (a crashed leader is eventually replaced by a new leader, but at any time there is at most one leader) or a group membership detection service (a crashed process is eventually suspected to have crashed but only crashed processes are suspected). From a theoretical point of view, the service specifications given for such services are not implementable in asynchronous systems. In particular, for each implementation one can derive a counter example in which the service violates its specification. From a practical point of view, the consensus, the leader election, and the membership detection problem are solvable in asynchronous distributed systems. In this special section, Raynal and Tronel show how to bridge this difference by showing how to implement the group membership detection problem with a negligible probability [1] to fail in an asynchronous system. The group membership detection problem is specified by a liveness condition (L) and a safety property (S): (L) if a process p crashes, then eventually every non-crashed process q has to suspect that p has crashed; and (S) if a process q suspects p, then p has indeed crashed. One can show that either (L) or (S) is implementable, but one cannot implement both (L) and (S) at the same time in an asynchronous system. In practice, one only needs to implement (L) and (S) such that the probability that (L) or (S) is violated becomes negligible. Raynal and Tronel propose and analyse a protocol that implements (L) with certainty and that can be tuned such that the probability that (S) is violated becomes negligible. Designing and implementing distributed fault-tolerant protocols for asynchronous systems is a difficult but not an impossible task. A fault-tolerant protocol has to detect and mask certain failure classes, e.g. crash failures and message omission failures. There is a trade-off between the performance of a fault-tolerant protocol and the failure classes the protocol can tolerate. One wants to tolerate as many failure classes as needed to satisfy the stochastic requirements of the protocol [1] while still maintaining a sufficient performance. Since clients of a protocol have different requirements with respect to the performance/fault-tolerance trade-off, one would like to be able to customize protocols such that one can select an appropriate performance/fault-tolerance trade-off. In this special section Hiltunen et al describe how one can compose protocols from micro-protocols in their Cactus system. They show how a group RPC system can be tailored to the needs of a client. In particular, they show how considering additional failure classes affects the performance of a group RPC system. References [1] Cristian F 1991 Understanding fault-tolerant distributed systems Communications of ACM 34 (2) 56-78 [2] Heimerdinger W L and Weinstock C B 1992 A conceptual framework for system fault tolerance Technical Report 92-TR-33, CMU/SEI [3] Laprie J C (ed) 1992 Dependability: Basic Concepts and Terminology (Vienna: Springer)

  16. Economic Comparison of Processes Using Spreadsheet Programs

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Pappano, A. W.; Jennings, C. N.

    1986-01-01

    Inexpensive approach aids plant-design decisions. Commercially available electronic spreadsheet programs aid economic comparison of different processes for producing particular end products. Facilitates plantdesign decisions without requiring large expenditures for powerful mainframe computers.

  17. The economic impact of public resource supply constraints in northeast Oregon.

    Treesearch

    Edward C Waters; David W. Holland; Richard W. Haynes

    1977-01-01

    Traditional, fixed-price (input-output) economic models provide a useful framework for conceptualizing links in a regional economy. Apparent shortcomings in these models, however, can severely restrict our ability to deduce valid prescriptions for public policy and economic development. A more efficient approach using regional computable general equilibrium (CGE)...

  18. The Impacts and Economic Costs of Climate Change in Agriculture and the Costs and Benefits of Adaptation

    NASA Astrophysics Data System (ADS)

    Iglesias, A.; Quiroga, S.; Garrote, L.; Cunningham, R.

    2012-04-01

    This paper provides monetary estimates of the effects of agricultural adaptation to climate change in Europe. The model computes spatial crop productivity changes as a response to climate change linking biophysical and socioeconomic components. It combines available data sets of crop productivity changes under climate change (Iglesias et al 2011, Ciscar et al 2011), statistical functions of productivity response to water and nitrogen inputs, catchment level water availability, and environmental policy scenarios. Future global change scenarios are derived from several socio-economic futures of representative concentration pathways and regional climate models. The economic valuation is conducted by using GTAP general equilibrium model. The marginal productivity changes has been used as an input for the economic general equilibrium model in order to analyse the economic impact of the agricultural changes induced by climate change in the world. The study also includes the analysis of an adaptive capacity index computed by using the socio-economic results of GTAP. The results are combined to prioritize agricultural adaptation policy needs in Europe.

  19. Economic optimization of operations for hybrid energy systems under variable markets

    DOE PAGES

    Chen, Jen; Garcia, Humberto E.

    2016-05-21

    We prosed a hybrid energy systems (HES) which is an important element to enable increasing penetration of clean energy. Our paper investigates the operations flexibility of HES, and develops a methodology for operations optimization for maximizing economic value based on predicted renewable generation and market information. A multi-environment computational platform for performing such operations optimization is also developed. In order to compensate for prediction error, a control strategy is accordingly designed to operate a standby energy storage element (ESE) to avoid energy imbalance within HES. The proposed operations optimizer allows systematic control of energy conversion for maximal economic value. Simulationmore » results of two specific HES configurations are included to illustrate the proposed methodology and computational capability. These results demonstrate the economic viability of HES under proposed operations optimizer, suggesting the diversion of energy for alternative energy output while participating in the ancillary service market. Economic advantages of such operations optimizer and associated flexible operations are illustrated by comparing the economic performance of flexible operations against that of constant operations. Sensitivity analysis with respect to market variability and prediction error, are also performed.« less

  20. Economic optimization of operations for hybrid energy systems under variable markets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jen; Garcia, Humberto E.

    We prosed a hybrid energy systems (HES) which is an important element to enable increasing penetration of clean energy. Our paper investigates the operations flexibility of HES, and develops a methodology for operations optimization for maximizing economic value based on predicted renewable generation and market information. A multi-environment computational platform for performing such operations optimization is also developed. In order to compensate for prediction error, a control strategy is accordingly designed to operate a standby energy storage element (ESE) to avoid energy imbalance within HES. The proposed operations optimizer allows systematic control of energy conversion for maximal economic value. Simulationmore » results of two specific HES configurations are included to illustrate the proposed methodology and computational capability. These results demonstrate the economic viability of HES under proposed operations optimizer, suggesting the diversion of energy for alternative energy output while participating in the ancillary service market. Economic advantages of such operations optimizer and associated flexible operations are illustrated by comparing the economic performance of flexible operations against that of constant operations. Sensitivity analysis with respect to market variability and prediction error, are also performed.« less

  1. The Production, Consumption and Distribution of Economic Podcasts

    ERIC Educational Resources Information Center

    Swan, Kathy; Hofer, Mark; Swan, Gerry; Mazur, Joan

    2010-01-01

    Podcasting remains one of the hot "buzz words" in technology today--both in and out of schools. For high school economics teachers, there is a growing number of podcasts targeted for instruction on economics. While the podcasts can be useful resources for teachers trying to enhance students' economic thinking, they are not produced…

  2. Economic growth, climate change, biodiversity loss: distributive justice for the global north and south.

    PubMed

    Rosales, Jon

    2008-12-01

    Economic growth-the increase in production and consumption of goods and services-must be considered within its biophysical context. Economic growth is fueled by biophysical inputs and its outputs degrade ecological processes, such as the global climate system. Economic growth is currently the principal cause of increased climate change, and climate change is a primary mechanism of biodiversity loss. Therefore, economic growth is a prime catalyst of biodiversity loss. Because people desire economic growth for dissimilar reasons-some for the increased accumulation of wealth, others for basic needs-how we limit economic growth becomes an ethical problem. Principles of distributive justice can help construct an international climate-change regime based on principles of equity. An equity-based framework that caps economic growth in the most polluting economies will lessen human impact on biodiversity. When coupled with a cap-and-trade mechanism, the framework can also provide a powerful tool for redistribution of wealth. Such an equity-based framework promises to be more inclusive and therefore more effective because it accounts for the disparate developmental conditions of the global north and south.

  3. Current Global Pricing For Human Papillomavirus Vaccines Brings The Greatest Economic Benefits To Rich Countries.

    PubMed

    Herlihy, Niamh; Hutubessy, Raymond; Jit, Mark

    2016-02-01

    Vaccinating females against human papillomavirus (HPV) prior to the debut of sexual activity is an effective way to prevent cervical cancer, yet vaccine uptake in low- and middle-income countries has been hindered by high vaccine prices. We created an economic model to estimate the distribution of the economic surplus-the sum of all health and economic benefits of a vaccine, minus the costs of development, production, and distribution-among different country income groups and manufacturers for a cohort of twelve-year-old females in 2012. We found that manufacturers may have received economic returns worth five times their original investment in HPV vaccine development. High-income countries gained the greatest economic surplus of any income category, realizing over five times more economic value per vaccinated female than low-income countries did. Subsidizing vaccine prices in low- and middle-income countries could both reduce financial barriers to vaccine adoption and still allow high-income countries to retain their economic surpluses and manufacturers to retain their profits. Project HOPE—The People-to-People Health Foundation, Inc.

  4. Scaling and universality in urban economic diversification.

    PubMed

    Youn, Hyejin; Bettencourt, Luís M A; Lobo, José; Strumsky, Deborah; Samaniego, Horacio; West, Geoffrey B

    2016-01-01

    Understanding cities is central to addressing major global challenges from climate change to economic resilience. Although increasingly perceived as fundamental socio-economic units, the detailed fabric of urban economic activities is only recently accessible to comprehensive analyses with the availability of large datasets. Here, we study abundances of business categories across US metropolitan statistical areas, and provide a framework for measuring the intrinsic diversity of economic activities that transcends scales of the classification scheme. A universal structure common to all cities is revealed, manifesting self-similarity in internal economic structure as well as aggregated metrics (GDP, patents, crime). We present a simple mathematical derivation of the universality, and provide a model, together with its economic implications of open-ended diversity created by urbanization, for understanding the observed empirical distribution. Given the universal distribution, scaling analyses for individual business categories enable us to determine their relative abundances as a function of city size. These results shed light on the processes of economic differentiation with scale, suggesting a general structure for the growth of national economies as integrated urban systems. © 2016 The Authors.

  5. Scaling and universality in urban economic diversification

    PubMed Central

    Bettencourt, Luís M. A.; Lobo, José; Strumsky, Deborah; Samaniego, Horacio; West, Geoffrey B.

    2016-01-01

    Understanding cities is central to addressing major global challenges from climate change to economic resilience. Although increasingly perceived as fundamental socio-economic units, the detailed fabric of urban economic activities is only recently accessible to comprehensive analyses with the availability of large datasets. Here, we study abundances of business categories across US metropolitan statistical areas, and provide a framework for measuring the intrinsic diversity of economic activities that transcends scales of the classification scheme. A universal structure common to all cities is revealed, manifesting self-similarity in internal economic structure as well as aggregated metrics (GDP, patents, crime). We present a simple mathematical derivation of the universality, and provide a model, together with its economic implications of open-ended diversity created by urbanization, for understanding the observed empirical distribution. Given the universal distribution, scaling analyses for individual business categories enable us to determine their relative abundances as a function of city size. These results shed light on the processes of economic differentiation with scale, suggesting a general structure for the growth of national economies as integrated urban systems. PMID:26790997

  6. Improving Conceptual Design for Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Olds, John R.

    1998-01-01

    This report summarizes activities performed during the second year of a three year cooperative agreement between NASA - Langley Research Center and Georgia Tech. Year 1 of the project resulted in the creation of a new Cost and Business Assessment Model (CABAM) for estimating the economic performance of advanced reusable launch vehicles including non-recurring costs, recurring costs, and revenue. The current year (second year) activities were focused on the evaluation of automated, collaborative design frameworks (computation architectures or computational frameworks) for automating the design process in advanced space vehicle design. Consistent with NASA's new thrust area in developing and understanding Intelligent Synthesis Environments (ISE), the goals of this year's research efforts were to develop and apply computer integration techniques and near-term computational frameworks for conducting advanced space vehicle design. NASA - Langley (VAB) has taken a lead role in developing a web-based computing architectures within which the designer can interact with disciplinary analysis tools through a flexible web interface. The advantages of this approach are, 1) flexible access to the designer interface through a simple web browser (e.g. Netscape Navigator), 2) ability to include existing 'legacy' codes, and 3) ability to include distributed analysis tools running on remote computers. To date, VAB's internal emphasis has been on developing this test system for the planetary entry mission under the joint Integrated Design System (IDS) program with NASA - Ames and JPL. Georgia Tech's complementary goals this year were to: 1) Examine an alternate 'custom' computational architecture for the three-discipline IDS planetary entry problem to assess the advantages and disadvantages relative to the web-based approach.and 2) Develop and examine a web-based interface and framework for a typical launch vehicle design problem.

  7. A new taxonomy for distributed computer systems based upon operating system structure

    NASA Technical Reports Server (NTRS)

    Foudriat, E. C.

    1985-01-01

    Characteristics of the resource structure found in the operating system are considered as a mechanism for classifying distributed computer systems. Since the operating system resources, themselves, are too diversified to provide a consistent classification, the structure upon which resources are built and shared are examined. The location and control character of this indivisibility provides the taxonomy for separating uniprocessors, computer networks, network computers (fully distributed processing systems or decentralized computers) and algorithm and/or data control multiprocessors. The taxonomy is important because it divides machines into a classification that is relevant or important to the client and not the hardware architect. It also defines the character of the kernel O/S structure needed for future computer systems. What constitutes an operating system for a fully distributed processor is discussed in detail.

  8. Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds

    NASA Astrophysics Data System (ADS)

    Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.

    In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.

  9. Numerical Package in Computer Supported Numeric Analysis Teaching

    ERIC Educational Resources Information Center

    Tezer, Murat

    2007-01-01

    At universities in the faculties of Engineering, Sciences, Business and Economics together with higher education in Computing, it is stated that because of the difficulty, calculators and computers can be used in Numerical Analysis (NA). In this study, the learning computer supported NA will be discussed together with important usage of the…

  10. Distributed processor allocation for launching applications in a massively connected processors complex

    DOEpatents

    Pedretti, Kevin

    2008-11-18

    A compute processor allocator architecture for allocating compute processors to run applications in a multiple processor computing apparatus is distributed among a subset of processors within the computing apparatus. Each processor of the subset includes a compute processor allocator. The compute processor allocators can share a common database of information pertinent to compute processor allocation. A communication path permits retrieval of information from the database independently of the compute processor allocators.

  11. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  12. Evidence for the Gompertz curve in the income distribution of Brazil 1978-2005

    NASA Astrophysics Data System (ADS)

    Moura, N. J., Jr.; Ribeiro, M. B.

    2009-01-01

    This work presents an empirical study of the evolution of the personal income distribution in Brazil. Yearly samples available from 1978 to 2005 were studied and evidence was found that the complementary cumulative distribution of personal income for 99% of the economically less favorable population is well represented by a Gompertz curve of the form G(x) = exp [exp (A-Bx)], where x is the normalized individual income. The complementary cumulative distribution of the remaining 1% richest part of the population is well represented by a Pareto power law distribution P(x) = βx-α. This result means that similarly to other countries, Brazil’s income distribution is characterized by a well defined two class system. The parameters A, B, α, β were determined by a mixture of boundary conditions, normalization and fitting methods for every year in the time span of this study. Since the Gompertz curve is characteristic of growth models, its presence here suggests that these patterns in income distribution could be a consequence of the growth dynamics of the underlying economic system. In addition, we found out that the percentage share of both the Gompertzian and Paretian components relative to the total income shows an approximate cycling pattern with periods of about 4 years and whose maximum and minimum peaks in each component alternate at about every 2 years. This finding suggests that the growth dynamics of Brazil’s economic system might possibly follow a Goodwin-type class model dynamics based on the application of the Lotka-Volterra equation to economic growth and cycle.

  13. 7 CFR 1942.17 - Community facilities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... economic purposes a single community having a contiguous boundary. (2) Project selection process. The... efficient management and economical service; and/or enlarge, extend, or otherwise modify existing facilities... account for items such as geographic distribution of funds and emergency conditions caused by economic...

  14. 7 CFR 1942.17 - Community facilities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... economic purposes a single community having a contiguous boundary. (2) Project selection process. The... efficient management and economical service; and/or enlarge, extend, or otherwise modify existing facilities... account for items such as geographic distribution of funds and emergency conditions caused by economic...

  15. 7 CFR 1942.17 - Community facilities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... economic purposes a single community having a contiguous boundary. (2) Project selection process. The... efficient management and economical service; and/or enlarge, extend, or otherwise modify existing facilities... account for items such as geographic distribution of funds and emergency conditions caused by economic...

  16. 7 CFR 1942.17 - Community facilities.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... economic purposes a single community having a contiguous boundary. (2) Project selection process. The... efficient management and economical service; and/or enlarge, extend, or otherwise modify existing facilities... account for items such as geographic distribution of funds and emergency conditions caused by economic...

  17. Temporal growth and spatial distribution of the fast food industry and its relationship with economic development in China - 2005-2012.

    PubMed

    Xue, Hong; Cheng, Xi; Zhang, Qi; Wang, Huijun; Zhang, Bing; Qu, Weidong; Wang, Youfa

    2017-09-01

    The fast food (FF) industry has expanded rapidly in China during the past two decades, in parallel with an increase in the prevalence of obesity. Using government-reported longitudinal data from 21 provinces and cities in China, this study examined the growth over time and the spatial distribution patterns of the FF industry as well as the key social economic factors involved. We visualized the temporal and geographic distributions of FF industry development and conducted cross-sectional and longitudinal spatial analysis to assess associations between macroeconomic conditions, population dynamics, and the growth and distributional changes of the industry. It grew faster in the southeast coastal (more economically developed) areas since 2005 than in other regions. The industry was: 1) highly correlated with Gross Domestic Product; 2) highly correlated with per capita disposable income for urban residents; 3) moderately correlated with urban population; and 4) not correlated with an increase of population size. The mean center of the FF industry shifted westward as the mean center of the GDP moved in the same direction, while the mean center of the population shifted eastward. The results suggest that the rapid FF industry expansion in China was closely associated with economic growth and that improving the food environment should be a major component in local economic development planning. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demeure, I.M.

    The research presented here is concerned with representation techniques and tools to support the design, prototyping, simulation, and evaluation of message-based parallel, distributed computations. The author describes ParaDiGM-Parallel, Distributed computation Graph Model-a visual representation technique for parallel, message-based distributed computations. ParaDiGM provides several views of a computation depending on the aspect of concern. It is made of two complementary submodels, the DCPG-Distributed Computing Precedence Graph-model, and the PAM-Process Architecture Model-model. DCPGs are precedence graphs used to express the functionality of a computation in terms of tasks, message-passing, and data. PAM graphs are used to represent the partitioning of a computationmore » into schedulable units or processes, and the pattern of communication among those units. There is a natural mapping between the two models. He illustrates the utility of ParaDiGM as a representation technique by applying it to various computations (e.g., an adaptive global optimization algorithm, the client-server model). ParaDiGM representations are concise. They can be used in documenting the design and the implementation of parallel, distributed computations, in describing such computations to colleagues, and in comparing and contrasting various implementations of the same computation. He then describes VISA-VISual Assistant, a software tool to support the design, prototyping, and simulation of message-based parallel, distributed computations. VISA is based on the ParaDiGM model. In particular, it supports the editing of ParaDiGM graphs to describe the computations of interest, and the animation of these graphs to provide visual feedback during simulations. The graphs are supplemented with various attributes, simulation parameters, and interpretations which are procedures that can be executed by VISA.« less

  19. Hybrid computer technique yields random signal probability distributions

    NASA Technical Reports Server (NTRS)

    Cameron, W. D.

    1965-01-01

    Hybrid computer determines the probability distributions of instantaneous and peak amplitudes of random signals. This combined digital and analog computer system reduces the errors and delays of manual data analysis.

  20. A distributed computing approach to mission operations support. [for spacecraft

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1975-01-01

    Computing mission operation support includes orbit determination, attitude processing, maneuver computation, resource scheduling, etc. The large-scale third-generation distributed computer network discussed is capable of fulfilling these dynamic requirements. It is shown that distribution of resources and control leads to increased reliability, and exhibits potential for incremental growth. Through functional specialization, a distributed system may be tuned to very specific operational requirements. Fundamental to the approach is the notion of process-to-process communication, which is effected through a high-bandwidth communications network. Both resource-sharing and load-sharing may be realized in the system.

  1. Economic Crisis, Technology and the Management of Education: The Case of Distributed Leadership

    ERIC Educational Resources Information Center

    Hartley, David

    2016-01-01

    The 2008 crash has been likened to that of 1929. Does it have consequences for the management of education, and in particular for distributed leadership? Informed by evolutionary economics, it is argued that 2008 marked the end of the installation period of a major technological innovation, namely ICT. In the aftermath of the crash, a period of…

  2. Analysing Relationships Between Urban Land Use Fragmentation Metrics and Socio-Economic Variables

    NASA Astrophysics Data System (ADS)

    Sapena, M.; Ruiz, L. A.; Goerlich, F. J.

    2016-06-01

    Analysing urban regions is essential for their correct monitoring and planning. This is mainly accounted for the sharp increase of people living in urban areas, and consequently, the need to manage them. At the same time there has been a rise in the use of spatial and statistical datasets, such as the Urban Atlas, which offers high-resolution urban land use maps obtained from satellite imagery, and the Urban Audit, which provides statistics of European cities and their surroundings. In this study, we analyse the relations between urban fragmentation metrics derived from Land Use and Land Cover (LULC) data from the Urban Atlas dataset, and socio-economic data from the Urban Audit for the reference years 2006 and 2012. We conducted the analysis on a sample of sixty-eight Functional Urban Areas (FUAs). One-date and two-date based fragmentation indices were computed for each FUA, land use class and date. Correlation tests and principal component analysis were then applied to select the most representative indices. Finally, multiple regression models were tested to explore the prediction of socio-economic variables, using different combinations of land use metrics as explanatory variables, both at a given date and in a dynamic context. The outcomes show that demography, living conditions, labour, and transportation variables have a clear relation with the morphology of the FUAs. This methodology allows us to compare European FUAs in terms of the spatial distribution of the land use classes, their complexity, and their structural changes, as well as to preview and model different growth patterns and socio-economic indicators.

  3. A distributed system for fast alignment of next-generation sequencing data.

    PubMed

    Srimani, Jaydeep K; Wu, Po-Yen; Phan, John H; Wang, May D

    2010-12-01

    We developed a scalable distributed computing system using the Berkeley Open Interface for Network Computing (BOINC) to align next-generation sequencing (NGS) data quickly and accurately. NGS technology is emerging as a promising platform for gene expression analysis due to its high sensitivity compared to traditional genomic microarray technology. However, despite the benefits, NGS datasets can be prohibitively large, requiring significant computing resources to obtain sequence alignment results. Moreover, as the data and alignment algorithms become more prevalent, it will become necessary to examine the effect of the multitude of alignment parameters on various NGS systems. We validate the distributed software system by (1) computing simple timing results to show the speed-up gained by using multiple computers, (2) optimizing alignment parameters using simulated NGS data, and (3) computing NGS expression levels for a single biological sample using optimal parameters and comparing these expression levels to that of a microarray sample. Results indicate that the distributed alignment system achieves approximately a linear speed-up and correctly distributes sequence data to and gathers alignment results from multiple compute clients.

  4. Application of computational statistical physics to scale invariance and universality in economic phenomena

    NASA Astrophysics Data System (ADS)

    Stanley, H. E.; Amaral, L. A. N.; Gopikrishnan, P.; Plerou, V.; Salinger, M. A.

    2002-06-01

    This paper discusses some of the similarities between work being done by economists and by computational physicists seeking to contribute to economics. We also mention some of the differences in the approaches taken and seek to justify these different approaches by developing the argument that by approaching the same problem from different points of view, new results might emerge. In particular, we review two such new results. Specifically, we discuss the two newly-discovered scaling results that appear to be "universal", in the sense that they hold for widely different economies as well as for different time periods: (i) the fluctuation of price changes of any stock market is characterized by a probability density function (PDF), which is a simple power law with exponent -4 extending over 10 2 standard deviations (a factor of 10 8 on the y-axis); this result is analogous to the Gutenberg-Richter power law describing the histogram of earthquakes of a given strength; (ii) for a wide range of economic organizations, the histogram that shows how size of organization is inversely correlated to fluctuations in size with an exponent ≈0.2. Neither of these two new empirical laws has a firm theoretical foundation. We also discuss results that are reminiscent of phase transitions in spin systems, where the divergent behavior of the response function at the critical point (zero magnetic field) leads to large fluctuations. We discuss a curious "symmetry breaking" for values of Σ above a certain threshold value Σc; here Σ is defined to be the local first moment of the probability distribution of demand Ω—the difference between the number of shares traded in buyer-initiated and seller-initiated trades. This feature is qualitatively identical to the behavior of the probability density of the magnetization for fixed values of the inverse temperature.

  5. Karlynn Cory | NREL

    Science.gov Websites

    . Research Interests Clean energy project financing Renewable energy techno-economic analysis Distributed Distributed Energy Future: Volume II A Case Study of Integrated Distributed Energy Resource Planning by

  6. The Newcomb-Benford law in its relation to some common distributions.

    PubMed

    Formann, Anton K

    2010-05-07

    An often reported, but nevertheless persistently striking observation, formalized as the Newcomb-Benford law (NBL), is that the frequencies with which the leading digits of numbers occur in a large variety of data are far away from being uniform. Most spectacular seems to be the fact that in many data the leading digit 1 occurs in nearly one third of all cases. Explanations for this uneven distribution of the leading digits were, among others, scale- and base-invariance. Little attention, however, found the interrelation between the distribution of the significant digits and the distribution of the observed variable. It is shown here by simulation that long right-tailed distributions of a random variable are compatible with the NBL, and that for distributions of the ratio of two random variables the fit generally improves. Distributions not putting most mass on small values of the random variable (e.g. symmetric distributions) fail to fit. Hence, the validity of the NBL needs the predominance of small values and, when thinking of real-world data, a majority of small entities. Analyses of data on stock prices, the areas and numbers of inhabitants of countries, and the starting page numbers of papers from a bibliography sustain this conclusion. In all, these findings may help to understand the mechanisms behind the NBL and the conditions needed for its validity. That this law is not only of scientific interest per se, but that, in addition, it has also substantial implications can be seen from those fields where it was suggested to be put into practice. These fields reach from the detection of irregularities in data (e.g. economic fraud) to optimizing the architecture of computers regarding number representation, storage, and round-off errors.

  7. Generalization of the Lord-Wingersky Algorithm to Computing the Distribution of Summed Test Scores Based on Real-Number Item Scores

    ERIC Educational Resources Information Center

    Kim, Seonghoon

    2013-01-01

    With known item response theory (IRT) item parameters, Lord and Wingersky provided a recursive algorithm for computing the conditional frequency distribution of number-correct test scores, given proficiency. This article presents a generalized algorithm for computing the conditional distribution of summed test scores involving real-number item…

  8. ELINT Signal Processing Using Choi-Williams Distribution on Reconfigurable Computers for Detection and Classification of LPI Emitters

    DTIC Science & Technology

    2008-03-01

    WVD Wigner - Ville Distribution xiv THIS PAGE INTENTIONALLY LEFT BLANK xv ACKNOWLEDGMENTS Many thanks to David Caliga of SRC Computer for his...11 2. Wigner - Ville Distribution .................................................................11 3. Choi-Williams... Ville Distribution ...................................12 Table 3. C Code Output for Wigner - Ville Distribution

  9. Measuring Changes in Economic Well-Being. Changing Domestic Priorities Discussion Paper.

    ERIC Educational Resources Information Center

    Moon, Marilyn

    This paper examines the distribution of economic well-being from 1980 to 1984, and compares economic changes during that period with those of other periods. The following indicators of economic change are used: (1) money income consistent with a Census definition; (2) money income net of direct taxes--i.e., disposable income; and (3) disposable…

  10. Modeling the Economic Impacts of Large Deployments on Local Communities

    DTIC Science & Technology

    2008-12-01

    MODELING THE ECONOMIC IMPACTS OF LARGE DEPLOYMENTS ON LOCAL COMMUNITIES THESIS Aaron L... MODELING THE ECONOMIC IMPACTS OF LARGE DEPLOYMENTS ON LOCAL COMMUNITIES THESIS Presented to the Faculty Department of Systems Engineering and...APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED AFIT/GCA/ENV/08-D01 MODELING THE ECONOMIC IMPACTS OF LARGE DEPLOYMENTS ON LOCAL

  11. Project Economic Stew: A Study of Poultry and Rice. A Third-grade Economics Project [and] A Bird's Eye View of an Economic Stew: A Study of Poultry and Rice Production in Arkansas.

    ERIC Educational Resources Information Center

    Fox, Penny

    An economics project for third grade children is described and lessons for teaching basic economic concepts are provided. In the first semester, students studied basic economic concepts; in the second semester, they learned about the origin, production, and distribution of rice and poultry and how these products affect the local and state…

  12. Rigorous Proof of the Boltzmann-Gibbs Distribution of Money on Connected Graphs

    NASA Astrophysics Data System (ADS)

    Lanchier, Nicolas

    2017-04-01

    Models in econophysics, i.e., the emerging field of statistical physics that applies the main concepts of traditional physics to economics, typically consist of large systems of economic agents who are characterized by the amount of money they have. In the simplest model, at each time step, one agent gives one dollar to another agent, with both agents being chosen independently and uniformly at random from the system. Numerical simulations of this model suggest that, at least when the number of agents and the average amount of money per agent are large, the distribution of money converges to an exponential distribution reminiscent of the Boltzmann-Gibbs distribution of energy in physics. The main objective of this paper is to give a rigorous proof of this result and show that the convergence to the exponential distribution holds more generally when the economic agents are located on the vertices of a connected graph and interact locally with their neighbors rather than globally with all the other agents. We also study a closely related model where, at each time step, agents buy with a probability proportional to the amount of money they have, and prove that in this case the limiting distribution of money is Poissonian.

  13. The distribution of common construction materials at risk to acid deposition in the United States

    NASA Astrophysics Data System (ADS)

    Lipfert, Frederick W.; Daum, Mary L.

    Information on the geographic distribution of various types of exposed materials is required to estimate the economic costs of damage to construction materials from acid deposition. This paper focuses on the identification, evaluation and interpretation of data describing the distributions of exterior construction materials, primarily in the United States. This information could provide guidance on how data needed for future economic assessments might be acquired in the most cost-effective ways. Materials distribution surveys from 16 cities in the U.S. and Canada and five related databases from government agencies and trade organizations were examined. Data on residential buildings are more commonly available than on nonresidential buildings; little geographically resolved information on distributions of materials in infrastructure was found. Survey results generally agree with the appropriate ancillary databases, but the usefulness of the databases is often limited by their coarse spatial resolution. Information on those materials which are most sensitive to acid deposition is especially scarce. Since a comprehensive error analysis has never been performed on the data required for an economic assessment, it is not possible to specify the corresponding detailed requirements for data on the distributions of materials.

  14. Sea/Lake Water Air Conditioning at Naval Facilities.

    DTIC Science & Technology

    1980-05-01

    ECONOMICS AT TWO FACILITIES ......... ................... 2 Facilities ........... .......................... 2 Computer Models...of an operational test at Naval Security Group Activity (NSGA) Winter Harbor, Me., and the economics of Navywide application. In FY76 an assessment of... economics of Navywide application of sea/lake water AC indicated that cost and energy savings at the sites of some Naval facilities are possible, depending

  15. A quantum–quantum Metropolis algorithm

    PubMed Central

    Yung, Man-Hong; Aspuru-Guzik, Alán

    2012-01-01

    The classical Metropolis sampling method is a cornerstone of many statistical modeling applications that range from physics, chemistry, and biology to economics. This method is particularly suitable for sampling the thermal distributions of classical systems. The challenge of extending this method to the simulation of arbitrary quantum systems is that, in general, eigenstates of quantum Hamiltonians cannot be obtained efficiently with a classical computer. However, this challenge can be overcome by quantum computers. Here, we present a quantum algorithm which fully generalizes the classical Metropolis algorithm to the quantum domain. The meaning of quantum generalization is twofold: The proposed algorithm is not only applicable to both classical and quantum systems, but also offers a quantum speedup relative to the classical counterpart. Furthermore, unlike the classical method of quantum Monte Carlo, this quantum algorithm does not suffer from the negative-sign problem associated with fermionic systems. Applications of this algorithm include the study of low-temperature properties of quantum systems, such as the Hubbard model, and preparing the thermal states of sizable molecules to simulate, for example, chemical reactions at an arbitrary temperature. PMID:22215584

  16. A parabolic velocity-decomposition method for wind turbines

    NASA Astrophysics Data System (ADS)

    Mittal, Anshul; Briley, W. Roger; Sreenivas, Kidambi; Taylor, Lafayette K.

    2017-02-01

    An economical parabolized Navier-Stokes approximation for steady incompressible flow is combined with a compatible wind turbine model to simulate wind turbine flows, both upstream of the turbine and in downstream wake regions. The inviscid parabolizing approximation is based on a Helmholtz decomposition of the secondary velocity vector and physical order-of-magnitude estimates, rather than an axial pressure gradient approximation. The wind turbine is modeled by distributed source-term forces incorporating time-averaged aerodynamic forces generated by a blade-element momentum turbine model. A solution algorithm is given whose dependent variables are streamwise velocity, streamwise vorticity, and pressure, with secondary velocity determined by two-dimensional scalar and vector potentials. In addition to laminar and turbulent boundary-layer test cases, solutions for a streamwise vortex-convection test problem are assessed by mesh refinement and comparison with Navier-Stokes solutions using the same grid. Computed results for a single turbine and a three-turbine array are presented using the NREL offshore 5-MW baseline wind turbine. These are also compared with an unsteady Reynolds-averaged Navier-Stokes solution computed with full rotor resolution. On balance, the agreement in turbine wake predictions for these test cases is very encouraging given the substantial differences in physical modeling fidelity and computer resources required.

  17. Computers in the Forest: A Summer Alternative. A Description and Evaluation of the Nature Computer Camp.

    ERIC Educational Resources Information Center

    Prom, Sukai; And Others

    The District of Columbia's Nature Computer Camp program, described and evaluated in this paper, was designed to reduce the geographical isolation of economically disadvantaged urban sixth graders, and to provide them with increased knowledge of the environmental and computer sciences. The paper begins by giving details of the program's management,…

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klitsner, Tom

    The recent Executive Order creating the National Strategic Computing Initiative (NSCI) recognizes the value of high performance computing for economic competitiveness and scientific discovery and commits to accelerate delivery of exascale computing. The HPC programs at Sandia –the NNSA ASC program and Sandia’s Institutional HPC Program– are focused on ensuring that Sandia has the resources necessary to deliver computation in the national interest.

  19. Technical economics in the power industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dicks, J.B.

    1990-01-01

    This paper reports discusses technical economics is an emerging subject different from such areas as engineering management. The subject of engineering economics will be vital in the coming years as the world economy is about to undergo an economic explosion due to the effect of the computer on every area of technology and the simultaneous large expansion of the world market through the opening and democratization of the Communist block together with the European Common Market.

  20. China Pakistan Economic Corridor (CPEC): Challenges and the Way Forward

    DTIC Science & Technology

    2017-06-01

    ECONOMIC CORRIDOR (CPEC): CHALLENGES AND THE WAY FORWARD by Muzaffar Hussain June 2017 Thesis Advisor: Robert E. Looney Second Reader...DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE CHINA PAKISTAN ECONOMIC CORRIDOR (CPEC): CHALLENGES AND THE WAY FORWARD 5. FUNDING NUMBERS...is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT The China-Pakistan Economic Corridor (CPEC)—the latest venture in a history of bilateral economic

  1. Beyond input-output computings: error-driven emergence with parallel non-distributed slime mold computer.

    PubMed

    Aono, Masashi; Gunji, Yukio-Pegio

    2003-10-01

    The emergence derived from errors is the key importance for both novel computing and novel usage of the computer. In this paper, we propose an implementable experimental plan for the biological computing so as to elicit the emergent property of complex systems. An individual plasmodium of the true slime mold Physarum polycephalum acts in the slime mold computer. Modifying the Elementary Cellular Automaton as it entails the global synchronization problem upon the parallel computing provides the NP-complete problem solved by the slime mold computer. The possibility to solve the problem by giving neither all possible results nor explicit prescription of solution-seeking is discussed. In slime mold computing, the distributivity in the local computing logic can change dynamically, and its parallel non-distributed computing cannot be reduced into the spatial addition of multiple serial computings. The computing system based on exhaustive absence of the super-system may produce, something more than filling the vacancy.

  2. Ethics, economics, and public financing of health care

    PubMed Central

    Hurley, J.

    2001-01-01

    There is a wide variety of ethical arguments for public financing of health care that share a common structure built on a series of four logically related propositions regarding: (1) the ultimate purpose of a human life or human society; (2) the role of health and its distribution in society in advancing this ultimate purpose; (3) the role of access to or utilisation of health care in maintaining or improving the desired level and distribution of health among members of society, and (4) the role of public financing in ensuring the ethically justified access to and utilisation of health care by members of society. This paper argues that economics has much to contribute to the development of the ethical foundations for publicly financed health care. It focuses in particular on recent economic work to clarify the concepts of access and need and their role in analyses of the just distribution of health care resources, and on the importance of economic analysis of health care and health care insurance markets in demonstrating why public financing is necessary to achieve broad access to and utilisation of health care services. Key Words: Ethics • economics • health care financing PMID:11479353

  3. [Socio-economic aspects of epidemiology of helicobateriosis].

    PubMed

    Fedichkina, T P; Solenova, L G; Zykova, I E; German, S V; Modestova, A V; Kislitsyn, V A; Rakhmanin, Yu A; Bobrovnitsky, I P

    There are considered special social and economic aspects of the epidemiology of Helicobacter pylori. These aspects acquired the particular importance for the last time due to the fact that the provision of the people with pure water has been becoming the focus of the attention of geopolitical and socio-economic interests in a number of countries. The availability ofpure drinking water serves a marker of the socio-economic state of the territory and the population living there. In Russia where different climatic conditions are deposited by considerable regional differences in the conditions of communal services caused both by various level of the socio-economic development of the territory, the supplementation with pure drinking water serves as the social determinant of the ecological conditions of the population’s life. This particularly has impact on the unfeasible technical state of the water distribution systems, microorganism ecology of which can substantially affect public health. The performed by authors a specialized screening ofpresented at the official web site of the joint-stock company «Mosvodokanal» current data concerning the quality of drinking water consumed by 2500 Moscovites, tested for the Helicobacter pylori infection revealed no deviations from the sanitary standards in the water received by the consumers. Along with that, the comparison of the map documents of the distribution of the Helicobacter pylori infection in Moscow with the distribution of citizens’ complaints of the decline of the quality of tap water has revealed a territorial fastening of the high values of the population infection rate of n^ylori and the urban sites with the greatest number of complaints. In the microbial ecology of water-distribution systems there are tightly aligned problems of their epidemiological safety, technical state and economic damage caused by corrosion as a result of microbiotic activity. In contrast to acute bacterial and viral infections which are deemed of the greatest importance when assessing the sanitary condition of water sources and water-distribution systems, the consequences of infection with H. pylori may not be manifestedfor a long time but some years later they may be manifested as serious chronic diseases (from gastritis to adenocarcinoma of the stomach and a wide range of extraintestinal pathologies), which causes great social and economic losses. Thus, the socio-economic aspect of the epidemiology of helicobacteriosis includes at least two components: the technic - the maintenance of the feasible technic and sanitary state of the water distribution systems and the medico-social - expenditures for screening and treatment of infected patients. In total they are an inseparable part of the prevention of socially-important diseases in the public health system.

  4. Persistence in a Random Bond Ising Model of Socio-Econo Dynamics

    NASA Astrophysics Data System (ADS)

    Jain, S.; Yamano, T.

    We study the persistence phenomenon in a socio-econo dynamics model using computer simulations at a finite temperature on hypercubic lattices in dimensions up to five. The model includes a "social" local field which contains the magnetization at time t. The nearest neighbour quenched interactions are drawn from a binary distribution which is a function of the bond concentration, p. The decay of the persistence probability in the model depends on both the spatial dimension and p. We find no evidence of "blocking" in this model. We also discuss the implications of our results for possible applications in the social and economic fields. It is suggested that the absence, or otherwise, of blocking could be used as a criterion to decide on the validity of a given model in different scenarios.

  5. Estimation of finite mixtures using the empirical characteristic function

    NASA Technical Reports Server (NTRS)

    Anderson, C.; Boullion, T.

    1985-01-01

    A problem which occurs in analyzing LANDSAT scenes is the problem of separating the components of a finite mixture of several distinct probability distributions. A review of the literature indicates this is a problem which occurs in many disciplines, such as engineering, biology, physiology and economics. Many approaches to this problem have appeared in the literature; however, most are very restrictive in their assumptions or have met with only a limited degree of success when applied to realistic situations. A proceudre is investigated with combines the k-L procedure of (Feurverger and McDunnough, 1981) with the MAICE procedure of (Akaike, 1974). The feasibility of this approach is being investigated numerically via the development of a computer software package enabling a simulation study and comparison with other procedures.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Letsoalo, A.; Blignaut, J.; de Wet, T.

    The South African government is exploring ways to address water scarcity problems by introducing a water resource management charge on the quantity of water used in sectors such as irrigated agriculture, mining, and forestry. It is expected that a more efficient water allocation, lower use, and a positive impact on poverty can be achieved. This paper reports on the validity of these claims by applying a computable general equilibrium model to analyze the triple dividend of water consumption charges in South Africa: reduced water use, more rapid economic growth, and a more equal income distribution. It is shown that anmore » appropriate budget-neutral combination of water charges, particularly on irrigated agriculture and coal mining, and reduced indirect taxes, particularly on food, would yield triple dividends, that is, less water use, more growth, and less poverty.« less

  7. A quantitative assessment of the Hadoop framework for analyzing massively parallel DNA sequencing data.

    PubMed

    Siretskiy, Alexey; Sundqvist, Tore; Voznesenskiy, Mikhail; Spjuth, Ola

    2015-01-01

    New high-throughput technologies, such as massively parallel sequencing, have transformed the life sciences into a data-intensive field. The most common e-infrastructure for analyzing this data consists of batch systems that are based on high-performance computing resources; however, the bioinformatics software that is built on this platform does not scale well in the general case. Recently, the Hadoop platform has emerged as an interesting option to address the challenges of increasingly large datasets with distributed storage, distributed processing, built-in data locality, fault tolerance, and an appealing programming methodology. In this work we introduce metrics and report on a quantitative comparison between Hadoop and a single node of conventional high-performance computing resources for the tasks of short read mapping and variant calling. We calculate efficiency as a function of data size and observe that the Hadoop platform is more efficient for biologically relevant data sizes in terms of computing hours for both split and un-split data files. We also quantify the advantages of the data locality provided by Hadoop for NGS problems, and show that a classical architecture with network-attached storage will not scale when computing resources increase in numbers. Measurements were performed using ten datasets of different sizes, up to 100 gigabases, using the pipeline implemented in Crossbow. To make a fair comparison, we implemented an improved preprocessor for Hadoop with better performance for splittable data files. For improved usability, we implemented a graphical user interface for Crossbow in a private cloud environment using the CloudGene platform. All of the code and data in this study are freely available as open source in public repositories. From our experiments we can conclude that the improved Hadoop pipeline scales better than the same pipeline on high-performance computing resources, we also conclude that Hadoop is an economically viable option for the common data sizes that are currently used in massively parallel sequencing. Given that datasets are expected to increase over time, Hadoop is a framework that we envision will have an increasingly important role in future biological data analysis.

  8. 10 CFR 590.105 - Computation of time.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Computation of time. 590.105 Section 590.105 Energy DEPARTMENT OF ENERGY (CONTINUED) NATURAL GAS (ECONOMIC REGULATORY ADMINISTRATION) ADMINISTRATIVE PROCEDURES WITH RESPECT TO THE IMPORT AND EXPORT OF NATURAL GAS General Provisions § 590.105 Computation of time...

  9. Secondary Computer-Based Instruction in Microeconomics: Cognitive and Affective Issues.

    ERIC Educational Resources Information Center

    Lasnik, Vincent E.

    This paper describes the general rationale, hypotheses, methodology, findings and implications of a recent dissertation research project conducted in the Columbus, Ohio, public schools. The computer-based study investigated the simultaneous relationship between achievement in microeconomics and attitude toward economics, level of computer anxiety,…

  10. 10 CFR 590.105 - Computation of time.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Computation of time. 590.105 Section 590.105 Energy DEPARTMENT OF ENERGY (CONTINUED) NATURAL GAS (ECONOMIC REGULATORY ADMINISTRATION) ADMINISTRATIVE PROCEDURES WITH RESPECT TO THE IMPORT AND EXPORT OF NATURAL GAS General Provisions § 590.105 Computation of time...

  11. Exploring the Issues: Humans and Computers.

    ERIC Educational Resources Information Center

    Walsh, Huber M.

    This presentation addresses three basic social issues generated by the computer revolution. The first section, "Money Matters," focuses on the economic effects of computer technology. These include the replacement of workers by fully automated machines, the threat to professionals posed by expanded access to specialized information, and the…

  12. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  13. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  14. Solution of quadratic matrix equations for free vibration analysis of structures.

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.

    1973-01-01

    An efficient digital computer procedure and the related numerical algorithm are presented herein for the solution of quadratic matrix equations associated with free vibration analysis of structures. Such a procedure enables accurate and economical analysis of natural frequencies and associated modes of discretized structures. The numerically stable algorithm is based on the Sturm sequence method, which fully exploits the banded form of associated stiffness and mass matrices. The related computer program written in FORTRAN V for the JPL UNIVAC 1108 computer proves to be substantially more accurate and economical than other existing procedures of such analysis. Numerical examples are presented for two structures - a cantilever beam and a semicircular arch.

  15. The price of alcohol: a consideration of contextual factors.

    PubMed

    Treno, Andrew J; Gruenewald, Paul J; Wood, Darryl S; Ponicki, William R

    2006-10-01

    The current study considers the determinants of prices charged for alcoholic beverages by on-premise and off-premise outlets in Alaska. Alcohol outlet densities, a surrogate measure for local retail competition, are expected to be negatively associated with prices while costs associated with distribution are expected to be positively related to prices. Community demographic and economic characteristics may affect observed local prices via the level of demand, retail costs borne by retailers, or the quality of brands offered for sale. The core data for these analyses came from a telephone survey of Alaskan retail establishments licensed to serve alcohol. This survey utilized computer-assisted telephone interviewing (CATI) techniques to collect alcohol-pricing information from on-premise (i.e., establishments where alcohol is consumed at the point of purchase such as bars and restaurants) and off-premise (i.e., establishments such as grocery stores and convenience markets where consumption occurs in other locations) alcohol retailers throughout the state of Alaska. Price estimates were developed for each beverage-type based on alcohol content. Separate regression analyses were used to model each of the 8 price indices (on-premise and off-premise measures for beer, spirits, wine, and the average price across beverage types). All regressions also controlled for a set of zip-code level indicators of community economic and demographic characteristics based on census data. Outlet density per roadway mile was unrelated to price for both on- and off-premise establishments, either across or between beverage types. In contrast, overall distribution costs did appear to be related to alcohol price. The demographic and economic variables, as a group, were significantly related to observed prices. More attention needs to be directed to the manner in which sellers and buyers behave relative to alcoholic beverages. Alcohol demand remains responsive to prices; yet, consumers have considerable latitude in determining the price that they pay for alcohol.

  16. Computed tomography imaging in the management of headache in the emergency department: cost efficacy and policy implications.

    PubMed

    Jordan, Yusef J; Lightfoote, Johnson B; Jordan, John E

    2009-04-01

    To evaluate the economic impact and diagnostic utility of computed tomography (CT) in the management of emergency department (ED) patients presenting with headache and nonfocal physical examinations. Computerized medical records from 2 major community hospitals were retrospectively reviewed of patients presenting with headache over a 2.5-year period (2003-2006). A model was developed to assess test outcomes, CT result costs, and average institutional costs of the ED visit. The binomial probabilistic distribution of expected maximum cases was also calculated. Of the 5510 patient records queried, 882 (16%) met the above criteria. Two hundred eighty-one patients demonstrated positive CT findings (31.8%), but only 9 (1.02%) demonstrated clinically significant results (requiring a change in management). Most positive studies were incidental, including old infarcts, chronic ischemic changes, encephalomalacia, and sinusitis. The average cost of the head CT exam and ED visit was $764 (2006 dollars). This was approximately 3 times the cost of a routine outpatient visit (plus CT) for headache ($253). The incremental cost per clinically significant case detected in the ED was $50078. The calculated expected maximum number of clinically significant positive cases was almost 50% lower than what was actually detected. Our results indicate that emergent CT imaging of nonfocal headache yields a low percentage of positive clinically significant results, and has limited cost efficacy. Since the use of CT for imaging patients with headache in the ED is widespread, the economic implications are considerable. Health policy reforms are indicated to better direct utilization in these patients.

  17. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models.

    PubMed

    Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik

    2017-12-15

    Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.

  18. LXtoo: an integrated live Linux distribution for the bioinformatics community

    PubMed Central

    2012-01-01

    Background Recent advances in high-throughput technologies dramatically increase biological data generation. However, many research groups lack computing facilities and specialists. This is an obstacle that remains to be addressed. Here, we present a Linux distribution, LXtoo, to provide a flexible computing platform for bioinformatics analysis. Findings Unlike most of the existing live Linux distributions for bioinformatics limiting their usage to sequence analysis and protein structure prediction, LXtoo incorporates a comprehensive collection of bioinformatics software, including data mining tools for microarray and proteomics, protein-protein interaction analysis, and computationally complex tasks like molecular dynamics. Moreover, most of the programs have been configured and optimized for high performance computing. Conclusions LXtoo aims to provide well-supported computing environment tailored for bioinformatics research, reducing duplication of efforts in building computing infrastructure. LXtoo is distributed as a Live DVD and freely available at http://bioinformatics.jnu.edu.cn/LXtoo. PMID:22813356

  19. LXtoo: an integrated live Linux distribution for the bioinformatics community.

    PubMed

    Yu, Guangchuang; Wang, Li-Gen; Meng, Xiao-Hua; He, Qing-Yu

    2012-07-19

    Recent advances in high-throughput technologies dramatically increase biological data generation. However, many research groups lack computing facilities and specialists. This is an obstacle that remains to be addressed. Here, we present a Linux distribution, LXtoo, to provide a flexible computing platform for bioinformatics analysis. Unlike most of the existing live Linux distributions for bioinformatics limiting their usage to sequence analysis and protein structure prediction, LXtoo incorporates a comprehensive collection of bioinformatics software, including data mining tools for microarray and proteomics, protein-protein interaction analysis, and computationally complex tasks like molecular dynamics. Moreover, most of the programs have been configured and optimized for high performance computing. LXtoo aims to provide well-supported computing environment tailored for bioinformatics research, reducing duplication of efforts in building computing infrastructure. LXtoo is distributed as a Live DVD and freely available at http://bioinformatics.jnu.edu.cn/LXtoo.

  20. Regional scale landslide risk assessment with a dynamic physical model - development, application and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Vidar Vangelsten, Bjørn; Liu, Zhongqiang; Eidsvig, Unni; Nadim, Farrokh

    2013-04-01

    Landslide risk must be assessed at the appropriate scale in order to allow effective risk management. At the moment, few deterministic models exist that can do all the computations required for a complete landslide risk assessment at a regional scale. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the models to compute the displacement with a large amount of individual initiation areas (computationally exhaustive). This paper presents a medium-scale, dynamic physical model for rapid mass movements in mountainous and volcanic areas. The deterministic nature of the approach makes it possible to apply it to other sites since it considers the frictional equilibrium conditions for the initiation process, the rheological resistance of the displaced flow for the run-out process and fragility curve that links intensity to economic loss for each building. The model takes into account the triggering effect of an earthquake, intense rainfall and a combination of both (spatial and temporal). The run-out module of the model considers the flow as a 2-D continuum medium solving the equations of mass balance and momentum conservation. The model is embedded in an open source environment geographical information system (GIS), it is computationally efficient and it is transparent (understandable and comprehensible) for the end-user. The model was applied to a virtual region, assessing landslide hazard, vulnerability and risk. A Monte Carlo simulation scheme was applied to quantify, propagate and communicate the effects of uncertainty in input parameters on the final results. In this technique, the input distributions are recreated through sampling and the failure criteria are calculated for each stochastic realisation of the site properties. The model is able to identify the released volumes of the critical slopes and the areas threatened by the run-out intensity. The obtained final outcome is the estimation of individual building damage and total economic risk. The research leading to these results has received funding from the European Community's Seventh Framework Programme [FP7/2007-2013] under grant agreement No 265138 New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe (MATRIX).

  1. Active and Cooperative Learning Using Web-Based Simulations.

    ERIC Educational Resources Information Center

    Schmidt, Stephen J.

    2003-01-01

    Cites advantages of using computers and the World Wide Web in classroom simulations. Provides a sample simulation that teaches the basic economic principles of trade, investment, and public goods in the context of U.S. economic history. (JEH)

  2. The Political Economy of Education.

    ERIC Educational Resources Information Center

    Carnoy, Martin

    1985-01-01

    The political economy of education treats education as a factor shaped by the power relations between different economic, political, and social groups. Specific topics discussed include the economic value of education, education as an allocator of economic roles, education and social class, education and income distribution, and education and…

  3. The Nature of Introductory Economics Courses

    ERIC Educational Resources Information Center

    Koscielniak, James

    1975-01-01

    A questionnaire was developed to determine the content, mode of instruction, approach, and textbook selection of instructors of introductory economics courses. The survey was distributed in 1974 to 143 economics instructors at two- and four-year colleges in Illinois. Results are presented here, and recommendations are made. (Author/NHM)

  4. Strategic Positioning of the Web in a Multi-Channel Market Approach.

    ERIC Educational Resources Information Center

    Simons, Luuk P. A.; Steinfield, Charles; Bouwman, Harry

    2002-01-01

    Discusses channel economics in retail activities and trends toward unbundling due to the emergence of the Web channel. Highlights include sales processes and physical distribution processes; transaction costs; hybrid electronic commerce strategies; channel management and customer support; information economics, thing economics, and service…

  5. Micro-Level Adaptation, Macro-Level Selection, and the Dynamics of Market Partitioning

    PubMed Central

    García-Díaz, César; van Witteloostuijn, Arjen; Péli, Gábor

    2015-01-01

    This paper provides a micro-foundation for dual market structure formation through partitioning processes in marketplaces by developing a computational model of interacting economic agents. We propose an agent-based modeling approach, where firms are adaptive and profit-seeking agents entering into and exiting from the market according to their (lack of) profitability. Our firms are characterized by large and small sunk costs, respectively. They locate their offerings along a unimodal demand distribution over a one-dimensional product variety, with the distribution peak constituting the center and the tails standing for the peripheries. We found that large firms may first advance toward the most abundant demand spot, the market center, and release peripheral positions as predicted by extant dual market explanations. However, we also observed that large firms may then move back toward the market fringes to reduce competitive niche overlap in the center, triggering nonlinear resource occupation behavior. Novel results indicate that resource release dynamics depend on firm-level adaptive capabilities, and that a minimum scale of production for low sunk cost firms is key to the formation of the dual structure. PMID:26656107

  6. Micro-Level Adaptation, Macro-Level Selection, and the Dynamics of Market Partitioning.

    PubMed

    García-Díaz, César; van Witteloostuijn, Arjen; Péli, Gábor

    2015-01-01

    This paper provides a micro-foundation for dual market structure formation through partitioning processes in marketplaces by developing a computational model of interacting economic agents. We propose an agent-based modeling approach, where firms are adaptive and profit-seeking agents entering into and exiting from the market according to their (lack of) profitability. Our firms are characterized by large and small sunk costs, respectively. They locate their offerings along a unimodal demand distribution over a one-dimensional product variety, with the distribution peak constituting the center and the tails standing for the peripheries. We found that large firms may first advance toward the most abundant demand spot, the market center, and release peripheral positions as predicted by extant dual market explanations. However, we also observed that large firms may then move back toward the market fringes to reduce competitive niche overlap in the center, triggering nonlinear resource occupation behavior. Novel results indicate that resource release dynamics depend on firm-level adaptive capabilities, and that a minimum scale of production for low sunk cost firms is key to the formation of the dual structure.

  7. Speed scanning system based on solid-state microchip laser for architectural planning

    NASA Astrophysics Data System (ADS)

    Redka, Dmitriy; Grishkanich, Alexsandr S.; Kolmakov, Egor; Tsvetkov, Konstantin

    2017-10-01

    According to the current great interest concerning Large-Scale Metrology applications in many different fields of manufacturing industry, technologies and techniques for dimensional measurement have recently shown a substantial improvement. Ease-of-use, logistic and economic issues, as well as metrological performance, are assuming a more and more important role among system requirements. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of microlasers as radiators on the linear-angular characteristics of existing measurement systems. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of microlasers as radiators on the linear-angular characteristics of existing measurement systems. The system consists of a distributed network-based layout, whose modularity allows to fit differently sized and shaped working volumes by adequately increasing the number of sensing units. Differently from existing spatially distributed metrological instruments, the remote sensor devices are intended to provide embedded data elaboration capabilities, in order to share the overall computational load.

  8. Coordinate measuring system based on microchip lasers for reverse prototyping

    NASA Astrophysics Data System (ADS)

    Iakovlev, Alexey; Grishkanich, Alexsandr S.; Redka, Dmitriy; Tsvetkov, Konstantin

    2017-02-01

    According to the current great interest concerning Large-Scale Metrology applications in many different fields of manufacturing industry, technologies and techniques for dimensional measurement have recently shown a substantial improvement. Ease-of-use, logistic and economic issues, as well as metrological performance, are assuming a more and more important role among system requirements. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of chip and microlasers as radiators on the linear-angular characteristics of existing measurement systems. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of microlasers as radiators on the linear-angular characteristics of existing measurement systems. The system consists of a distributed network-based layout, whose modularity allows to fit differently sized and shaped working volumes by adequately increasing the number of sensing units. Differently from existing spatially distributed metrological instruments, the remote sensor devices are intended to provide embedded data elaboration capabilities, in order to share the overall computational load.

  9. CAL--ERDA program manual. [Building Design Language; LOADS, SYSTEMS, PLANT, ECONOMICS, REPORT, EXECUTIVE, CAL-ERDA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunn, B. D.; Diamond, S. C.; Bennett, G. A.

    1977-10-01

    A set of computer programs, called Cal-ERDA, is described that is capable of rapid and detailed analysis of energy consumption in buildings. A new user-oriented input language, named the Building Design Language (BDL), has been written to allow simplified manipulation of the many variables used to describe a building and its operation. This manual provides the user with information necessary to understand in detail the Cal-ERDA set of computer programs. The new computer programs described include: an EXECUTIVE Processor to create computer system control commands; a BDL Processor to analyze input instructions, execute computer system control commands, perform assignments andmore » data retrieval, and control the operation of the LOADS, SYSTEMS, PLANT, ECONOMICS, and REPORT programs; a LOADS analysis program that calculates peak (design) zone and hourly loads and the effect of the ambient weather conditions, the internal occupancy, lighting, and equipment within the building, as well as variations in the size, location, orientation, construction, walls, roofs, floors, fenestrations, attachments (awnings, balconies), and shape of a building; a Heating, Ventilating, and Air-Conditioning (HVAC) SYSTEMS analysis program capable of modeling the operation of HVAC components including fans, coils, economizers, humidifiers, etc.; 16 standard configurations and operated according to various temperature and humidity control schedules. A plant equipment program models the operation of boilers, chillers, electrical generation equipment (diesel or turbines), heat storage apparatus (chilled or heated water), and solar heating and/or cooling systems. An ECONOMIC analysis program calculates life-cycle costs. A REPORT program produces tables of user-selected variables and arranges them according to user-specified formats. A set of WEATHER ANALYSIS programs manipulates, summarizes and plots weather data. Libraries of weather data, schedule data, and building data were prepared.« less

  10. Aircraft requirements for low/medium density markets

    NASA Technical Reports Server (NTRS)

    Ausrotas, R.; Dodge, S.; Faulkner, H.; Glendinning, I.; Hays, A.; Simpson, R.; Swan, W.; Taneja, N.; Vittek, J.

    1973-01-01

    A study was conducted to determine the demand for and the economic factors involved in air transportation in a low and medium density market. The subjects investigated are as follows: (1) industry and market structure, (2) aircraft analysis, (3) economic analysis, (4) field surveys, and (5) computer network analysis. Graphs are included to show the economic requirements and the aircraft performance characteristics.

  11. Offshore Wind Jobs and Economic Development Impact: Four Regional Scenarios (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tegen, S.

    NREL's Jobs and Economic Development Impact (JEDI) Model for Offshore Wind, is a computer tool for studying the economic impacts of fixed-bottom offshore wind projects in the United States. This presentation provides the results of an analysis of four offshore wind development scenarios in the Southeast Atlantic, Great Lakes, Mid-Atlantic, and Gulf of Mexico regions.

  12. A study of electricity planning in Thailand: An integrated top-down and bottom-up Computable General Equilibrium (CGE) modeling analysis

    NASA Astrophysics Data System (ADS)

    Srisamran, Supree

    This dissertation examines the potential impacts of three electricity policies on the economy of Thailand in terms of macroeconomic performance, income distribution, and unemployment rate. The three considered policies feature responses to potential disruption of imported natural gas used in electricity generation, alternative combinations (portfolios) of fuel feedstock for electricity generation, and increases in investment and local electricity consumption. The evaluation employs Computable General Equilibrium (CGE) approach with the extension of electricity generation and transmission module to simulate the counterfactual scenario for each policy. The dissertation consists of five chapters. Chapter one begins with a discussion of Thailand's economic condition and is followed by a discussion of the current state of electricity generation and consumption and current issues in power generation. The security of imported natural gas in power generation is then briefly discussed. The persistence of imported natural gas disruption has always caused trouble to the country, however, the economic consequences of this disruption have not yet been evaluated. The current portfolio of power generation and the concerns it raises are then presented. The current portfolio of power generation is heavily reliant upon natural gas and so needs to be diversified. Lastly, the anticipated increase in investment and electricity consumption as a consequence of regional integration is discussed. Chapter two introduces the CGE model, its background and limitations. Chapter three reviews relevant literature of the CGE method and its application in electricity policies. In addition, the submodule characterizing the network of electricity generation and distribution and the method of its integration with the CGE model are explained. Chapter four presents the findings of the policy simulations. The first simulation illustrates the consequences of responses to disruptions in natural gas imports. The results indicate that the induced response to a complete reduction in natural gas imports would cause RGDP to drop by almost 0.1%. The second set of simulations examines alternative portfolios of power generation. Simulation results indicate that promoting hydro power would be the most economical solution; although the associated mix of power generation would have some adverse effects on RGDP. Consequently, the second best alternative, in which domestic natural gas dominates the portfolio, is recommended. The last simulation suggests that two power plants, South Bangkok and Siam Energy, should be upgraded to cope with an expected 30% spike in power consumption due to an anticipated increase in regional trade and domestic investment. Chapter five concludes the dissertation and suggests possibilities for future research.

  13. Distribution of Economic Benefits from Ecotourism: A Case Study of Wolong Nature Reserve for Giant Pandas in China

    NASA Astrophysics Data System (ADS)

    He, Guangming; Chen, Xiaodong; Liu, Wei; Bearer, Scott; Zhou, Shiqiang; Cheng, Lily Yeqing; Zhang, Hemin; Ouyang, Zhiyun; Liu, Jianguo

    2008-12-01

    Ecotourism is widely promoted as a conservation tool and actively practiced in protected areas worldwide. Theoretically, support for conservation from the various types of stakeholder inside and outside protected areas is maximized if stakeholders benefit proportionally to the opportunity costs they bear. The disproportional benefit distribution among stakeholders can erode their support for or lead to the failure of ecotourism and conservation. Using Wolong Nature Reserve for Giant Pandas (China) as an example, we demonstrate two types of uneven distribution of economic benefits among four major groups of stakeholders. First, a significant inequality exists between the local rural residents and the other types of stakeholder. The rural residents are the primary bearers of the cost of conservation, but the majority of economic benefits (investment, employment, and goods) in three key ecotourism sectors (infrastructural construction, hotels/restaurants, and souvenir sales) go to other stakeholders. Second, results show that the distribution of economic benefits is unequal among the rural residents inside the reserve. Most rural households that benefit from ecotourism are located near the main road and potentially have less impact on panda habitat than households far from the road and closer to panda habitats. This distribution gap is likely to discourage conservation support from the latter households, whose activities are the main forces degrading panda habitats. We suggest that the unequal distribution of the benefits from ecotourism can be lessened by enhancing local participation, increasing the use of local goods, and encouraging relocation of rural households closer to ecotourism facilities.

  14. Distribution of economic benefits from ecotourism: a case study of Wolong Nature Reserve For Giant Pandas in China.

    PubMed

    He, Guangming; Chen, Xiaodong; Liu, Wei; Bearer, Scott; Zhou, Shiqiang; Cheng, Lily Yeqing; Zhang, Hemin; Ouyang, Zhiyun; Liu, Jianguo

    2008-12-01

    Ecotourism is widely promoted as a conservation tool and actively practiced in protected areas worldwide. Theoretically, support for conservation from the various types of stakeholder inside and outside protected areas is maximized if stakeholders benefit proportionally to the opportunity costs they bear. The disproportional benefit distribution among stakeholders can erode their support for or lead to the failure of ecotourism and conservation. Using Wolong Nature Reserve for Giant Pandas (China) as an example, we demonstrate two types of uneven distribution of economic benefits among four major groups of stakeholders. First, a significant inequality exists between the local rural residents and the other types of stakeholder. The rural residents are the primary bearers of the cost of conservation, but the majority of economic benefits (investment, employment, and goods) in three key ecotourism sectors (infrastructural construction, hotels/restaurants, and souvenir sales) go to other stakeholders. Second, results show that the distribution of economic benefits is unequal among the rural residents inside the reserve. Most rural households that benefit from ecotourism are located near the main road and potentially have less impact on panda habitat than households far from the road and closer to panda habitats. This distribution gap is likely to discourage conservation support from the latter households, whose activities are the main forces degrading panda habitats. We suggest that the unequal distribution of the benefits from ecotourism can be lessened by enhancing local participation, increasing the use of local goods, and encouraging relocation of rural households closer to ecotourism facilities.

  15. Universal laws of human society's income distribution

    NASA Astrophysics Data System (ADS)

    Tao, Yong

    2015-10-01

    General equilibrium equations in economics play the same role with many-body Newtonian equations in physics. Accordingly, each solution of the general equilibrium equations can be regarded as a possible microstate of the economic system. Since Arrow's Impossibility Theorem and Rawls' principle of social fairness will provide a powerful support for the hypothesis of equal probability, then the principle of maximum entropy is available in a just and equilibrium economy so that an income distribution will occur spontaneously (with the largest probability). Remarkably, some scholars have observed such an income distribution in some democratic countries, e.g. USA. This result implies that the hypothesis of equal probability may be only suitable for some "fair" systems (economic or physical systems). From this meaning, the non-equilibrium systems may be "unfair" so that the hypothesis of equal probability is unavailable.

  16. A view of Kanerva's sparse distributed memory

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    Pentti Kanerva is working on a new class of computers, which are called pattern computers. Pattern computers may close the gap between capabilities of biological organisms to recognize and act on patterns (visual, auditory, tactile, or olfactory) and capabilities of modern computers. Combinations of numeric, symbolic, and pattern computers may one day be capable of sustaining robots. The overview of the requirements for a pattern computer, a summary of Kanerva's Sparse Distributed Memory (SDM), and examples of tasks this computer can be expected to perform well are given.

  17. Assessing Tax Form Distribution Costs: A Proposed Method for Computing the Dollar Value of Tax Form Distribution in a Public Library.

    ERIC Educational Resources Information Center

    Casey, James B.

    1998-01-01

    Explains how a public library can compute the actual cost of distributing tax forms to the public by listing all direct and indirect costs and demonstrating the formulae and necessary computations. Supplies directions for calculating costs involved for all levels of staff as well as associated public relations efforts, space, and utility costs.…

  18. Parallel processing for scientific computations

    NASA Technical Reports Server (NTRS)

    Alkhatib, Hasan S.

    1995-01-01

    The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.

  19. Carbon accounting and economic model uncertainty of emissions from biofuels-induced land use change.

    PubMed

    Plevin, Richard J; Beckman, Jayson; Golub, Alla A; Witcover, Julie; O'Hare, Michael

    2015-03-03

    Few of the numerous published studies of the emissions from biofuels-induced "indirect" land use change (ILUC) attempt to propagate and quantify uncertainty, and those that have done so have restricted their analysis to a portion of the modeling systems used. In this study, we pair a global, computable general equilibrium model with a model of greenhouse gas emissions from land-use change to quantify the parametric uncertainty in the paired modeling system's estimates of greenhouse gas emissions from ILUC induced by expanded production of three biofuels. We find that for the three fuel systems examined--US corn ethanol, Brazilian sugar cane ethanol, and US soybean biodiesel--95% of the results occurred within ±20 g CO2e MJ(-1) of the mean (coefficient of variation of 20-45%), with economic model parameters related to crop yield and the productivity of newly converted cropland (from forestry and pasture) contributing most of the variance in estimated ILUC emissions intensity. Although the experiments performed here allow us to characterize parametric uncertainty, changes to the model structure have the potential to shift the mean by tens of grams of CO2e per megajoule and further broaden distributions for ILUC emission intensities.

  20. School Choice: Economic and Fiscal Perspectives. Policy Report PR-B12.

    ERIC Educational Resources Information Center

    Addonizio, Michael

    This paper applies economic concepts to several school choice issues, identifying various market and public school choice proposals as alternative mechanisms for generating and distributing the economic benefits of education. Private benefits redound directly to those educated or their parents; external, or public, benefits redound to other…

  1. 78 FR 67103 - Request for Nominations of Members To Serve on the Census Scientific Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-08

    ... analysis, survey methodology, geospatial analysis, econometrics, cognitive psychology, and computer science... following disciplines: demography, economics, geography, psychology, statistics, survey methodology, social... expertise in such areas as demography, economics, geography, psychology, statistics, survey methodology...

  2. Intergenerational mobility for women and minorities in the United States.

    PubMed

    Kearney, Melissa S

    2006-01-01

    Now that some of the historic barriers to economic success for U.S. women and minorities have begun to fall, women and blacks, in particular, are moving upward on the nation's socioeconomic ladder. Melissa Kearney reviews evidence that improved economic opportunities for these two groups make sex and race less important than they once were in determining economic status. But sex- and race-based differences in wages and income persist, and interactions between sex and class and between race and class continue to play a role in the intergenerational transmission of income status. Kearney surveys studies and data showing that marriage remains important in determining women's economic status, even though marriage rates among women aged eighteen to thirty-four have been falling--from 73 percent in 1960 to 44 percent in 2000. Not only do spousal earnings continue to dominate family income for married women, but also women tend to marry men whose position in the income distribution resembles their fathers' position. Marriage thus facilitates the transmission of economic status from parents to daughters. Racial wage gaps persist, says Kearney, largely because of differences in education, occupation, and skill. It also appears likely that the effects of discrimination, both current and past, continue to impede racial economic convergence. Kearney notes that the transmission of income class from parents to children among blacks differs noticeably from that among whites. Black parents and white parents pass their economic standing along to children at similar rates. But because mean income is lower among blacks than among whites, the likelihood of upward mobility in the overall income distribution is substantially lower among blacks. Black children are much more likely than white children to remain in the lower percentiles of the income distribution, and white children are more likely to remain in the upper reaches of the income distribution. Downward mobility from the top quartile to the bottom quartile is nearly four times as great for blacks as for whites.

  3. Quantitative Mineral Resource Assessment of Copper, Molybdenum, Gold, and Silver in Undiscovered Porphyry Copper Deposits in the Andes Mountains of South America

    USGS Publications Warehouse

    Cunningham, Charles G.; Zappettini, Eduardo O.; Vivallo S., Waldo; Celada, Carlos Mario; Quispe, Jorge; Singer, Donald A.; Briskey, Joseph A.; Sutphin, David M.; Gajardo M., Mariano; Diaz, Alejandro; Portigliati, Carlos; Berger, Vladimir I.; Carrasco, Rodrigo; Schulz, Klaus J.

    2008-01-01

    Quantitative information on the general locations and amounts of undiscovered porphyry copper resources of the world is important to exploration managers, land-use and environmental planners, economists, and policy makers. This publication contains the results of probabilistic estimates of the amounts of copper (Cu), molybdenum (Mo), gold (Au), and silver (Ag) in undiscovered porphyry copper deposits in the Andes Mountains of South America. The methodology used to make these estimates is called the 'Three-Part Form'. It was developed to explicitly express estimates of undiscovered resources and associated uncertainty in a form that allows economic analysis and is useful to decisionmakers. The three-part form of assessment includes: (1) delineation of tracts of land where the geology is permissive for porphyry copper deposits to form; (2) selection of grade and tonnage models appropriate for estimating grades and tonnages of the undiscovered porphyry copper deposits in each tract; and (3) estimation of the number of undiscovered porphyry copper deposits in each tract consistent with the grade and tonnage model. A Monte Carlo simulation computer program (EMINERS) was used to combine the probability distributions of the estimated number of undiscovered deposits, the grades, and the tonnages of the selected model to obtain the probability distributions for undiscovered metals in each tract. These distributions of grades and tonnages then can be used to conduct economic evaluations of undiscovered resources in a format usable by decisionmakers. Economic evaluations are not part of this report. The results of this assessment are presented in two principal parts. The first part identifies 26 regional tracts of land where the geology is permissive for the occurrence of undiscovered porphyry copper deposits of Phanerozoic age to a depth of 1 km below the Earth's surface. These tracts are believed to contain most of South America's undiscovered resources of copper. The second part presents probabilistic estimates of the amounts of copper, molybdenum, gold, and silver in undiscovered porphyry copper deposits in each tract. The study also provides tables showing the location, tract number, and age (if available) of discovered deposits and prospects. For each of the 26 permissive tracts delineated in this study, summary information is provided on: (1) the rationale for delineating the tract; (2) the rationale for choosing the mineral deposit model used to assess the tract; (3) discovered deposits and prospects; (4) exploration history; and (5) the distribution of undiscovered deposits in the tract. The scale used to evaluate geologic information and draw tracts is 1:1,000,000.

  4. Country level economic disparities in child injury mortality.

    PubMed

    Khan, Uzma Rahim; Sengoelge, Mathilde; Zia, Nukhba; Razzak, Junaid Abdul; Hasselberg, Marie; Laflamme, Lucie

    2015-02-01

    Injuries are a neglected cause of child mortality globally and the burden is unequally distributed in resource poor settings. The aim of this study is to explore the share and distribution of child injury mortality across country economic levels and the correlation between country economic level and injuries. All-cause and injury mortality rates per 100,000 were extracted for 187 countries for the 1-4 age group and under 5s from the Global Burden of Disease Study 2010. Countries were grouped into four economic levels. Gross domestic product (GDP) per capita was used to determine correlation with injury mortality. For all regions and country economic levels, the share of injuries in all-cause mortality was greater when considering the 1-4 age group than under 5s, ranging from 36.6% in Organization for Economic Cooperation and Development countries to 10.6% in Sub-Saharan Africa. Except for Sub-Saharan Africa, there is a graded association between country economic level and 1-4 injury mortality across regions, with all low-income countries having the highest rates. Except for the two regions with the highest overall injury mortality rates, there is a significant negative correlation between GDP and injury mortality in Latin America and the Caribbean, Eastern Europe/Central Asia, Asia East/South-East and Pacific and North Africa/ Middle East. Child injury mortality is unevenly distributed across regions and country economic level to the detriment of poorer countries. A significant negative correlation exists between GDP and injury in all regions, exception for the most resource poor where the burden of injuries is highest. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Electricity distribution networks: Changing regulatory approaches

    NASA Astrophysics Data System (ADS)

    Cambini, Carlo

    2016-09-01

    Increasing the penetration of distributed generation and smart grid technologies requires substantial investments. A study proposes an innovative approach that combines four regulatory tools to provide economic incentives for distribution system operators to facilitate these innovative practices.

  6. An Alternative Method for Computing Mean and Covariance Matrix of Some Multivariate Distributions

    ERIC Educational Resources Information Center

    Radhakrishnan, R.; Choudhury, Askar

    2009-01-01

    Computing the mean and covariance matrix of some multivariate distributions, in particular, multivariate normal distribution and Wishart distribution are considered in this article. It involves a matrix transformation of the normal random vector into a random vector whose components are independent normal random variables, and then integrating…

  7. Breaking the Back of Economic and Financial (Il)Literacy in South Africa: A Critical Reflection of the Role of Economic Education

    ERIC Educational Resources Information Center

    Maistry, S. M.

    2010-01-01

    South African society is characterized by high levels of poverty and unemployment. South Africa has an embarrassingly uneven distribution of income as reflected by the Gini-coefficient. While much of the country's economic ailments can be attributed to poor and selective application of economic policies during the apartheid era, there is a growing…

  8. Association between HIV infection and socio-economic status: evidence from a semirural area of southern Mozambique.

    PubMed

    Pons-Duran, Clara; González, Raquel; Quintó, Llorenç; Munguambe, Khatia; Tallada, Joan; Naniche, Denise; Sacoor, Charfudin; Sicuri, Elisa

    2016-12-01

    To analyse the association between socio-economic status (SES) and HIV in Manhiça, a district of Southern Mozambique with one of the highest HIV prevalences in the world. Data were gathered from two cross-sectional surveys performed in 2010 and 2012 among 1511 adults and from the household census of the district's population. Fractional polynomial logit models were used to analyse the association between HIV and SES, controlling for age and sex and taking into account the nonlinearity of covariates. The inequality of the distribution of HIV infection with regard to SES was computed through a concentration index. Fourth and fifth wealth quintiles, the least poor, were associated with a reduced probability of HIV infection compared to the first quintile (OR = 0.595, P-value = 0.009 and OR = 0.474, P-value < 0.001, respectively). Probability of HIV infection peaked at 36 years and then fell, and was always higher for women regardless of age and SES. HIV infection was unequally distributed across the SES strata. Despite the high HIV prevalence across the entire population of Manhiça, the poorest are at greatest risk of being HIV infected. While women have a higher probability of being HIV positive than men, both sexes showed the same infection reduction at higher levels of SES. HIV interventions in the area should particularly focus on the poorest and on women without neglecting anyone else, as the HIV risk is high for everyone. © 2016 John Wiley & Sons Ltd.

  9. Adaptive grid based multi-objective Cauchy differential evolution for stochastic dynamic economic emission dispatch with wind power uncertainty

    PubMed Central

    Lei, Xiaohui; Wang, Chao; Yue, Dong; Xie, Xiangpeng

    2017-01-01

    Since wind power is integrated into the thermal power operation system, dynamic economic emission dispatch (DEED) has become a new challenge due to its uncertain characteristics. This paper proposes an adaptive grid based multi-objective Cauchy differential evolution (AGB-MOCDE) for solving stochastic DEED with wind power uncertainty. To properly deal with wind power uncertainty, some scenarios are generated to simulate those possible situations by dividing the uncertainty domain into different intervals, the probability of each interval can be calculated using the cumulative distribution function, and a stochastic DEED model can be formulated under different scenarios. For enhancing the optimization efficiency, Cauchy mutation operation is utilized to improve differential evolution by adjusting the population diversity during the population evolution process, and an adaptive grid is constructed for retaining diversity distribution of Pareto front. With consideration of large number of generated scenarios, the reduction mechanism is carried out to decrease the scenarios number with covariance relationships, which can greatly decrease the computational complexity. Moreover, the constraint-handling technique is also utilized to deal with the system load balance while considering transmission loss among thermal units and wind farms, all the constraint limits can be satisfied under the permitted accuracy. After the proposed method is simulated on three test systems, the obtained results reveal that in comparison with other alternatives, the proposed AGB-MOCDE can optimize the DEED problem while handling all constraint limits, and the optimal scheme of stochastic DEED can decrease the conservation of interval optimization, which can provide a more valuable optimal scheme for real-world applications. PMID:28961262

  10. Research in Functionally Distributed Computer Systems Development. Volume III. Evaluation of Conversion to a Back-End Data Base Management System.

    DTIC Science & Technology

    1976-03-01

    RESEARCH IN FUNCTIONALLY DISTRIBUTED COMPUTER SYSTEMS DEVEI.OPME--ETClU) MAR 76 P S FISHER, F MARYANSKI DAA629-76-6-0108 UNCLASSIFIED CS-76-08AN...RESEARCH IN FUNCTIONALLY !DISTRIBUTED COMPUTER SYSTEMS DEVELOPMENT Kansas State University Virgil Wallentine Principal Investigator Approved for public...reme; disiribution unlimited DTIC \\4JWE III ELECTi"U ~E V0AI. Ill ~1ONTAUG 2 0 1981&EV .IAIN LiSP4 F U.S. ARMY COMPUTER SYSTEMS COMMAND FT BELVOIR, VA

  11. How Reliable Are Michigan High School Economics Textbooks? Sixteen Commonly Used Michigan High School Textbooks Are Graded for Balance, Accuracy, Clarity, and Instruction in the "Economic Way of Thinking." A Mackinac Center Report.

    ERIC Educational Resources Information Center

    Folsom, Burton; Leef, George; Mateer, Dirk

    This study examined 16 high school economics textbooks commonly used in Michigan. The textbooks were graded for 12 criteria that form the basis for the sound study of economics: (1) the price system and production; (2) competition and monopoly; (3) comparative economic systems; (4) the distribution of income and poverty; (5) the role of…

  12. Ethics, economics, and public financing of health care.

    PubMed

    Hurley, J

    2001-08-01

    There is a wide variety of ethical arguments for public financing of health care that share a common structure built on a series of four logically related propositions regarding: (1) the ultimate purpose of a human life or human society; (2) the role of health and its distribution in society in advancing this ultimate purpose; (3) the role of access to or utilisation of health care in maintaining or improving the desired level and distribution of health among members of society, and (4) the role of public financing in ensuring the ethically justified access to and utilisation of health care by members of society. This paper argues that economics has much to contribute to the development of the ethical foundations for publicly financed health care. It focuses in particular on recent economic work to clarify the concepts of access and need and their role in analyses of the just distribution of health care resources, and on the importance of economic analysis of health care and health care insurance markets in demonstrating why public financing is necessary to achieve broad access to and utilisation of health care services.

  13. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    NASA Astrophysics Data System (ADS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  14. Playable Serious Games for Studying and Programming Computational STEM and Informatics Applications of Distributed and Parallel Computer Architectures

    ERIC Educational Resources Information Center

    Amenyo, John-Thones

    2012-01-01

    Carefully engineered playable games can serve as vehicles for students and practitioners to learn and explore the programming of advanced computer architectures to execute applications, such as high performance computing (HPC) and complex, inter-networked, distributed systems. The article presents families of playable games that are grounded in…

  15. NASA Exhibits

    NASA Technical Reports Server (NTRS)

    Deardorff, Glenn; Djomehri, M. Jahed; Freeman, Ken; Gambrel, Dave; Green, Bryan; Henze, Chris; Hinke, Thomas; Hood, Robert; Kiris, Cetin; Moran, Patrick; hide

    2001-01-01

    A series of NASA presentations for the Supercomputing 2001 conference are summarized. The topics include: (1) Mars Surveyor Landing Sites "Collaboratory"; (2) Parallel and Distributed CFD for Unsteady Flows with Moving Overset Grids; (3) IP Multicast for Seamless Support of Remote Science; (4) Consolidated Supercomputing Management Office; (5) Growler: A Component-Based Framework for Distributed/Collaborative Scientific Visualization and Computational Steering; (6) Data Mining on the Information Power Grid (IPG); (7) Debugging on the IPG; (8) Debakey Heart Assist Device: (9) Unsteady Turbopump for Reusable Launch Vehicle; (10) Exploratory Computing Environments Component Framework; (11) OVERSET Computational Fluid Dynamics Tools; (12) Control and Observation in Distributed Environments; (13) Multi-Level Parallelism Scaling on NASA's Origin 1024 CPU System; (14) Computing, Information, & Communications Technology; (15) NAS Grid Benchmarks; (16) IPG: A Large-Scale Distributed Computing and Data Management System; and (17) ILab: Parameter Study Creation and Submission on the IPG.

  16. Distributed computing methodology for training neural networks in an image-guided diagnostic application.

    PubMed

    Plagianakos, V P; Magoulas, G D; Vrahatis, M N

    2006-03-01

    Distributed computing is a process through which a set of computers connected by a network is used collectively to solve a single problem. In this paper, we propose a distributed computing methodology for training neural networks for the detection of lesions in colonoscopy. Our approach is based on partitioning the training set across multiple processors using a parallel virtual machine. In this way, interconnected computers of varied architectures can be used for the distributed evaluation of the error function and gradient values, and, thus, training neural networks utilizing various learning methods. The proposed methodology has large granularity and low synchronization, and has been implemented and tested. Our results indicate that the parallel virtual machine implementation of the training algorithms developed leads to considerable speedup, especially when large network architectures and training sets are used.

  17. The economic mobility in money transfer models

    NASA Astrophysics Data System (ADS)

    Ding, Ning; Xi, Ning; Wang, Yougui

    2006-07-01

    In this paper, we investigate the economic mobility in four money transfer models which have been applied into the research on wealth distribution. We demonstrate the mobility by recording the time series of agents’ ranks and observing their volatility. We also compare the mobility quantitatively by employing an index, “the per capita aggregate change in log-income”, proposed by economists. Like the shape of distribution, the character of mobility is also decided by the trading rule in these transfer models. It is worth noting that even though two models have the same type of distribution, their mobility characters may be quite different.

  18. Fast distributed large-pixel-count hologram computation using a GPU cluster.

    PubMed

    Pan, Yuechao; Xu, Xuewu; Liang, Xinan

    2013-09-10

    Large-pixel-count holograms are one essential part for big size holographic three-dimensional (3D) display, but the generation of such holograms is computationally demanding. In order to address this issue, we have built a graphics processing unit (GPU) cluster with 32.5 Tflop/s computing power and implemented distributed hologram computation on it with speed improvement techniques, such as shared memory on GPU, GPU level adaptive load balancing, and node level load distribution. Using these speed improvement techniques on the GPU cluster, we have achieved 71.4 times computation speed increase for 186M-pixel holograms. Furthermore, we have used the approaches of diffraction limits and subdivision of holograms to overcome the GPU memory limit in computing large-pixel-count holograms. 745M-pixel and 1.80G-pixel holograms were computed in 343 and 3326 s, respectively, for more than 2 million object points with RGB colors. Color 3D objects with 1.02M points were successfully reconstructed from 186M-pixel hologram computed in 8.82 s with all the above three speed improvement techniques. It is shown that distributed hologram computation using a GPU cluster is a promising approach to increase the computation speed of large-pixel-count holograms for large size holographic display.

  19. Actors: A Model of Concurrent Computation in Distributed Systems.

    DTIC Science & Technology

    1985-06-01

    Artificial Intelligence Labora- tory of the Massachusetts Institute of Technology. Support for the labora- tory’s aritificial intelligence research is...RD-A157 917 ACTORS: A MODEL OF CONCURRENT COMPUTATION IN 1/3- DISTRIBUTED SY𔃿TEMS(U) MASSACHUSETTS INST OF TECH CRMBRIDGE ARTIFICIAL INTELLIGENCE ...Computation In Distributed Systems Gui A. Aghai MIT Artificial Intelligence Laboratory Thsdocument ha. been cipp-oved I= pblicrelease and sale; itsI

  20. Optimized distributed computing environment for mask data preparation

    NASA Astrophysics Data System (ADS)

    Ahn, Byoung-Sup; Bang, Ju-Mi; Ji, Min-Kyu; Kang, Sun; Jang, Sung-Hoon; Choi, Yo-Han; Ki, Won-Tai; Choi, Seong-Woon; Han, Woo-Sung

    2005-11-01

    As the critical dimension (CD) becomes smaller, various resolution enhancement techniques (RET) are widely adopted. In developing sub-100nm devices, the complexity of optical proximity correction (OPC) is severely increased and applied OPC layers are expanded to non-critical layers. The transformation of designed pattern data by OPC operation causes complexity, which cause runtime overheads to following steps such as mask data preparation (MDP), and collapse of existing design hierarchy. Therefore, many mask shops exploit the distributed computing method in order to reduce the runtime of mask data preparation rather than exploit the design hierarchy. Distributed computing uses a cluster of computers that are connected to local network system. However, there are two things to limit the benefit of the distributing computing method in MDP. First, every sequential MDP job, which uses maximum number of available CPUs, is not efficient compared to parallel MDP job execution due to the input data characteristics. Second, the runtime enhancement over input cost is not sufficient enough since the scalability of fracturing tools is limited. In this paper, we will discuss optimum load balancing environment that is useful in increasing the uptime of distributed computing system by assigning appropriate number of CPUs for each input design data. We will also describe the distributed processing (DP) parameter optimization to obtain maximum throughput in MDP job processing.

  1. One approach for evaluating the Distributed Computing Design System (DCDS)

    NASA Technical Reports Server (NTRS)

    Ellis, J. T.

    1985-01-01

    The Distributed Computer Design System (DCDS) provides an integrated environment to support the life cycle of developing real-time distributed computing systems. The primary focus of DCDS is to significantly increase system reliability and software development productivity, and to minimize schedule and cost risk. DCDS consists of integrated methodologies, languages, and tools to support the life cycle of developing distributed software and systems. Smooth and well-defined transistions from phase to phase, language to language, and tool to tool provide a unique and unified environment. An approach to evaluating DCDS highlights its benefits.

  2. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  3. Investment Justification of Robotic Technology in Aerospace Manufacturing. User’s Manual

    DTIC Science & Technology

    1984-10-01

    assessing the economic attractiveness of investments in robotics and/or flexible manufacturing systems (FMS). It models the cash flows...relative. 5. RIDM assesses the inherent economic attractiveness of robotic/FMS implementation. The model is based on real economic events and not...provided for an optional analysis of state and local tax impacts, to be custom designed by the user. (2) Computation of Depreciation

  4. Assessment of health and economic effects by PM2.5 pollution in Beijing: a combined exposure-response and computable general equilibrium analysis.

    PubMed

    Wang, Guizhi; Gu, SaiJu; Chen, Jibo; Wu, Xianhua; Yu, Jun

    2016-12-01

    Assessment of the health and economic impacts of PM2.5 pollution is of great importance for urban air pollution prevention and control. In this study, we evaluate the damage of PM2.5 pollution using Beijing as an example. First, we use exposure-response functions to estimate the adverse health effects due to PM2.5 pollution. Then, the corresponding labour loss and excess medical expenditure are computed as two conducting variables. Finally, different from the conventional valuation methods, this paper introduces the two conducting variables into the computable general equilibrium (CGE) model to assess the impacts on sectors and the whole economic system caused by PM2.5 pollution. The results show that, substantial health effects of the residents in Beijing from PM2.5 pollution occurred in 2013, including 20,043 premature deaths and about one million other related medical cases. Correspondingly, using the 2010 social accounting data, Beijing gross domestic product loss due to the health impact of PM2.5 pollution is estimated as 1286.97 (95% CI: 488.58-1936.33) million RMB. This demonstrates that PM2.5 pollution not only has adverse health effects, but also brings huge economic loss.

  5. An Information and Technical Manual for the Computer-Assisted Teacher Training System (CATTS).

    ERIC Educational Resources Information Center

    Semmel, Melvyn I.; And Others

    The manual presents technical information on the computer assisted teacher training system (CATTS) which aims at developing a versatile and economical computer based teacher training system with the capability of providing immediate analysis and feedback of data relevant to teacher pupil transactions in a classroom setting. The physical…

  6. Computing in Higher Education: A Planning Perspective for Administrators. CAUSE Monograph Series.

    ERIC Educational Resources Information Center

    Chachra, Vinod; Heterick, Robert C.

    Options for using computers in managing higher education institutions and technological questions are considered in a collection of nine essays developed by the authors for this monograph. An introduction considers historical developments and provides an overview of computing modes and languages. After considering some of the economic and…

  7. THE DEVELOPMENT AND PRESENTATION OF FOUR COLLEGE COURSES BY COMPUTER TELEPROCESSING. FINAL REPORT.

    ERIC Educational Resources Information Center

    MITZEL, HAROLD E.

    THIS IS A FINAL REPORT ON THE DEVELOPMENT AND PRESENTATION OF FOUR COLLEGE COURSES BY COMPUTER TELEPROCESSING FROM APRIL 1964 TO JUNE 1967. IT OUTLINES THE PROGRESS MADE TOWARDS THE PREPARATION, DEVELOPMENT, AND EVALUATION OF MATERIALS FOR COMPUTER PRESENTATION OF COURSES IN AUDIOLOGY, MANAGEMENT ACCOUNTING, ENGINEERING ECONOMICS, AND MODERN…

  8. Mental Computation: Evidence from Fifth Graders

    ERIC Educational Resources Information Center

    Erdem, Emrullah

    2017-01-01

    The current study examines the mental computation performance owned by students at fifth grade. This study was carried out with 118 fifth graders (11-12-year-olds) studying at 3 randomly selected primary schools that served low and middle socio-economic areas in a city of Turkey. "Mental Computation Test (MCT)" has been used to reach how…

  9. Japanese supercomputer technology.

    PubMed

    Buzbee, B L; Ewald, R H; Worlton, W J

    1982-12-17

    Under the auspices of the Ministry for International Trade and Industry the Japanese have launched a National Superspeed Computer Project intended to produce high-performance computers for scientific computation and a Fifth-Generation Computer Project intended to incorporate and exploit concepts of artificial intelligence. If these projects are successful, which appears likely, advanced economic and military research in the United States may become dependent on access to supercomputers of foreign manufacture.

  10. Efficient implementation of multidimensional fast fourier transform on a distributed-memory parallel multi-node computer

    DOEpatents

    Bhanot, Gyan V [Princeton, NJ; Chen, Dong [Croton-On-Hudson, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY

    2012-01-10

    The present in invention is directed to a method, system and program storage device for efficiently implementing a multidimensional Fast Fourier Transform (FFT) of a multidimensional array comprising a plurality of elements initially distributed in a multi-node computer system comprising a plurality of nodes in communication over a network, comprising: distributing the plurality of elements of the array in a first dimension across the plurality of nodes of the computer system over the network to facilitate a first one-dimensional FFT; performing the first one-dimensional FFT on the elements of the array distributed at each node in the first dimension; re-distributing the one-dimensional FFT-transformed elements at each node in a second dimension via "all-to-all" distribution in random order across other nodes of the computer system over the network; and performing a second one-dimensional FFT on elements of the array re-distributed at each node in the second dimension, wherein the random order facilitates efficient utilization of the network thereby efficiently implementing the multidimensional FFT. The "all-to-all" re-distribution of array elements is further efficiently implemented in applications other than the multidimensional FFT on the distributed-memory parallel supercomputer.

  11. Efficient implementation of a multidimensional fast fourier transform on a distributed-memory parallel multi-node computer

    DOEpatents

    Bhanot, Gyan V [Princeton, NJ; Chen, Dong [Croton-On-Hudson, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY

    2008-01-01

    The present in invention is directed to a method, system and program storage device for efficiently implementing a multidimensional Fast Fourier Transform (FFT) of a multidimensional array comprising a plurality of elements initially distributed in a multi-node computer system comprising a plurality of nodes in communication over a network, comprising: distributing the plurality of elements of the array in a first dimension across the plurality of nodes of the computer system over the network to facilitate a first one-dimensional FFT; performing the first one-dimensional FFT on the elements of the array distributed at each node in the first dimension; re-distributing the one-dimensional FFT-transformed elements at each node in a second dimension via "all-to-all" distribution in random order across other nodes of the computer system over the network; and performing a second one-dimensional FFT on elements of the array re-distributed at each node in the second dimension, wherein the random order facilitates efficient utilization of the network thereby efficiently implementing the multidimensional FFT. The "all-to-all" re-distribution of array elements is further efficiently implemented in applications other than the multidimensional FFT on the distributed-memory parallel supercomputer.

  12. Distributed GPU Computing in GIScience

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.

    2013-12-01

    Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE Transactions on, 9(3), 378-394. 2. Li, J., Jiang, Y., Yang, C., Huang, Q., & Rice, M. (2013). Visualizing 3D/4D Environmental Data Using Many-core Graphics Processing Units (GPUs) and Multi-core Central Processing Units (CPUs). Computers & Geosciences, 59(9), 78-89. 3. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. (2008). GPU computing. Proceedings of the IEEE, 96(5), 879-899.

  13. The microeconomics of residential photovoltaics: Tariffs, network operation and maintenance, and ancillary services in distribution-level electricity markets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boero, Riccardo; Backhaus, Scott N.; Edwards, Brian K.

    Here, we develop a microeconomic model of a distribution-level electricity market that takes explicit account of residential photovoltaics (PV) adoption. The model allows us to study the consequences of most tariffs on PV adoption and the consequences of increased residential PV adoption under the assumption of economic sustainability for electric utilities. We also validated the model using U.S. data and extend it to consider different pricing schemes for operation and maintenance costs of the distribution network and for ancillary services. Results show that net metering promotes more environmental benefits and social welfare than other tariffs. But, if costs to operatemore » the distribution network increase, net metering will amplify the unequal distribution of surplus among households. In conclusion, maintaining the economic sustainability of electric utilities under net metering may become extremely difficult unless the uneven distribution of surplus is legitimated by environmental benefits.« less

  14. The microeconomics of residential photovoltaics: Tariffs, network operation and maintenance, and ancillary services in distribution-level electricity markets

    DOE PAGES

    Boero, Riccardo; Backhaus, Scott N.; Edwards, Brian K.

    2016-11-12

    Here, we develop a microeconomic model of a distribution-level electricity market that takes explicit account of residential photovoltaics (PV) adoption. The model allows us to study the consequences of most tariffs on PV adoption and the consequences of increased residential PV adoption under the assumption of economic sustainability for electric utilities. We also validated the model using U.S. data and extend it to consider different pricing schemes for operation and maintenance costs of the distribution network and for ancillary services. Results show that net metering promotes more environmental benefits and social welfare than other tariffs. But, if costs to operatemore » the distribution network increase, net metering will amplify the unequal distribution of surplus among households. In conclusion, maintaining the economic sustainability of electric utilities under net metering may become extremely difficult unless the uneven distribution of surplus is legitimated by environmental benefits.« less

  15. China’s Economic Conditions

    DTIC Science & Technology

    2008-06-27

    billion), Dell Computer ($2.1 billion), Hewlett Packard ($1.3 billion), and Kodak ($0.6 billion). 11 Communications equipment, computers, and other...of Southeast Asian Nations (ASEAN) member countries are Indonesia, Malaysia , the Philippines, Singapore, Thailand, Brunei, Cambodia, Laos, Myanmar

  16. China’s Economic Conditions

    DTIC Science & Technology

    2008-03-11

    billion), Dell Computer ($2.1 billion), Hewlett Packard ($1.3 billion), and Kodak ($0.6 billion). 11 Communications equipment, computers, and other...Southeast Asian Nations (ASEAN) member countries are Indonesia, Malaysia , the Philippines, Singapore, Thailand, Brunei, Cambodia, Laos, Myanmar (Burma

  17. A Pilot Computer-Aided Design and Manufacturing Curriculum that Promotes Engineering

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Elizabeth City State University (ECSU) is located in a community that is mostly rural in nature. The area is economically deprived when compared to the rest of the state. Many businesses lack the computerized equipment and skills needed to propel upward in today's technologically advanced society. This project will close the ever-widening gap between advantaged and disadvantaged workers as well as increase their participation with industry, NASA and/or other governmental agencies. Everyone recognizes computer technology as the catalyst for advances in design, prototyping, and manufacturing or the art of machining. Unprecedented quality control and cost-efficiency improvements are recognized through the use of computer technology. This technology has changed the manufacturing industry with advanced high-tech capabilities needed by NASA. With the ever-widening digital divide, we must continue to provide computer technology to those who are socio-economically disadvantaged.

  18. Performance and economics of residential solar space heating

    NASA Astrophysics Data System (ADS)

    Zehr, F. J.; Vineyard, T. A.; Barnes, R. W.; Oneal, D. L.

    1982-11-01

    The performance and economics of residential solar space heating were studied for various locations in the contiguous United States. Common types of active and passive solar heating systems were analyzed with respect to an average-size, single-family house designed to meet or exceed the thermal requirements of the Department of Housing and Urban Development Minimum Property Standards (HUD-MPS). The solar systems were evaluated in seventeen cities to provide a broad range of climatic conditions. Active systems evaluated consist of air and liquid flat plate collectors with single- and double-glazing: passive systems include Trombe wall, water wall, direct gain, and sunspace systems. The active system solar heating performance was computed using the University of Wisconsin's F-CHART computer program. The Los Alamos Scientific Laboratory's Solar Load Ratio (SLR) method was employed to compute solar heating performance for the passive systems. Heating costs were computed with gas, oil, and electricity as backups and as conventional heating system fuels.

  19. Efficient ecologic and economic operational rules for dammed systems by means of nondominated sorting genetic algorithm II

    NASA Astrophysics Data System (ADS)

    Niayifar, A.; Perona, P.

    2015-12-01

    River impoundment by dams is known to strongly affect the natural flow regime and in turn the river attributes and the related ecosystem biodiversity. Making hydropower sustainable implies to seek for innovative operational policies able to generate dynamic environmental flows while maintaining economic efficiency. For dammed systems, we build the ecological and economical efficiency plot for non-proportional flow redistribution operational rules compared to minimal flow operational. As for the case of small hydropower plants (e.g., see the companion paper by Gorla et al., this session), we use a four parameters Fermi-Dirac statistical distribution to mathematically formulate non-proportional redistribution rules. These rules allocate a fraction of water to the riverine environment depending on current reservoir inflows and storage. Riverine ecological benefits associated to dynamic environmental flows are computed by integrating the Weighted Usable Area (WUA) for fishes with Richter's hydrological indicators. Then, we apply nondominated sorting genetic algorithm II (NSGA-II) to an ensemble of non-proportional and minimal flow redistribution rules in order to generate the Pareto frontier showing the system performances in the ecologic and economic space. This fast and elitist multiobjective optimization method is eventually applied to a case study. It is found that non-proportional dynamic flow releases ensure maximal power production on the one hand, while conciliating ecological sustainability on the other hand. Much of the improvement in the environmental indicator is seen to arise from a better use of the reservoir storage dynamics, which allows to capture, and laminate flood events while recovering part of them for energy production. In conclusion, adopting such new operational policies would unravel a spectrum of globally-efficient performances of the dammed system when compared with those resulting from policies based on constant minimum flow releases.

  20. Soliton sustainable socio-economic distribution

    NASA Astrophysics Data System (ADS)

    Dresvyannikov, M. A.; Petrova, M. V.; Tshovrebov, A. M.

    2017-11-01

    In the work presented, from close positions, we consider: 1) the question of the stability of socio-economic distributions; 2) the question of the possible mechanism for the formation of fractional power-law dependences in the Cobb/Douglas production function; 3) the introduction of a fractional order derivative for a general analysis of a fractional power function; 4) bringing in a state of mutual matching of the interest rate and the production function of Cobb/Douglas.

  1. Developing a Distributed Computing Architecture at Arizona State University.

    ERIC Educational Resources Information Center

    Armann, Neil; And Others

    1994-01-01

    Development of Arizona State University's computing architecture, designed to ensure that all new distributed computing pieces will work together, is described. Aspects discussed include the business rationale, the general architectural approach, characteristics and objectives of the architecture, specific services, and impact on the university…

  2. Distributed Multimedia Computing: An Assessment of the State of the Art.

    ERIC Educational Resources Information Center

    Williams, Neil; And Others

    1991-01-01

    Describes multimedia computing and the characteristics of multimedia information. Trends in information technology are reviewed; distributed multimedia computing is explained; media types are described, including digital media; and multimedia applications are examined, including office systems, documents, information storage and retrieval,…

  3. NREL Solar Technical Assistance Team to Partner with Illinois, Nevada, and

    Science.gov Websites

    solar PV deployment can stimulate economic development in Illinois. The STAT Network and Illinois will explore solar policy scenarios and their impact on solar deployment and economic development. This new analysis will employ NREL's Distributed Generation and Market Demand (dGen) and Jobs and Economic

  4. Elementary School Economics: A Guide for Teachers (Revised).

    ERIC Educational Resources Information Center

    Virginia State Dept. of Education, Richmond.

    GRADES OR AGES: Grades K-7. SUBJECT MATTER: Elementary school economics. ORGANIZATION AND PHYSICAL APPEARANCE: The guide has a preliminary chapter on economic understandings and a chapter for each grade. Each chapter has eight subdivisions: 1) natural resources, 2) human resources; 3) production of goods and services, 4) distribution of goods and…

  5. Mixture Distributions for Modeling Lead Time Demand in Coordinated Supply Chains

    DTIC Science & Technology

    2014-04-30

    International Journal of Production Economics , 101...backorder price discount. International Journal of Production Economics , 111, 118–128. McClain, J. O., & Thomas, L. J. (1985). Operations management...2008). Using the inventory-theoretic framework to determine cost-minimizing supply strategies in a stochastic setting. International Journal of Production Economics ,

  6. Characterizing Crowd Participation and Productivity of Foldit Through Web Scraping

    DTIC Science & Technology

    2016-03-01

    Berkeley Open Infrastructure for Network Computing CDF Cumulative Distribution Function CPU Central Processing Unit CSSG Crowdsourced Serious Game...computers at once can create a similar capacity. According to Anderson [6], principal investigator for the Berkeley Open Infrastructure for Network...extraterrestrial life. From this project, a software-based distributed computing platform called the Berkeley Open Infrastructure for Network Computing

  7. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    DTIC Science & Technology

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  8. On fair, effective and efficient REDD mechanism design

    PubMed Central

    2009-01-01

    The issues surrounding 'Reduced Emissions from Deforestation and Forest Degradation' (REDD) have become a major component of continuing negotiations under the United Nations Framework Convention on Climate Change (UNFCCC). This paper aims to address two key requirements of any potential REDD mechanism: first, the generation of measurable, reportable and verifiable (MRV) REDD credits; and secondly, the sustainable and efficient provision of emission reductions under a robust financing regime. To ensure the supply of MRV credits, we advocate the establishment of an 'International Emission Reference Scenario Coordination Centre' (IERSCC). The IERSCC would act as a global clearing house for harmonized data to be used in implementing reference level methodologies. It would be tasked with the collection, reporting and subsequent processing of earth observation, deforestation- and degradation driver information in a globally consistent manner. The IERSCC would also assist, coordinate and supervise the computation of national reference scenarios according to rules negotiated under the UNFCCC. To overcome the threats of "market flooding" on the one hand and insufficient economic incentives for REDD on the other hand, we suggest an 'International Investment Reserve' (IIR) as REDD financing framework. In order to distribute the resources of the IIR we propose adopting an auctioning mechanism. Auctioning not only reveals the true emission reduction costs, but might also allow for incentivizing the protection of biodiversity and socio-economic values. The introduced concepts will be vital to ensure robustness, environmental integrity and economic efficiency of the future REDD mechanism. PMID:19943927

  9. Student Activities to Accompany the People on Market Street Film Series--A Series of Seven Economics Films Distributed by the Walt Disney Educational Media Company.

    ERIC Educational Resources Information Center

    Watts, Michael, Ed.

    Designed to help students in grades 9-12 understand economic terms, fundamental economic principles of the free enterprise system, and economic forces that influence activities in everyone's life, this teacher's guide provides over 50 reproducible activity sheets on the following topics: (1) scarcity and planning, (2) cost, (3) demand, (4) supply,…

  10. New security infrastructure model for distributed computing systems

    NASA Astrophysics Data System (ADS)

    Dubenskaya, J.; Kryukov, A.; Demichev, A.; Prikhodko, N.

    2016-02-01

    At the paper we propose a new approach to setting up a user-friendly and yet secure authentication and authorization procedure in a distributed computing system. The security concept of the most heterogeneous distributed computing systems is based on the public key infrastructure along with proxy certificates which are used for rights delegation. In practice a contradiction between the limited lifetime of the proxy certificates and the unpredictable time of the request processing is a big issue for the end users of the system. We propose to use unlimited in time hashes which are individual for each request instead of proxy certificate. Our approach allows to avoid using of the proxy certificates. Thus the security infrastructure of distributed computing system becomes easier for development, support and use.

  11. Foraging optimally for home ranges

    USGS Publications Warehouse

    Mitchell, Michael S.; Powell, Roger A.

    2012-01-01

    Economic models predict behavior of animals based on the presumption that natural selection has shaped behaviors important to an animal's fitness to maximize benefits over costs. Economic analyses have shown that territories of animals are structured by trade-offs between benefits gained from resources and costs of defending them. Intuitively, home ranges should be similarly structured, but trade-offs are difficult to assess because there are no costs of defense, thus economic models of home-range behavior are rare. We present economic models that predict how home ranges can be efficient with respect to spatially distributed resources, discounted for travel costs, under 2 strategies of optimization, resource maximization and area minimization. We show how constraints such as competitors can influence structure of homes ranges through resource depression, ultimately structuring density of animals within a population and their distribution on a landscape. We present simulations based on these models to show how they can be generally predictive of home-range behavior and the mechanisms that structure the spatial distribution of animals. We also show how contiguous home ranges estimated statistically from location data can be misleading for animals that optimize home ranges on landscapes with patchily distributed resources. We conclude with a summary of how we applied our models to nonterritorial black bears (Ursus americanus) living in the mountains of North Carolina, where we found their home ranges were best predicted by an area-minimization strategy constrained by intraspecific competition within a social hierarchy. Economic models can provide strong inference about home-range behavior and the resources that structure home ranges by offering falsifiable, a priori hypotheses that can be tested with field observations.

  12. The economics of biobanking and pharmacogenetics databasing: the case of an adaptive platform on breast cancer.

    PubMed

    Huttin, Christine C; Liebman, Michael N

    2013-01-01

    This paper aims to discuss the economics of biobanking. Among the critical issues in evaluating potential ROI for creation of a bio-bank are: scale (e.g. local, national, international), centralized versus virtual/distributed, degree of sample annotation/QC procedures, targeted end-users and uses, types of samples, potential characterization, both of samples and annotations. The paper presents a review on cost models for an economic analysis of biobanking for different steps: data collection (e.g. biospecimens in different types of sites, storage, transport and distribution, information management for the different types of information (e.g. biological information such as cell, gene, and protein)). It also provides additional concepts to process biospecimens from laboratory to clinical practice and will help to identify how changing paradigms in translational medicine affect the economic modeling.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The model is designed to enable decision makers to compare the economics of geothermal projects with the economics of alternative energy systems at an early stage in the decision process. The geothermal engineering and economic feasibility computer model (GEEF) is written in FORTRAN IV language and can be run on a mainframe or a mini-computer system. An abbreviated version of the model is being developed for usage in conjunction with a programmable desk calculator. The GEEF model has two main segments, namely (i) the engineering design/cost segment and (ii) the economic analysis segment. In the engineering segment, the model determinesmore » the numbers of production and injection wells, heat exchanger design, operating parameters for the system, requirement of supplementary system (to augment the working fluid temperature if the resource temperature is not sufficiently high), and the fluid flow rates. The model can handle single stage systems as well as two stage cascaded systems in which the second stage may involve a space heating application after a process heat application in the first stage.« less

  14. Economic Model For a Return on Investment Analysis of United States Government High Performance Computing (HPC) Research and Development (R & D) Investment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, Earl C.; Conway, Steve; Dekate, Chirag

    This study investigated how high-performance computing (HPC) investments can improve economic success and increase scientific innovation. This research focused on the common good and provided uses for DOE, other government agencies, industry, and academia. The study created two unique economic models and an innovation index: 1 A macroeconomic model that depicts the way HPC investments result in economic advancements in the form of ROI in revenue (GDP), profits (and cost savings), and jobs. 2 A macroeconomic model that depicts the way HPC investments result in basic and applied innovations, looking at variations by sector, industry, country, and organization size. Amore » new innovation index that provides a means of measuring and comparing innovation levels. Key findings of the pilot study include: IDC collected the required data across a broad set of organizations, with enough detail to create these models and the innovation index. The research also developed an expansive list of HPC success stories.« less

  15. Distributed Accounting on the Grid

    NASA Technical Reports Server (NTRS)

    Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.

    2001-01-01

    By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.

  16. Exploring community health through the Sustainable Livelihoods framework.

    PubMed

    Barnidge, Ellen K; Baker, Elizabeth A; Motton, Freda; Fitzgerald, Teresa; Rose, Frank

    2011-02-01

    Health disparities are a major concern in the United States. Research suggests that inequitable distribution of money, power, and resources shape the circumstances for daily life and create and exacerbate health disparities. In rural communities, inequitable distribution of these structural factors seems to limit employment opportunities. The Sustainable Livelihoods framework, an economic development model, provides a conceptual framework to understand how distribution of these social, economic, and political structural factors affect employment opportunities and community health in rural America. This study uses photo-elicitation interviews, a qualitative, participatory method, to understand community members' perceptions of how distribution of structural factors through creation and maintenance of institutional practices and policies influence employment opportunities and, ultimately, community health for African Americans living in rural Missouri.

  17. Parallel Newton-Krylov-Schwarz algorithms for the transonic full potential equation

    NASA Technical Reports Server (NTRS)

    Cai, Xiao-Chuan; Gropp, William D.; Keyes, David E.; Melvin, Robin G.; Young, David P.

    1996-01-01

    We study parallel two-level overlapping Schwarz algorithms for solving nonlinear finite element problems, in particular, for the full potential equation of aerodynamics discretized in two dimensions with bilinear elements. The overall algorithm, Newton-Krylov-Schwarz (NKS), employs an inexact finite-difference Newton method and a Krylov space iterative method, with a two-level overlapping Schwarz method as a preconditioner. We demonstrate that NKS, combined with a density upwinding continuation strategy for problems with weak shocks, is robust and, economical for this class of mixed elliptic-hyperbolic nonlinear partial differential equations, with proper specification of several parameters. We study upwinding parameters, inner convergence tolerance, coarse grid density, subdomain overlap, and the level of fill-in in the incomplete factorization, and report their effect on numerical convergence rate, overall execution time, and parallel efficiency on a distributed-memory parallel computer.

  18. Triple dividends of water consumption charges in South Africa

    NASA Astrophysics Data System (ADS)

    Letsoalo, Anthony; Blignaut, James; de Wet, Theuns; de Wit, Martin; Hess, Sebastiaan; Tol, Richard S. J.; van Heerden, Jan

    2007-05-01

    The South African government is exploring ways to address water scarcity problems by introducing a water resource management charge on the quantity of water used in sectors such as irrigated agriculture, mining, and forestry. It is expected that a more efficient water allocation, lower use, and a positive impact on poverty can be achieved. This paper reports on the validity of these claims by applying a computable general equilibrium model to analyze the triple dividend of water consumption charges in South Africa: reduced water use, more rapid economic growth, and a more equal income distribution. It is shown that an appropriate budget-neutral combination of water charges, particularly on irrigated agriculture and coal mining, and reduced indirect taxes, particularly on food, would yield triple dividends, that is, less water use, more growth, and less poverty.

  19. NDEx: A Community Resource for Sharing and Publishing of Biological Networks.

    PubMed

    Pillich, Rudolf T; Chen, Jing; Rynkov, Vladimir; Welker, David; Pratt, Dexter

    2017-01-01

    Networks are a powerful and flexible paradigm that facilitate communication and computation about interactions of any type, whether social, economic, or biological. NDEx, the Network Data Exchange, is an online commons to enable new modes of collaboration and publication using biological networks. NDEx creates an access point and interface to a broad range of networks, whether they express molecular interactions, curated relationships from literature, or the outputs of systematic analysis of big data. Research organizations can use NDEx as a distribution channel for networks they generate or curate. Developers of bioinformatic applications can store and query NDEx networks via a common programmatic interface. NDEx can also facilitate the integration of networks as data in electronic publications, thus making a step toward an ecosystem in which networks bearing data, hypotheses, and findings flow seamlessly between scientists.

  20. Network and data security design for telemedicine applications.

    PubMed

    Makris, L; Argiriou, N; Strintzis, M G

    1997-01-01

    The maturing of telecommunication technologies has ushered in a whole new era of applications and services in the health care environment. Teleworking, teleconsultation, mutlimedia conferencing and medical data distribution are rapidly becoming commonplace in clinical practice. As a result, a set of problems arises, concerning data confidentiality and integrity. Public computer networks, such as the emerging ISDN technology, are vulnerable to eavesdropping. Therefore it is important for telemedicine applications to employ end-to-end encryption mechanisms securing the data channel from unauthorized access of modification. We propose a network access and encryption system that is both economical and easily implemented for integration in developing or existing applications, using well-known and thoroughly tested encryption algorithms. Public-key cryptography is used for session-key exchange, while symmetric algorithms are used for bulk encryption. Mechanisms for session-key generation and exchange are also provided.

  1. The value of improved (ERS) information based on domestic distribution effects of U.S. agriculture crops

    NASA Technical Reports Server (NTRS)

    Bradford, D. F.; Kelejian, H. H.; Brusch, R.; Gross, J.; Fishman, H.; Feenberg, D.

    1974-01-01

    The value of improving information for forecasting future crop harvests was investigated. Emphasis was placed upon establishing practical evaluation procedures firmly based in economic theory. The analysis was applied to the case of U.S. domestic wheat consumption. Estimates for a cost of storage function and a demand function for wheat were calculated. A model of market determinations of wheat inventories was developed for inventory adjustment. The carry-over horizon is computed by the solution of a nonlinear programming problem, and related variables such as spot and future price at each stage are determined. The model is adaptable to other markets. Results are shown to depend critically on the accuracy of current and proposed measurement techniques. The quantitative results are presented parametrically, in terms of various possible values of current and future accuracies.

  2. Computer modeling of dendritic web growth processes and characterization of the material

    NASA Technical Reports Server (NTRS)

    Seidensticker, R. G.; Kothmann, R. E.; Mchugh, J. P.; Duncan, C. S.; Hopkins, R. H.; Blais, P. D.; Davis, J. R.; Rohatgi, A.

    1978-01-01

    High area throughput rate will be required for the economical production of silicon dendritic web for solar cells. Web width depends largely on the temperature distribution on the melt surface while growth speed is controlled by the dissipation of the latent heat of fusion. Thermal models were developed to investigate each of these aspects, and were used to engineer the design of laboratory equipment capable of producing crystals over 4 cm wide; growth speeds up to 10 cm/min were achieved. The web crystals were characterized by resistivity, lifetime and etch pit density data as well as by detailed solar cell I-V data. Solar cells ranged in efficiency from about 10 to 14.5% (AM-1) depending on growth conditions. Cells with lower efficiency displayed lowered bulk lifetime believed to be due to surface contamination.

  3. A Latency-Tolerant Partitioner for Distributed Computing on the Information Power Grid

    NASA Technical Reports Server (NTRS)

    Das, Sajal K.; Harvey, Daniel J.; Biwas, Rupak; Kwak, Dochan (Technical Monitor)

    2001-01-01

    NASA's Information Power Grid (IPG) is an infrastructure designed to harness the power of graphically distributed computers, databases, and human expertise, in order to solve large-scale realistic computational problems. This type of a meta-computing environment is necessary to present a unified virtual machine to application developers that hides the intricacies of a highly heterogeneous environment and yet maintains adequate security. In this paper, we present a novel partitioning scheme. called MinEX, that dynamically balances processor workloads while minimizing data movement and runtime communication, for applications that are executed in a parallel distributed fashion on the IPG. We also analyze the conditions that are required for the IPG to be an effective tool for such distributed computations. Our results show that MinEX is a viable load balancer provided the nodes of the IPG are connected by a high-speed asynchronous interconnection network.

  4. Uncertainties in obtaining high reliability from stress-strength models

    NASA Technical Reports Server (NTRS)

    Neal, Donald M.; Matthews, William T.; Vangel, Mark G.

    1992-01-01

    There has been a recent interest in determining high statistical reliability in risk assessment of aircraft components. The potential consequences are identified of incorrectly assuming a particular statistical distribution for stress or strength data used in obtaining the high reliability values. The computation of the reliability is defined as the probability of the strength being greater than the stress over the range of stress values. This method is often referred to as the stress-strength model. A sensitivity analysis was performed involving a comparison of reliability results in order to evaluate the effects of assuming specific statistical distributions. Both known population distributions, and those that differed slightly from the known, were considered. Results showed substantial differences in reliability estimates even for almost nondetectable differences in the assumed distributions. These differences represent a potential problem in using the stress-strength model for high reliability computations, since in practice it is impossible to ever know the exact (population) distribution. An alternative reliability computation procedure is examined involving determination of a lower bound on the reliability values using extreme value distributions. This procedure reduces the possibility of obtaining nonconservative reliability estimates. Results indicated the method can provide conservative bounds when computing high reliability. An alternative reliability computation procedure is examined involving determination of a lower bound on the reliability values using extreme value distributions. This procedure reduces the possibility of obtaining nonconservative reliability estimates. Results indicated the method can provide conservative bounds when computing high reliability.

  5. Assessing major ecosystem types and the challenge of sustainability in Turkey.

    PubMed

    Evrendilek, F; Doygun, H

    2000-11-01

    In recent years, Turkey has experienced rapid economic and population growth coupled with both an equally rapid increase in energy consumption and a vast disparity in welfare between socioeconomic groups and regions. In turn, these pressures have accelerated the destruction of productive, assimilative, and regenerative capacities of the ecosystems, which are essential for the well-being of the people and the economy. This paper describes the structure and function of major ecosystem types in Turkey and discusses the underlying causes of environmental degradation in the framework of economy, energy, environment, and ethics. From a national perspective, this paper suggests three sustainability-based policies necessary for Turkey's long-term interests that balance economic, environmental, and energy goals: (1) decoupling economic growth from energy consumption growth through the development of energy-efficient and renewable energy technologies; (2) linking economic efficiency and distributive justice of wealth and power through distributive and participatory public policies; and (3) integrating the economic and ecological systems through the internalization of externalities and ecosystem rehabilitation.

  6. Solar Plus: A Holistic Approach to Distributed Solar PV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    OShaughnessy, Eric J.; Ardani, Kristen B.; Cutler, Dylan S.

    Solar 'plus' refers to an emerging approach to distributed solar photovoltaic (PV) deployment that uses energy storage and controllable devices to optimize customer economics. The solar plus approach increases customer system value through technologies such as electric batteries, smart domestic water heaters, smart air-conditioner (AC) units, and electric vehicles We use an NREL optimization model to explore the customer-side economics of solar plus under various utility rate structures and net metering rates. We explore optimal solar plus applications in five case studies with different net metering rates and rate structures. The model deploys different configurations of PV, batteries, smart domesticmore » water heaters, and smart AC units in response to different rate structures and customer load profiles. The results indicate that solar plus improves the customer economics of PV and may mitigate some of the negative impacts of evolving rate structures on PV economics. Solar plus may become an increasingly viable model for optimizing PV customer economics in an evolving rate environment.« less

  7. Cogeneration Technology Alternatives Study (CTAS). Volume 6: Computer data. Part 2: Residual-fired nocogeneration process boiler

    NASA Technical Reports Server (NTRS)

    Knightly, W. F.

    1980-01-01

    Computer generated data on the performance of the cogeneration energy conversion system are presented. Performance parameters included fuel consumption and savings, capital costs, economics, and emissions of residual fired process boilers.

  8. Computer program for discounted cash flow/rate of return evaluations

    NASA Technical Reports Server (NTRS)

    Robson, W. D.

    1971-01-01

    Technique, incorporated into set of three computer programs, provides economic methodology for reducing all parameters to financially sound common denominator of present worth, and calculates resultant rate of return on new equipment, processes, or systems investments.

  9. COMPUTER TECHNOLOGY AND SOCIAL CHANGE,

    DTIC Science & Technology

    This paper presents a discussion of the social , political, economic and psychological problems associated with the rapid growth and development of...public officials and responsible groups is required to increase public understanding of the computer as a powerful tool, to select appropriate

  10. Object-oriented Tools for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1993-01-01

    Distributed computing systems are proliferating, owing to the availability of powerful, affordable microcomputers and inexpensive communication networks. A critical problem in developing such systems is getting application programs to interact with one another across a computer network. Remote interprogram connectivity is particularly challenging across heterogeneous environments, where applications run on different kinds of computers and operating systems. NetWorks! (trademark) is an innovative software product that provides an object-oriented messaging solution to these problems. This paper describes the design and functionality of NetWorks! and illustrates how it is being used to build complex distributed applications for NASA and in the commercial sector.

  11. The impact of distributed computing on education

    NASA Technical Reports Server (NTRS)

    Utku, S.; Lestingi, J.; Salama, M.

    1982-01-01

    In this paper, developments in digital computer technology since the early Fifties are reviewed briefly, and the parallelism which exists between these developments and developments in analysis and design procedures of structural engineering is identified. The recent trends in digital computer technology are examined in order to establish the fact that distributed processing is now an accepted philosophy for further developments. The impact of this on the analysis and design practices of structural engineering is assessed by first examining these practices from a data processing standpoint to identify the key operations and data bases, and then fitting them to the characteristics of distributed processing. The merits and drawbacks of the present philosophy in educating structural engineers are discussed and projections are made for the industry-academia relations in the distributed processing environment of structural analysis and design. An ongoing experiment of distributed computing in a university environment is described.

  12. Online social activity reflects economic status

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Hu; Wang, Jun; Shao, Junming; Zhou, Tao

    2016-09-01

    To characterize economic development and diagnose the economic health condition, several popular indices such as gross domestic product (GDP), industrial structure and income growth are widely applied. However, computing these indices based on traditional economic census is usually costly and resources consuming, and more importantly, following a long time delay. In this paper, we analyzed nearly 200 million users' activities for four consecutive years in the largest social network (Sina Microblog) in China, aiming at exploring latent relationships between the online social activities and local economic status. Results indicate that online social activity has a strong correlation with local economic development and industrial structure, and more interestingly, allows revealing the macro-economic structure instantaneously with nearly no cost. Beyond, this work also provides a new venue to identify risky signal in local economic structure.

  13. DEP : a computer program for evaluating lumber drying costs and investments

    Treesearch

    Stewart Holmes; George B. Harpole; Edward Bilek

    1983-01-01

    The DEP computer program is a modified discounted cash flow computer program designed for analysis of problems involving economic analysis of wood drying processes. Wood drying processes are different from other processes because of the large amounts of working capital required to finance inventories, and because of relatively large shares of costs charged to inventory...

  14. Saving Energy and Money: A Lesson in Computer Power Management

    ERIC Educational Resources Information Center

    Lazaros, Edward J.; Hua, David

    2012-01-01

    In this activity, students will develop an understanding of the economic impact of technology by estimating the cost savings of power management strategies in the classroom. Students will learn how to adjust computer display settings to influence the impact that the computer has on the financial burden to the school. They will use mathematics to…

  15. Proceedings of NECC/2 National Educational Computing Conference 1980 (Norfolk, Virginia, June 23-25, 1980).

    ERIC Educational Resources Information Center

    Harris, Diana, Ed.; Collison, Beth, Ed.

    This proceedings, which includes 52 papers and abstracts of 13 invited and nine tutorial sessions, provides an overview of the current status of computer usage in education and offers substantive forecasts for academic computing. Papers are presented under the following headings: Business--Economics, Tools and Techniques for Instruction, Computers…

  16. Implications of Computer Technology. Harvard University Program on Technology and Society.

    ERIC Educational Resources Information Center

    Taviss, Irene; Burbank, Judith

    Lengthy abstracts of a small number of selected books and articles on the implications of computer technology are presented, preceded by a brief state-of-the-art survey which traces the impact of computers on the structure of economic and political organizations and socio-cultural patterns. A summary statement introduces each of the three abstract…

  17. Numerical Optimization Using Desktop Computers

    DTIC Science & Technology

    1980-09-11

    concentrating compound parabolic trough solar collector . Thermophysical, geophysical, optical and economic analyses were used to compute a life-cycle...third computer program, NISCO, was developed to model a nonimaging concentrating compound parabolic trough solar collector using thermophysical...concentrating compound parabolic trough Solar Collector . C. OBJECTIVE The objective of this thesis was to develop a system of interactive programs for the Hewlett

  18. Group Communication Through Computers. Volume 4: Social, Managerial, and Economic Issues.

    ERIC Educational Resources Information Center

    Vallee, Jacques; And Others

    This study is the first assessment of the long term effects of computer conferencing. The use of PLANET and FORUM are described, and major users and conference characteristics are presented through excerpts from conference transcripts. Part I of the report focuses on the ways in which organizations used computer conferencing. Conference size and…

  19. APL: An Alternative to the Multi-Language Environment for Education. Systems Research Memo Number Four.

    ERIC Educational Resources Information Center

    Lippert, Henry T.; Harris, Edward V.

    The diverse requirements for computing facilities in education place heavy demands upon available resources. Although multiple or very large computers can supply such diverse needs, their cost makes them impractical for many institutions. Small computers which serve a few specific needs may be an economical answer. However, to serve operationally…

  20. Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  1. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    PubMed

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  2. Algorithm Calculates Cumulative Poisson Distribution

    NASA Technical Reports Server (NTRS)

    Bowerman, Paul N.; Nolty, Robert C.; Scheuer, Ernest M.

    1992-01-01

    Algorithm calculates accurate values of cumulative Poisson distribution under conditions where other algorithms fail because numbers are so small (underflow) or so large (overflow) that computer cannot process them. Factors inserted temporarily to prevent underflow and overflow. Implemented in CUMPOIS computer program described in "Cumulative Poisson Distribution Program" (NPO-17714).

  3. A Framework for a Computer System to Support Distributed Cooperative Learning

    ERIC Educational Resources Information Center

    Chiu, Chiung-Hui

    2004-01-01

    To develop a computer system to support cooperative learning among distributed students; developers should consider the foundations of cooperative learning. This article examines the basic elements that make cooperation work and proposes a framework for such computer supported cooperative learning (CSCL) systems. This framework is constituted of…

  4. Computer Power: Part 1: Distribution of Power (and Communications).

    ERIC Educational Resources Information Center

    Price, Bennett J.

    1988-01-01

    Discussion of the distribution of power to personal computers and computer terminals addresses options such as extension cords, perimeter raceways, and interior raceways. Sidebars explain: (1) the National Electrical Code; (2) volts, amps, and watts; (3) transformers, circuit breakers, and circuits; and (4) power vs. data wiring. (MES)

  5. Evoking Knowledge and Information Awareness for Enhancing Computer-Supported Collaborative Problem Solving

    ERIC Educational Resources Information Center

    Engelmann, Tanja; Tergan, Sigmar-Olaf; Hesse, Friedrich W.

    2010-01-01

    Computer-supported collaboration by spatially distributed group members still involves interaction problems within the group. This article presents an empirical study investigating the question of whether computer-supported collaborative problem solving by spatially distributed group members can be fostered by evoking knowledge and information…

  6. JPRS Report, Science & Technology, USSR: Computers.

    DTIC Science & Technology

    1988-07-08

    Computers DISTRIBUTION STAfpiEFTX Approved !CJ- public vekrase; Distribution Unla;u;ed DTIC QUALITY INSPECTED S REPRODUCED BY U.S. DEPARTMENT OF...COMMERCE National Technical Information Service SPRINGFIELD, VA. 22161 /O o f\\H JPRS-UCC-88-002 8 JULY 1988 SCIENCE & TECHNOLOGY USSR: COMPUTERS ...CONTENTS GENERAL Computers : Steps to the World Level ,nm^VA „ 17Q (V. Kovalenko; SOTSIAL1STICIIESKAYA INDUSTRIYA, No 178, 4 Aug 87

  7. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  8. Determinants of Distribution Logistics in the Construction Industry

    NASA Astrophysics Data System (ADS)

    Bukova, Bibiana; Brumercikova, Eva; Kondek, Pavol

    2017-03-01

    Global business is currently still influenced by the economic crisis and the economic development in each country of the EU. The construction sector is among the most affected sectors of the national economies. The production of building material is a part of the construction industry. Several companies of this sector in the European Union use business logistics effectively. The overall efficiency of the company is influenced by many various external and internal determinants, especially the distribution logistics.

  9. Guidelines for Preparing Economic Analysis (2000)

    EPA Pesticide Factsheets

    The Guidelines provide guidance on analyzing the economic impacts of regulations and policies, and assessing the distribution of costs and benefits among various segments of the population, with a particular focus on disadvantaged and vulnerable groups.

  10. Western Europe--A Trading Game.

    ERIC Educational Resources Information Center

    Cox, Ann Curtis

    1991-01-01

    Presents a geography program to show students why the European Community was formed. Involves student research of economic data, creation of a computer database on the European Community, and simulation of trading. Emphasizes geographic themes of movement, region formation, and change in response to economic forces. Includes game rules, sample…

  11. Distributing an executable job load file to compute nodes in a parallel computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gooding, Thomas M.

    Distributing an executable job load file to compute nodes in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: determining, by a compute node in the parallel computer, whether the compute node is participating in a job; determining, by the compute node in the parallel computer, whether a descendant compute node is participating in the job; responsive to determining that the compute node is participating in the job or that the descendant compute node is participating in the job, communicating, by the compute node to a parent compute node, an identification of a data communications linkmore » over which the compute node receives data from the parent compute node; constructing a class route for the job, wherein the class route identifies all compute nodes participating in the job; and broadcasting the executable load file for the job along the class route for the job.« less

  12. ScipionCloud: An integrative and interactive gateway for large scale cryo electron microscopy image processing on commercial and academic clouds.

    PubMed

    Cuenca-Alba, Jesús; Del Cano, Laura; Gómez Blanco, Josué; de la Rosa Trevín, José Miguel; Conesa Mingo, Pablo; Marabini, Roberto; S Sorzano, Carlos Oscar; Carazo, Jose María

    2017-10-01

    New instrumentation for cryo electron microscopy (cryoEM) has significantly increased data collection rate as well as data quality, creating bottlenecks at the image processing level. Current image processing model of moving the acquired images from the data source (electron microscope) to desktops or local clusters for processing is encountering many practical limitations. However, computing may also take place in distributed and decentralized environments. In this way, cloud is a new form of accessing computing and storage resources on demand. Here, we evaluate on how this new computational paradigm can be effectively used by extending our current integrative framework for image processing, creating ScipionCloud. This new development has resulted in a full installation of Scipion both in public and private clouds, accessible as public "images", with all the required preinstalled cryoEM software, just requiring a Web browser to access all Graphical User Interfaces. We have profiled the performance of different configurations on Amazon Web Services and the European Federated Cloud, always on architectures incorporating GPU's, and compared them with a local facility. We have also analyzed the economical convenience of different scenarios, so cryoEM scientists have a clearer picture of the setup that is best suited for their needs and budgets. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Crowd Computing as a Cooperation Problem: An Evolutionary Approach

    NASA Astrophysics Data System (ADS)

    Christoforou, Evgenia; Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A.; Sánchez, Angel

    2013-05-01

    Cooperation is one of the socio-economic issues that has received more attention from the physics community. The problem has been mostly considered by studying games such as the Prisoner's Dilemma or the Public Goods Game. Here, we take a step forward by studying cooperation in the context of crowd computing. We introduce a model loosely based on Principal-agent theory in which people (workers) contribute to the solution of a distributed problem by computing answers and reporting to the problem proposer (master). To go beyond classical approaches involving the concept of Nash equilibrium, we work on an evolutionary framework in which both the master and the workers update their behavior through reinforcement learning. Using a Markov chain approach, we show theoretically that under certain----not very restrictive—conditions, the master can ensure the reliability of the answer resulting of the process. Then, we study the model by numerical simulations, finding that convergence, meaning that the system reaches a point in which it always produces reliable answers, may in general be much faster than the upper bounds given by the theoretical calculation. We also discuss the effects of the master's level of tolerance to defectors, about which the theory does not provide information. The discussion shows that the system works even with very large tolerances. We conclude with a discussion of our results and possible directions to carry this research further.

  14. Visual Analytics for Power Grid Contingency Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Pak C.; Huang, Zhenyu; Chen, Yousu

    2014-01-20

    Contingency analysis is the process of employing different measures to model scenarios, analyze them, and then derive the best response to remove the threats. This application paper focuses on a class of contingency analysis problems found in the power grid management system. A power grid is a geographically distributed interconnected transmission network that transmits and delivers electricity from generators to end users. The power grid contingency analysis problem is increasingly important because of both the growing size of the underlying raw data that need to be analyzed and the urgency to deliver working solutions in an aggressive timeframe. Failure tomore » do so may bring significant financial, economic, and security impacts to all parties involved and the society at large. The paper presents a scalable visual analytics pipeline that transforms about 100 million contingency scenarios to a manageable size and form for grid operators to examine different scenarios and come up with preventive or mitigation strategies to address the problems in a predictive and timely manner. Great attention is given to the computational scalability, information scalability, visual scalability, and display scalability issues surrounding the data analytics pipeline. Most of the large-scale computation requirements of our work are conducted on a Cray XMT multi-threaded parallel computer. The paper demonstrates a number of examples using western North American power grid models and data.« less

  15. Why the changing American economy calls for twenty-first century learning: answers to educators' questions.

    PubMed

    Levy, Frank; Murnane, Richard J

    2006-01-01

    While struggling with the current pressures of educational reform, some educators will ask whether their efforts make economic sense. Questioning the future makeup of the nation's workforce, many wonder how the educational system should be tempered to better prepare today's youth. This chapter answers educators' and parents' questions around the effect of fluctuations in the American economy on the future of education. The authors offer reassurance that good jobs will always be available, but warn that those jobs will require a new level of skills: expert thinking and complex communication. Schools need to go beyond their current curriculum and prepare students to use reading, math, and communication skills to build a deeper and more thoughtful understanding of subject matter. To explain the implications of the nation's changing economy on jobs, technology, and therefore education, the authors address a range of vital questions. Citing occupational distribution data, the chapter explores the supply and range of jobs in the future, as well as why changes in the U.S. job distribution have taken place. As much of the explanation for the shift in job distribution over the past several decades is due to the computerization of the workforce, the authors discuss how computers will affect the future composition of the workforce. The chapter also addresses the consequences of educational improvement on earnings distribution. The authors conclude that beyond workforce preparedness, students need to learn how to be contributing members of a democracy.

  16. CAD/CAE Integration Enhanced by New CAD Services Standard

    NASA Technical Reports Server (NTRS)

    Claus, Russell W.

    2002-01-01

    A Government-industry team led by the NASA Glenn Research Center has developed a computer interface standard for accessing data from computer-aided design (CAD) systems. The Object Management Group, an international computer standards organization, has adopted this CAD services standard. The new standard allows software (e.g., computer-aided engineering (CAE) and computer-aided manufacturing software to access multiple CAD systems through one programming interface. The interface is built on top of a distributed computing system called the Common Object Request Broker Architecture (CORBA). CORBA allows the CAD services software to operate in a distributed, heterogeneous computing environment.

  17. Power-law distribution in Japanese racetrack betting

    NASA Astrophysics Data System (ADS)

    Ichinomiya, Takashi

    2006-08-01

    Gambling is one of the basic economic activities that humans indulge in. An investigation of gambling activities provides deep insights into the economic actions of people and sheds lights on the study of econophysics. In this paper we present an analysis of the distribution of the final odds of the races organized by the Japan Racing Association. The distribution of the final odds Po(x) indicates a clear power-law Po(x)∝1/x, where x represents the final odds. This power-law can be explained on the basis of the assumption that every bettor bets his money on the horse that appears to be the strongest in a race.

  18. 76 FR 37763 - Fisheries of the Exclusive Economic Zone Off Alaska; Pacific Cod Allocations in the Gulf of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-28

    ...-AY53 Fisheries of the Exclusive Economic Zone Off Alaska; Pacific Cod Allocations in the Gulf of Alaska... the uncertainty regarding the distribution of Pacific cod catch, enhance stability among the sectors... available for public review and comment. The groundfish fisheries in the exclusive economic zone of the Gulf...

  19. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Technical Reports Server (NTRS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-01-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  20. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Astrophysics Data System (ADS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-08-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  1. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prowell, Stacy J; Symons, Christopher T

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  2. The Future of School Library Media Centers.

    ERIC Educational Resources Information Center

    Craver, Kathleen W.

    1984-01-01

    Examines impact of technology on school library media program development and role of school librarian. Technological trends (computerized record keeping, computer-assisted instruction, networking, home computers, videodiscs), employment and economic trends, education of school librarians, social and behavioral trends, and organizational and…

  3. An efficient parallel termination detection algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, A. H.; Crivelli, S.; Jessup, E. R.

    2004-05-27

    Information local to any one processor is insufficient to monitor the overall progress of most distributed computations. Typically, a second distributed computation for detecting termination of the main computation is necessary. In order to be a useful computational tool, the termination detection routine must operate concurrently with the main computation, adding minimal overhead, and it must promptly and correctly detect termination when it occurs. In this paper, we present a new algorithm for detecting the termination of a parallel computation on distributed-memory MIMD computers that satisfies all of those criteria. A variety of termination detection algorithms have been devised. Ofmore » these, the algorithm presented by Sinha, Kale, and Ramkumar (henceforth, the SKR algorithm) is unique in its ability to adapt to the load conditions of the system on which it runs, thereby minimizing the impact of termination detection on performance. Because their algorithm also detects termination quickly, we consider it to be the most efficient practical algorithm presently available. The termination detection algorithm presented here was developed for use in the PMESC programming library for distributed-memory MIMD computers. Like the SKR algorithm, our algorithm adapts to system loads and imposes little overhead. Also like the SKR algorithm, ours is tree-based, and it does not depend on any assumptions about the physical interconnection topology of the processors or the specifics of the distributed computation. In addition, our algorithm is easier to implement and requires only half as many tree traverses as does the SKR algorithm. This paper is organized as follows. In section 2, we define our computational model. In section 3, we review the SKR algorithm. We introduce our new algorithm in section 4, and prove its correctness in section 5. We discuss its efficiency and present experimental results in section 6.« less

  4. Other Cosmic Ray Links

    Science.gov Websites

    curriculum for its course Physics In and Through Cosmology. The Distributed Observatory aims to become the world's largest cosmic ray telescope, using the distributed sensing and computing power of the world's cell phones. Modeled after the distributed computing efforts of SETI@Home and Folding@Home, the

  5. Univariate and Bivariate Loglinear Models for Discrete Test Score Distributions.

    ERIC Educational Resources Information Center

    Holland, Paul W.; Thayer, Dorothy T.

    2000-01-01

    Applied the theory of exponential families of distributions to the problem of fitting the univariate histograms and discrete bivariate frequency distributions that often arise in the analysis of test scores. Considers efficient computation of the maximum likelihood estimates of the parameters using Newton's Method and computationally efficient…

  6. Distriblets: Java-Based Distributed Computing on the Web.

    ERIC Educational Resources Information Center

    Finkel, David; Wills, Craig E.; Brennan, Brian; Brennan, Chris

    1999-01-01

    Describes a system for using the World Wide Web to distribute computational tasks to multiple hosts on the Web that is written in Java programming language. Describes the programs written to carry out the load distribution, the structure of a "distriblet" class, and experiences in using this system. (Author/LRW)

  7. A Hybrid Computer Simulation to Generate the DNA Distribution of a Cell Population.

    ERIC Educational Resources Information Center

    Griebling, John L.; Adams, William S.

    1981-01-01

    Described is a method of simulating the formation of a DNA distribution, on which statistical results and experimentally measured parameters from DNA distribution and percent-labeled mitosis studies are combined. An EAI-680 and DECSystem-10 Hybrid Computer configuration are used. (Author/CS)

  8. Direct coal liquefaction baseline design and system analysis. Quarterly report, January--March 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less

  9. Direct coal liquefaction baseline design and system analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less

  10. A computational analysis of lower bounds for the economic lot sizing problem in remanufacturing with separate setups

    NASA Astrophysics Data System (ADS)

    Aishah Syed Ali, Sharifah

    2017-09-01

    This paper considers economic lot sizing problem in remanufacturing with separate setup (ELSRs), where remanufactured and new products are produced on dedicated production lines. Since this problem is NP-hard in general, which leads to computationally inefficient and low-quality of solutions, we present (a) a multicommodity formulation and (b) a strengthened formulation based on a priori addition of valid inequalities in the space of original variables, which are then compared with the Wagner-Whitin based formulation available in the literature. Computational experiments on a large number of test data sets are performed to evaluate the different approaches. The numerical results show that our strengthened formulation outperforms all the other tested approaches in terms of linear relaxation bounds. Finally, we conclude with future research directions.

  11. Outline of cost-benefit analysis and a case study

    NASA Technical Reports Server (NTRS)

    Kellizy, A.

    1978-01-01

    The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.

  12. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    PubMed Central

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  13. Sputnik: ad hoc distributed computation.

    PubMed

    Völkel, Gunnar; Lausser, Ludwig; Schmid, Florian; Kraus, Johann M; Kestler, Hans A

    2015-04-15

    In bioinformatic applications, computationally demanding algorithms are often parallelized to speed up computation. Nevertheless, setting up computational environments for distributed computation is often tedious. Aim of this project were the lightweight ad hoc set up and fault-tolerant computation requiring only a Java runtime, no administrator rights, while utilizing all CPU cores most effectively. The Sputnik framework provides ad hoc distributed computation on the Java Virtual Machine which uses all supplied CPU cores fully. It provides a graphical user interface for deployment setup and a web user interface displaying the current status of current computation jobs. Neither a permanent setup nor administrator privileges are required. We demonstrate the utility of our approach on feature selection of microarray data. The Sputnik framework is available on Github http://github.com/sysbio-bioinf/sputnik under the Eclipse Public License. hkestler@fli-leibniz.de or hans.kestler@uni-ulm.de Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Distributed Computing Architecture for Image-Based Wavefront Sensing and 2 D FFTs

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey S.; Dean, Bruce H.; Haghani, Shadan

    2006-01-01

    Image-based wavefront sensing (WFS) provides significant advantages over interferometric-based wavefi-ont sensors such as optical design simplicity and stability. However, the image-based approach is computational intensive, and therefore, specialized high-performance computing architectures are required in applications utilizing the image-based approach. The development and testing of these high-performance computing architectures are essential to such missions as James Webb Space Telescope (JWST), Terrestial Planet Finder-Coronagraph (TPF-C and CorSpec), and Spherical Primary Optical Telescope (SPOT). The development of these specialized computing architectures require numerous two-dimensional Fourier Transforms, which necessitate an all-to-all communication when applied on a distributed computational architecture. Several solutions for distributed computing are presented with an emphasis on a 64 Node cluster of DSPs, multiple DSP FPGAs, and an application of low-diameter graph theory. Timing results and performance analysis will be presented. The solutions offered could be applied to other all-to-all communication and scientifically computationally complex problems.

  15. Assessing the relationship between computational speed and precision: a case study comparing an interpreted versus compiled programming language using a stochastic simulation model in diabetes care.

    PubMed

    McEwan, Phil; Bergenheim, Klas; Yuan, Yong; Tetlow, Anthony P; Gordon, Jason P

    2010-01-01

    Simulation techniques are well suited to modelling diseases yet can be computationally intensive. This study explores the relationship between modelled effect size, statistical precision, and efficiency gains achieved using variance reduction and an executable programming language. A published simulation model designed to model a population with type 2 diabetes mellitus based on the UKPDS 68 outcomes equations was coded in both Visual Basic for Applications (VBA) and C++. Efficiency gains due to the programming language were evaluated, as was the impact of antithetic variates to reduce variance, using predicted QALYs over a 40-year time horizon. The use of C++ provided a 75- and 90-fold reduction in simulation run time when using mean and sampled input values, respectively. For a series of 50 one-way sensitivity analyses, this would yield a total run time of 2 minutes when using C++, compared with 155 minutes for VBA when using mean input values. The use of antithetic variates typically resulted in a 53% reduction in the number of simulation replications and run time required. When drawing all input values to the model from distributions, the use of C++ and variance reduction resulted in a 246-fold improvement in computation time compared with VBA - for which the evaluation of 50 scenarios would correspondingly require 3.8 hours (C++) and approximately 14.5 days (VBA). The choice of programming language used in an economic model, as well as the methods for improving precision of model output can have profound effects on computation time. When constructing complex models, more computationally efficient approaches such as C++ and variance reduction should be considered; concerns regarding model transparency using compiled languages are best addressed via thorough documentation and model validation.

  16. Techno-economic requirements for automotive composites

    NASA Technical Reports Server (NTRS)

    Arnold, Scot

    1993-01-01

    New technology generally serves two main goals of the automotive industry: one is to enable vehicles to comply with various governmental regulations and the other is to provide a competitive edge in the market. The latter goal can either be served through improved manufacturing and design capabilities, such as computer aided design and computer aided manufacturing, or through improved product performance, such as anti-lock braking (ABS). Although safety features are sometimes customer driven, such as the increasing use of airbags and ABS, most are determined by regulations as outlined by the Federal Motor Vehicle Safety Standards (FMVSS). Other standards, set by the Environmental Protection Agency, determine acceptable levels of emissions and fuel consumption. State governments, such as in California, are also setting precedent standards, such as requiring manufacturers to offer zero-emission vehicles as a certain fraction of their sales in the state. The drive to apply new materials in the automobile stems from the need to reduce weight and improve fuel efficiency. Topics discussed include: new lightweight materials; types of automotive materials; automotive composite applications; the role for composite materials in automotive applications; advantages and disadvantages of composite materials; material substitution economics; economic perspective; production economics; and composite materials production economics.

  17. Annual economic impacts of seasonal influenza on US counties: Spatial heterogeneity and patterns

    PubMed Central

    2012-01-01

    Economic impacts of seasonal influenza vary across US counties, but little estimation has been conducted at the county level. This research computed annual economic costs of seasonal influenza for 3143 US counties based on Census 2010, identified inherent spatial patterns, and investigated cost-benefits of vaccination strategies. The computing model modified existing methods for national level estimation, and further emphasized spatial variations between counties, in terms of population size, age structure, influenza activity, and income level. Upon such a model, four vaccination strategies that prioritize different types of counties were simulated and their net returns were examined. The results indicate that the annual economic costs of influenza varied from $13.9 thousand to $957.5 million across US counties, with a median of $2.47 million. Prioritizing vaccines to counties with high influenza attack rates produces the lowest influenza cases and highest net returns. This research fills the current knowledge gap by downscaling the estimation to a county level, and adds spatial variability into studies of influenza economics and interventions. Compared to the national estimates, the presented statistics and maps will offer detailed guidance for local health agencies to fight against influenza. PMID:22594494

  18. Economic impact of public resource supply constraints in northeast Oregon. Forest Service general technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waters, E.C.; Holland, D.W.; Haynes, R.W.

    1997-04-01

    Traditional, fixed-price (input-output) economic models provide a useful framework for conceptualizing links in a regional economy. Apparent shortcomings in these models, however, severely restrict our ability to deduce valid prescriptions for public policy and economic development. A more efficient approach using regional computable general equilibrium (CGE) models as well as a brief survey of relevant literature is presented. Computable general equilibrium results under several different resource policy scenarios are examined and contrasted with a fixed-price analysis. In the most severe CGE scenario, elimination of Federal range programs caused the loss of 1,371 jobs (2.3 percent of regional employment) and $29more » million (1.6 percent) of house income; and an 80-percent reduction in Federal log supplies resulted in the loss of 3,329 jobs (5.5 percent of regional employment), and $76 millin (4.2 percent) of household income. These results do not include positive economic impacts associated with improvement in salmon runs. Economic counter scenarios indicate that increases in tourism and high-technology manufacturing and growth in the population of retirees can largely offset total employment and income losses.« less

  19. Socio-economic variation in CT scanning in Northern England, 1990-2002

    PubMed Central

    2012-01-01

    Background Socio-economic status is known to influence health throughout life. In childhood, studies have shown increased injury rates in more deprived settings. Socio-economic status may therefore be related to rates of certain medical procedures, such as computed tomography (CT) scans. This study aimed to assess socio-economic variation among young people having CT scans in Northern England between 1990 and 2002 inclusive. Methods Electronic data were obtained from Radiology Information Systems of all nine National Health Service hospital Trusts in the region. CT scan data, including sex, date of scan, age at scan, number and type of scans were assessed in relation to quintiles of Townsend deprivation scores, obtained from linkage of postcodes with census data, using χ2 tests and Spearman rank correlations. Results During the study period, 39,676 scans were recorded on 21,089 patients, with 38,007 scans and 19,485 patients (11344 male and 8132 female) linkable to Townsend scores. The overall distributions of both scans and patients by quintile of Townsend deprivation scores were significantly different to the distributions of Townsend scores from the census wards included in the study (p < 0.0001). There was a significant association between type of scan and deprivation quintile (p < 0.0001), primarily due to the higher proportions of head scans in the three most deprived quintiles, and slightly higher proportions of chest scans and abdomen and pelvis scans in the least deprived groups. There was also a significant association (p < 0.0001) between the patient's age at the time of the CT scan and Townsend deprivation quintiles, with slightly increasing proportions of younger children with increasing deprivation. A similar association with age (p < 0.0001) was seen when restricting the data to include only the first scan of each patient. The number of scans per patient was also associated with Townsend deprivation quintiles (p = 0.014). Conclusions Social inequalities exist in the numbers of young people undergoing CT scans with those from deprived areas more likely to do so. This may reflect the rates of injuries in these individuals and implies that certain groups within the population may receive higher radiation doses than others due to medical procedures. PMID:22283843

  20. Comparative analysis of economic models in selected solar energy computer programs

    NASA Astrophysics Data System (ADS)

    Powell, J. W.; Barnes, K. A.

    1982-01-01

    The economic evaluation models in five computer programs widely used for analyzing solar energy systems (F-CHART 3.0, F-CHART 4.0, SOLCOST, BLAST, and DOE-2) are compared. Differences in analysis techniques and assumptions among the programs are assessed from the point of view of consistency with the Federal requirements for life cycle costing (10 CFR Part 436), effect on predicted economic performance, and optimal system size, case of use, and general applicability to diverse systems types and building types. The FEDSOL program developed by the National Bureau of Standards specifically to meet the Federal life cycle cost requirements serves as a basis for the comparison. Results of the study are illustrated in test cases of two different types of Federally owned buildings: a single family residence and a low rise office building.

  1. Airline return-on-investment model for technology evaluation. [computer program to measure economic value of advanced technology applied to passenger aircraft

    NASA Technical Reports Server (NTRS)

    1974-01-01

    This report presents the derivation, description, and operating instructions for a computer program (TEKVAL) which measures the economic value of advanced technology features applied to long range commercial passenger aircraft. The program consists of three modules; and airplane sizing routine, a direct operating cost routine, and an airline return-on-investment routine. These modules are linked such that they may be operated sequentially or individually, with one routine generating the input for the next or with the option of externally specifying the input for either of the economic routines. A very simple airplane sizing technique was previously developed, based on the Brequet range equation. For this program, that sizing technique has been greatly expanded and combined with the formerly separate DOC and ROI programs to produce TEKVAL.

  2. The OSG Open Facility: an on-ramp for opportunistic scientific computing

    NASA Astrophysics Data System (ADS)

    Jayatilaka, B.; Levshina, T.; Sehgal, C.; Gardner, R.; Rynge, M.; Würthwein, F.

    2017-10-01

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource owners and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.

  3. The OSG Open Facility: An On-Ramp for Opportunistic Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jayatilaka, B.; Levshina, T.; Sehgal, C.

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource ownersmore » and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.« less

  4. System-wide power management control via clock distribution network

    DOEpatents

    Coteus, Paul W.; Gara, Alan; Gooding, Thomas M.; Haring, Rudolf A.; Kopcsay, Gerard V.; Liebsch, Thomas A.; Reed, Don D.

    2015-05-19

    An apparatus, method and computer program product for automatically controlling power dissipation of a parallel computing system that includes a plurality of processors. A computing device issues a command to the parallel computing system. A clock pulse-width modulator encodes the command in a system clock signal to be distributed to the plurality of processors. The plurality of processors in the parallel computing system receive the system clock signal including the encoded command, and adjusts power dissipation according to the encoded command.

  5. Computer measurement of particle sizes in electron microscope images

    NASA Technical Reports Server (NTRS)

    Hall, E. L.; Thompson, W. B.; Varsi, G.; Gauldin, R.

    1976-01-01

    Computer image processing techniques have been applied to particle counting and sizing in electron microscope images. Distributions of particle sizes were computed for several images and compared to manually computed distributions. The results of these experiments indicate that automatic particle counting within a reasonable error and computer processing time is feasible. The significance of the results is that the tedious task of manually counting a large number of particles can be eliminated while still providing the scientist with accurate results.

  6. A Debugger for Computational Grid Applications

    NASA Technical Reports Server (NTRS)

    Hood, Robert; Jost, Gabriele

    2000-01-01

    The p2d2 project at NAS has built a debugger for applications running on heterogeneous computational grids. It employs a client-server architecture to simplify the implementation. Its user interface has been designed to provide process control and state examination functions on a computation containing a large number of processes. It can find processes participating in distributed computations even when those processes were not created under debugger control. These process identification techniques work both on conventional distributed executions as well as those on a computational grid.

  7. Understanding determinants of unequal distribution of stillbirth in Tehran, Iran: a concentration index decomposition approach.

    PubMed

    Almasi-Hashiani, Amir; Sepidarkish, Mahdi; Safiri, Saeid; Khedmati Morasae, Esmaeil; Shadi, Yahya; Omani-Samani, Reza

    2017-05-17

    The present inquiry set to determine the economic inequality in history of stillbirth and understanding determinants of unequal distribution of stillbirth in Tehran, Iran. A population-based cross-sectional study was conducted on 5170 pregnancies in Tehran, Iran, since 2015. Principal component analysis (PCA) was applied to measure the asset-based economic status. Concentration index was used to measure socioeconomic inequality in stillbirth and then decomposed into its determinants. The concentration index and its 95% CI for stillbirth was -0.121 (-0.235 to -0.002). Decomposition of the concentration index showed that mother's education (50%), mother's occupation (30%), economic status (26%) and father's age (12%) had the highest positive contributions to measured inequality in stillbirth history in Tehran. Mother's age (17%) had the highest negative contribution to inequality. Stillbirth is unequally distributed among Iranian women and is mostly concentrated among low economic status people. Mother-related factors had the highest positive and negative contributions to inequality, highlighting specific interventions for mothers to redress inequality. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. A summary analysis of the 3rd inquiry.

    PubMed

    1977-01-01

    20 ESCAP member countries responded to the "Third Population Inquiry among Governments: Population policies in the context of development in 1976." The questionnaire sent to the member countries covered economic and social development and population growth, mortality, fertility and family formation, population distribution and internal migration, international migration, population data collection and research, training, and institutional arrangements for the formulation of population policies within development. Most of the governments in the ESCAP region that responded indicate that the present rate of population growth constrains their social and economic development. Among the governments that consider the present rate of population growth to constrain economic and social development, 13 countries regarded the most appropriate response to the constraint would include an adjustment of both socioeconomic and demographic factors. 11 of the governments regarded their present levels of average life expectancy at birth "acceptable" and 7 identified their levels as "unacceptable." Most of the governments who responded consider that, in general, their present level of fertility is too high and constrains family well-being. Internal migration and population distribution are coming to be seen as concerns for government population policy. The most popular approaches to distributing economic and social activities are rural development, urban and regional development and industrial dispersion. There was much less concern among the governments returning the questionnaire about the effect of international migration than internal migration on social and economic development.

  9. Watershed and Economic Data InterOperability (WEDO) System

    EPA Science Inventory

    Hydrologic modeling is essential for environmental, economic, and human health decision-making. However, sharing of modeling studies is limited within the watershed modeling community. Distribution of hydrologic modeling research typically involves publishing summarized data in p...

  10. Watershed and Economic Data InterOperability (WEDO) System (presentation)

    EPA Science Inventory

    Hydrologic modeling is essential for environmental, economic, and human health decision- making. However, sharing of modeling studies is limited within the watershed modeling community. Distribution of hydrologic modeling research typically involves publishing summarized data in ...

  11. Distributed MRI reconstruction using Gadgetron-based cloud computing.

    PubMed

    Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S

    2015-03-01

    To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.

  12. 43 CFR 4130.8-1 - Payment of fees.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... economic value of livestock grazing, defined by the Congress as fair market value (FMV) of the forage; $1.23=The base economic value of grazing on public rangeland established by the 1966 Western Livestock... multiplied by the result of the Forage Value Index (computed annually from data supplied by the National...

  13. 43 CFR 4130.8-1 - Payment of fees.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... economic value of livestock grazing, defined by the Congress as fair market value (FMV) of the forage; $1.23=The base economic value of grazing on public rangeland established by the 1966 Western Livestock... multiplied by the result of the Forage Value Index (computed annually from data supplied by the National...

  14. 43 CFR 4130.8-1 - Payment of fees.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... economic value of livestock grazing, defined by the Congress as fair market value (FMV) of the forage; $1.23=The base economic value of grazing on public rangeland established by the 1966 Western Livestock... multiplied by the result of the Forage Value Index (computed annually from data supplied by the National...

  15. 43 CFR 4130.8-1 - Payment of fees.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... economic value of livestock grazing, defined by the Congress as fair market value (FMV) of the forage; $1.23=The base economic value of grazing on public rangeland established by the 1966 Western Livestock... multiplied by the result of the Forage Value Index (computed annually from data supplied by the National...

  16. Economic Modeling as a Component of Academic Strategic Planning.

    ERIC Educational Resources Information Center

    MacKinnon, Joyce; Sothmann, Mark; Johnson, James

    2001-01-01

    Computer-based economic modeling was used to enable a school of allied health to define outcomes, identify associated costs, develop cost and revenue models, and create a financial planning system. As a strategic planning tool, it assisted realistic budgeting and improved efficiency and effectiveness. (Contains 18 references.) (SK)

  17. Computer Assisted Learning Feature--Using Databases in Economics and Business Studies.

    ERIC Educational Resources Information Center

    Davies, Peter; Allison, Ron.

    1989-01-01

    Describes ways in which databases can be used in economics and business education classes. Explores arguments put forth by advocates for the use of databases in the classroom. Offers information on British software and discusses six online database systems listing the features of each. (KO)

  18. Assessing the benefits and economic values of trees

    Treesearch

    David J. Nowak

    2017-01-01

    Understanding the environmental, economic, and social/community benefits of nature, in particular trees and forests, can lead to better vegetation management and designs to optimize environmental quality and human health for current and future generations. Computer models have been developed to assess forest composition and its associated effects on environmental...

  19. Evaluation of Computer-Assisted Instruction in Principles of Economics.

    ERIC Educational Resources Information Center

    Coates, Dennis; Humphreys, Brad R.

    2001-01-01

    Assesses the effectiveness of supplementary Web-based materials and activities in traditional introductory college economics courses. Results suggest that faculty should focus more on developing self-test quizzes and effective bulletin board discussion projects as opposed to generating online content related to text or lecture notes. (Author/LRW)

  20. Teaching Practices in Principles of Economics Courses at Michigan Community Colleges.

    ERIC Educational Resources Information Center

    Utech, Claudia J.; Mosti, Patricia A.

    1995-01-01

    Presents findings from a study of teaching practices in Principles of Economics courses at Michigan's 29 community colleges. Describes course prerequisites; textbooks used; lecture supplements; and the use of experiential learning tools, such as computers and field trips. Presents three recommendations for improving student preparation in…

  1. 45 CFR 34.8 - Computation of award and settlement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... amount normally payable on property damaged beyond economical repair shall not exceed its depreciated value. If the cost of repairs is less than the depreciated value it shall be considered economically repairable and the costs of repairs shall be the amount payable. (b) Depreciation in value of an item shall...

  2. 45 CFR 34.8 - Computation of award and settlement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... amount normally payable on property damaged beyond economical repair shall not exceed its depreciated value. If the cost of repairs is less than the depreciated value it shall be considered economically repairable and the costs of repairs shall be the amount payable. (b) Depreciation in value of an item shall...

  3. Overview of Particle and Heavy Ion Transport Code System PHITS

    NASA Astrophysics Data System (ADS)

    Sato, Tatsuhiko; Niita, Koji; Matsuda, Norihiro; Hashimoto, Shintaro; Iwamoto, Yosuke; Furuta, Takuya; Noda, Shusaku; Ogawa, Tatsuhiko; Iwase, Hiroshi; Nakashima, Hiroshi; Fukahori, Tokio; Okumura, Keisuke; Kai, Tetsuya; Chiba, Satoshi; Sihver, Lembit

    2014-06-01

    A general purpose Monte Carlo Particle and Heavy Ion Transport code System, PHITS, is being developed through the collaboration of several institutes in Japan and Europe. The Japan Atomic Energy Agency is responsible for managing the entire project. PHITS can deal with the transport of nearly all particles, including neutrons, protons, heavy ions, photons, and electrons, over wide energy ranges using various nuclear reaction models and data libraries. It is written in Fortran language and can be executed on almost all computers. All components of PHITS such as its source, executable and data-library files are assembled in one package and then distributed to many countries via the Research organization for Information Science and Technology, the Data Bank of the Organization for Economic Co-operation and Development's Nuclear Energy Agency, and the Radiation Safety Information Computational Center. More than 1,000 researchers have been registered as PHITS users, and they apply the code to various research and development fields such as nuclear technology, accelerator design, medical physics, and cosmic-ray research. This paper briefly summarizes the physics models implemented in PHITS, and introduces some important functions useful for specific applications, such as an event generator mode and beam transport functions.

  4. Efficiency Evaluation of Handling of Geologic-Geophysical Information by Means of Computer Systems

    NASA Astrophysics Data System (ADS)

    Nuriyahmetova, S. M.; Demyanova, O. V.; Zabirova, L. M.; Gataullin, I. I.; Fathutdinova, O. A.; Kaptelinina, E. A.

    2018-05-01

    Development of oil and gas resources, considering difficult geological, geographical and economic conditions, requires considerable finance costs; therefore their careful reasons, application of the most perspective directions and modern technologies from the point of view of cost efficiency of planned activities are necessary. For ensuring high precision of regional and local forecasts and modeling of reservoirs of fields of hydrocarbonic raw materials, it is necessary to analyze huge arrays of the distributed information which is constantly changing spatial. The solution of this task requires application of modern remote methods of a research of the perspective oil-and-gas territories, complex use of materials remote, nondestructive the environment of geologic-geophysical and space methods of sounding of Earth and the most perfect technologies of their handling. In the article, the authors considered experience of handling of geologic-geophysical information by means of computer systems by the Russian and foreign companies. Conclusions that the multidimensional analysis of geologicgeophysical information space, effective planning and monitoring of exploration works requires broad use of geoinformation technologies as one of the most perspective directions in achievement of high profitability of an oil and gas industry are drawn.

  5. Modeling Microalgae Productivity in Industrial-Scale Vertical Flat Panel Photobioreactors.

    PubMed

    Endres, Christian H; Roth, Arne; Brück, Thomas B

    2018-05-01

    Potentially achievable biomass yields are a decisive performance indicator for the economic viability of mass cultivation of microalgae. In this study, a computer model has been developed and applied to estimate the productivity of microalgae for large-scale outdoor cultivation in vertical flat panel photobioreactors. Algae growth is determined based on simulations of the reactor temperature and light distribution. Site-specific weather and irradiation data are used for annual yield estimations in six climate zones. Shading and reflections between opposing panels and between panels and the ground are dynamically computed based on the reactor geometry and the position of the sun. The results indicate that thin panels (≤0.05 m) are best suited for the assumed cell density of 2 g L -1 and that reactor panels should face in north-south direction. Panel spacings of 0.4-0.75 m at a panel height of 1 m appear most suitable for commercial applications. Under these preconditions, yields of around 10 kg m -2 a -1 are possible for most locations in the U.S. Only in hot climates significantly lower yields have to be expected, as extreme reactor temperatures limit overall productivity.

  6. An information system for epidemiology based on a computer-based medical record.

    PubMed

    Verdier, C; Flory, A

    1994-12-01

    A new way is presented to build an information system addressed to problems in epidemiology. Based on our analysis of current and future requirements, a system is proposed which allows for collection, organization and distribution of data within a computer network. In this application, two broad communities of users-physicians and epidemiologists-can be identified, each with their own perspectives and goals. The different requirements of each community lead us to a client-service centered architecture which provides the functionality requirements of the two groups. The resulting physician workstation provides help for recording and querying medical information about patients and from a pharmacological database. All information is classified and coded in order to be retrieved for pharmaco-economic studies. The service center receives information from physician workstations and permits organizations that are in charge of statistical studies to work with "real" data recorded during patient encounters. This leads to a new approach in epidemiology. Studies can be carried out with a more efficient data acquisition. For modelling the information system, we use an object-oriented approach. We have observed that the object-oriented representation, particularly its concepts of generalization, aggregation and encapsulation, are very usable for our problem.

  7. Task allocation in a distributed computing system

    NASA Technical Reports Server (NTRS)

    Seward, Walter D.

    1987-01-01

    A conceptual framework is examined for task allocation in distributed systems. Application and computing system parameters critical to task allocation decision processes are discussed. Task allocation techniques are addressed which focus on achieving a balance in the load distribution among the system's processors. Equalization of computing load among the processing elements is the goal. Examples of system performance are presented for specific applications. Both static and dynamic allocation of tasks are considered and system performance is evaluated using different task allocation methodologies.

  8. Distributed Training for the Reserve Component: Course Conversion and Implementation Guidelines for Computer Conferencing

    DTIC Science & Technology

    1990-08-01

    Computer Conferencing ’ DTIC •ELECTEM. b ~Novo JIML 0*- B August 1990 Field Element al Boise, Idaho Field Unit at Fort Knox, Kentucky Training Resecarch...Distributed Training for the Reserve Component: Course Conversion and implementation Guidelines for Computer (onferencing _________________ __________ 12...identify by block number) FIELD GROUP SUB-GROT;W Asynchironous computer conferencing ’rt i1inimg technology _____ 1Reserve Component jtr ibuted

  9. Design and implementation of a UNIX based distributed computing system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Love, J.S.; Michael, M.W.

    1994-12-31

    We have designed, implemented, and are running a corporate-wide distributed processing batch queue on a large number of networked workstations using the UNIX{reg_sign} operating system. Atlas Wireline researchers and scientists have used the system for over a year. The large increase in available computer power has greatly reduced the time required for nuclear and electromagnetic tool modeling. Use of remote distributed computing has simultaneously reduced computation costs and increased usable computer time. The system integrates equipment from different manufacturers, using various CPU architectures, distinct operating system revisions, and even multiple processors per machine. Various differences between the machines have tomore » be accounted for in the master scheduler. These differences include shells, command sets, swap spaces, memory sizes, CPU sizes, and OS revision levels. Remote processing across a network must be performed in a manner that is seamless from the users` perspective. The system currently uses IBM RISC System/6000{reg_sign}, SPARCstation{sup TM}, HP9000s700, HP9000s800, and DEC Alpha AXP{sup TM} machines. Each CPU in the network has its own speed rating, allowed working hours, and workload parameters. The system if designed so that all of the computers in the network can be optimally scheduled without adversely impacting the primary users of the machines. The increase in the total usable computational capacity by means of distributed batch computing can change corporate computing strategy. The integration of disparate computer platforms eliminates the need to buy one type of computer for computations, another for graphics, and yet another for day-to-day operations. It might be possible, for example, to meet all research and engineering computing needs with existing networked computers.« less

  10. Managing Watersheds as Couple Human-Natural Systems: A Review of Research Opportunities

    NASA Astrophysics Data System (ADS)

    Cai, X.

    2011-12-01

    Many watersheds around the world are impaired with severe social and environmental problems due to heavy anthropogenic stresses. Humans have transformed hydrological and biochemical processes in watersheds from a stationary to non-stationary status through direct (e.g., water withdrawals) and indirect (e.g., altering vegetation and land cover) interferences. It has been found that in many watersheds that socio-economic drivers, which have caused increasingly intensive alteration of natural processes, have even overcome natural variability to become the dominant factor affecting the behavior of watershed systems. Reversing this trend requires an understanding of the drivers of this intensification trajectory, and needs tremendous policy reform and investment. As stressed by several recent National Research Council (NRC) reports, watershed management will pose an enormous challenge in the coming decades. Correspondingly, the focus of research has started an evolution from the management of reservoir, stormwater and aquifer systems to the management of integrated watershed systems, to which policy instruments designed to make more rational economic use of water resources are likely to be applied. To provide a few examples: reservoir operation studies have moved from a local to a watershed scale in order to consider upstream best management practices in soil conservation and erosion control and downstream ecological flow requirements and water rights; watersheds have been modeled as integrated hydrologic-economic systems with multidisciplinary modeling efforts, instead of traditional isolated physical systems. Today's watershed management calls for a re-definition of watersheds from isolated natural systems to coupled human-natural systems (CHNS), which are characterized by the interactions between human activities and natural processes, crossing various spatial and temporal scales within the context of a watershed. The importance of the conceptual innovation has been evidenced by 1) institutional innovation for integrated watershed management; 2) real-world management practices involving multidisciplinary expertise; 3) growing role of economics in systems analysis; 4) enhanced research programs such as the CHNS program and Water, Sustainability and Climate (WSC) program at the US National Science Foundation (NSF). Furthermore, recent scientific and technological developments are expected to accommodate integrated watershed system analysis approaches, such as: 1) increasing availability of distributed digital datasets especially from remote sensing products (e.g. digital watersheds); 2) distributed and semi-distributed watershed hydrologic modeling; 3) enhanced hydroclimatic monitoring and forecast; 4) identified evidences of vulnerability and threshold behavior of watersheds; and 5) continuing improvements in computational and optimization algorithms. Managing watersheds as CHNS will be critical for watershed sustainability, which ensures that human societies will benefit forever from the watershed through development of harmonious relationships between human and natural systems. This presentation will provide a review of the research opportunities that take advantage of the concept of CHNS and associated scientific, technological and institutional innovations/developments.

  11. Rural poverty and environmental degradation in the Philippines: A system dynamics approach

    NASA Astrophysics Data System (ADS)

    Parayno, Phares Penuliar

    Poverty among the small cultivators in the Philippines remains widespread despite a general increase in per capita income during the last three decades. At the same time, the degradation of agricultural land resources, as sources of daily subsistence for the rural workers, is progressing. Past policy studies on the alleviation of rural poverty in the developing countries have centered on the issue of increasing food production and expanding economic growth but gave little attention to the issue of constraints imposed by degradation of agricultural land resources. Only in recent years have there been increasing focus on the relationship between rural poverty and environmental degradation. Inquiry is, however, often done by simplistic one way causal relationships which, although often illuminating, does not provide a comprehensive understanding of the different interacting processes that create rural poverty and land degradation. Thus, policies ensuing from such analyses provide only short-term gains without effecting lasting improvement in the living conditions of the small cultivators. This dissertation examines the complex interrelationships between rural poverty and land degradation and attempts to explain the inefficacy of broad development programs implemented in alleviating rural poverty and reversing deterioration of land resources. The study uses the case of the Philippines for empirical validation. The analysis employs computer simulation experiments with a system dynamics model of a developing economy consisting of an agricultural sector whose microstructure incorporates processes influencing: agricultural production; disbursement of income; changes in the quality of agricultural land resources; demographic behavior; and rural-urban transfer of real and monetary resources. The system dynamics model used in this study extends the wage and income distribution model of Saeed (1988) by adding to it decision structures concerning changes in the quality of agricultural land resources and rural-urban interaction. The study concludes that development programs advancing growth in agricultural production and providing technological, organizational, and financial assistance to target poor groups would not deliver long-term improvement in the economic conditions of the poor peasants unless distribution of land is altered. Similarly, policies promoting land improvement and conservation measures in an economic environment where land ownership remains skewed do not produce lasting betterment of agricultural land quality. It has been shown that a policy, which discourages the separation of land ownership from cultivatorship by imposing a tax on income accrued from absentee ownership, is therefore very critical in promoting land ownership among small cultivators and changing unequal land and income distribution. However, in order to sustain the improvement in the economic and environmental conditions of the small cultivators, this policy of taxing rent income must be complemented by policies that: (1) promote increases in agricultural production; (2) provide technological, organizational, and financial assistance to the small farmers; and (3) promote land improvement measures.

  12. Economic Inequality in Presenting Vision in Shahroud, Iran: Two Decomposition Methods.

    PubMed

    Mansouri, Asieh; Emamian, Mohammad Hassan; Zeraati, Hojjat; Hashemi, Hasan; Fotouhi, Akbar

    2017-04-22

    Visual acuity, like many other health-related problems, does not have an equal distribution in terms of socio-economic factors. We conducted this study to estimate and decompose economic inequality in presenting visual acuity using two methods and to compare their results in a population aged 40-64 years in Shahroud, Iran. The data of 5188 participants in the first phase of the Shahroud Cohort Eye Study, performed in 2009, were used for this study. Our outcome variable was presenting vision acuity (PVA) that was measured using LogMAR (logarithm of the minimum angle of resolution). The living standard variable used for estimation of inequality was the economic status and was constructed by principal component analysis on home assets. Inequality indices were concentration index and the gap between low and high economic groups. We decomposed these indices by the concentration index and BlinderOaxaca decomposition approaches respectively and compared the results. The concentration index of PVA was -0.245 (95% CI: -0.278, -0.212). The PVA gap between groups with a high and low economic status was 0.0705 and was in favor of the high economic group. Education, economic status, and age were the most important contributors of inequality in both concentration index and Blinder-Oaxaca decomposition. Percent contribution of these three factors in the concentration index and Blinder-Oaxaca decomposition was 41.1% vs. 43.4%, 25.4% vs. 19.1% and 15.2% vs. 16.2%, respectively. Other factors including gender, marital status, employment status and diabetes had minor contributions. This study showed that individuals with poorer visual acuity were more concentrated among people with a lower economic status. The main contributors of this inequality were similar in concentration index and Blinder-Oaxaca decomposition. So, it can be concluded that setting appropriate interventions to promote the literacy and income level in people with low economic status, formulating policies to address economic problems in the elderly, and paying more attention to their vision problems can help to alleviate economic inequality in visual acuity. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited

  13. Bivariate normal, conditional and rectangular probabilities: A computer program with applications

    NASA Technical Reports Server (NTRS)

    Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.

    1980-01-01

    Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.

  14. Collaborative Strategic Board Games as a Site for Distributed Computational Thinking

    ERIC Educational Resources Information Center

    Berland, Matthew; Lee, Victor R.

    2011-01-01

    This paper examines the idea that contemporary strategic board games represent an informal, interactional context in which complex computational thinking takes place. When games are collaborative--that is, a game requires that players work in joint pursuit of a shared goal--the computational thinking is easily observed as distributed across…

  15. VLab: A Science Gateway for Distributed First Principles Calculations in Heterogeneous High Performance Computing Systems

    ERIC Educational Resources Information Center

    da Silveira, Pedro Rodrigo Castro

    2014-01-01

    This thesis describes the development and deployment of a cyberinfrastructure for distributed high-throughput computations of materials properties at high pressures and/or temperatures--the Virtual Laboratory for Earth and Planetary Materials--VLab. VLab was developed to leverage the aggregated computational power of grid systems to solve…

  16. Bringing the CMS distributed computing system into scalable operations

    NASA Astrophysics Data System (ADS)

    Belforte, S.; Fanfani, A.; Fisk, I.; Flix, J.; Hernández, J. M.; Kress, T.; Letts, J.; Magini, N.; Miccio, V.; Sciabà, A.

    2010-04-01

    Establishing efficient and scalable operations of the CMS distributed computing system critically relies on the proper integration, commissioning and scale testing of the data and workload management tools, the various computing workflows and the underlying computing infrastructure, located at more than 50 computing centres worldwide and interconnected by the Worldwide LHC Computing Grid. Computing challenges periodically undertaken by CMS in the past years with increasing scale and complexity have revealed the need for a sustained effort on computing integration and commissioning activities. The Processing and Data Access (PADA) Task Force was established at the beginning of 2008 within the CMS Computing Program with the mandate of validating the infrastructure for organized processing and user analysis including the sites and the workload and data management tools, validating the distributed production system by performing functionality, reliability and scale tests, helping sites to commission, configure and optimize the networking and storage through scale testing data transfers and data processing, and improving the efficiency of accessing data across the CMS computing system from global transfers to local access. This contribution reports on the tools and procedures developed by CMS for computing commissioning and scale testing as well as the improvements accomplished towards efficient, reliable and scalable computing operations. The activities include the development and operation of load generators for job submission and data transfers with the aim of stressing the experiment and Grid data management and workload management systems, site commissioning procedures and tools to monitor and improve site availability and reliability, as well as activities targeted to the commissioning of the distributed production, user analysis and monitoring systems.

  17. Distributed Optimal Consensus Over Resource Allocation Network and Its Application to Dynamical Economic Dispatch.

    PubMed

    Li, Chaojie; Yu, Xinghuo; Huang, Tingwen; He, Xing; Chaojie Li; Xinghuo Yu; Tingwen Huang; Xing He; Li, Chaojie; Huang, Tingwen; He, Xing; Yu, Xinghuo

    2018-06-01

    The resource allocation problem is studied and reformulated by a distributed interior point method via a -logarithmic barrier. By the facilitation of the graph Laplacian, a fully distributed continuous-time multiagent system is developed for solving the problem. Specifically, to avoid high singularity of the -logarithmic barrier at boundary, an adaptive parameter switching strategy is introduced into this dynamical multiagent system. The convergence rate of the distributed algorithm is obtained. Moreover, a novel distributed primal-dual dynamical multiagent system is designed in a smart grid scenario to seek the saddle point of dynamical economic dispatch, which coincides with the optimal solution. The dual decomposition technique is applied to transform the optimization problem into easily solvable resource allocation subproblems with local inequality constraints. The good performance of the new dynamical systems is, respectively, verified by a numerical example and the IEEE six-bus test system-based simulations.

  18. Observed oil and gas field size distributions: A consequence of the discovery process and prices of oil and gas

    USGS Publications Warehouse

    Drew, L.J.; Attanasi, E.D.; Schuenemeyer, J.H.

    1988-01-01

    If observed oil and gas field size distributions are obtained by random samplings, the fitted distributions should approximate that of the parent population of oil and gas fields. However, empirical evidence strongly suggests that larger fields tend to be discovered earlier in the discovery process than they would be by random sampling. Economic factors also can limit the number of small fields that are developed and reported. This paper examines observed size distributions in state and federal waters of offshore Texas. Results of the analysis demonstrate how the shape of the observable size distributions change with significant hydrocarbon price changes. Comparison of state and federal observed size distributions in the offshore area shows how production cost differences also affect the shape of the observed size distribution. Methods for modifying the discovery rate estimation procedures when economic factors significantly affect the discovery sequence are presented. A primary conclusion of the analysis is that, because hydrocarbon price changes can significantly affect the observed discovery size distribution, one should not be confident about inferring the form and specific parameters of the parent field size distribution from the observed distributions. ?? 1988 International Association for Mathematical Geology.

  19. Rapid Analysis of Mass Distribution of Radiation Shielding

    NASA Technical Reports Server (NTRS)

    Zapp, Edward

    2007-01-01

    Radiation Shielding Evaluation Toolset (RADSET) is a computer program that rapidly calculates the spatial distribution of mass of an arbitrary structure for use in ray-tracing analysis of the radiation-shielding properties of the structure. RADSET was written to be used in conjunction with unmodified commercial computer-aided design (CAD) software that provides access to data on the structure and generates selected three-dimensional-appearing views of the structure. RADSET obtains raw geometric, material, and mass data on the structure from the CAD software. From these data, RADSET calculates the distribution(s) of the masses of specific materials about any user-specified point(s). The results of these mass-distribution calculations are imported back into the CAD computing environment, wherein the radiation-shielding calculations are performed.

  20. Shortfall online: The development of an educational computer game for teaching sustainable engineering to Millennial Generation students

    NASA Astrophysics Data System (ADS)

    Gennett, Zachary Andrew

    Millennial Generation students bring significant learning and teaching challenges to the classroom, because of their unique learning styles, breadth of interests related to social and environmental issues, and intimate experiences with technology. As a result, there has been an increased willingness at many universities to experiment with pedagogical strategies that depart from a traditional "learning by listening" model, and move toward more innovative methods involving active learning through computer games. In particular, current students typically express a strong interest in sustainability in which economic concerns must be weighed relative to environmental and social responsibilities. A game-based setting could prove very effective for fostering an operational understanding of these tradeoffs, and especially the social dimension which remains largely underdeveloped relative to the economic and environmental aspects. Through an examination of the educational potential of computer games, this study hypothesizes that to acquire the skills necessary to manage and understand the complexities of sustainability, Millennial Generation students must be engaged in active learning exercises that present dynamic problems and foster a high level of social interaction. This has led to the development of an educational computer game, entitled Shortfall, which simulates a business milieu for testing alternative paths regarding the principles of sustainability. This study examines the evolution of Shortfall from an educational board game that teaches the principles of environmentally benign manufacturing, to a completely networked computer game, entitled Shortfall Online that teaches the principles of sustainability. A capital-based theory of sustainability is adopted to more accurately convey the tradeoffs and opportunity costs among economic prosperity, environmental preservation, and societal responsibilities. While the economic and environmental aspects of sustainability have received considerable attention in traditional pedagogical approaches, specific focus is provided for the social dimension of sustainability, as it had remained largely underdeveloped. To measure social sustainability and provide students with an understanding of its significance, a prospective metric utilizing a social capital peer-evaluation survey, unique to Shortfall, is developed.

Top