NASA Astrophysics Data System (ADS)
Doyle, Chris
2014-01-01
The Vancouver 2010 Winter Olympics were held from 12 to 28 February 2010, and the Paralympic events followed 2 weeks later. During the Games, the weather posed a grave threat to the viability of one venue and created significant complications for the event schedule at others. Forecasts of weather with lead times ranging from minutes to days helped organizers minimize disruptions to sporting events and helped ensure all medal events were successfully completed. Of comparable importance, however, were the scenarios and forecasts of probable weather for the winter in advance of the Games. Forecasts of mild conditions at the time of the Games helped the Games' organizers mitigate what would have been very serious potential consequences for at least one venue. Snowmaking was one strategy employed well in advance of the Games to prepare for the expected conditions. This short study will focus on how operational decisions were made by the Games' organizers on the basis of both climatological and snowmaking forecasts during the pre-Games winter. An attempt will be made to quantify, economically, the value of some of the snowmaking forecasts made for the Games' operators. The results obtained indicate that although the economic value of the snowmaking forecast was difficult to determine, the Games' organizers valued the forecast information greatly. This suggests that further development of probabilistic forecasts for applications like pre-Games snowmaking would be worthwhile.
Quantum Bayesian networks with application to games displaying Parrondo's paradox
NASA Astrophysics Data System (ADS)
Pejic, Michael
Bayesian networks and their accompanying graphical models are widely used for prediction and analysis across many disciplines. We will reformulate these in terms of linear maps. This reformulation will suggest a natural extension, which we will show is equivalent to standard textbook quantum mechanics. Therefore, this extension will be termed quantum. However, the term quantum should not be taken to imply this extension is necessarily only of utility in situations traditionally thought of as in the domain of quantum mechanics. In principle, it may be employed in any modelling situation, say forecasting the weather or the stock market---it is up to experiment to determine if this extension is useful in practice. Even restricting to the domain of quantum mechanics, with this new formulation the advantages of Bayesian networks can be maintained for models incorporating quantum and mixed classical-quantum behavior. The use of these will be illustrated by various basic examples. Parrondo's paradox refers to the situation where two, multi-round games with a fixed winning criteria, both with probability greater than one-half for one player to win, are combined. Using a possibly biased coin to determine the rule to employ for each round, paradoxically, the previously losing player now wins the combined game with probabilitygreater than one-half. Using the extended Bayesian networks, we will formulate and analyze classical observed, classical hidden, and quantum versions of a game that displays this paradox, finding bounds for the discrepancy from naive expectations for the occurrence of the paradox. A quantum paradox inspired by Parrondo's paradox will also be analyzed. We will prove a bound for the discrepancy from naive expectations for this paradox as well. Games involving quantum walks that achieve this bound will be presented.
Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game
NASA Astrophysics Data System (ADS)
Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian
2016-08-01
Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.
Beat the Instructor: An Introductory Forecasting Game
ERIC Educational Resources Information Center
Snider, Brent R.; Eliasson, Janice B.
2013-01-01
This teaching brief describes a 30-minute game where student groups compete in-class in an introductory time-series forecasting exercise. The students are challenged to "beat the instructor" who competes using forecasting techniques that will be subsequently taught. All forecasts are graphed prior to revealing the randomly generated…
What Information Theory Says about Bounded Rational Best Response
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2005-01-01
Probability Collectives (PC) provides the information-theoretic extension of conventional full-rationality game theory to bounded rational games. Here an explicit solution to the equations giving the bounded rationality equilibrium of a game is presented. Then PC is used to investigate games in which the players use bounded rational best-response strategies. Next it is shown that in the continuum-time limit, bounded rational best response games result in a variant of the replicator dynamics of evolutionary game theory. It is then shown that for team (shared-payoff) games, this variant of replicator dynamics is identical to Newton-Raphson iterative optimization of the shared utility function.
Improved Lower Bounds on the Price of Stability of Undirected Network Design Games
NASA Astrophysics Data System (ADS)
Bilò, Vittorio; Caragiannis, Ioannis; Fanelli, Angelo; Monaco, Gianpiero
Bounding the price of stability of undirected network design games with fair cost allocation is a challenging open problem in the Algorithmic Game Theory research agenda. Even though the generalization of such games in directed networks is well understood in terms of the price of stability (it is exactly H n , the n-th harmonic number, for games with n players), far less is known for network design games in undirected networks. The upper bound carries over to this case as well while the best known lower bound is 42/23 ≈ 1.826. For more restricted but interesting variants of such games such as broadcast and multicast games, sublogarithmic upper bounds are known while the best known lower bound is 12/7 ≈ 1.714. In the current paper, we improve the lower bounds as follows. We break the psychological barrier of 2 by showing that the price of stability of undirected network design games is at least 348/155 ≈ 2.245. Our proof uses a recursive construction of a network design game with a simple gadget as the main building block. For broadcast and multicast games, we present new lower bounds of 20/11 ≈ 1.818 and 1.862, respectively.
Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game
NASA Astrophysics Data System (ADS)
Arnal, Louise; Ramos, Maria-Helena; Coughlan, Erin; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk-Jan; Pappenberger, Florian
2016-04-01
Forecast uncertainty is a twofold issue, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic forecasts over deterministic forecasts for a diversity of activities in the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. The setup and the results of a risk-based decision-making experiment, designed as a game on the topic of flood protection mitigation, called ``How much are you prepared to pay for a forecast?'', will be presented. The game was played at several workshops in 2015, including during this session at the EGU conference in 2015, and a total of 129 worksheets were collected and analysed. The aim of this experiment was to contribute to the understanding of the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game showed that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers. Balancing avoided costs and the cost (or the benefit) of having forecasts available for making decisions is not straightforward, even in a simplified game situation, and is a topic that deserves more attention from the hydrological forecasting community in the future.
How much are you prepared to PAY for a forecast?
NASA Astrophysics Data System (ADS)
Arnal, Louise; Coughlan, Erin; Ramos, Maria-Helena; Pappenberger, Florian; Wetterhall, Fredrik; Bachofen, Carina; van Andel, Schalk Jan
2015-04-01
Probabilistic hydro-meteorological forecasts are a crucial element of the decision-making chain in the field of flood prevention. The operational use of probabilistic forecasts is increasingly promoted through the development of new novel state-of-the-art forecast methods and numerical skill is continuously increasing. However, the value of such forecasts for flood early-warning systems is a topic of diverging opinions. Indeed, the word value, when applied to flood forecasting, is multifaceted. It refers, not only to the raw cost of acquiring and maintaining a probabilistic forecasting system (in terms of human and financial resources, data volume and computational time), but also and most importantly perhaps, to the use of such products. This game aims at investigating this point. It is a willingness to pay game, embedded in a risk-based decision-making experiment. Based on a ``Red Cross/Red Crescent, Climate Centre'' game, it is a contribution to the international Hydrologic Ensemble Prediction Experiment (HEPEX). A limited number of probabilistic forecasts will be auctioned to the participants; the price of these forecasts being market driven. All participants (irrespective of having bought or not a forecast set) will then be taken through a decision-making process to issue warnings for extreme rainfall. This game will promote discussions around the topic of the value of forecasts for decision-making in the field of flood prevention.
Pathways to designing and running an operational flood forecasting system: an adventure game!
NASA Astrophysics Data System (ADS)
Arnal, Louise; Pappenberger, Florian; Ramos, Maria-Helena; Cloke, Hannah; Crochemore, Louise; Giuliani, Matteo; Aalbers, Emma
2017-04-01
In the design and building of an operational flood forecasting system, a large number of decisions have to be taken. These include technical decisions related to the choice of the meteorological forecasts to be used as input to the hydrological model, the choice of the hydrological model itself (its structure and parameters), the selection of a data assimilation procedure to run in real-time, the use (or not) of a post-processor, and the computing environment to run the models and display the outputs. Additionally, a number of trans-disciplinary decisions are also involved in the process, such as the way the needs of the users will be considered in the modelling setup and how the forecasts (and their quality) will be efficiently communicated to ensure usefulness and build confidence in the forecasting system. We propose to reflect on the numerous, alternative pathways to designing and running an operational flood forecasting system through an adventure game. In this game, the player is the protagonist of an interactive story driven by challenges, exploration and problem-solving. For this presentation, you will have a chance to play this game, acting as the leader of a forecasting team at an operational centre. Your role is to manage the actions of your team and make sequential decisions that impact the design and running of the system in preparation to and during a flood event, and that deal with the consequences of the forecasts issued. Your actions are evaluated by how much they cost you in time, money and credibility. Your aim is to take decisions that will ultimately lead to a good balance between time and money spent, while keeping your credibility high over the whole process. This game was designed to highlight the complexities behind decision-making in an operational forecasting and emergency response context, in terms of the variety of pathways that can be selected as well as the timescale, cost and timing of effective actions.
The Emergence of Simulation and Gaming.
ERIC Educational Resources Information Center
Becker, Henk A.
1980-01-01
Describes the historical and international development of simulation and gaming in terms of simulation as analytical models, and games as communicative models; and forecasts possible futures of simulation and gaming. (CMV)
Complexity, Heuristic, and Search Analysis for the Games of Crossings and Epaminondas
2014-03-27
research in Artifical Intelligence (Section 2.1) and why games are studied (Section 2.2). Section 2.3 discusses how games are played and solved. An...5 2.1 Games in Artificial Intelligence . . . . . . . . . . . . . . . . . . . . . . . . 5 2.2 Game Study...Artificial Intelligence UCT Upper Confidence Bounds applied to Trees HUCT Heuristic Guided UCT LOA Lines of Action UCB Upper Confidence Bound RAVE Rapid
NASA Astrophysics Data System (ADS)
Seabra, M.; Gonçalves, P.; Braga, A.; Raposo, R.; Ito, E.; Gadelha, A.; Dallantonia, A.
2008-05-01
The XV Pan-American Games were organized in Rio de Janeiro city during 13 to 29 July, 2007 with a participation of 5.662 athletes of 42 countries . The Ministry of Sports requested INMET to provide meteorological support to the games, with the exception of the water sports only, which fell under the responsibility of the Brazilian Navy. The meteorological activities should follow the same pattern experienced during the Olympic Games of Sydney in Australia in the year of 2000, and of Athens in Greece in 2004, with a forecast center entirely dedicated to the event. NMET developed a website with detailed information oriented to the athletes and organizing committee and to the general public. The homepage had 3 different option of idioms (Portuguese, English and Spanish). After choosing the idiom, the user could consult the meteorological data, to each competition place, and to the Pan- American Village, every 15 minutes, containing weather forecast bulletin, daily synoptic analysis, the last 10 satellite image and meteograms. Besides observed data verified "in situ" INMET supplied forecast generated by High Resolution Model (MBAR) with 7km grid resolution especially set up for the games. INMET installed 7 automatic meteorological stations near the competition places, which supplied temperature , relative humidity , atmospheric pressure, wind (direction and intensity), radiation and precipitation every 15 minutes. Those information were relayed by satellite to INMET headquarters located in Brasília and soon after they were published in the website. To help the Brazilian Olympic Committee - COB, the athletes, their technical commission and the public in general, meteorological bulletins were emitted daily. The forecast was done together with the Navy and also with INMET's 6th District located in Rio de Janeiro, and responsible for the forecast statewide. This forecast was then placed at the INMET's website. Both the 3 days weather forecast and Meteorological Alert were emitted in Portuguese, English and Spanish, and sent to the INMET homepage, organizing committee, and specific area (intranet) accessed only by the athletes and technical commissions. Direct interaction with the game organizers allowed for a more efficient and precise decision-making process regarding meteorological effects in some sport modalities. As an example we can mention the fact that during the women marathon competition a low humidity alert was forecasted and the organizers took care to increase hydratation to prevent problems. INMET's participation during the XVth Pan-American Games, which took place in Rio de Janeiro in July 2007, represented a good opportunity for the institute to provide a tailor-made short range forecast with specific application. INMET's performance was recognized by the organizing committee and the occasion helped to divulge products and services provided by the institution.
Communicating Uncertainty in Volcanic Ash Forecasts: Decision-Making and Information Preferences
NASA Astrophysics Data System (ADS)
Mulder, Kelsey; Black, Alison; Charlton-Perez, Andrew; McCloy, Rachel; Lickiss, Matthew
2016-04-01
The Robust Assessment and Communication of Environmental Risk (RACER) consortium, an interdisciplinary research team focusing on communication of uncertainty with respect to natural hazards, hosted a Volcanic Ash Workshop to discuss issues related to volcanic ash forecasting, especially forecast uncertainty. Part of the workshop was a decision game in which participants including forecasters, academics, and members of the Aviation Industry were given hypothetical volcanic ash concentration forecasts and asked whether they would approve a given flight path. The uncertainty information was presented in different formats including hazard maps, line graphs, and percent probabilities. Results from the decision game will be presented with a focus on information preferences, understanding of the forecasts, and whether different formats of the same volcanic ash forecast resulted in different flight decisions. Implications of this research will help the design and presentation of volcanic ash plume decision tools and can also help advise design of other natural hazard information.
What Information Theory Says About Best Response and About Binding Contracts
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2004-01-01
Product Distribution (PD) theory is the information-theoretic extension of conventional full- rationality game theory to bounded rational games. Here PD theory is used to investigate games in which the players use bounded rational best-response strategies. This investigation illuminates how to determine the optimal organization chart for a corporation, or more generally how to order the sequence of moves of the players / employees so as to optimize an overall objective function. It is then shown that in the continuum-time limit, bounded rational best response games result in a variant of the replicator dynamics of evolutionary game theory. This variant is then investigated for team games, in which the players share the same utility function, by showing that such continuum- limit bounded rational best response is identical to Newton-Raphson iterative optimization of the shared utility function. Next PD theory is used to investigate changing the coordinate system of the game, i.e., changing the mapping from the joint move of the players to the arguments in the utility functions. Such a change couples those arguments, essentially by making each players move be an offered binding contract.
The game of making decisions under uncertainty: How sure must one be?
NASA Astrophysics Data System (ADS)
Werner, Micha; Verkade, Jan; Wetterhall, Fredrik; van Andel, Schalk-Jan; Ramos, Maria-Helena
2016-04-01
Probabilistic hydrometeorological forecasting is now widely accepted to be more skillful than deterministic forecasts, and is increasingly being integrated into operational practice. Provided they are reliable and unbiased, probabilistic forecasts have the advantage that they give decision maker not only the forecast value, but also the uncertainty associated to that prediction. Though that information provides more insight, it does now leave the forecaster/decision maker with the challenge of deciding at what level of probability of a threshold being exceeded the decision to act should be taken. According to the cost-loss theory, that probability should be related to the impact of the threshold being exceeded. However, it is not entirely clear how easy it is for decision makers to follow that rule, even when the impact of a threshold being exceeded, and the actions to choose from are known. To continue the tradition in the "Ensemble Hydrometeorological Forecast" session, we will address the challenge of making decisions based on probabilistic forecasts through a game to be played with the audience. We will explore how decisions made differ depending on the known impacts of the forecasted events. Participants will be divided into a number of groups with differing levels of impact, and will be faced with a number of forecast situations. They will be asked to make decisions and record the consequence of those decisions. A discussion of the differences in the decisions made will be presented at the end of the game, with a fuller analysis later posted on the HEPEX web site blog (www.hepex.org).
Wind Forecasting for Yacht Racing at the 1991 Pan American Games.
NASA Astrophysics Data System (ADS)
Powell, Mark D.
1993-01-01
The U.S. Sailing Team competed successfully at the 1991 Pan American Games despite having no previous experience with the sailing conditions off Havana, Cuba. One of the key factors in the team's success was meteorological support in the form of wind climate analysis; application of sea breeze forecasting typical of the south Florida area, modified by tropical weather systems; and effective preregatta briefing.
Algorithms for Differential Games with Bounded Control and States.
1982-03-01
D-R124 642 ALGORITHMS FOR DIFFERENTIAL GAMES WI1TH BOUNDED CONTROL 1/2 AND STATES(U) CALIFORNIA UNIV LOS ANGELES SCHOOL OF ENGINEERING AND APPLIED...RECIPILNT’S CATALOG NUMBER None ~_________ TITLE (end Subtitle) S. TYPE OF REPORT P ERIOD COVERED ALGORITHMS FOR DIFFERENTIAL GAMES WITH Final, 11/29/79-11/28...problems are probably the most natural application of differential game theory and have been treated by many authors as such. Very few problems of this
Weighing costs and losses: A decision making game using probabilistic forecasts
NASA Astrophysics Data System (ADS)
Werner, Micha; Ramos, Maria-Helena; Wetterhall, Frederik; Cranston, Michael; van Andel, Schalk-Jan; Pappenberger, Florian; Verkade, Jan
2017-04-01
Probabilistic forecasts are increasingly recognised as an effective and reliable tool to communicate uncertainties. The economic value of probabilistic forecasts has been demonstrated by several authors, showing the benefit to using probabilistic forecasts over deterministic forecasts in several sectors, including flood and drought warning, hydropower, and agriculture. Probabilistic forecasting is also central to the emerging concept of risk-based decision making, and underlies emerging paradigms such as impact-based forecasting. Although the economic value of probabilistic forecasts is easily demonstrated in academic works, its evaluation in practice is more complex. The practical use of probabilistic forecasts requires decision makers to weigh the cost of an appropriate response to a probabilistic warning against the projected loss that would occur if the event forecast becomes reality. In this paper, we present the results of a simple game that aims to explore how decision makers are influenced by the costs required for taking a response and the potential losses they face in case the forecast flood event occurs. Participants play the role of one of three possible different shop owners. Each type of shop has losses of quite different magnitude, should a flood event occur. The shop owners are presented with several forecasts, each with a probability of a flood event occurring, which would inundate their shop and lead to those losses. In response, they have to decide if they want to do nothing, raise temporary defences, or relocate their inventory. Each action comes at a cost; and the different shop owners therefore have quite different cost/loss ratios. The game was played on four occasions. Players were attendees of the ensemble hydro-meteorological forecasting session of the 2016 EGU Assembly, professionals participating at two other conferences related to hydrometeorology, and a group of students. All audiences were familiar with the principles of forecasting and water-related risks, and one of the audiences comprised a group of experts in probabilistic forecasting. Results show that the different shop owners do take the costs of taking action and the potential losses into account in their decisions. Shop owners with a low cost/loss ratio were found to be more inclined to take actions based on the forecasts, though the absolute value of the losses also increased the willingness to take action. Little differentiation was found between the different groups of players.
Finding Bounded Rational Equilibria. Part 1; Iterative Focusing
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2004-01-01
A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights from the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.
Nishiura, Hiroshi
2011-02-16
Real-time forecasting of epidemics, especially those based on a likelihood-based approach, is understudied. This study aimed to develop a simple method that can be used for the real-time epidemic forecasting. A discrete time stochastic model, accounting for demographic stochasticity and conditional measurement, was developed and applied as a case study to the weekly incidence of pandemic influenza (H1N1-2009) in Japan. By imposing a branching process approximation and by assuming the linear growth of cases within each reporting interval, the epidemic curve is predicted using only two parameters. The uncertainty bounds of the forecasts are computed using chains of conditional offspring distributions. The quality of the forecasts made before the epidemic peak appears largely to depend on obtaining valid parameter estimates. The forecasts of both weekly incidence and final epidemic size greatly improved at and after the epidemic peak with all the observed data points falling within the uncertainty bounds. Real-time forecasting using the discrete time stochastic model with its simple computation of the uncertainty bounds was successful. Because of the simplistic model structure, the proposed model has the potential to additionally account for various types of heterogeneity, time-dependent transmission dynamics and epidemiological details. The impact of such complexities on forecasting should be explored when the data become available as part of the disease surveillance.
Game Maturity Model for Health Care.
de Boer, Jan C; Adriani, Paul; van Houwelingen, Jan Willem; Geerts, A
2016-04-01
This article introduces the Game Maturity Model for the healthcare industry as an extension to the general Game Maturity Model and describes the usage by two case studies of applied health games. The Game Maturity Model for healthcare provides a practical and value-adding method to assess existing games and to determine strategic considerations for application of applied health games. Our forecast is that within 5 years the use and development of applied games will have a role in our daily lives and the way we organize health care that will be similar to the role social media has today.
Finding Bounded Rational Equilibria. Part 2; Alternative Lagrangians and Uncountable Move Spaces
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2004-01-01
A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights &om the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.
Three-player conflicting interest games and nonlocality
NASA Astrophysics Data System (ADS)
Bolonek-Lasoń, Katarzyna
2017-08-01
We outline the general construction of three-player games with incomplete information which fulfil the following conditions: (i) symmetry with respect to the permutations of players; (ii) the existence of an upper bound for total payoff resulting from Bell inequalities; (iii) the existence of both fair and unfair Nash equilibria saturating this bound. Conditions (i)-(iii) imply that we are dealing with conflicting interest games. An explicit example of such a game is given. A quantum counterpart of this game is considered. It is obtained by keeping the same utilities but replacing classical advisor by a quantum one. It is shown that the quantum game possesses only fair equilibria with strictly higher payoffs than in the classical case. This implies that quantum nonlocality can be used to resolve the conflict between the players.
2011-01-01
Background Real-time forecasting of epidemics, especially those based on a likelihood-based approach, is understudied. This study aimed to develop a simple method that can be used for the real-time epidemic forecasting. Methods A discrete time stochastic model, accounting for demographic stochasticity and conditional measurement, was developed and applied as a case study to the weekly incidence of pandemic influenza (H1N1-2009) in Japan. By imposing a branching process approximation and by assuming the linear growth of cases within each reporting interval, the epidemic curve is predicted using only two parameters. The uncertainty bounds of the forecasts are computed using chains of conditional offspring distributions. Results The quality of the forecasts made before the epidemic peak appears largely to depend on obtaining valid parameter estimates. The forecasts of both weekly incidence and final epidemic size greatly improved at and after the epidemic peak with all the observed data points falling within the uncertainty bounds. Conclusions Real-time forecasting using the discrete time stochastic model with its simple computation of the uncertainty bounds was successful. Because of the simplistic model structure, the proposed model has the potential to additionally account for various types of heterogeneity, time-dependent transmission dynamics and epidemiological details. The impact of such complexities on forecasting should be explored when the data become available as part of the disease surveillance. PMID:21324153
CARA: Cognitive Architecture for Reasoning About Adversaries
2012-01-20
synthesis approach taken here the KIDS principle (Keep It Descriptive, Stupid ) applies, and agents and organizations are profiled in great detail...developed two algorithms to make forecasts about adversarial behavior. We developed game-theoretical approaches to reason about group behavior. We...to automatically make forecasts about group behavior together with methods to quantify the uncertainty inherent in such forecasts; • Developed
A parimutuel gambling perspective to compare probabilistic seismicity forecasts
NASA Astrophysics Data System (ADS)
Zechar, J. Douglas; Zhuang, Jiancang
2014-10-01
Using analogies to gaming, we consider the problem of comparing multiple probabilistic seismicity forecasts. To measure relative model performance, we suggest a parimutuel gambling perspective which addresses shortcomings of other methods such as likelihood ratio, information gain and Molchan diagrams. We describe two variants of the parimutuel approach for a set of forecasts: head-to-head, in which forecasts are compared in pairs, and round table, in which all forecasts are compared simultaneously. For illustration, we compare the 5-yr forecasts of the Regional Earthquake Likelihood Models experiment for M4.95+ seismicity in California.
Very-short range forecasting system for 2018 Pyeonchang Winter Olympic and Paralympic games
NASA Astrophysics Data System (ADS)
Nam, Ji-Eun; Park, Kyungjeen; Kim, Minyou; Kim, Changhwan; Joo, Sangwon
2016-04-01
The 23rd Olympic Winter and the 13th Paralympic Winter Games will be held in Pyeongchang, Republic of Korea respectively from 9 to 25 February 2018 and from 9 to 18 February 2018. The Korea Meteorological Administration (KMA) and the National Institute for Meteorological Science (NIMS) have the responsibility to provide weather information for the management of the Games and the safety of the public. NIMS will carry out a Forecast Demonstration Project (FDP) and a Research and Development Project (RDP) which will be called ICE-POP 2018. These projects will focus on intensive observation campaigns to understand severe winter weathers over the Pyeongchang region, and the research results from the RDP will be used to improve the accuracy of nowcasting and very short-range forecast systems during the Games. To support these projects, NIMS developed Very-short range Data Assimilation and Prediction System (VDAPS), which is run in real time with 1 hour cycling interval and up to 12 hour forecasts. The domain is covering Korean Peninsular and surrounding seas with 1.5km horizontal resolution. AWS, windprofiler, buoy, sonde, aircraft, scatwinds, and radar radial winds are assimilated by 3DVAR on 3km resolution inner domain. The rain rate is converted into latent heat and initialized via nudging. The visibility data are also assimilated with the addition of aerosol control variable. The experiments results show the improvement in rainfall over south sea of Korean peninsula. In order to reduce excessive rainfalls during first 2 hours due to the reduced cycling interval, the data assimilation algorithm is optimized.
Efficiency and formalism of quantum games
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C.F.; Johnson, Neil F.
We show that quantum games are more efficient than classical games and provide a saturated upper bound for this efficiency. We also demonstrate that the set of finite classical games is a strict subset of the set of finite quantum games. Our analysis is based on a rigorous formulation of quantum games, from which quantum versions of the minimax theorem and the Nash equilibrium theorem can be deduced.
Prediction of stock markets by the evolutionary mix-game model
NASA Astrophysics Data System (ADS)
Chen, Fang; Gou, Chengling; Guo, Xiaoqian; Gao, Jieping
2008-06-01
This paper presents the efforts of using the evolutionary mix-game model, which is a modified form of the agent-based mix-game model, to predict financial time series. Here, we have carried out three methods to improve the original mix-game model by adding the abilities of strategy evolution to agents, and then applying the new model referred to as the evolutionary mix-game model to forecast the Shanghai Stock Exchange Composite Index. The results show that these modifications can improve the accuracy of prediction greatly when proper parameters are chosen.
NASA Astrophysics Data System (ADS)
Terti, G.; Ruin, I.; Kalas, M.; Lorini, V.; Sabbatini, T.; i Alonso, A. C.
2017-12-01
New technologies are currently adopted worldwide to improve weather forecasts and communication of the corresponding warnings to the end-users. "EnhANcing emergency management and response to extreme WeatHER and climate Events" (ANYWHERE) project is an innovating action that aims at developing and implementing a European decision-support platform for weather-related risks integrating cutting-edge forecasting technology. The initiative is built in a collaborative manner where researchers, developers, potential users and other stakeholders meet frequently to define needs, capabilities and challenges. In this study, we propose a role-playing game to test the added value of the ANYWHERE platform on i) the decision-making process and the choice of warning levels under uncertainty, ii) the management of the official emergency response and iii) the crisis communication and triggering of protective actions at different levels of the warning system (from hazard detection to citizen response). The designed game serves as an interactive communication tool. Here, flood and flash flood focused simulations seek to enhance participant's understanding of the complexities and challenges embedded in various levels of the decision-making process under the threat of weather disasters (e.g., forecasting/warnings, official emergency actions, self-protection). Also, we facilitate collaboration and coordination between the participants who belong to different national or local agencies/authorities across Europe. The game is first applied and tested in ANYWHERE's workshop in Helsinki (September, 2017) where about 30-50 people, including researchers, forecasters, civil protection and representatives of related companies, are anticipated to play the simulation. The main idea is to provide to the players a virtual case study that well represents realistic uncertainties and dilemmas embedded in the real-time forecasting-warning processes. At the final debriefing step the participants are encouraged to exchange knowledge, thoughts and insights on their capability or difficulty to decide and communicate their action based on the available information and given constrains. Such feedback will be analyzed and presented and future potentialities for the application of the game will be discussed.
Weather Support for the 2002 Winter Olympic and Paralympic Games.
NASA Astrophysics Data System (ADS)
Horel, J.; Potter, T.; Dunn, L.; Steenburgh, W. J.; Eubank, M.; Splitt, M.; Onton, D. J.
2002-02-01
The 2002 Winter Olympic and Paralympic Games will be hosted by Salt Lake City, Utah, during February-March 2002. Adverse weather during this period may delay sporting events, while snow and ice-covered streets and highways may impede access by the athletes and spectators to the venues. While winter snowstorms and other large-scale weather systems typically have widespread impacts throughout northern Utah, hazardous winter weather is often related to local terrain features (the Wasatch Mountains and Great Salt Lake are the most prominent ones). Examples of such hazardous weather include lake-effect snowstorms, ice fog, gap winds, downslope windstorms, and low visibility over mountain passes.A weather support system has been developed to provide weather information to the athletes, games officials, spectators, and the interested public around the world. This system is managed by the Salt Lake Olympic Committee and relies upon meteorologists from the public, private, and academic sectors of the atmospheric science community. Weather forecasting duties will be led by National Weather Service forecasters and a team of private, weather forecasters organized by KSL, the Salt Lake City NBC television affiliate. Other government agencies, commercial firms, and the University of Utah are providing specialized forecasts and support services for the Olympics. The weather support system developed for the 2002 Winter Olympics is expected to provide long-term benefits to the public through improved understanding,monitoring, and prediction of winter weather in the Intermountain West.
NASA Astrophysics Data System (ADS)
Bailey, Monika E.; Isaac, George A.; Gultepe, Ismail; Heckman, Ivan; Reid, Janti
2014-01-01
An automated short-range forecasting system, adaptive blending of observations and model (ABOM), was tested in real time during the 2010 Vancouver Olympic and Paralympic Winter Games in British Columbia. Data at 1-min time resolution were available from a newly established, dense network of surface observation stations. Climatological data were not available at these new stations. This, combined with output from new high-resolution numerical models, provided a unique and exciting setting to test nowcasting systems in mountainous terrain during winter weather conditions. The ABOM method blends extrapolations in time of recent local observations with numerical weather predictions (NWP) model predictions to generate short-range point forecasts of surface variables out to 6 h. The relative weights of the model forecast and the observation extrapolation are based on performance over recent history. The average performance of ABOM nowcasts during February and March 2010 was evaluated using standard scores and thresholds important for Olympic events. Significant improvements over the model forecasts alone were obtained for continuous variables such as temperature, relative humidity and wind speed. The small improvements to forecasts of variables such as visibility and ceiling, subject to discontinuous changes, are attributed to the persistence component of ABOM.
NASA Astrophysics Data System (ADS)
Mailhot, J.; Milbrandt, J. A.; Giguère, A.; McTaggart-Cowan, R.; Erfani, A.; Denis, B.; Glazer, A.; Vallée, M.
2014-01-01
Environment Canada ran an experimental numerical weather prediction (NWP) system during the Vancouver 2010 Winter Olympic and Paralympic Games, consisting of nested high-resolution (down to 1-km horizontal grid-spacing) configurations of the GEM-LAM model, with improved geophysical fields, cloud microphysics and radiative transfer schemes, and several new diagnostic products such as density of falling snow, visibility, and peak wind gust strength. The performance of this experimental NWP system has been evaluated in these winter conditions over complex terrain using the enhanced mesoscale observing network in place during the Olympics. As compared to the forecasts from the operational regional 15-km GEM model, objective verification generally indicated significant added value of the higher-resolution models for near-surface meteorological variables (wind speed, air temperature, and dewpoint temperature) with the 1-km model providing the best forecast accuracy. Appreciable errors were noted in all models for the forecasts of wind direction and humidity near the surface. Subjective assessment of several cases also indicated that the experimental Olympic system was skillful at forecasting meteorological phenomena at high-resolution, both spatially and temporally, and provided enhanced guidance to the Olympic forecasters in terms of better timing of precipitation phase change, squall line passage, wind flow channeling, and visibility reduction due to fog and snow.
An Overview of NWS Weather Support for the XXVI Olympiad.
NASA Astrophysics Data System (ADS)
Rothfusz, Lans P.; McLaughlin, Melvin R.; Rinard, Stephen K.
1998-05-01
The 1996 Centennial Olympic Games in Atlanta, Georgia, received weather support from the National Weather Service (NWS). The mandate to provide this support gave the NWS an unprecedented opportunity to employ in an operational setting several tools and practices similar to those planned for the "modernized" era of the NWS. The project also provided a glimpse of technology and practices not planned for the NWS modernization, but that might be valuable in the future. The underlying purpose of the project was to protect the life and property of the two million spectators, athletes, volunteers, and officials visiting and/or participating in the games. While there is no way to accurately account for lives and property that were protected by the NWS support, the absence of weather-related deaths, significant injuries, and damaged property during the games despite an almost daily occurrence of thunderstorms, high temperatures, and/or rain indicates that the project was a success. In fact, popular perception held that weather had no effect on the games. The 2000+ weather bulletins issued during the 6-week support period suggest otherwise. The authors describe the many facets of this demanding and successful project, with special attention given to aspects related to operational forecasting. A postproject survey completed by the Olympics forecasters, feedback provided by weather support customers, and experiences of the management team provide the bases for project observations and recommendations for future operational forecasting activities.
NASA Technical Reports Server (NTRS)
Desoer, C. A.; Polak, E.; Zadeh, L. A.
1974-01-01
A series of research projects is briefly summarized which includes investigations in the following areas: (1) mathematical programming problems for large system and infinite-dimensional spaces, (2) bounded-input bounded-output stability, (3) non-parametric approximations, and (4) differential games. A list of reports and papers which were published over the ten year period of research is included.
Demographic Analysis and Planning for the Future. No. 13.
ERIC Educational Resources Information Center
Efird, Cathy M.
The basic sources and types of demographic data available for future planning for the developmentally disabled are reviewed and a frame work for data organization is suggested. It is explained that future forecasts may be undertaken by the following principles: trend forecasting or extrapolation; scenario construction; models, games, and…
ERIC Educational Resources Information Center
Simmons, Joseph P.; Massey, Cade
2012-01-01
Is optimism real, or are optimistic forecasts just cheap talk? To help answer this question, we investigated whether optimistic predictions persist in the face of large incentives to be accurate. We asked National Football League football fans to predict the winner of a single game. Roughly half (the partisans) predicted a game involving their…
Numerical Studies of the Georgia Coast Sea Breeze
1996-12-09
the 1991 Pan American Games. Bull. Amer. Meteor. Soc., 74, 5-16. ____, S. Rinard, C. Garza, and G. Hoogenboom , 1996: Wind forecasting for the sailing...events of the Summer Olympic Games. Conf on Coastal Oceanic and Atmos. Pred., Atlanta, GA, pp 336-343. Rinard, S., M. Powell, C. Garza and G. Hoogenboom
ERIC Educational Resources Information Center
Pocorobba, Janet; And Others
1996-01-01
Presents activities found to be useful in English-as-a-Second-Language (ESL) classes, including a game in which the teacher guesses the meaning of words in students' native tongues; an exercise in which students write predictions, such as weather forecasting, in English; a game in which students explain the meaning of selected idioms in their own…
NASA Astrophysics Data System (ADS)
Alfonso, Leonardo; van Andel, Schalk Jan
2014-05-01
Part of recent research in ensemble and probabilistic hydro-meteorological forecasting analyses which probabilistic information is required by decision makers and how it can be most effectively visualised. This work, in addition, analyses if decision making in flood early warning is also influenced by the way the decision question is posed. For this purpose, the decision-making game "Do probabilistic forecasts lead to better decisions?", which Ramos et al (2012) conducted at the EGU General Assembly 2012 in the city of Vienna, has been repeated with a small group and expanded. In that game decision makers had to decide whether or not to open a flood release gate, on the basis of flood forecasts, with and without uncertainty information. A conclusion of that game was that, in the absence of uncertainty information, decision makers are compelled towards a more risk-averse attitude. In order to explore to what extent the answers were driven by the way the questions were framed, in addition to the original experiment, a second variant was introduced where participants were asked to choose between a sure value (for either loosing or winning with a giving probability) and a gamble. This set-up is based on Kahneman and Tversky (1979). Results indicate that the way how the questions are posed may play an important role in decision making and that Prospect Theory provides promising concepts to further understand how this works.
Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2005-01-01
A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.
Statistical model for forecasting monthly large wildfire events in western United States
Haiganoush K. Preisler; Anthony L. Westerling
2006-01-01
The ability to forecast the number and location of large wildfire events (with specified confidence bounds) is important to fire managers attempting to allocate and distribute suppression efforts during severe fire seasons. This paper describes the development of a statistical model for assessing the forecasting skills of fire-danger predictors and producing 1-month-...
NASA Astrophysics Data System (ADS)
Cranston, Michael; Speight, Linda; Maxey, Richard; Tavendale, Amy; Buchanan, Peter
2015-04-01
One of the main challenges for the flood forecasting community remains the provision of reliable early warnings of surface (or pluvial) flooding. The Scottish Flood Forecasting Service has been developing approaches for forecasting the risk of surface water flooding including capitalising on the latest developments in quantitative precipitation forecasting from the Met Office. A probabilistic Heavy Rainfall Alert decision support tool helps operational forecasters assess the likelihood of surface water flooding against regional rainfall depth-duration estimates from MOGREPS-UK linked to historical short-duration flooding in Scotland. The surface water flood risk is communicated through the daily Flood Guidance Statement to emergency responders. A more recent development is an innovative risk-based hydrometeorological approach that links 24-hour ensemble rainfall forecasts through a hydrological model (Grid-to-Grid) to a library of impact assessments (Speight et al., 2015). The early warning tool - FEWS Glasgow - presents the risk of flooding to people, property and transport across a 1km grid over the city of Glasgow with a lead time of 24 hours. Communication of the risk was presented in a bespoke surface water flood forecast product designed based on emergency responder requirements and trialled during the 2014 Commonwealth Games in Glasgow. The development of new approaches to surface water flood forecasting are leading to improved methods of communicating the risk and better performance in early warning with a reduction in false alarm rates with summer flood guidance in 2014 (67%) compared to 2013 (81%) - although verification of instances of surface water flooding remains difficult. However the introduction of more demanding hydrometeorological capabilities with associated greater levels of uncertainty does lead to an increased demand on operational flood forecasting skills and resources. Speight, L., Cole, S.J., Moore, R.J., Pierce, C., Wright, B., Golding, B., Cranston, M., Tavendale, A., Ghimire, S., and Dhondia, J. (2015) Developing surface water flood forecasting capabilities in Scotland: an operational pilot for the 2014 Commonwealth Games in Glasgow. Journal of Flood Risk Management, In Press.
Understanding the Current 30-Year Shipbuilding Plan Through Three Models
2014-12-01
to what the Navy can ultimately build in ten years and beyond due to the fact that this plan is revised annually. In a political game this...the game and each one hoping for a different outcome. Though not currently possible with the FY2015 Long Range Plan it is possible to analyze the...to make a choice while the bounded rationality theory acknowledges the limits of human capabilities, knowledge, and capacity. Therefore, bounded
Adaptive and bounded investment returns promote cooperation in spatial public goods games.
Chen, Xiaojie; Liu, Yongkui; Zhou, Yonghui; Wang, Long; Perc, Matjaž
2012-01-01
The public goods game is one of the most famous models for studying the evolution of cooperation in sizable groups. The multiplication factor in this game can characterize the investment return from the public good, which may be variable depending on the interactive environment in realistic situations. Instead of using the same universal value, here we consider that the multiplication factor in each group is updated based on the differences between the local and global interactive environments in the spatial public goods game, but meanwhile limited to within a certain range. We find that the adaptive and bounded investment returns can significantly promote cooperation. In particular, full cooperation can be achieved for high feedback strength when appropriate limitation is set for the investment return. Also, we show that the fraction of cooperators in the whole population can become larger if the lower and upper limits of the multiplication factor are increased. Furthermore, in comparison to the traditionally spatial public goods game where the multiplication factor in each group is identical and fixed, we find that cooperation can be better promoted if the multiplication factor is constrained to adjust between one and the group size in our model. Our results highlight the importance of the locally adaptive and bounded investment returns for the emergence and dominance of cooperative behavior in structured populations.
Adaptive and Bounded Investment Returns Promote Cooperation in Spatial Public Goods Games
Chen, Xiaojie; Liu, Yongkui; Zhou, Yonghui; Wang, Long; Perc, Matjaž
2012-01-01
The public goods game is one of the most famous models for studying the evolution of cooperation in sizable groups. The multiplication factor in this game can characterize the investment return from the public good, which may be variable depending on the interactive environment in realistic situations. Instead of using the same universal value, here we consider that the multiplication factor in each group is updated based on the differences between the local and global interactive environments in the spatial public goods game, but meanwhile limited to within a certain range. We find that the adaptive and bounded investment returns can significantly promote cooperation. In particular, full cooperation can be achieved for high feedback strength when appropriate limitation is set for the investment return. Also, we show that the fraction of cooperators in the whole population can become larger if the lower and upper limits of the multiplication factor are increased. Furthermore, in comparison to the traditionally spatial public goods game where the multiplication factor in each group is identical and fixed, we find that cooperation can be better promoted if the multiplication factor is constrained to adjust between one and the group size in our model. Our results highlight the importance of the locally adaptive and bounded investment returns for the emergence and dominance of cooperative behavior in structured populations. PMID:22615836
ERIC Educational Resources Information Center
Little, Dennis; Feller, Richard
The Institute for the Future has been conducting research in technological and societal forecasting, social indicators, value change, and simulation gaming. This paper describes an effort to bring together parts of that research into a simulation game ("State Policy," or STAPOL) for analysis of the impact of government policy, social values, and…
Kuz'kin, B P; Ezhlova, E B; Kulichenko, A N; Maletskaia, O V; Demina, Iu V; Taran, T V; Pakskina, N D; Kharchenko, T V; Grizhebovskiĭ, G M; Savel'ev, V N; Orobeĭ, V G; Klindukhov, V P; Grechanaia, T V; Tesheva, S Ch; Brukhanova, G D
2015-01-01
To assess the epidemiological risk of introduction of serious infectious diseases in the pre-Olympic period defined list of dangerous and exotic infections and held assessment of potential danger threatening. Initial external information to assess the potential risk of skidding were reports, forecasts, posted on the official websites. The risk of skidding and epidemiological complications conditionally designated as high, moderate and minimal risk importation of measles virus-Rate was considered as high. In confirmation of the forecast for the period of the Olympic Games in Sochi have been registered about 100 cases of measles. Moderate risk of importation was determined for poliomyelitis due to wild poliovirus, Lassa fever, cholera, plague, and the minimal--for Dengue fever, yellow fever, the Middle East and respiratory syndrome, diseases caused by viruses Marburg and Ebola. Based on of analysis of previous Olympic Games and subsequent co-events related to the activity of the infectious diseases in the world, mate-cluded that even a slight risk of importation of infectious diseases requires maximum alertness and readiness to conduct adequate epidemiological issues incorporated.
Bounded rationality leads to equilibrium of public goods games.
Xu, Zhaojin; Wang, Zhen; Zhang, Lianzhong
2009-12-01
In this work, we introduce a degree of rationality to the public goods games in which players can determine whether or not to participate, and with it a new mechanism has been established. Existence of the bounded rationality would lead to a new equilibrium which differs from the Nash equilibrium and qualitatively explains the fundamental role of loners' payoff for maintaining cooperation. Meanwhile, it is shown how the potential strategy influences the players' decision. Finally, we explicitly demonstrate a rock-scissors-paper dynamics which is a consequence of this model.
Bounded rationality leads to equilibrium of public goods games
NASA Astrophysics Data System (ADS)
Xu, Zhaojin; Wang, Zhen; Zhang, Lianzhong
2009-12-01
In this work, we introduce a degree of rationality to the public goods games in which players can determine whether or not to participate, and with it a new mechanism has been established. Existence of the bounded rationality would lead to a new equilibrium which differs from the Nash equilibrium and qualitatively explains the fundamental role of loners’ payoff for maintaining cooperation. Meanwhile, it is shown how the potential strategy influences the players’ decision. Finally, we explicitly demonstrate a rock-scissors-paper dynamics which is a consequence of this model.
Minority Game of price promotions in fast moving consumer goods markets
NASA Astrophysics Data System (ADS)
Groot, Robert D.; Musters, Pieter A. D.
2005-05-01
A variation of the Minority Game has been applied to study the timing of promotional actions at retailers in the fast moving consumer goods market. The underlying hypotheses for this work are that price promotions are more effective when fewer than average competitors do a promotion, and that a promotion strategy can be based on past sales data. The first assumption has been checked by analysing 1467 promotional actions for three products on the Dutch market (ketchup, mayonnaise and curry sauce) over a 120-week period, both on an aggregated level and on retailer chain level. The second assumption was tested by analysing past sales data with the Minority Game. This revealed that high or low competitor promotional pressure for actual ketchup, mayonnaise, curry sauce and barbecue sauce markets is to some extent predictable up to a forecast of some 10 weeks. Whereas a random guess would be right 50% of the time, a single-agent game can predict the market with a success rate of 56% for a 6-9 week forecast. This number is the same for all four mentioned fast moving consumer markets. For a multi-agent game a larger variability in the success rate is obtained, but predictability can be as high as 65%. Contrary to expectation, the actual market does the opposite of what game theory would predict. This points at a systematic oscillation in the market. Even though this result is not fully understood, merely observing that this trend is present in the data could lead to exploitable trading benefits. As a check, random history strings were generated from which the statistical variation in the game prediction was studied. This shows that the odds are 1:1,000,000 that the observed pattern in the market is based on coincidence.
NASA Astrophysics Data System (ADS)
Isaac, G. A.; Joe, P. I.; Mailhot, J.; Bailey, M.; Bélair, S.; Boudala, F. S.; Brugman, M.; Campos, E.; Carpenter, R. L.; Crawford, R. W.; Cober, S. G.; Denis, B.; Doyle, C.; Reeves, H. D.; Gultepe, I.; Haiden, T.; Heckman, I.; Huang, L. X.; Milbrandt, J. A.; Mo, R.; Rasmussen, R. M.; Smith, T.; Stewart, R. E.; Wang, D.; Wilson, L. J.
2014-01-01
A World Weather Research Programme (WWRP) project entitled the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) was developed to be associated with the Vancouver 2010 Olympic and Paralympic Winter Games conducted between 12 February and 21 March 2010. The SNOW-V10 international team augmented the instrumentation associated with the Winter Games and several new numerical weather forecasting and nowcasting models were added. Both the additional observational and model data were available to the forecasters in real time. This was an excellent opportunity to demonstrate existing capability in nowcasting and to develop better techniques for short term (0-6 h) nowcasts of winter weather in complex terrain. Better techniques to forecast visibility, low cloud, wind gusts, precipitation rate and type were evaluated. The weather during the games was exceptionally variable with many periods of low visibility, low ceilings and precipitation in the form of both snow and rain. The data collected should improve our understanding of many physical phenomena such as the diabatic effects due to melting snow, wind flow around and over terrain, diurnal flow reversal in valleys associated with daytime heating, and precipitation reductions and increases due to local terrain. Many studies related to these phenomena are described in the Special Issue on SNOW-V10 for which this paper was written. Numerical weather prediction and nowcast models have been evaluated against the unique observational data set now available. It is anticipated that the data set and the knowledge learned as a result of SNOW-V10 will become a resource for other World Meteorological Organization member states who are interested in improving forecasts of winter weather.
Analysis of price behavior in lazy $-game
NASA Astrophysics Data System (ADS)
Kiniwa, Jun; Koide, Takeshi; Sandoh, Hiroaki
2009-09-01
A non-cooperative iterated multiagent game, called a minority game, and its variations have been extensively studied in this decade. To increase its market similarity, a -game was presented by observing the current and the next agent’s payoffs. However, since the -game is defined as an offline game, it is difficult to simulate it in practice. So we propose a new online version of the -game, called a lazy -game, and analyze the price behavior of the game. First, we reveal the condition of a bubble phenomenon in the lazy -game. Next, we investigate the price behavior in the lazy -game and show that there are some upper/lower bounds of the price as long as both the buyers group and the sellers group are nonempty. Then, we consider the similarity between the lazy -game and the -game. Finally, we present some simulation results.
Do probabilistic forecasts lead to better decisions?
NASA Astrophysics Data System (ADS)
Ramos, M. H.; van Andel, S. J.; Pappenberger, F.
2012-12-01
The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.
Do probabilistic forecasts lead to better decisions?
NASA Astrophysics Data System (ADS)
Ramos, M. H.; van Andel, S. J.; Pappenberger, F.
2013-06-01
The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.
Development of the nowcasting system for the XVII Asiad at Korea Meteorological Administration
NASA Astrophysics Data System (ADS)
Park, Kyungjeen; Kim, Juwon; Jang, Taekyu; Hwang, Seung On; Park, Yunho; Kim, Yoonjae; Park, Seonjoo; Joo, Sangwon; Noh, Hae Mi
2014-05-01
The XVII Asiad, known as the 2014 Asian game, is the largest sporting event in Asia. It will be held in Incheon, South Korea from September 19 to October 4, with 437 events in 36 sports. To support this game, Korea Meteorological Administration developed Incheon Data Assimilation and Prediction System (IDAPS) for nowcasting and very short range forecasts. The domain is centered at Incheon city and covers the central region of the Korean peninsula and adjacent seas. It repeats analysis and forecast processes with 1 hour cycling interval. IDAPS has approximately 1 km horizontal resolution with 324 x 360 grids and 70 vertical layers. Three dimensional variational data assimilation is applied to assimilate AWS, windprofiler, buoy, sonde, aircraft, scatwinds, rain rate, and radar products. The details of IDAPS and the experiment results will be given during the conference.
NASA Astrophysics Data System (ADS)
Yu, Qian; Fang, Debin; Zhang, Xiaoling; Jin, Chen; Ren, Qiyu
2016-06-01
Stochasticity plays an important role in the evolutionary dynamic of cyclic dominance within a finite population. To investigate the stochastic evolution process of the behaviour of bounded rational individuals, we model the Rock-Scissors-Paper (RSP) game as a finite, state dependent Quasi Birth and Death (QBD) process. We assume that bounded rational players can adjust their strategies by imitating the successful strategy according to the payoffs of the last round of the game, and then analyse the limiting distribution of the QBD process for the game stochastic evolutionary dynamic. The numerical experiments results are exhibited as pseudo colour ternary heat maps. Comparisons of these diagrams shows that the convergence property of long run equilibrium of the RSP game in populations depends on population size and the parameter of the payoff matrix and noise factor. The long run equilibrium is asymptotically stable, neutrally stable and unstable respectively according to the normalised parameters in the payoff matrix. Moreover, the results show that the distribution probability becomes more concentrated with a larger population size. This indicates that increasing the population size also increases the convergence speed of the stochastic evolution process while simultaneously reducing the influence of the noise factor.
Yu, Qian; Fang, Debin; Zhang, Xiaoling; Jin, Chen; Ren, Qiyu
2016-06-27
Stochasticity plays an important role in the evolutionary dynamic of cyclic dominance within a finite population. To investigate the stochastic evolution process of the behaviour of bounded rational individuals, we model the Rock-Scissors-Paper (RSP) game as a finite, state dependent Quasi Birth and Death (QBD) process. We assume that bounded rational players can adjust their strategies by imitating the successful strategy according to the payoffs of the last round of the game, and then analyse the limiting distribution of the QBD process for the game stochastic evolutionary dynamic. The numerical experiments results are exhibited as pseudo colour ternary heat maps. Comparisons of these diagrams shows that the convergence property of long run equilibrium of the RSP game in populations depends on population size and the parameter of the payoff matrix and noise factor. The long run equilibrium is asymptotically stable, neutrally stable and unstable respectively according to the normalised parameters in the payoff matrix. Moreover, the results show that the distribution probability becomes more concentrated with a larger population size. This indicates that increasing the population size also increases the convergence speed of the stochastic evolution process while simultaneously reducing the influence of the noise factor.
Prisoner's Dilemma and Chicken: A Communications Perspective.
ERIC Educational Resources Information Center
Dauber, Cori
1990-01-01
Analyzes nuclear strategic doctrine by applying understandings of human communication processes to deterrence theory and game theory. Focuses on communication as risk. Demonstrates flaws in game theory because of its mechanistic view, stressing that communication is a process not an event, and actions are bound by cultural dispositions. Advocates…
Virtual Games in Social Science Education
ERIC Educational Resources Information Center
Lopez, Jose M. Cuenca; Caceres, Myriam J. Martin
2010-01-01
The new technologies make the appearance of highly motivating and dynamic games with different levels of interaction possible, in which large amounts of data, information, procedures and values are included which are intimately bound with the social sciences. We set out from the hypothesis that videogames may become interesting resources for their…
Explicit Lower and Upper Bounds on the Entangled Value of Multiplayer XOR Games
NASA Astrophysics Data System (ADS)
Briët, Jop; Vidick, Thomas
2013-07-01
The study of quantum-mechanical violations of Bell inequalities is motivated by the investigation, and the eventual demonstration, of the nonlocal properties of entanglement. In recent years, Bell inequalities have found a fruitful re-formulation using the language of multiplayer games originating from Computer Science. This paper studies the nonlocal properties of entanglement in the context of the simplest such games, called XOR games. When there are two players, it is well known that the maximum bias—the advantage over random play—of players using entanglement can be at most a constant times greater than that of classical players. Recently, Pérez-García et al. (Commun. Mathe. Phys. 279:455, 2008) showed that no such bound holds when there are three or more players: the use of entanglement can provide an unbounded advantage, and scale with the number of questions in the game. Their proof relies on non-trivial results from operator space theory, and gives a non-explicit existence proof, leading to a game with a very large number of questions and only a loose control over the local dimension of the players' shared entanglement. We give a new, simple and explicit (though still probabilistic) construction of a family of three-player XOR games which achieve a large quantum-classical gap (QC-gap). This QC-gap is exponentially larger than the one given by Pérez-García et. al. in terms of the size of the game, achieving a QC-gap of order {√{N}} with N 2 questions per player. In terms of the dimension of the entangled state required, we achieve the same (optimal) QC-gap of {√{N}} for a state of local dimension N per player. Moreover, the optimal entangled strategy is very simple, involving observables defined by tensor products of the Pauli matrices. Additionally, we give the first upper bound on the maximal QC-gap in terms of the number of questions per player, showing that our construction is only quadratically off in that respect. Our results rely on probabilistic estimates on the norm of random matrices and higher-order tensors which may be of independent interest.
Tsirelson's bound and supersymmetric entangled states
Borsten, L.; Brádler, K.; Duff, M. J.
2014-01-01
A superqubit, belonging to a (2|1)-dimensional super-Hilbert space, constitutes the minimal supersymmetric extension of the conventional qubit. In order to see whether superqubits are more non-local than ordinary qubits, we construct a class of two-superqubit entangled states as a non-local resource in the CHSH game. Since super Hilbert space amplitudes are Grassmann numbers, the result depends on how we extract real probabilities and we examine three choices of map: (1) DeWitt (2) Trigonometric and (3) Modified Rogers. In cases (1) and (2), the winning probability reaches the Tsirelson bound pwin=cos2π/8≃0.8536 of standard quantum mechanics. Case (3) crosses Tsirelson's bound with pwin≃0.9265. Although all states used in the game involve probabilities lying between 0 and 1, case (3) permits other changes of basis inducing negative transition probabilities. PMID:25294964
Discrete post-processing of total cloud cover ensemble forecasts
NASA Astrophysics Data System (ADS)
Hemri, Stephan; Haiden, Thomas; Pappenberger, Florian
2017-04-01
This contribution presents an approach to post-process ensemble forecasts for the discrete and bounded weather variable of total cloud cover. Two methods for discrete statistical post-processing of ensemble predictions are tested. The first approach is based on multinomial logistic regression, the second involves a proportional odds logistic regression model. Applying them to total cloud cover raw ensemble forecasts from the European Centre for Medium-Range Weather Forecasts improves forecast skill significantly. Based on station-wise post-processing of raw ensemble total cloud cover forecasts for a global set of 3330 stations over the period from 2007 to early 2014, the more parsimonious proportional odds logistic regression model proved to slightly outperform the multinomial logistic regression model. Reference Hemri, S., Haiden, T., & Pappenberger, F. (2016). Discrete post-processing of total cloud cover ensemble forecasts. Monthly Weather Review 144, 2565-2577.
A Pumping Algorithm for Ergodic Stochastic Mean Payoff Games with Perfect Information
NASA Astrophysics Data System (ADS)
Boros, Endre; Elbassioni, Khaled; Gurvich, Vladimir; Makino, Kazuhisa
In this paper, we consider two-person zero-sum stochastic mean payoff games with perfect information, or BWR-games, given by a digraph G = (V = V B ∪ V W ∪ V R , E), with local rewards r: E to { R}, and three types of vertices: black V B , white V W , and random V R . The game is played by two players, White and Black: When the play is at a white (black) vertex v, White (Black) selects an outgoing arc (v,u). When the play is at a random vertex v, a vertex u is picked with the given probability p(v,u). In all cases, Black pays White the value r(v,u). The play continues forever, and White aims to maximize (Black aims to minimize) the limiting mean (that is, average) payoff. It was recently shown in [7] that BWR-games are polynomially equivalent with the classical Gillette games, which include many well-known subclasses, such as cyclic games, simple stochastic games (SSG's), stochastic parity games, and Markov decision processes. In this paper, we give a new algorithm for solving BWR-games in the ergodic case, that is when the optimal values do not depend on the initial position. Our algorithm solves a BWR-game by reducing it, using a potential transformation, to a canonical form in which the optimal strategies of both players and the value for every initial position are obvious, since a locally optimal move in it is optimal in the whole game. We show that this algorithm is pseudo-polynomial when the number of random nodes is constant. We also provide an almost matching lower bound on its running time, and show that this bound holds for a wider class of algorithms. Let us add that the general (non-ergodic) case is at least as hard as SSG's, for which no pseudo-polynomial algorithm is known.
Model Error Estimation for the CPTEC Eta Model
NASA Technical Reports Server (NTRS)
Tippett, Michael K.; daSilva, Arlindo
1999-01-01
Statistical data assimilation systems require the specification of forecast and observation error statistics. Forecast error is due to model imperfections and differences between the initial condition and the actual state of the atmosphere. Practical four-dimensional variational (4D-Var) methods try to fit the forecast state to the observations and assume that the model error is negligible. Here with a number of simplifying assumption, a framework is developed for isolating the model error given the forecast error at two lead-times. Two definitions are proposed for the Talagrand ratio tau, the fraction of the forecast error due to model error rather than initial condition error. Data from the CPTEC Eta Model running operationally over South America are used to calculate forecast error statistics and lower bounds for tau.
On the Inefficiency of Equilibria in Linear Bottleneck Congestion Games
NASA Astrophysics Data System (ADS)
de Keijzer, Bart; Schäfer, Guido; Telelis, Orestis A.
We study the inefficiency of equilibrium outcomes in bottleneck congestion games. These games model situations in which strategic players compete for a limited number of facilities. Each player allocates his weight to a (feasible) subset of the facilities with the goal to minimize the maximum (weight-dependent) latency that he experiences on any of these facilities. We derive upper and (asymptotically) matching lower bounds on the (strong) price of anarchy of linear bottleneck congestion games for a natural load balancing social cost objective (i.e., minimize the maximum latency of a facility). We restrict our studies to linear latency functions. Linear bottleneck congestion games still constitute a rich class of games and generalize, for example, load balancing games with identical or uniformly related machines with or without restricted assignments.
Differential Games of inf-sup Type and Isaacs Equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaise, Hidehiro; Sheu, S.-J.
2005-06-15
Motivated by the work of Fleming, we provide a general framework to associate inf-sup type values with the Isaacs equations.We show that upper and lower bounds for the generators of inf-sup type are upper and lower Hamiltonians, respectively. In particular, the lower (resp. upper) bound corresponds to the progressive (resp. strictly progressive) strategy. By the Dynamic Programming Principle and identification of the generator, we can prove that the inf-sup type game is characterized as the unique viscosity solution of the Isaacs equation. We also discuss the Isaacs equation with a Hamiltonian of a convex combination between the lower and uppermore » Hamiltonians.« less
Network marketing with bounded rationality and partial information
NASA Astrophysics Data System (ADS)
Kiet, Hoang Anh Tuan; Kim, Beom Jun
2008-08-01
Network marketing has been proposed and used as a way to spread the product information to consumers through social connections. We extend the previous game model of the network marketing on a small-world tree network and propose two games: In the first model with the bounded rationality, each consumer makes purchase decision stochastically, while in the second model, consumers get only partial information due to the finite length of social connections. Via extensive numerical simulations, we find that as the rationality is enhanced not only the consumer surplus but also the firm’s profit is increased. The implication of our results is also discussed.
NASA Astrophysics Data System (ADS)
Jiang, Zhong-Zhong; He, Na; Qin, Xuwei; Ip, W. H.; Wu, C. H.; Yung, K. L.
2018-07-01
The emergence of online group-buying provides a new consumption pattern for consumers in e-commerce era. However, many consumers realize that their own interests sometimes can't be guaranteed in the group-buying market due to the lack of being regulated. This paper aims to develop effective regulation strategies for online group-buying market. To the best of our knowledge, most existing studies assume that three parties in online group-buying market, i.e. the retailer, the group-buying platform and the consumer, are perfectly rational. To better understand the decision process, in this paper, we incorporate the concept of bounded rationality into consideration. Firstly, a three-parties evolutionary game model is established to study each player's game strategy based on bounded rationality. Secondly, the game model is simulated as a whole by adopting system dynamics to analyze its stability. Finally, theoretical analysis and extensive computational experiments are conducted to obtain the managerial insights and regulation strategies for online group-buying market. Our results clearly demonstrate that a suitable bonus-penalty measure can promote the healthy development of online group-buying market.
Affective forecasting bias in preschool children.
Gautam, Shalini; Bulley, Adam; von Hippel, William; Suddendorf, Thomas
2017-07-01
Adults are capable of predicting their emotional reactions to possible future events. Nevertheless, they systematically overestimate the intensity of their future emotional reactions relative to how they feel when these events actually occur. The developmental origin of this "intensity bias" has not yet been examined. Two studies were conducted to test the intensity bias in preschool children. In the first study, 5-year-olds (N=30) predicted how they would feel if they won or lost various games. Comparisons with subsequent self-reported feelings indicated that participants overestimated how sad they would feel to lose the games but did not overestimate their happiness from winning. The second study replicated this effect in another sample of 5-year-olds (n=34) and also found evidence of an intensity bias in 4-year-olds (n=30). These findings provide the first evidence of a negative intensity bias in affective forecasting among young children. Copyright © 2017 Elsevier Inc. All rights reserved.
CMB constraints on running non-Gaussianity
NASA Astrophysics Data System (ADS)
Oppizzi, F.; Liguori, M.; Renzi, A.; Arroja, F.; Bartolo, N.
2018-05-01
We develop a complete set of tools for CMB forecasting, simulation and estimation of primordial running bispectra, arising from a variety of curvaton and single-field (DBI) models of Inflation. We validate our pipeline using mock CMB running non-Gaussianity realizations and test it on real data by obtaining experimental constraints on the fNL running spectral index, nNG, using WMAP 9-year data. Our final bounds (68% C.L.) read ‑0.6< nNG<1.4}, ‑0.3< nNG<1.2, ‑1.1
NASA Astrophysics Data System (ADS)
Ohdaira, Tetsushi
2014-07-01
Previous studies discussing cooperation employ the best decision that every player knows all information regarding the payoff matrix and selects the strategy of the highest payoff. Therefore, they do not discuss cooperation based on the altruistic decision with limited information (bounded rational altruistic decision). In addition, they do not cover the case where every player can submit his/her strategy several times in a match of the game. This paper is based on Ohdaira's reconsideration of the bounded rational altruistic decision, and also employs the framework of the prisoner's dilemma game (PDG) with sequential strategy. The distinction between this study and the Ohdaira's reconsideration is that the former covers the model of multiple groups, but the latter deals with the model of only two groups. Ohdaira's reconsideration shows that the bounded rational altruistic decision facilitates much more cooperation in the PDG with sequential strategy than Ohdaira and Terano's bounded rational second-best decision does. However, the detail of cooperation of multiple groups based on the bounded rational altruistic decision has not been resolved yet. This study, therefore, shows how randomness in the network composed of multiple groups affects the increase of the average frequency of mutual cooperation (cooperation between groups) based on the bounded rational altruistic decision of multiple groups. We also discuss the results of the model in comparison with related studies which employ the best decision.
NASA Astrophysics Data System (ADS)
Karagiannis, Dionysios; Lazanu, Andrei; Liguori, Michele; Raccanelli, Alvise; Bartolo, Nicola; Verde, Licia
2018-07-01
We forecast constraints on primordial non-Gaussianity (PNG) and bias parameters from measurements of galaxy power spectrum and bispectrum in future radio continuum and optical surveys. In the galaxy bispectrum, we consider a comprehensive list of effects, including the bias expansion for non-Gaussian initial conditions up to second order, redshift space distortions, redshift uncertainties and theoretical errors. These effects are all combined in a single PNG forecast for the first time. Moreover, we improve the bispectrum modelling over previous forecasts, by accounting for trispectrum contributions. All effects have an impact on final predicted bounds, which varies with the type of survey. We find that the bispectrum can lead to improvements up to a factor ˜5 over bounds based on the power spectrum alone, leading to significantly better constraints for local-type PNG, with respect to current limits from Planck. Future radio and photometric surveys could obtain a measurement error of σ (f_{NL}^{loc}) ≈ 0.2. In the case of equilateral PNG, galaxy bispectrum can improve upon present bounds only if significant improvements in the redshift determinations of future, large volume, photometric or radio surveys could be achieved. For orthogonal non-Gaussianity, expected constraints are generally comparable to current ones.
NASA Astrophysics Data System (ADS)
Karagiannis, Dionysios; Lazanu, Andrei; Liguori, Michele; Raccanelli, Alvise; Bartolo, Nicola; Verde, Licia
2018-04-01
We forecast constraints on primordial non-Gaussianity (PNG) and bias parameters from measurements of galaxy power spectrum and bispectrum in future radio continuum and optical surveys. In the galaxy bispectrum, we consider a comprehensive list of effects, including the bias expansion for non-Gaussian initial conditions up to second order, redshift space distortions, redshift uncertainties and theoretical errors. These effects are all combined in a single PNG forecast for the first time. Moreover, we improve the bispectrum modelling over previous forecasts, by accounting for trispectrum contributions. All effects have an impact on final predicted bounds, which varies with the type of survey. We find that the bispectrum can lead to improvements up to a factor ˜5 over bounds based on the power spectrum alone, leading to significantly better constraints for local-type PNG, with respect to current limits from Planck. Future radio and photometric surveys could obtain a measurement error of σ (f_{NL}^{loc}) ≈ 0.2. In the case of equilateral PNG, galaxy bispectrum can improve upon present bounds only if significant improvements in the redshift determinations of future, large volume, photometric or radio surveys could be achieved. For orthogonal non-Gaussianity, expected constraints are generally comparable to current ones.
A conceptual framework of game-informed principles for health professions education.
Ellaway, Rachel H
2016-01-01
Games have been used for training purposes for many years, but their use remains somewhat underdeveloped and under-theorized in health professional education. This paper considers the basis for using serious games (games that have an explicit educational purpose) in health professional education in terms of their underlying concepts and design principles. These principles can be understood as a series of game facets: competition and conflict, chance and luck, experience and performance, simulation and make-believe, tactics and strategies, media, symbols and actions, and complexity and difficulty. Games are distinct and bound in ways that other health professional education activities are not. The differences between games and simulation can be understood in terms of the interconnected concepts of isomorphism (convergence with real-world practice) and anisomorphism (divergence from real-world practice). Gaming facets can extend the instructional design repertoire in health professional education.
NASA Astrophysics Data System (ADS)
Bernier, Natacha B.; Bélair, Stéphane; Bilodeau, Bernard; Tong, Linying
2014-01-01
A dynamical model was experimentally implemented to provide high resolution forecasts at points of interests in the 2010 Vancouver Olympics and Paralympics Region. In a first experiment, GEM-Surf, the near surface and land surface modeling system, is driven by operational atmospheric forecasts and used to refine the surface forecasts according to local surface conditions such as elevation and vegetation type. In this simple form, temperature and snow depth forecasts are improved mainly as a result of the better representation of real elevation. In a second experiment, screen level observations and operational atmospheric forecasts are blended to drive a continuous cycle of near surface and land surface hindcasts. Hindcasts of the previous day conditions are then regarded as today's optimized initial conditions. Hence, in this experiment, given observations are available, observation driven hindcasts continuously ensure that daily forecasts are issued from improved initial conditions. GEM-Surf forecasts obtained from improved short-range hindcasts produced using these better conditions result in improved snow depth forecasts. In a third experiment, assimilation of snow depth data is applied to further optimize GEM-Surf's initial conditions, in addition to the use of blended observations and forecasts for forcing. Results show that snow depth and summer temperature forecasts are further improved by the addition of snow depth data assimilation.
ERIC Educational Resources Information Center
Enzer, Selwyn
1977-01-01
Futures research offers new tools for forecasting and for designing alternative intervention strategies. Interactive cross-impact modeling is presented as a useful method for identifying future events. (Author/MV)
Improved Taxation Rate for Bin Packing Games
NASA Astrophysics Data System (ADS)
Kern, Walter; Qiu, Xian
A cooperative bin packing game is a N-person game, where the player set N consists of k bins of capacity 1 each and n items of sizes a 1, ⋯ ,a n . The value of a coalition of players is defined to be the maximum total size of items in the coalition that can be packed into the bins of the coalition. We present an alternative proof for the non-emptiness of the 1/3-core for all bin packing games and show how to improve this bound ɛ= 1/3 (slightly). We conjecture that the true best possible value is ɛ= 1/7.
NASA Astrophysics Data System (ADS)
Pimentel, F. P.; Marques Da Cruz, L.; Cabral, M. M.; Miranda, T. C.; Garção, H. F.; Oliveira, A. L. S. C.; Carvalho, G. V.; Soares, F.; São Tiago, P. M.; Barmak, R. B.; Rinaldi, F.; dos Santos, F. A.; Da Rocha Fragoso, M.; Pellegrini, J. C.
2016-02-01
Marine debris is a widespread pollution issue that affects almost all water bodies and is remarkably relevant in estuaries and bays. Rio de Janeiro city will host the 2016 Olympic Games and Guanabara Bay will be the venue for the sailing competitions. Historically serving as deposit for all types of waste, this water body suffers with major environmental problems, one of them being the massive presence of floating garbage. Therefore, it is of great importance to count on effective contingency actions to address this issue. In this sense, an operational ocean forecasting system was designed and it is presently being used by the Rio de Janeiro State Government to manage and control the cleaning actions on the bay. The forecasting system makes use of high resolution hydrodynamic and atmospheric models and a lagragian particle transport model, in order to provide probabilistic forecasts maps of the areas where the debris are most probably accumulating. All the results are displayed on an interactive GIS web platform along with the tracks of the boats that make the garbage collection, so the decision makers can easily command the actions, enhancing its efficiency. The integration of in situ data and advanced techniques such as Lyapunov exponent analysis are also being developed in the system, so to increase its forecast reliability. Additionally, the system also gathers and compiles on its database all the information on the debris collection, including quantity, type, locations, accumulation areas and their correlation with the environmental factors that drive the runoff and surface drift. Combining probabilistic, deterministic and statistical approaches, the forecasting system of Guanabara Bay has been proving to be a powerful tool for the environmental management and will be of great importance on helping securing the safety and fairness of the Olympic sailing competitions. The system design, its components and main results are presented in this paper.
Operational hydrological forecasting in Bavaria. Part II: Ensemble forecasting
NASA Astrophysics Data System (ADS)
Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.
2009-04-01
In part I of this study, the operational flood forecasting system in Bavaria and an approach to identify and quantify forecast uncertainty was introduced. The approach is split into the calculation of an empirical 'overall error' from archived forecasts and the calculation of an empirical 'model error' based on hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The 'model error' can especially in upstream catchments where forecast uncertainty is strongly dependent on the current predictability of the atrmosphere be superimposed on the spread of a hydrometeorological ensemble forecast. In Bavaria, two meteorological ensemble prediction systems are currently tested for operational use: the 16-member COSMO-LEPS forecast and a poor man's ensemble composed of DWD GME, DWD Cosmo-EU, NCEP GFS, Aladin-Austria, MeteoSwiss Cosmo-7. The determination of the overall forecast uncertainty is dependent on the catchment characteristics: 1. Upstream catchment with high influence of weather forecast a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. b) Corresponding to the characteristics of the meteorological ensemble forecast, each resulting forecast hydrograph can be regarded as equally likely. c) The 'model error' distribution, with parameters dependent on hydrological case and lead time, is added to each forecast timestep of each ensemble member d) For each forecast timestep, the overall (i.e. over all 'model error' distribution of each ensemble member) error distribution is calculated e) From this distribution, the uncertainty range on a desired level (here: the 10% and 90% percentile) is extracted and drawn as forecast envelope. f) As the mean or median of an ensemble forecast does not necessarily exhibit meteorologically sound temporal evolution, a single hydrological forecast termed 'lead forecast' is chosen and shown in addition to the uncertainty bounds. This can be either an intermediate forecast between the extremes of the ensemble spread or a manually selected forecast based on a meteorologists advice. 2. Downstream catchments with low influence of weather forecast In downstream catchments with strong human impact on discharge (e.g. by reservoir operation) and large influence of upstream gauge observation quality on forecast quality, the 'overall error' may in most cases be larger than the combination of the 'model error' and an ensemble spread. Therefore, the overall forecast uncertainty bounds are calculated differently: a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. Here, additionally the corresponding inflow hydrograph from all upstream catchments must be used. b) As for an upstream catchment, the uncertainty range is determined by combination of 'model error' and the ensemble member forecasts c) In addition, the 'overall error' is superimposed on the 'lead forecast'. For reasons of consistency, the lead forecast must be based on the same meteorological forecast in the downstream and all upstream catchments. d) From the resulting two uncertainty ranges (one from the ensemble forecast and 'model error', one from the 'lead forecast' and 'overall error'), the envelope is taken as the most prudent uncertainty range. In sum, the uncertainty associated with each forecast run is calculated and communicated to the public in the form of 10% and 90% percentiles. As in part I of this study, the methodology as well as the useful- or uselessness of the resulting uncertainty ranges will be presented and discussed by typical examples.
Rational and Boundedly Rational Behavior in a Binary Choice Sender-Receiver Game
ERIC Educational Resources Information Center
Landi, Massimiliano; Colucci, Domenico
2008-01-01
The authors investigate the strategic rationale behind the message sent by Osama bin Laden on the eve of the 2004 U.S. Presidential elections. They model this situation as a signaling game in which a population of receivers takes a binary choice, the outcome is decided by majority rule, sender and receivers have conflicting interests, and there is…
Gambling scores for earthquake predictions and forecasts
NASA Astrophysics Data System (ADS)
Zhuang, Jiancang
2010-04-01
This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.
Realistic Affective Forecasting: The Role of Personality
Hoerger, Michael; Chapman, Ben; Duberstein, Paul
2016-01-01
Affective forecasting often drives decision making. Although affective forecasting research has often focused on identifying sources of error at the event level, the present investigation draws upon the ‘realistic paradigm’ in seeking to identify factors that similarly influence predicted and actual emotions, explaining their concordance across individuals. We hypothesized that the personality traits neuroticism and extraversion would account for variation in both predicted and actual emotional reactions to a wide array of stimuli and events (football games, an election, Valentine’s Day, birthdays, happy/sad film clips, and an intrusive interview). As hypothesized, individuals who were more introverted and neurotic anticipated, correctly, that they would experience relatively more unpleasant emotional reactions, and those who were more extraverted and less neurotic anticipated, correctly, that they would experience relatively more pleasant emotional reactions. Personality explained 30% of the concordance between predicted and actual emotional reactions. Findings suggest three purported personality processes implicated in affective forecasting, highlight the importance of individual-differences research in this domain, and call for more research on realistic affective forecasts. PMID:26212463
Realistic affective forecasting: The role of personality.
Hoerger, Michael; Chapman, Ben; Duberstein, Paul
2016-11-01
Affective forecasting often drives decision-making. Although affective forecasting research has often focused on identifying sources of error at the event level, the present investigation draws upon the "realistic paradigm" in seeking to identify factors that similarly influence predicted and actual emotions, explaining their concordance across individuals. We hypothesised that the personality traits neuroticism and extraversion would account for variation in both predicted and actual emotional reactions to a wide array of stimuli and events (football games, an election, Valentine's Day, birthdays, happy/sad film clips, and an intrusive interview). As hypothesised, individuals who were more introverted and neurotic anticipated, correctly, that they would experience relatively more unpleasant emotional reactions, and those who were more extraverted and less neurotic anticipated, correctly, that they would experience relatively more pleasant emotional reactions. Personality explained 30% of the concordance between predicted and actual emotional reactions. Findings suggest three purported personality processes implicated in affective forecasting, highlight the importance of individual-differences research in this domain, and call for more research on realistic affective forecasts.
Constant Price of Anarchy in Network Creation Games via Public Service Advertising
NASA Astrophysics Data System (ADS)
Demaine, Erik D.; Zadimoghaddam, Morteza
Network creation games have been studied in many different settings recently. These games are motivated by social networks in which selfish agents want to construct a connection graph among themselves. Each node wants to minimize its average or maximum distance to the others, without paying much to construct the network. Many generalizations have been considered, including non-uniform interests between nodes, general graphs of allowable edges, bounded budget agents, etc. In all of these settings, there is no known constant bound on the price of anarchy. In fact, in many cases, the price of anarchy can be very large, namely, a constant power of the number of agents. This means that we have no control on the behavior of network when agents act selfishly. On the other hand, the price of stability in all these models is constant, which means that there is chance that agents act selfishly and we end up with a reasonable social cost.
Characterizing the Nash equilibria of three-player Bayesian quantum games
NASA Astrophysics Data System (ADS)
Solmeyer, Neal; Balu, Radhakrishnan
2017-05-01
Quantum games with incomplete information can be studied within a Bayesian framework. We analyze games quantized within the EWL framework [Eisert, Wilkens, and Lewenstein, Phys Rev. Lett. 83, 3077 (1999)]. We solve for the Nash equilibria of a variety of two-player quantum games and compare the results to the solutions of the corresponding classical games. We then analyze Bayesian games where there is uncertainty about the player types in two-player conflicting interest games. The solutions to the Bayesian games are found to have a phase diagram-like structure where different equilibria exist in different parameter regions, depending both on the amount of uncertainty and the degree of entanglement. We find that in games where a Pareto-optimal solution is not a Nash equilibrium, it is possible for the quantized game to have an advantage over the classical version. In addition, we analyze the behavior of the solutions as the strategy choices approach an unrestricted operation. We find that some games have a continuum of solutions, bounded by the solutions of a simpler restricted game. A deeper understanding of Bayesian quantum game theory could lead to novel quantum applications in a multi-agent setting.
[New Developments in Video Games for Psychotherapy].
Brezinka, Veronika
2016-01-01
A literature survey on new developments in the area of video games and psychotherapy of children and adolescents was conducted. Despite the omnipresence of computers and the internet, development of therapeutic games seems rather slow. The video game Treasure Hunt was introduced in 2008 to support treatment of children with internalizing and externalizing disorders. Camp Cope-A-Lot was developed for treatment of anxious children, whereas the self-help game SPARX is directed at depressed adolescents. Rage-Control is a biofeedback game for children with anger problems. The game Zoo U aims to assess and train social skills of primary school children. Ricky and the Spider for young children with obsessive compulsive disorder is meant to support the cognitive-behavioural treatment of these patients. Clash- Back is a French game for adolescents with externalizing problems. Possible reasons for the relatively slow development of therapeutic games are the high methodological demands concerning an evaluation as well as the high costs of game development. Nonetheless, computers and the internet are bound to influence psychotherapy with children and adolescents in the long run.
Regional crop yield forecasting: a probabilistic approach
NASA Astrophysics Data System (ADS)
de Wit, A.; van Diepen, K.; Boogaard, H.
2009-04-01
Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.
A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables
NASA Astrophysics Data System (ADS)
Huang, Laura X.; Isaac, George A.; Sheng, Grant
2014-01-01
This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.
Performances of One-Round Walks in Linear Congestion Games
NASA Astrophysics Data System (ADS)
Bilò, Vittorio; Fanelli, Angelo; Flammini, Michele; Moscardelli, Luca
We investigate the approximation ratio of the solutions achieved after a one-round walk in linear congestion games. We consider the social functions {Stextsc{um}}, defined as the sum of the players’ costs, and {Mtextsc{ax}}, defined as the maximum cost per player, as a measure of the quality of a given solution. For the social function {Stextsc{um}} and one-round walks starting from the empty strategy profile, we close the gap between the upper bound of 2+sqrt{5}≈ 4.24 given in [8] and the lower bound of 4 derived in [4] by providing a matching lower bound whose construction and analysis require non-trivial arguments. For the social function {Mtextsc{ax}}, for which, to the best of our knowledge, no results were known prior to this work, we show an approximation ratio of Θ(sqrt[4]{n^3}) (resp. Θ(nsqrt{n})), where n is the number of players, for one-round walks starting from the empty (resp. an arbitrary) strategy profile.
NASA Technical Reports Server (NTRS)
Koster, Randal D.; Walker, Gregory K.; Mahanama, Sarith P.; Reichle, Rolf H.
2013-01-01
Offline simulations over the conterminous United States (CONUS) with a land surface model are used to address two issues relevant to the forecasting of large-scale seasonal streamflow: (i) the extent to which errors in soil moisture initialization degrade streamflow forecasts, and (ii) the extent to which a realistic increase in the spatial resolution of forecasted precipitation would improve streamflow forecasts. The addition of error to a soil moisture initialization field is found to lead to a nearly proportional reduction in streamflow forecast skill. The linearity of the response allows the determination of a lower bound for the increase in streamflow forecast skill achievable through improved soil moisture estimation, e.g., through satellite-based soil moisture measurements. An increase in the resolution of precipitation is found to have an impact on large-scale streamflow forecasts only when evaporation variance is significant relative to the precipitation variance. This condition is met only in the western half of the CONUS domain. Taken together, the two studies demonstrate the utility of a continental-scale land surface modeling system as a tool for addressing the science of hydrological prediction.
Low-dimensional Representation of Error Covariance
NASA Technical Reports Server (NTRS)
Tippett, Michael K.; Cohn, Stephen E.; Todling, Ricardo; Marchesin, Dan
2000-01-01
Ensemble and reduced-rank approaches to prediction and assimilation rely on low-dimensional approximations of the estimation error covariances. Here stability properties of the forecast/analysis cycle for linear, time-independent systems are used to identify factors that cause the steady-state analysis error covariance to admit a low-dimensional representation. A useful measure of forecast/analysis cycle stability is the bound matrix, a function of the dynamics, observation operator and assimilation method. Upper and lower estimates for the steady-state analysis error covariance matrix eigenvalues are derived from the bound matrix. The estimates generalize to time-dependent systems. If much of the steady-state analysis error variance is due to a few dominant modes, the leading eigenvectors of the bound matrix approximate those of the steady-state analysis error covariance matrix. The analytical results are illustrated in two numerical examples where the Kalman filter is carried to steady state. The first example uses the dynamics of a generalized advection equation exhibiting nonmodal transient growth. Failure to observe growing modes leads to increased steady-state analysis error variances. Leading eigenvectors of the steady-state analysis error covariance matrix are well approximated by leading eigenvectors of the bound matrix. The second example uses the dynamics of a damped baroclinic wave model. The leading eigenvectors of a lowest-order approximation of the bound matrix are shown to approximate well the leading eigenvectors of the steady-state analysis error covariance matrix.
Simmons, Joseph P; Massey, Cade
2012-11-01
Is optimism real, or are optimistic forecasts just cheap talk? To help answer this question, we investigated whether optimistic predictions persist in the face of large incentives to be accurate. We asked National Football League football fans to predict the winner of a single game. Roughly half (the partisans) predicted a game involving their favorite team, and the other half (the neutrals) predicted a game involving 2 teams they were neutral about. Participants were promised either a small incentive ($5) or a large incentive ($50) for correctly predicting the game's winner. Optimism emerged even when incentives were large, as partisans were much more likely than neutrals to predict partisans' favorite teams to win. Strong optimism also emerged among participants whose responses to follow-up questions strongly suggested that they believed the predictions they made. This research supports the claim that optimism is real. (PsycINFO Database Record (c) 2012 APA, all rights reserved).
A new accuracy measure based on bounded relative error for time series forecasting
Twycross, Jamie; Garibaldi, Jonathan M.
2017-01-01
Many accuracy measures have been proposed in the past for time series forecasting comparisons. However, many of these measures suffer from one or more issues such as poor resistance to outliers and scale dependence. In this paper, while summarising commonly used accuracy measures, a special review is made on the symmetric mean absolute percentage error. Moreover, a new accuracy measure called the Unscaled Mean Bounded Relative Absolute Error (UMBRAE), which combines the best features of various alternative measures, is proposed to address the common issues of existing measures. A comparative evaluation on the proposed and related measures has been made with both synthetic and real-world data. The results indicate that the proposed measure, with user selectable benchmark, performs as well as or better than other measures on selected criteria. Though it has been commonly accepted that there is no single best accuracy measure, we suggest that UMBRAE could be a good choice to evaluate forecasting methods, especially for cases where measures based on geometric mean of relative errors, such as the geometric mean relative absolute error, are preferred. PMID:28339480
A new accuracy measure based on bounded relative error for time series forecasting.
Chen, Chao; Twycross, Jamie; Garibaldi, Jonathan M
2017-01-01
Many accuracy measures have been proposed in the past for time series forecasting comparisons. However, many of these measures suffer from one or more issues such as poor resistance to outliers and scale dependence. In this paper, while summarising commonly used accuracy measures, a special review is made on the symmetric mean absolute percentage error. Moreover, a new accuracy measure called the Unscaled Mean Bounded Relative Absolute Error (UMBRAE), which combines the best features of various alternative measures, is proposed to address the common issues of existing measures. A comparative evaluation on the proposed and related measures has been made with both synthetic and real-world data. The results indicate that the proposed measure, with user selectable benchmark, performs as well as or better than other measures on selected criteria. Though it has been commonly accepted that there is no single best accuracy measure, we suggest that UMBRAE could be a good choice to evaluate forecasting methods, especially for cases where measures based on geometric mean of relative errors, such as the geometric mean relative absolute error, are preferred.
Optional contributions have positive effects for volunteering public goods games
NASA Astrophysics Data System (ADS)
Song, Qi-Qing; Li, Zhen-Peng; Fu, Chang-He; Wang, Lai-Sheng
2011-11-01
Public goods (PG) games with the volunteering mechanism are referred to as volunteering public goods (VPG) games, in which loners are introduced to the PG games, and a loner obtains a constant payoff but not participating the game. Considering that small contributions may have positive effects to encourage more players with bounded rationality to contribute, this paper introduces optional contributions (high value or low value) to these typical VPG games-a cooperator can contribute a high or low payoff to the public pools. With the low contribution, the logit dynamics show that cooperation can be promoted in a well mixed population comparing to the typical VPG games, furthermore, as the multiplication factor is greater than a threshold, the average payoff of the population is also enhanced. In spatial VPG games, we introduce a new adjusting mechanism that is an approximation to best response. Some results in agreement with the prediction of the logit dynamics are found. These simulation results reveal that for VPG games the option of low contributions may be a better method to stimulate the growth of cooperation frequency and the average payoff of the population.
An Annotated Bibliography on Second Language Acquisition
1994-06-01
not assessment purposes. Examples of three ’serious’ rote learning programs are SPRECH, GERAD, and LEX. DEDUCT is a question/answer game where the...interactively create game-trees for context-bound vocabulary drills from scratch without modifications to the program software through CREATE. C 1985...II (pp. 223-294). Stanford, CA: Heuristech Press. The goal of CAI research is to build instructional programs that incorporate well-prepared course
Short-term leprosy forecasting from an expert opinion survey.
Deiner, Michael S; Worden, Lee; Rittel, Alex; Ackley, Sarah F; Liu, Fengchen; Blum, Laura; Scott, James C; Lietman, Thomas M; Porco, Travis C
2017-01-01
We conducted an expert survey of leprosy (Hansen's Disease) and neglected tropical disease experts in February 2016. Experts were asked to forecast the next year of reported cases for the world, for the top three countries, and for selected states and territories of India. A total of 103 respondents answered at least one forecasting question. We elicited lower and upper confidence bounds. Comparing these results to regression and exponential smoothing, we found no evidence that any forecasting method outperformed the others. We found evidence that experts who believed it was more likely to achieve global interruption of transmission goals and disability reduction goals had higher error scores for India and Indonesia, but lower for Brazil. Even for a disease whose epidemiology changes on a slow time scale, forecasting exercises such as we conducted are simple and practical. We believe they can be used on a routine basis in public health.
Short-term leprosy forecasting from an expert opinion survey
Deiner, Michael S.; Worden, Lee; Rittel, Alex; Ackley, Sarah F.; Liu, Fengchen; Blum, Laura; Scott, James C.; Lietman, Thomas M.
2017-01-01
We conducted an expert survey of leprosy (Hansen’s Disease) and neglected tropical disease experts in February 2016. Experts were asked to forecast the next year of reported cases for the world, for the top three countries, and for selected states and territories of India. A total of 103 respondents answered at least one forecasting question. We elicited lower and upper confidence bounds. Comparing these results to regression and exponential smoothing, we found no evidence that any forecasting method outperformed the others. We found evidence that experts who believed it was more likely to achieve global interruption of transmission goals and disability reduction goals had higher error scores for India and Indonesia, but lower for Brazil. Even for a disease whose epidemiology changes on a slow time scale, forecasting exercises such as we conducted are simple and practical. We believe they can be used on a routine basis in public health. PMID:28813531
Palm oil price forecasting model: An autoregressive distributed lag (ARDL) approach
NASA Astrophysics Data System (ADS)
Hamid, Mohd Fahmi Abdul; Shabri, Ani
2017-05-01
Palm oil price fluctuated without any clear trend or cyclical pattern in the last few decades. The instability of food commodities price causes it to change rapidly over time. This paper attempts to develop Autoregressive Distributed Lag (ARDL) model in modeling and forecasting the price of palm oil. In order to use ARDL as a forecasting model, this paper modifies the data structure where we only consider lagged explanatory variables to explain the variation in palm oil price. We then compare the performance of this ARDL model with a benchmark model namely ARIMA in term of their comparative forecasting accuracy. This paper also utilize ARDL bound testing approach to co-integration in examining the short run and long run relationship between palm oil price and its determinant; production, stock, and price of soybean as the substitute of palm oil and price of crude oil. The comparative forecasting accuracy suggests that ARDL model has a better forecasting accuracy compared to ARIMA.
Complex Nonlinear Dynamic System of Oligopolies Price Game with Heterogeneous Players Under Noise
NASA Astrophysics Data System (ADS)
Liu, Feng; Li, Yaguang
A nonlinear four oligopolies price game with heterogeneous players, that are boundedly rational and adaptive, is built using two different special demand costs. Based on the theory of complex discrete dynamical system, the stability and the existing equilibrium point are investigated. The complex dynamic behavior is presented via bifurcation diagrams, the Lyapunov exponents to show equilibrium state, bifurcation and chaos with the variation in parameters. As disturbance is ubiquitous in economic systems, this paper focuses on the analysis of delay feedback control method under noise circumstances. Stable dynamics is confirmed to depend mainly on the low price adjustment speed, and if all four players have limited opportunities to stabilize the market, the new adaptive player facing profits of scale are found to be higher than the incumbents of bounded rational.
The method of planning the energy consumption for electricity market
NASA Astrophysics Data System (ADS)
Russkov, O. V.; Saradgishvili, S. E.
2017-10-01
The limitations of existing forecast models are defined. The offered method is based on game theory, probabilities theory and forecasting the energy prices relations. New method is the basis for planning the uneven energy consumption of industrial enterprise. Ecological side of the offered method is disclosed. The program module performed the algorithm of the method is described. Positive method tests at the industrial enterprise are shown. The offered method allows optimizing the difference between planned and factual consumption of energy every hour of a day. The conclusion about applicability of the method for addressing economic and ecological challenges is made.
Tightness of correlation inequalities with no quantum violation
NASA Astrophysics Data System (ADS)
Ramanathan, Ravishankar; Quintino, Marco Túlio; Sainz, Ana Belén; Murta, Gláucia; Augusiak, Remigiusz
2017-01-01
We study the faces of the set of quantum correlations, i.e., the Bell and noncontextuality inequalities without any quantum violation. First, we investigate the question of whether every proper (facet-defining) Bell inequality for two parties, other than the trivial ones from positivity, normalization, and no-signaling, can be violated by quantum correlations, i.e., whether the classical Bell polytope or the smaller correlation polytope share any facets with their respective quantum sets. To do this, we develop a recently derived bound on the quantum value of linear games based on the norms of game matrices to give a simple sufficient condition to identify linear games with no quantum advantage. Additionally we show how this bound can be extended to the general class of unique games. We then show that the paradigmatic examples of correlation Bell inequalities with no quantum violation, namely the nonlocal computation games, do not constitute facet-defining Bell inequalities, not even for the correlation polytope. We also extend this to an arbitrary prime number of outcomes for a specific class of these games. We then study the faces in the simplest Clauser-Horne-Shimony-Holt Bell scenario of binary dichotomic measurements, and identify edges in the set of quantum correlations in this scenario. Finally, we relate the noncontextual polytope of single-party correlation inequalities with the cut polytope CUT(∇ G ) , where G denotes the compatibility graph of observables in the contextuality scenario and ∇ G denotes the suspension graph of G . We observe that there exist facet-defining noncontextuality inequalities with no quantum violation, and furthermore that this set of inequalities is beyond those implied by the consistent exclusivity principle.
2012-01-20
t) = ∑ ~a∈A(σ1(t1)(a1) × . . . × σm(tm)(am))ui(~t,~a). Player i’s expected utility if ~σ is played, denoted Ui(~σ), is then just EPr [u ~σ i ] = ∑ ~t∈T...well-known Traveler’s Dilemma (K. Basu, The traveler’s dilemma: paradoxes of rationality in game theory, American Economic Review 84:2, 1994, pp. 391
Short-Term Forecasts Using NU-WRF for the Winter Olympics 2018
NASA Technical Reports Server (NTRS)
Srikishen, Jayanthi; Case, Jonathan L.; Petersen, Walter A.; Iguchi, Takamichi; Tao, Wei-Kuo; Zavodsky, Bradley T.; Molthan, Andrew
2017-01-01
The NASA Unified-Weather Research and Forecasting model (NU-WRF) will be included for testing and evaluation in the forecast demonstration project (FDP) of the International Collaborative Experiment -PyeongChang 2018 Olympic and Paralympic (ICE-POP) Winter Games. An international array of radar and supporting ground based observations together with various forecast and now-cast models will be operational during ICE-POP. In conjunction with personnel from NASA's Goddard Space Flight Center, the NASA Short-term Prediction Research and Transition (SPoRT) Center is developing benchmark simulations for a real-time NU-WRF configuration to run during the FDP. ICE-POP observational datasets will be used to validate model simulations and investigate improved model physics and performance for prediction of snow events during the research phase (RDP) of the project The NU-WRF model simulations will also support NASA Global Precipitation Measurement (GPM) Mission ground-validation physical and direct validation activities in relation to verifying, testing and improving satellite-based snowfall retrieval algorithms over complex terrain.
Decision-making under uncertainty: results from an experiment conducted at EGU 2012
NASA Astrophysics Data System (ADS)
Ramos, Maria-Helena; van Andel, Schalk Jan; Pappenberger, Florian
2013-04-01
Do probabilistic forecasts lead to better decisions? At the EGU General Assembly 2012, we conducted a laboratory-style experiment to address this question. Several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Participants were prompted to make decisions when forecasts were provided with and without uncertainty information. They had to decide whether to open or not a gate which was the inlet of a retention basin designed to protect a town. The rules were such that: if they decided to open the gate, the retention basin was flooded and the farmers in this basin demanded a compensation for flooding their land; if they decided not to open the gate and a flood occurred on the river, the town was flooded and they had to pay a fine to the town. Participants were encouraged to keep note of their individual decisions in a worksheet. About 100 worksheets were collected at the end of the game and the results of their evaluation are presented here. In general, they show that decisions are based on a combination of what is displayed by the expected (forecast) value and what is given by the uncertainty information. In the absence of uncertainty information, decision makers are compelled towards a more risk-averse attitude. Besides, more money was lost by a large majority of participants when they had to make decisions without uncertainty information. Limitations of the experiment setting are discussed, as well as the importance of the development of training tools to increase effectiveness in the use of probabilistic predictions to support decisions under uncertainty.
Robust allocation of a defensive budget considering an attacker's private information.
Nikoofal, Mohammad E; Zhuang, Jun
2012-05-01
Attackers' private information is one of the main issues in defensive resource allocation games in homeland security. The outcome of a defense resource allocation decision critically depends on the accuracy of estimations about the attacker's attributes. However, terrorists' goals may be unknown to the defender, necessitating robust decisions by the defender. This article develops a robust-optimization game-theoretical model for identifying optimal defense resource allocation strategies for a rational defender facing a strategic attacker while the attacker's valuation of targets, being the most critical attribute of the attacker, is unknown but belongs to bounded distribution-free intervals. To our best knowledge, no previous research has applied robust optimization in homeland security resource allocation when uncertainty is defined in bounded distribution-free intervals. The key features of our model include (1) modeling uncertainty in attackers' attributes, where uncertainty is characterized by bounded intervals; (2) finding the robust-optimization equilibrium for the defender using concepts dealing with budget of uncertainty and price of robustness; and (3) applying the proposed model to real data. © 2011 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Zhu, Zhiwei; Li, Tim
2017-01-01
The extended-range (10-30-day) rainfall forecast over the entire China was carried out using spatial-temporal projection models (STPMs). Using a rotated empirical orthogonal function analysis of intraseasonal (10-80-day) rainfall anomalies, China is divided into ten sub-regions. Different predictability sources were selected for each of the ten regions. The forecast skills are ranked for each region. Based on temporal correlation coefficient (TCC) and Gerrity skill score, useful skills are found for most parts of China at a 20-25-day lead. The southern China and the mid-lower reaches of Yangtze River Valley show the highest predictive skills, whereas southwestern China and Huang-Huai region have the lowest predictive skills. By combining forecast results from ten regional STPMs, the TCC distribution of 8-year (2003-2010) independent forecast for the entire China is investigated. The combined forecast results from ten STPMs show significantly higher skills than the forecast with just one single STPM for the entire China. Independent forecast examples of summer rainfall anomalies around the period of Beijing Olympic Games in 2008 and Shanghai World Expo in 2010 are presented. The result shows that the current model is able to reproduce the gross pattern of the summer intraseasonal rainfall over China at a 20-day lead. The present study provides, for the first time, a guide on the statistical extended-range forecast of summer rainfall anomalies for the entire China. It is anticipated that the ideas and methods proposed here will facilitate the extended-range forecast in China.
Market mechanism based on the endogenous changing of game types such as Minority-Majority games
NASA Astrophysics Data System (ADS)
Ahn, Sanghyun; Lim, Gyuchang; Kim, Sooyong; Kim, Kyungsik
2010-03-01
In many social and biological systems agents simultaneously and adaptively compete for limited resources, thereby altering their environment. We propose a evolution function extending Minority-Majority Games that captures the competition between agents to make money. The dynamics changes the ratio of two types of boundedly rational traders, fundamentalists and chartists with the payoff function endogenously. In the previous game theories, the best strategies are not always targeting the minority but are shifting opportunistically between the minority and the majority. And using a mixture of local bifurcation theory and numerical methods, there are possible bifurcation routes to complicated asset price dynamics, chaotic attractors. Hereby we improve the thinking logic of the atoms for attaching the dynamics to the market. This working shows that removing unrealistic features of the game theories leads to models which reproduce a behavior close to what is observed in real markets.
Inferring Structure and Forecasting Dynamics on Evolving Networks
2016-01-05
Graphs ........................................................................................................................ 23 7. Sacred Values...5) Team Formation; (6) Games of Graphs; (7) Sacred Values and Legitimacy in Network Interactions; (8) Network processes in Geo-Social Context. 1...Authority, Cooperation and Competition in Religious Networks Key Papers: McBride 2015a [72] and McBride 2015b [73] McBride (2015a) examines
NASA Astrophysics Data System (ADS)
Ramos, Maria-Helena; Wetterhall, Fredrik; Wood, Andy; Wang, Qj; Pappenberger, Florian; Verkade, Jan
2017-04-01
Since 2004, HEPEX (Hydrologic Ensemble Prediction Experiment) has been fostering a community of researchers and practitioners around the world. Through the years, it has contributed to establish a more integrative view of hydrological forecasting, where data assimilation, hydro-meteorological modelling chains, post-processing techniques, expert knowledge, and decision support systems are connected to enhance operational systems and water management applications. Here we present the community activities in HEPEX that have contributed to strengthening this unfunded/volunteer effort for more than a decade. It includes the organization of workshops, conference sessions, testbeds and inter-comparison experiments. More recently, HEPEX has also prompted the development of several publicly available role-play games and, since 2013, it has been running a blog portal (www.hepex.org), which is used as an intersection point for members. Through this website, members can continuously share their research, make announcements, report on workshops, projects and meetings, and hear about related research and operational challenges. It also creates a platform for early career scientists to become increasingly involved in hydrological forecasting science and applications.
Uncertainty relation in Schwarzschild spacetime
NASA Astrophysics Data System (ADS)
Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng
2015-04-01
We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.
Software reliability: Additional investigations into modeling with replicated experiments
NASA Technical Reports Server (NTRS)
Nagel, P. M.; Schotz, F. M.; Skirvan, J. A.
1984-01-01
The effects of programmer experience level, different program usage distributions, and programming languages are explored. All these factors affect performance, and some tentative relational hypotheses are presented. An analytic framework for replicated and non-replicated (traditional) software experiments is presented. A method of obtaining an upper bound on the error rate of the next error is proposed. The method was validated empirically by comparing forecasts with actual data. In all 14 cases the bound exceeded the observed parameter, albeit somewhat conservatively. Two other forecasting methods are proposed and compared to observed results. Although demonstrated relative to this framework that stages are neither independent nor exponentially distributed, empirical estimates show that the exponential assumption is nearly valid for all but the extreme tails of the distribution. Except for the dependence in the stage probabilities, Cox's model approximates to a degree what is being observed.
Construction of Barrier in a Fishing Game With Point Capture.
Zha, Wenzhong; Chen, Jie; Peng, Zhihong; Gu, Dongbing
2017-06-01
This paper addresses a particular pursuit-evasion game, called as "fishing game" where a faster evader attempts to pass the gap between two pursuers. We are concerned with the conditions under which the evader or pursuers can win the game. This is a game of kind in which an essential aspect, barrier, separates the state space into disjoint parts associated with each player's winning region. We present a method of explicit policy to construct the barrier. This method divides the fishing game into two subgames related to the included angle and the relative distances between the evader and the pursuers, respectively, and then analyzes the possibility of capture or escape for each subgame to ascertain the analytical forms of the barrier. Furthermore, we fuse the games of kind and degree by solving the optimal control strategies in the minimum time for each player when the initial state lies in their winning regions. Along with the optimal strategies, the trajectories of the players are delineated and the upper bounds of their winning times are also derived.
Matching games with partial information
NASA Astrophysics Data System (ADS)
Laureti, Paolo; Zhang, Yi-Cheng
2003-06-01
We analyze different ways of pairing agents in a bipartite matching problem, with regard to its scaling properties and to the distribution of individual “satisfactions”. Then we explore the role of partial information and bounded rationality in a generalized Marriage Problem, comparing the benefits obtained by self-searching and by a matchmaker. Finally we propose a modified matching game intended to mimic the way consumers’ information makes firms to enhance the quality of their products in a competitive market.
NASA Astrophysics Data System (ADS)
Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui
2011-05-01
During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.
Forecasted economic change and the self-fulfilling prophecy in economic decision-making
2017-01-01
This study addresses the self-fulfilling prophecy effect, in the domain of economic decision-making. We present experimental data in support of the hypothesis that speculative forecasts of economic change can impact individuals’ economic decision behavior, prior to any realized changes. In a within-subjects experiment, participants (N = 40) played 180 trials in a Balloon Analogue Risk Talk (BART) in which they could make actual profit. Simple messages about possible (positive and negative) changes in outcome probabilities of future trials had significant effects on measures of risk taking (number of inflations) and actual profits in the game. These effects were enduring, even though no systematic changes in actual outcome probabilities took place following any of the messages. Risk taking also found to be reflected in reaction times revealing increasing reaction times with riskier decisions. Positive and negative economic forecasts affected reaction times slopes differently, with negative forecasts resulting in increased reaction time slopes as a function of risk. These findings suggest that forecasted positive or negative economic change can bias people’s mental model of the economy and reduce or stimulate risk taking. Possible implications for media-fulfilling prophecies in the domain of the economy are considered. PMID:28334031
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.
Securing cyber-systems on a continual basis against a multitude of adverse events is a challenging undertaking. Game-theoretic approaches, that model actions of strategic decision-makers, are increasingly being applied to address cybersecurity resource allocation challenges. Such game-based models account for multiple player actions and represent cyber attacker payoffs mostly as point utility estimates. Since a cyber-attacker’s payoff generation mechanism is largely unknown, appropriate representation and propagation of uncertainty is a critical task. In this paper we expand on prior work and focus on operationalizing the probabilistic uncertainty quantification framework, for a notional cyber system, through: 1) representation of uncertain attacker andmore » system-related modeling variables as probability distributions and mathematical intervals, and 2) exploration of uncertainty propagation techniques including two-phase Monte Carlo sampling and probability bounds analysis.« less
Violation of Bell inequalities for arbitrary-dimensional bipartite systems
NASA Astrophysics Data System (ADS)
Yang, Yanmin; Zheng, Zhu-Jun
2018-01-01
In this paper, we consider the violation of Bell inequalities for quantum system C^K⊗ C^K (integer K≥2) with group theoretical method. For general M possible measurements, and each measurement with K outcomes, the Bell inequalities based on the choice of two orbits are derived. When the observables are much enough, the quantum bounds are only dependent on M and approximate to the classical bounds. Moreover, the corresponding nonlocal games with two different scenarios are analyzed.
Participatory Games: Experiential learning to bridge disciplines
NASA Astrophysics Data System (ADS)
Coughlan, E.; Suarez, P.; Mendler de Suarez, J.; Bachofen, C.
2014-12-01
While the benefits of multi-disciplinary education have been extolled, there is more to success than producing students who are able to articulate the theorems of all pertinent disciplines. Here, we will describe case studies in which participatory scenario exercises and games can make the difference between memorizing information from an "outside" discipline, and actually internalizing the priorities and complications of the issue from an alien perspective. Case studies include teaching Red Cross community-based volunteers the Probability Distribution Function of seasonal rainfall forecasts, as well as requiring students of Columbia University's Master's Program in Climate and Society to study both natural and social aspects of climate. Games create a model system of the world, in which players assume a role and make decisions with consequences, facing complex feedback loops. Taking such roles catalyzes "AHA" moments that effectively bring home the intricacies of disciplinary paradigms outside of one's own.
Accuracy of professional sports drafts in predicting career potential.
Koz, D; Fraser-Thomas, J; Baker, J
2012-08-01
The forecasting of talented players is a crucial aspect of building a successful sports franchise and professional sports invest significant resources in making player choices in sport drafts. The current study examined the relationship between career performance (i.e. games played) and draft round for the National Football League, National Hockey League, National Basketball League, and Major League Baseball for players drafted from 1980 to 1989 (n = 4874) against the assumption of a linear relationship between performance and draft round (i.e. that players with the most potential will be selected before players of lower potential). A two-step analysis revealed significant differences in games played across draft rounds (step 1) and a significant negative relationship between draft round and games played (step 2); however, the amount of variance accounted for was relatively low (less than 17%). Results highlight the challenges of accurately evaluating amateur talent. © 2011 John Wiley & Sons A/S.
Zhao, Kun; Smillie, Luke D
2015-08-01
Economic games are well-established experimental paradigms for modeling social decision making. A large body of literature has pointed to the heterogeneity of behavior within many of these games, which might be partly explained by broad interpersonal trait dispositions. Using the Big Five and HEXACO (Honesty-Humility, Emotionality, eXtraversion, Agreeableness, Conscientiousness, Openness to Experience) personality frameworks, we review the role of personality in two main classes of economic games: social dilemmas and bargaining games. This reveals an emerging role for Big Five agreeableness in promoting cooperative, egalitarian, and altruistic behaviors across several games, consistent with its core characteristic of maintaining harmonious interpersonal relations. The role for extraversion is less clear, which may reflect the divergent effects of its underlying agentic and affiliative motivational components. In addition, HEXACO honesty-humility and agreeableness may capture distinct aspects of prosocial behavior outside the bounds of the Five-Factor Model. Important considerations and directions for future studies are discussed within the emerging personality-economics interface. © 2014 by the Society for Personality and Social Psychology, Inc.
NASA Astrophysics Data System (ADS)
Ma, Junhai; Yang, Wenhui; Lou, Wandong
This paper establishes an oligopolistic game model under the carbon emission reduction constraint and investigates its complex characteristics like bifurcation and chaos. Two oligopolistic manufacturers comprise three mixed game models, aiming to explore the variation in the status of operating system as per the upgrading of benchmark reward-penalty mechanism. Firstly, we set up these basic models that are respectively distinguished with carbon emission quantity and study these models using different game methods. Then, we concentrate on one typical game model to further study the dynamic complexity of variations in the system status, through 2D bifurcation diagrams and 4D parameter adjustment features based on the bounded rationality scheme for price, and the adaptive scheme for carbon emission. The results show that the carbon emission constraint has significant influence on the status variation of two-oligopolistic game operating systems no matter whether it is stable or chaotic. Besides, the new carbon emission regulation meets government supervision target and achieves the goal of being environment friendly by motivating the system to operate with lower carbon emission.
A game theoretic approach to a finite-time disturbance attenuation problem
NASA Technical Reports Server (NTRS)
Rhee, Ihnseok; Speyer, Jason L.
1991-01-01
A disturbance attenuation problem over a finite-time interval is considered by a game theoretic approach where the control, restricted to a function of the measurement history, plays against adversaries composed of the process and measurement disturbances, and the initial state. A zero-sum game, formulated as a quadratic cost criterion subject to linear time-varying dynamics and measurements, is solved by a calculus of variation technique. By first maximizing the quadratic cost criterion with respect to the process disturbance and initial state, a full information game between the control and the measurement residual subject to the estimator dynamics results. The resulting solution produces an n-dimensional compensator which expresses the controller as a linear combination of the measurement history. A disturbance attenuation problem is solved based on the results of the game problem. For time-invariant systems it is shown that under certain conditions the time-varying controller becomes time-invariant on the infinite-time interval. The resulting controller satisfies an H(infinity) norm bound.
Zhu, Zhengqiu; Chen, Bin; Qiu, Sihang; Wang, Rongxiao; Chen, Feiran; Wang, Yiping; Qiu, Xiaogang
2018-03-27
Chemical production activities in industrial districts pose great threats to the surrounding atmospheric environment and human health. Therefore, developing appropriate and intelligent pollution controlling strategies for the management team to monitor chemical production processes is significantly essential in a chemical industrial district. The literature shows that playing a chemical plant environmental protection (CPEP) game can force the chemical plants to be more compliant with environmental protection authorities and reduce the potential risks of hazardous gas dispersion accidents. However, results of the current literature strictly rely on several perfect assumptions which rarely hold in real-world domains, especially when dealing with human adversaries. To address bounded rationality and limited observability in human cognition, the CPEP game is extended to generate robust schedules of inspection resources for inspection agencies. The present paper is innovative on the following contributions: (i) The CPEP model is extended by taking observation frequency and observation cost of adversaries into account, and thus better reflects the industrial reality; (ii) Uncertainties such as attackers with bounded rationality, attackers with limited observation and incomplete information (i.e., the attacker's parameters) are integrated into the extended CPEP model; (iii) Learning curve theory is employed to determine the attacker's observability in the game solver. Results in the case study imply that this work improves the decision-making process for environmental protection authorities in practical fields by bringing more rewards to the inspection agencies and by acquiring more compliance from chemical plants.
A Game Theoretic Fault Detection Filter
NASA Technical Reports Server (NTRS)
Chung, Walter H.; Speyer, Jason L.
1995-01-01
The fault detection process is modelled as a disturbance attenuation problem. The solution to this problem is found via differential game theory, leading to an H(sub infinity) filter which bounds the transmission of all exogenous signals save the fault to be detected. For a general class of linear systems which includes some time-varying systems, it is shown that this transmission bound can be taken to zero by simultaneously bringing the sensor noise weighting to zero. Thus, in the limit, a complete transmission block can he achieved, making the game filter into a fault detection filter. When we specialize this result to time-invariant system, it is found that the detection filter attained in the limit is identical to the well known Beard-Jones Fault Detection Filter. That is, all fault inputs other than the one to be detected (the "nuisance faults") are restricted to an invariant subspace which is unobservable to a projection on the output. For time-invariant systems, it is also shown that in the limit, the order of the state-space and the game filter can be reduced by factoring out the invariant subspace. The result is a lower dimensional filter which can observe only the fault to be detected. A reduced-order filter can also he generated for time-varying systems, though the computational overhead may be intensive. An example given at the end of the paper demonstrates the effectiveness of the filter as a tool for fault detection and identification.
Wang, Rongxiao; Chen, Feiran; Wang, Yiping; Qiu, Xiaogang
2018-01-01
Chemical production activities in industrial districts pose great threats to the surrounding atmospheric environment and human health. Therefore, developing appropriate and intelligent pollution controlling strategies for the management team to monitor chemical production processes is significantly essential in a chemical industrial district. The literature shows that playing a chemical plant environmental protection (CPEP) game can force the chemical plants to be more compliant with environmental protection authorities and reduce the potential risks of hazardous gas dispersion accidents. However, results of the current literature strictly rely on several perfect assumptions which rarely hold in real-world domains, especially when dealing with human adversaries. To address bounded rationality and limited observability in human cognition, the CPEP game is extended to generate robust schedules of inspection resources for inspection agencies. The present paper is innovative on the following contributions: (i) The CPEP model is extended by taking observation frequency and observation cost of adversaries into account, and thus better reflects the industrial reality; (ii) Uncertainties such as attackers with bounded rationality, attackers with limited observation and incomplete information (i.e., the attacker’s parameters) are integrated into the extended CPEP model; (iii) Learning curve theory is employed to determine the attacker’s observability in the game solver. Results in the case study imply that this work improves the decision-making process for environmental protection authorities in practical fields by bringing more rewards to the inspection agencies and by acquiring more compliance from chemical plants. PMID:29584679
2014-07-01
Unified Theory of Acceptance and Use of Technology, Structuration Model of Technology, UNCLASSIFIED DSTO-TR-2992 UNCLASSIFIED 5 Adaptive...Structuration Theory , Model of Mutual Adaptation, Model of Technology Appropriation, Diffusion/Implementation Model, and Tri-core Model, among others [11...simulation gaming essay/scenario writing genius forecasting role play/acting backcasting swot brainstorming relevance tree/logic chart scenario workshop
Linear game non-contextuality and Bell inequalities—a graph-theoretic approach
NASA Astrophysics Data System (ADS)
Rosicka, M.; Ramanathan, R.; Gnaciński, P.; Horodecki, K.; Horodecki, M.; Horodecki, P.; Severini, S.
2016-04-01
We study the classical and quantum values of a class of one- and two-party unique games, that generalizes the well-known XOR games to the case of non-binary outcomes. In the bipartite case the generalized XOR (XOR-d) games we study are a subclass of the well-known linear games. We introduce a ‘constraint graph’ associated to such a game, with the constraints defining the game represented by an edge-coloring of the graph. We use the graph-theoretic characterization to relate the task of finding equivalent games to the notion of signed graphs and switching equivalence from graph theory. We relate the problem of computing the classical value of single-party anti-correlation XOR games to finding the edge bipartization number of a graph, which is known to be MaxSNP hard, and connect the computation of the classical value of XOR-d games to the identification of specific cycles in the graph. We construct an orthogonality graph of the game from the constraint graph and study its Lovász theta number as a general upper bound on the quantum value even in the case of single-party contextual XOR-d games. XOR-d games possess appealing properties for use in device-independent applications such as randomness of the local correlated outcomes in the optimal quantum strategy. We study the possibility of obtaining quantum algebraic violation of these games, and show that no finite XOR-d game possesses the property of pseudo-telepathy leaving the frequently used chained Bell inequalities as the natural candidates for such applications. We also show this lack of pseudo-telepathy for multi-party XOR-type inequalities involving two-body correlation functions.
Nagle, Aniket; Riener, Robert; Wolf, Peter
2015-01-01
Computer games are increasingly being used for training cognitive functions like working memory and attention among the growing population of older adults. While cognitive training games often include elements like difficulty adaptation, rewards, and visual themes to make the games more enjoyable and effective, the effect of different degrees of afforded user control in manipulating these elements has not been systematically studied. To address this issue, two distinct implementations of the three aforementioned game elements were tested among healthy older adults (N = 21, 69.9 ± 6.4 years old) playing a game-like version of the n-back task on a tablet at home for 3 weeks. Two modes were considered, differentiated by the afforded degree of user control of the three elements: user control of difficulty vs. automatic difficulty adaptation, difficulty-dependent rewards vs. automatic feedback messages, and user choice of visual theme vs. no choice. The two modes ("USER-CONTROL" and "AUTO") were compared for frequency of play, duration of play, and in-game performance. Participants were free to play the game whenever and for however long they wished. Participants in USER-CONTROL exhibited significantly higher frequency of playing, total play duration, and in-game performance than participants in AUTO. The results of the present study demonstrate the efficacy of providing user control in the three game elements, while validating a home-based study design in which participants were not bound by any training regimen, and could play the game whenever they wished. The results have implications for designing cognitive training games that elicit higher compliance and better in-game performance, with an emphasis on home-based training.
Nagle, Aniket; Riener, Robert; Wolf, Peter
2015-01-01
Computer games are increasingly being used for training cognitive functions like working memory and attention among the growing population of older adults. While cognitive training games often include elements like difficulty adaptation, rewards, and visual themes to make the games more enjoyable and effective, the effect of different degrees of afforded user control in manipulating these elements has not been systematically studied. To address this issue, two distinct implementations of the three aforementioned game elements were tested among healthy older adults (N = 21, 69.9 ± 6.4 years old) playing a game-like version of the n-back task on a tablet at home for 3 weeks. Two modes were considered, differentiated by the afforded degree of user control of the three elements: user control of difficulty vs. automatic difficulty adaptation, difficulty-dependent rewards vs. automatic feedback messages, and user choice of visual theme vs. no choice. The two modes (“USER-CONTROL” and “AUTO”) were compared for frequency of play, duration of play, and in-game performance. Participants were free to play the game whenever and for however long they wished. Participants in USER-CONTROL exhibited significantly higher frequency of playing, total play duration, and in-game performance than participants in AUTO. The results of the present study demonstrate the efficacy of providing user control in the three game elements, while validating a home-based study design in which participants were not bound by any training regimen, and could play the game whenever they wished. The results have implications for designing cognitive training games that elicit higher compliance and better in-game performance, with an emphasis on home-based training. PMID:26635681
NASA Astrophysics Data System (ADS)
Antonenkov, D. V.; Solovev, D. B.
2017-10-01
The article covers the aspects of forecasting and consideration of the wholesale market environment in generating the power demand forecast. Major mining companies that operate in conditions of the present day power market have to provide a reliable energy demand request for a certain time period ahead, thus ensuring sufficient reduction of financial losses associated with deviations of the actual power demand from the expected figures. Normally, under the power supply agreement, the consumer is bound to provide a per-month and per-hour request annually. It means that the consumer has to generate one-month-ahead short-term and medium-term hourly forecasts. The authors discovered that empiric distributions of “Yakutugol”, Holding Joint Stock Company, power demand belong to the sustainable rank parameter H-distribution type used for generating forecasts based on extrapolation of such distribution parameters. For this reason they justify the need to apply the mathematic rank analysis in short-term forecasting of the contracted power demand of “Neryungri” coil strip mine being a component of the technocenosis-type system of the mining company “Yakutugol”, Holding JSC.
Data-driven forecasting algorithms for building energy consumption
NASA Astrophysics Data System (ADS)
Noh, Hae Young; Rajagopal, Ram
2013-04-01
This paper introduces two forecasting methods for building energy consumption data that are recorded from smart meters in high resolution. For utility companies, it is important to reliably forecast the aggregate consumption profile to determine energy supply for the next day and prevent any crisis. The proposed methods involve forecasting individual load on the basis of their measurement history and weather data without using complicated models of building system. The first method is most efficient for a very short-term prediction, such as the prediction period of one hour, and uses a simple adaptive time-series model. For a longer-term prediction, a nonparametric Gaussian process has been applied to forecast the load profiles and their uncertainty bounds to predict a day-ahead. These methods are computationally simple and adaptive and thus suitable for analyzing a large set of data whose pattern changes over the time. These forecasting methods are applied to several sets of building energy consumption data for lighting and heating-ventilation-air-conditioning (HVAC) systems collected from a campus building at Stanford University. The measurements are collected every minute, and corresponding weather data are provided hourly. The results show that the proposed algorithms can predict those energy consumption data with high accuracy.
Optimal strategies in the neighborhood of a collision course
NASA Technical Reports Server (NTRS)
Gutman, S.; Leitmann, G.
1976-01-01
We consider a simple differential game between pursuer P and evader E in the neighborhood of a nominal collision course. The payoff is the terminal lateral miss-distance. The control of each player is his acceleration normal to his velocity vector, and both players' controls are bounded. Saddlepoint strategies are deduced for three combinations of the acceleration bounds and are shown to be related to the sign of the derivative of the orientation of the line of sight (L.O.S.).
Bounded rationality in volunteering public goods games.
Xu, Zhaojin; Wang, Zhen; Zhang, Lianzhong
2010-05-07
It is one of the fundamental problems in biology and social sciences how to maintain high levels of cooperation among selfish individuals. Theorists present an effective mechanism promoting cooperation by allowing for voluntary participation in public goods games. But Nash's theory predicts that no one can do better or worse than loners (players unwilling to join the public goods game) in the long run, and that the frequency of participants is independent of loners' payoff. In this paper, we introduce a degree of rationality and investigate the model by means of an approximate best response dynamics. Our research shows that the payoffs of the loners have a significant effect in anonymous voluntary public goods games by this introduction and that the dynamics will drive the system to a fixed point, which is different from the Nash equilibrium. In addition, we also qualitatively explain the existing experimental results. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Single-axis gyroscopic motion with uncertain angular velocity about spin axis
NASA Technical Reports Server (NTRS)
Singh, S. N.
1977-01-01
A differential game approach is presented for studying the response of a gyro by treating the controlled angular velocity about the input axis as the evader, and the bounded but uncertain angular velocity about the spin axis as the pursuer. When the uncertain angular velocity about the spin axis desires to force the gyro to saturation a differential game problem with two terminal surfaces results, whereas when the evader desires to attain the equilibrium state the usual game with single terminal manifold arises. A barrier, delineating the capture zone (CZ) in which the gyro can attain saturation and the escape zone (EZ) in which the evader avoids saturation is obtained. The CZ is further delineated into two subregions such that the states in each subregion can be forced on a definite target manifold. The application of the game theoretic approach to Control Moment Gyro is briefly discussed.
Interval forecasting of cyber-attacks on industrial control systems
NASA Astrophysics Data System (ADS)
Ivanyo, Y. M.; Krakovsky, Y. M.; Luzgin, A. N.
2018-03-01
At present, cyber-security issues of industrial control systems occupy one of the key niches in a state system of planning and management Functional disruption of these systems via cyber-attacks may lead to emergencies related to loss of life, environmental disasters, major financial and economic damage, or disrupted activities of cities and settlements. There is then an urgent need to develop protection methods against cyber-attacks. This paper studied the results of cyber-attack interval forecasting with a pre-set intensity level of cyber-attacks. Interval forecasting is the forecasting of one interval from two predetermined ones in which a future value of the indicator will be obtained. For this, probability estimates of these events were used. For interval forecasting, a probabilistic neural network with a dynamic updating value of the smoothing parameter was used. A dividing bound of these intervals was determined by a calculation method based on statistical characteristics of the indicator. The number of cyber-attacks per hour that were received through a honeypot from March to September 2013 for the group ‘zeppo-norcal’ was selected as the indicator.
Evaluating Moving Target Defense with PLADD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Stephen T.; Outkin, Alexander V.; Gearhart, Jared Lee
This project evaluates the effectiveness of moving target defense (MTD) techniques using a new game we have designed, called PLADD, inspired by the game FlipIt [28]. PLADD extends FlipIt by incorporating what we believe are key MTD concepts. We have analyzed PLADD and proven the existence of a defender strategy that pushes a rational attacker out of the game, demonstrated how limited the strategies available to an attacker are in PLADD, and derived analytic expressions for the expected utility of the game’s players in multiple game variants. We have created an algorithm for finding a defender’s optimal PLADD strategy. Wemore » show that in the special case of achieving deterrence in PLADD, MTD is not always cost effective and that its optimal deployment may shift abruptly from not using MTD at all to using it as aggressively as possible. We believe our effort provides basic, fundamental insights into the use of MTD, but conclude that a truly practical analysis requires model selection and calibration based on real scenarios and empirical data. We propose several avenues for further inquiry, including (1) agents with adaptive capabilities more reflective of real world adversaries, (2) the presence of multiple, heterogeneous adversaries, (3) computational game theory-based approaches such as coevolution to allow scaling to the real world beyond the limitations of analytical analysis and classical game theory, (4) mapping the game to real-world scenarios, (5) taking player risk into account when designing a strategy (in addition to expected payoff), (6) improving our understanding of the dynamic nature of MTD-inspired games by using a martingale representation, defensive forecasting, and techniques from signal processing, and (7) using adversarial games to develop inherently resilient cyber systems.« less
Using systems gaming to explore decision-making under uncertainty in natural hazard crises
NASA Astrophysics Data System (ADS)
McCaughey, Jamie W.; Finnigan, David
2017-04-01
Faced with uncertain scientific forecasts of a potential hazard, it is perhaps natural to wait and see. As we wait, uncertainties do decrease, but so do our options to minimise impacts of the hazard. This tradeoff is fundamental to preparing for natural hazards, yet difficult to communicate. Interactive systems gaming is one promising way forward. We are developing in-person interactive games, drawing on role-playing and other table-top scenario exercises in natural hazards, as well as on game-based modeling of complex systems. Our games model an unfolding natural hazard crisis (such as volcanic unrest or an approaching typhoon) as a complex social-physical system. Participants take on the roles of diverse stakeholder groups (including government, scientists, media, farmers, city residents, and others) with differing expertise, responsibilities, and priorities. Interactions among these groups play out in a context of decreasing scientific uncertainty and decreasing options for actions to reduce societal risk. Key design challenges are (1) to engage players without trivialising the real-world context; (2) to provide the right level of guidance for players to navigate the system; and (3) to enable players to face realistic tradeoffs and see realistic consequences of their choices, without feeling frustrated that the game is set up for them to fail. We will first prototype the games with general public and secondary-school participants, then adjust this for specialist groups working in disaster management. We will illustrate participatory systems gaming techniques in our presentation 'A toolkit of systems gaming techniques' in the companion EGU session EOS6: 'Perform! A platform to discuss art & science projects with live presentation'.
A Gaussian Processes Technique for Short-term Load Forecasting with Considerations of Uncertainty
NASA Astrophysics Data System (ADS)
Ohmi, Masataro; Mori, Hiroyuki
In this paper, an efficient method is proposed to deal with short-term load forecasting with the Gaussian Processes. Short-term load forecasting plays a key role to smooth power system operation such as economic load dispatching, unit commitment, etc. Recently, the deregulated and competitive power market increases the degree of uncertainty. As a result, it is more important to obtain better prediction results to save the cost. One of the most important aspects is that power system operator needs the upper and lower bounds of the predicted load to deal with the uncertainty while they require more accurate predicted values. The proposed method is based on the Bayes model in which output is expressed in a distribution rather than a point. To realize the model efficiently, this paper proposes the Gaussian Processes that consists of the Bayes linear model and kernel machine to obtain the distribution of the predicted value. The proposed method is successively applied to real data of daily maximum load forecasting.
How Interplanetary Scintillation Data Can Improve Modeling of Coronal Mass Ejection Propagation
NASA Astrophysics Data System (ADS)
Taktakishvili, A.; Mays, M. L.; Manoharan, P. K.; Rastaetter, L.; Kuznetsova, M. M.
2017-12-01
Coronal mass ejections (CMEs) can have a significant impact on the Earth's magnetosphere-ionosphere system and cause widespread anomalies for satellites from geosynchronous to low-Earth orbit and produce effects such as geomagnetically induced currents. At the NASA/GSFC Community Coordinated Modeling Center we have been using ensemble modeling of CMEs since 2012. In this presnetation we demonstrate that using of interplanetary scintillation (IPS) observations from the Ooty Radio Telescope facility in India can help to track CME propagaion and improve ensemble forecasting of CMEs. The observations of the solar wind density and velocity using IPS from hundreds of distant sources in ensemble modeling of CMEs can be a game-changing improvement of the current state of the art in CME forecasting.
Emergence of scale-free characteristics in socio-ecological systems with bounded rationality
Kasthurirathna, Dharshana; Piraveenan, Mahendra
2015-01-01
Socio–ecological systems are increasingly modelled by games played on complex networks. While the concept of Nash equilibrium assumes perfect rationality, in reality players display heterogeneous bounded rationality. Here we present a topological model of bounded rationality in socio-ecological systems, using the rationality parameter of the Quantal Response Equilibrium. We argue that system rationality could be measured by the average Kullback–-Leibler divergence between Nash and Quantal Response Equilibria, and that the convergence towards Nash equilibria on average corresponds to increased system rationality. Using this model, we show that when a randomly connected socio-ecological system is topologically optimised to converge towards Nash equilibria, scale-free and small world features emerge. Therefore, optimising system rationality is an evolutionary reason for the emergence of scale-free and small-world features in socio-ecological systems. Further, we show that in games where multiple equilibria are possible, the correlation between the scale-freeness of the system and the fraction of links with multiple equilibria goes through a rapid transition when the average system rationality increases. Our results explain the influence of the topological structure of socio–ecological systems in shaping their collective cognitive behaviour, and provide an explanation for the prevalence of scale-free and small-world characteristics in such systems. PMID:26065713
Emergence of scale-free characteristics in socio-ecological systems with bounded rationality.
Kasthurirathna, Dharshana; Piraveenan, Mahendra
2015-06-11
Socio-ecological systems are increasingly modelled by games played on complex networks. While the concept of Nash equilibrium assumes perfect rationality, in reality players display heterogeneous bounded rationality. Here we present a topological model of bounded rationality in socio-ecological systems, using the rationality parameter of the Quantal Response Equilibrium. We argue that system rationality could be measured by the average Kullback--Leibler divergence between Nash and Quantal Response Equilibria, and that the convergence towards Nash equilibria on average corresponds to increased system rationality. Using this model, we show that when a randomly connected socio-ecological system is topologically optimised to converge towards Nash equilibria, scale-free and small world features emerge. Therefore, optimising system rationality is an evolutionary reason for the emergence of scale-free and small-world features in socio-ecological systems. Further, we show that in games where multiple equilibria are possible, the correlation between the scale-freeness of the system and the fraction of links with multiple equilibria goes through a rapid transition when the average system rationality increases. Our results explain the influence of the topological structure of socio-ecological systems in shaping their collective cognitive behaviour, and provide an explanation for the prevalence of scale-free and small-world characteristics in such systems.
The amazing evolutionary dynamics of non-linear optical systems with feedback
NASA Astrophysics Data System (ADS)
Yaroslavsky, Leonid
2013-09-01
Optical systems with feedback are, generally, non-linear dynamic systems. As such, they exhibit evolutionary behavior. In the paper we present results of experimental investigation of evolutionary dynamics of several models of such systems. The models are modifications of the famous mathematical "Game of Life". The modifications are two-fold: "Game of Life" rules are made stochastic and mutual influence of cells is made spatially non-uniform. A number of new phenomena in the evolutionary dynamics of the models are revealed: - "Ordering of chaos". Formation, from seed patterns, of stable maze-like patterns with chaotic "dislocations" that resemble natural patterns, such as skin patterns of some animals and fishes, see shell, fingerprints, magnetic domain patterns and alike, which one can frequently find in the nature. These patterns and their fragments exhibit a remarkable capability of unlimited growth. - "Self-controlled growth" of chaotic "live" formations into "communities" bounded, depending on the model, by a square, hexagon or octagon, until they reach a certain critical size, after which the growth stops. - "Eternal life in a bounded space" of "communities" after reaching a certain size and shape. - "Coherent shrinkage" of "mature", after reaching a certain size, "communities" into one of stable or oscillating patterns preserving in this process isomorphism of their bounding shapes until the very end.
Planning Responses to Demographic Change. AIR 1986 Annual Forum Paper.
ERIC Educational Resources Information Center
Taylor, Bryan J. R.; Taylor, Elizabeth A.
A method for forecasting the number of college graduates in the United Kingdom is described, and suggestions are offered about ways that society should react to influence declining enrollments and potential reductions in technologically skilled graduates. Consideration is given to the implications of recruiting noncollege-bound individuals to…
NASA Technical Reports Server (NTRS)
Rhee, Ihnseok; Speyer, Jason L.
1990-01-01
A game theoretic controller is developed for a linear time-invariant system with parameter uncertainties in system and input matrices. The input-output decomposition modeling for the plant uncertainty is adopted. The uncertain dynamic system is represented as an internal feedback loop in which the system is assumed forced by fictitious disturbance caused by the parameter uncertainty. By considering the input and the fictitious disturbance as two noncooperative players, a differential game problem is constructed. It is shown that the resulting time invariant controller stabilizes the uncertain system for a prescribed uncertainty bound. This game theoretic controller is applied to the momentum management and attitude control of the Space Station in the presence of uncertainties in the moments of inertia. Inclusion of the external disturbance torque to the design procedure results in a dynamical feedback controller which consists of conventional PID control and cyclic disturbance rejection filter. It is shown that the game theoretic design, comparing to the LQR design or pole placement design, improves the stability robustness with respect to inertia variations.
Adaptive Multi-Agent Systems for Constrained Optimization
NASA Technical Reports Server (NTRS)
Macready, William; Bieniawski, Stefan; Wolpert, David H.
2004-01-01
Product Distribution (PD) theory is a new framework for analyzing and controlling distributed systems. Here we demonstrate its use for distributed stochastic optimization. First we review one motivation of PD theory, as the information-theoretic extension of conventional full-rationality game theory to the case of bounded rational agents. In this extension the equilibrium of the game is the optimizer of a Lagrangian of the (probability distribution of) the joint state of the agents. When the game in question is a team game with constraints, that equilibrium optimizes the expected value of the team game utility, subject to those constraints. The updating of the Lagrange parameters in the Lagrangian can be viewed as a form of automated annealing, that focuses the MAS more and more on the optimal pure strategy. This provides a simple way to map the solution of any constrained optimization problem onto the equilibrium of a Multi-Agent System (MAS). We present computer experiments involving both the Queen s problem and K-SAT validating the predictions of PD theory and its use for off-the-shelf distributed adaptive optimization.
NASA Astrophysics Data System (ADS)
Sun, Jingliang; Liu, Chunsheng
2018-01-01
In this paper, the problem of intercepting a manoeuvring target within a fixed final time is posed in a non-linear constrained zero-sum differential game framework. The Nash equilibrium solution is found by solving the finite-horizon constrained differential game problem via adaptive dynamic programming technique. Besides, a suitable non-quadratic functional is utilised to encode the control constraints into a differential game problem. The single critic network with constant weights and time-varying activation functions is constructed to approximate the solution of associated time-varying Hamilton-Jacobi-Isaacs equation online. To properly satisfy the terminal constraint, an additional error term is incorporated in a novel weight-updating law such that the terminal constraint error is also minimised over time. By utilising Lyapunov's direct method, the closed-loop differential game system and the estimation weight error of the critic network are proved to be uniformly ultimately bounded. Finally, the effectiveness of the proposed method is demonstrated by using a simple non-linear system and a non-linear missile-target interception system, assuming first-order dynamics for the interceptor and target.
The Price of Anarchy in Network Creation Games Is (Mostly) Constant
NASA Astrophysics Data System (ADS)
Mihalák, Matúš; Schlegel, Jan Christoph
We study the price of anarchy and the structure of equilibria in network creation games. A network creation game (first defined and studied by Fabrikant et al. [4]) is played by n players {1,2,...,n}, each identified with a vertex of a graph (network), where the strategy of player i, i = 1,...,n, is to build some edges adjacent to i. The cost of building an edge is α> 0, a fixed parameter of the game. The goal of every player is to minimize its creation cost plus its usage cost. The creation cost of player i is α times the number of built edges. In the SumGame (the original variant of Fabrikant et al. [4]) the usage cost of player i is the sum of distances from i to every node of the resulting graph. In the MaxGame (variant defined and studied by Demaine et al. [3]) the usage cost is the eccentricity of i in the resulting graph of the game. In this paper we improve previously known bounds on the price of anarchy of the game (of both variants) for various ranges of α, and give new insights into the structure of equilibria for various values of α. The two main results of the paper show that for α > 273·n all equilibria in SumGame are trees and thus the price of anarchy is constant, and that for α> 129 all equilibria in MaxGame are trees and the price of anarchy is constant. For SumGame this (almost) answers one of the basic open problems in the field - is price of anarchy of the network creation game constant for all values of α? - in an affirmative way, up to a tiny range of α.
On Nash-Equilibria of Approximation-Stable Games
NASA Astrophysics Data System (ADS)
Awasthi, Pranjal; Balcan, Maria-Florina; Blum, Avrim; Sheffet, Or; Vempala, Santosh
One reason for wanting to compute an (approximate) Nash equilibrium of a game is to predict how players will play. However, if the game has multiple equilibria that are far apart, or ɛ-equilibria that are far in variation distance from the true Nash equilibrium strategies, then this prediction may not be possible even in principle. Motivated by this consideration, in this paper we define the notion of games that are approximation stable, meaning that all ɛ-approximate equilibria are contained inside a small ball of radius Δ around a true equilibrium, and investigate a number of their properties. Many natural small games such as matching pennies and rock-paper-scissors are indeed approximation stable. We show furthermore there exist 2-player n-by-n approximation-stable games in which the Nash equilibrium and all approximate equilibria have support Ω(log n). On the other hand, we show all (ɛ,Δ) approximation-stable games must have an ɛ-equilibrium of support O(Δ^{2-o(1)}/ɛ2{log n}), yielding an immediate n^{O(Δ^{2-o(1)}/ɛ^2log n)}-time algorithm, improving over the bound of [11] for games satisfying this condition. We in addition give a polynomial-time algorithm for the case that Δ and ɛ are sufficiently close together. We also consider an inverse property, namely that all non-approximate equilibria are far from some true equilibrium, and give an efficient algorithm for games satisfying that condition.
Climatic Forecasting of Net Infiltration at Yucca Montain Using Analogue Meteororological Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
B. Faybishenko
At Yucca Mountain, Nevada, future changes in climatic conditions will most likely alter net infiltration, or the drainage below the bottom of the evapotranspiration zone within the soil profile or flow across the interface between soil and the densely welded part of the Tiva Canyon Tuff. The objectives of this paper are to: (a) develop a semi-empirical model and forecast average net infiltration rates, using the limited meteorological data from analogue meteorological stations, for interglacial (present day), and future monsoon, glacial transition, and glacial climates over the Yucca Mountain region, and (b) corroborate the computed net-infiltration rates by comparing themmore » with the empirically and numerically determined groundwater recharge and percolation rates through the unsaturated zone from published data. In this paper, the author presents an approach for calculations of net infiltration, aridity, and precipitation-effectiveness indices, using a modified Budyko's water-balance model, with reference-surface potential evapotranspiration determined from the radiation-based Penman (1948) formula. Results of calculations show that net infiltration rates are expected to generally increase from the present-day climate to monsoon climate, to glacial transition climate, and then to the glacial climate. The forecasting results indicate the overlap between the ranges of net infiltration for different climates. For example, the mean glacial net-infiltration rate corresponds to the upper-bound glacial transition net infiltration, and the lower-bound glacial net infiltration corresponds to the glacial transition mean net infiltration. Forecasting of net infiltration for different climate states is subject to numerous uncertainties-associated with selecting climate analogue sites, using relatively short analogue meteorological records, neglecting the effects of vegetation and surface runoff and runon on a local scale, as well as possible anthropogenic climate changes.« less
Making detailed predictions makes (some) predictions worse
NASA Astrophysics Data System (ADS)
Kelly, Theresa F.
In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.
Similarity-based cooperation and spatial segregation
NASA Astrophysics Data System (ADS)
Traulsen, Arne; Claussen, Jens Christian
2004-10-01
We analyze a cooperative game, where the cooperative act is not based on the previous behavior of the coplayer, but on the similarity between the players. This system has been studied in a mean-field description recently [A. Traulsen and H. G. Schuster, Phys. Rev. E 68, 046129 (2003)]. Here, the spatial extension to a two-dimensional lattice is studied, where each player interacts with eight players in a Moore neighborhood. The system shows a strong segregation independent of parameters. The introduction of a local conversion mechanism towards tolerance allows for four-state cycles and the emergence of spiral waves in the spatial game. In the case of asymmetric costs of cooperation a rich variety of complex behavior is observed depending on both cooperation costs. Finally, we study the stabilization of a cooperative fixed point of a forecast rule in the symmetric game, which corresponds to cooperation across segregation borders. This fixed point becomes unstable for high cooperation costs, but can be stabilized by a linear feedback mechanism.
Business Simulations in Language Teaching.
ERIC Educational Resources Information Center
Westerfield, Kay J.; And Others
This paper describes a pilot project, conducted within the American English Institute at the University of Oregon, on the use of a published business-oriented management simulation in English language training for university-bound international students. The management game simulated competition among a group of manufacturing companies to acquire…
Puerto Rican Children's Informal Education at Home. Final Report.
ERIC Educational Resources Information Center
Jacob, Evelyn
Observations of children's daily activities and interviews with the children's caretakers provided information on preschool children's informal home education in Utuado, Puerto Rico. Three kinds of skills were examined: literacy, chores, and rule-bound games. The unit of analysis was the "Potential Learning Activity" (PLA), a…
76 FR 79567 - Tribal Background Investigations and Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-22
... development, self-sufficiency, and strong tribal governments; ensuring that the Indian tribe is the primary... federal agency for releasing personal information. Tribes are not necessarily bound by those restrictions... a tribe or a gaming operation. Failure to consent to the disclosures indicated in this notice will...
Game of life on phyllosilicates: Gliders, oscillators and still life
NASA Astrophysics Data System (ADS)
Adamatzky, Andrew
2013-10-01
A phyllosilicate is a sheet of silicate tetrahedra bound by basal oxygens. A phyllosilicate automaton is a regular network of finite state machines - silicon nodes and oxygen nodes - which mimics structure of the phyllosilicate. A node takes states 0 and 1. Each node updates its state in discrete time depending on a sum of states of its three (silicon) or six (oxygen) neighbours. Phyllosilicate automata exhibit localisations attributed to Conway's Game of Life: gliders, oscillators, still lifes, and a glider gun. Configurations and behaviour of typical localisations, and interactions between the localisations are illustrated.
Men, Zhongxian; Yee, Eugene; Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian
2014-01-01
Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an "optimal" weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds.
Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian
2014-01-01
Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an “optimal” weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds. PMID:27382627
NASA Astrophysics Data System (ADS)
Sahu, S.; Beig, G.; Schultz, M.; Parkhi, N.; Stein, O.
2012-04-01
The mega city of Delhi is the second largest urban agglomeration in India with 16.7 mio. inhabitants. Delhi has the highest per capita power consumption of electricity in India and the demand has risen by more than 50% during the last decade. Emissions from commercial, power, domestic and industrial sectors have strongly increased causing more and more environmental problems due to air pollution and its adverse impacts on human health. Particulate matter (PM) of size less than 2.5-micron (PM2.5) and 10 micron (PM10) have emerged as primary pollutants of concern due to their adverse impact on human health. As part of the System of Air quality Forecasting and Research (SAFAR) project developed for air quality forecasting during the Commonwealth Games (CWG) - 2010, a high resolution Emission Inventory (EI) of PM10 and PM2.5 has been developed for the metropolitan city Delhi for the year 2010. The comprehensive inventory involves detailed activity data and has been developed for a domain of 70km×65km with a 1.67km×1.67km resolution covering Delhi and its surrounding region (i.e. National Capital Region (NCR)). In creating this inventory, Geographical Information System (GIS) based techniques were used for the first time in India. The major sectors considered are, transport, thermal power plants, industries, residential and commercial cooking along with windblown road dust which is found to play a major role for the megacity environment. Extensive surveys were conducted among the Delhi slum dwellers (Jhuggi) in order to obtain more robust estimates for the activity data related to domestic cooking and heating. Total emissions of PM10 and PM2.5 including wind blown dust over the study area are found to be 236 Gg/yr and 94 Gg/yr respectively. About half of the PM10 emissions stem from windblown road dust. The new emission inventory has been used for regional air quality forecasts in the Delhi region during the Commonwealth games (SAFAR project), and they will soon be tested in simulations of the global atmospheric composition in the framework of the European MACC project which provided the chemical boundary conditions to the regional air quality forecasts in 2010.
New Approaches to Exciting Exergame-Experiences for People with Motor Function Impairments.
Eckert, Martina; Gómez-Martinho, Ignacio; Meneses, Juan; Martínez, José-Fernán
2017-02-12
The work presented here suggests new ways to tackle exergames for physical rehabilitation and to improve the players' immersion and involvement. The primary (but not exclusive) purpose is to increase the motivation of children and adolescents with severe physical impairments, for doing their required exercises while playing. The proposed gaming environment is based on the Kinect sensor and the Blender Game Engine. A middleware has been implemented that efficiently transmits the data from the sensor to the game. Inside the game, different newly proposed mechanisms have been developed to distinguish pure exercise-gestures from other movements used to control the game (e.g., opening a menu). The main contribution is the amplification of weak movements, which allows the physically impaired to have similar gaming experiences as the average population. To test the feasibility of the proposed methods, four mini-games were implemented and tested by a group of 11 volunteers with different disabilities, most of them bound to a wheelchair. Their performance has also been compared to that of a healthy control group. Results are generally positive and motivating, although there is much to do to improve the functionalities. There is a major demand for applications that help to include disabled people in society and to improve their life conditions. This work will contribute towards providing them with more fun during exercise.
New Approaches to Exciting Exergame-Experiences for People with Motor Function Impairments
Eckert, Martina; Gómez-Martinho, Ignacio; Meneses, Juan; Martínez, José-Fernán
2017-01-01
The work presented here suggests new ways to tackle exergames for physical rehabilitation and to improve the players’ immersion and involvement. The primary (but not exclusive) purpose is to increase the motivation of children and adolescents with severe physical impairments, for doing their required exercises while playing. The proposed gaming environment is based on the Kinect sensor and the Blender Game Engine. A middleware has been implemented that efficiently transmits the data from the sensor to the game. Inside the game, different newly proposed mechanisms have been developed to distinguish pure exercise-gestures from other movements used to control the game (e.g., opening a menu). The main contribution is the amplification of weak movements, which allows the physically impaired to have similar gaming experiences as the average population. To test the feasibility of the proposed methods, four mini-games were implemented and tested by a group of 11 volunteers with different disabilities, most of them bound to a wheelchair. Their performance has also been compared to that of a healthy control group. Results are generally positive and motivating, although there is much to do to improve the functionalities. There is a major demand for applications that help to include disabled people in society and to improve their life conditions. This work will contribute towards providing them with more fun during exercise. PMID:28208682
The socialization effect on decision making in the Prisoner's Dilemma game: An eye-tracking study
Myagkov, Mikhail G.; Harriff, Kyle
2017-01-01
We used a mobile eye-tracking system (in the form of glasses) to study the characteristics of visual perception in decision making in the Prisoner's Dilemma game. In each experiment, one of the 12 participants was equipped with eye-tracking glasses. The experiment was conducted in three stages: an anonymous Individual Game stage against a randomly chosen partner (one of the 12 other participants of the experiment); a Socialization stage, in which the participants were divided into two groups; and a Group Game stage, in which the participants played with partners in the groups. After each round, the respondent received information about his or her personal score in the last round and the overall winner of the game at the moment. The study proves that eye-tracking systems can be used for studying the process of decision making and forecasting. The total viewing time and the time of fixation on areas corresponding to noncooperative decisions is related to the participants’ overall level of cooperation. The increase in the total viewing time and the time of fixation on the areas of noncooperative choice is due to a preference for noncooperative decisions and a decrease in the overall level of cooperation. The number of fixations on the group attributes is associated with group identity, but does not necessarily lead to cooperative behavior. PMID:28394939
Scenario Generation and Assessment Framework Solution in Support of the Comprehensive Approach
2010-04-01
attention, stress, fatigue etc.) and neurofeedback tracking for evaluation in a qualitative manner the real involvement of the trained participants in CAX...Series, Softrade, 2006 (in Bulgarian). [11] Minchev Z., Dukov G., Georgiev S. EEG Spectral Analysis in Serious Gaming: An Ad Hoc Experimental...Nonlinear and linear forecasting of the EEG time series, Biological Cybernetics, 66, 221-259, 1991. [20] Schubert, J., Svenson, P., and Mårtenson, Ch
NASA Astrophysics Data System (ADS)
Gunda, T.; Bazuin, J. T.; Nay, J.; Yeung, K. L.
2017-03-01
Access to seasonal climate forecasts can benefit farmers by allowing them to make more informed decisions about their farming practices. However, it is unclear whether farmers realize these benefits when crop choices available to farmers have different and variable costs and returns; multiple countries have programs that incentivize production of certain crops while other crops are subject to market fluctuations. We hypothesize that the benefits of forecasts on farmer livelihoods will be moderated by the combined impact of differing crop economics and changing climate. Drawing upon methods and insights from both physical and social sciences, we develop a model of farmer decision-making to evaluate this hypothesis. The model dynamics are explored using empirical data from Sri Lanka; primary sources include survey and interview information as well as game-based experiments conducted with farmers in the field. Our simulations show that a farmer using seasonal forecasts has more diversified crop selections, which drive increases in average agricultural income. Increases in income are particularly notable under a drier climate scenario, when a farmer using seasonal forecasts is more likely to plant onions, a crop with higher possible returns. Our results indicate that, when water resources are scarce (i.e. drier climate scenario), farmer incomes could become stratified, potentially compounding existing disparities in farmers’ financial and technical abilities to use forecasts to inform their crop selections. This analysis highlights that while programs that promote production of certain crops may ensure food security in the short-term, the long-term implications of these dynamics need careful evaluation.
NASA Astrophysics Data System (ADS)
Krakovsky, Y. M.; Luzgin, A. N.; Mikhailova, E. A.
2018-05-01
At present, cyber-security issues associated with the informatization objects of industry occupy one of the key niches in the state management system. As a result of functional disruption of these systems via cyberattacks, an emergency may arise related to loss of life, environmental disasters, major financial and economic damage, or disrupted activities of cities and settlements. When cyberattacks occur with high intensity, in these conditions there is the need to develop protection against them, based on machine learning methods. This paper examines interval forecasting and presents results with a pre-set intensity level. The interval forecasting is carried out based on a probabilistic cluster model. This method involves forecasting of one of the two predetermined intervals in which a future value of the indicator will be located; probability estimates are used for this purpose. A dividing bound of these intervals is determined by a calculation method based on statistical characteristics of the indicator. Source data are used that includes a number of hourly cyberattacks using a honeypot from March to September 2013.
A monogamy-of-entanglement game with applications to device-independent quantum cryptography
NASA Astrophysics Data System (ADS)
Tomamichel, Marco; Fehr, Serge; Kaniewski, Jędrzej; Wehner, Stephanie
2013-10-01
We consider a game in which two separate laboratories collaborate to prepare a quantum system and are then asked to guess the outcome of a measurement performed by a third party in a random basis on that system. Intuitively, by the uncertainty principle and the monogamy of entanglement, the probability that both players simultaneously succeed in guessing the outcome correctly is bounded. We are interested in the question of how the success probability scales when many such games are performed in parallel. We show that any strategy that maximizes the probability to win every game individually is also optimal for the parallel repetition of the game. Our result implies that the optimal guessing probability can be achieved without the use of entanglement. We explore several applications of this result. Firstly, we show that it implies security for standard BB84 quantum key distribution when the receiving party uses fully untrusted measurement devices, i.e. we show that BB84 is one-sided device independent. Secondly, we show how our result can be used to prove security of a one-round position-verification scheme. Finally, we generalize a well-known uncertainty relation for the guessing probability to quantum side information.
Short-term load and wind power forecasting using neural network-based prediction intervals.
Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas
2014-02-01
Electrical power systems are evolving from today's centralized bulk systems to more decentralized systems. Penetrations of renewable energies, such as wind and solar power, significantly increase the level of uncertainty in power systems. Accurate load forecasting becomes more complex, yet more important for management of power systems. Traditional methods for generating point forecasts of load demands cannot properly handle uncertainties in system operations. To quantify potential uncertainties associated with forecasts, this paper implements a neural network (NN)-based method for the construction of prediction intervals (PIs). A newly introduced method, called lower upper bound estimation (LUBE), is applied and extended to develop PIs using NN models. A new problem formulation is proposed, which translates the primary multiobjective problem into a constrained single-objective problem. Compared with the cost function, this new formulation is closer to the primary problem and has fewer parameters. Particle swarm optimization (PSO) integrated with the mutation operator is used to solve the problem. Electrical demands from Singapore and New South Wales (Australia), as well as wind power generation from Capital Wind Farm, are used to validate the PSO-based LUBE method. Comparative results show that the proposed method can construct higher quality PIs for load and wind power generation forecasts in a short time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Kyri; Dall'Anese, Emiliano; Summers, Tyler
This paper outlines a data-driven, distributionally robust approach to solve chance-constrained AC optimal power flow problems in distribution networks. Uncertain forecasts for loads and power generated by photovoltaic (PV) systems are considered, with the goal of minimizing PV curtailment while meeting power flow and voltage regulation constraints. A data- driven approach is utilized to develop a distributionally robust conservative convex approximation of the chance-constraints; particularly, the mean and covariance matrix of the forecast errors are updated online, and leveraged to enforce voltage regulation with predetermined probability via Chebyshev-based bounds. By combining an accurate linear approximation of the AC power flowmore » equations with the distributionally robust chance constraint reformulation, the resulting optimization problem becomes convex and computationally tractable.« less
Lifting primordial non-Gaussianity above the noise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Welling, Yvette; Woude, Drian van der; Pajer, Enrico, E-mail: welling@strw.leidenuniv.nl, E-mail: D.C.vanderWoude@uu.nl, E-mail: enrico.pajer@gmail.com
2016-08-01
Primordial non-Gaussianity (PNG) in Large Scale Structures is obfuscated by the many additional sources of non-linearity. Within the Effective Field Theory approach to Standard Perturbation Theory, we show that matter non-linearities in the bispectrum can be modeled sufficiently well to strengthen current bounds with near future surveys, such as Euclid. We find that the EFT corrections are crucial to this improvement in sensitivity. Yet, our understanding of non-linearities is still insufficient to reach important theoretical benchmarks for equilateral PNG, while, for local PNG, our forecast is more optimistic. We consistently account for the theoretical error intrinsic to the perturbative approachmore » and discuss the details of its implementation in Fisher forecasts.« less
Modeling Humans as Reinforcement Learners: How to Predict Human Behavior in Multi-Stage Games
NASA Technical Reports Server (NTRS)
Lee, Ritchie; Wolpert, David H.; Backhaus, Scott; Bent, Russell; Bono, James; Tracey, Brendan
2011-01-01
This paper introduces a novel framework for modeling interacting humans in a multi-stage game environment by combining concepts from game theory and reinforcement learning. The proposed model has the following desirable characteristics: (1) Bounded rational players, (2) strategic (i.e., players account for one anothers reward functions), and (3) is computationally feasible even on moderately large real-world systems. To do this we extend level-K reasoning to policy space to, for the first time, be able to handle multiple time steps. This allows us to decompose the problem into a series of smaller ones where we can apply standard reinforcement learning algorithms. We investigate these ideas in a cyber-battle scenario over a smart power grid and discuss the relationship between the behavior predicted by our model and what one might expect of real human defenders and attackers.
The Influence of Information Acquisition on the Complex Dynamics of Market Competition
NASA Astrophysics Data System (ADS)
Guo, Zhanbing; Ma, Junhai
In this paper, we build a dynamical game model with three bounded rational players (firms) to study the influence of information on the complex dynamics of market competition, where useful information is about rival’s real decision. In this dynamical game model, one information-sharing team is composed of two firms, they acquire and share the information about their common competitor, however, they make their own decisions separately, where the amount of information acquired by this information-sharing team will determine the estimation accuracy about the rival’s real decision. Based on this dynamical game model and some creative 3D diagrams, the influence of the amount of information on the complex dynamics of market competition such as local dynamics, global dynamics and profits is studied. These results have significant theoretical and practical values to realize the influence of information.
Oil supply between OPEC and non-OPEC based on game theory
NASA Astrophysics Data System (ADS)
Chang, Yuwen; Yi, Jiexin; Yan, Wei; Yang, Xinshe; Zhang, Song; Gao, Yifan; Wang, Xi
2014-10-01
The competing strategies between OPEC (Organization of the Petroleum Exporting Countries) and non-OPEC producers make the oil supply market a complex system, and thus, it is very difficult to model and to make predictions. In this paper, we combine the macro-model based on game theory and micro-model to propose a new approach for forecasting oil supply. We take into account the microscopic behaviour in the clearing market and also use the game relationships to adjust oil supplies in our approach. For the supply model, we analyse and consider the different behaviour of non-OPEC and OPEC producers. According to our analysis, limiting the oil supply, and thus maintaining oil price, is the best strategy for OPEC in the low-price scenario, while the rising supply is the best strategy in the high-price scenario. No matter what the oil price is, the dominant strategy for non-OPEC producers is to increase their oil supply. In the high-price scenario, OPEC will try to deplete non-OPEC's share in the oil supply market, which is to OPEC's advantage.
Gambling in Spain: update on experience, research and policy.
Jiménez-Murcia, Susana; Fernández-Aranda, Fernando; Granero, Roser; Menchón, Jose Manuel
2014-10-01
To describe the current situation of gambling in Spain, sketching its history and discussing the regulations and legislation currently in force within the framework of the European Union (EU), and to review the epidemiology of gambling in Spain, the self-help groups and professional treatments available, and their potential effectiveness. A systematic computerized search was performed in three databases (EMBASE, PubMed and PsychINFO, including articles and chapters) and the reference lists from previous reviews to obtain some of the most relevant studies published up to now on the topic of pathologic gambling in Spain. Similar to other EU countries, Spain has a high prevalence of pathologic gambling, focused on specific culturally bounded types of gambling. Expenditure in online gaming has risen significantly in the last few years, prompting the Spanish government to draft new legislation to regulate gaming. The gaming industry is expected to be one of the fastest growing sectors in Spain in the coming years owing to the rise of new technologies and the development of online gaming. © 2013 Society for the Study of Addiction.
Reversible simulation of irreversible computation
NASA Astrophysics Data System (ADS)
Li, Ming; Tromp, John; Vitányi, Paul
1998-09-01
Computer computations are generally irreversible while the laws of physics are reversible. This mismatch is penalized by among other things generating excess thermic entropy in the computation. Computing performance has improved to the extent that efficiency degrades unless all algorithms are executed reversibly, for example by a universal reversible simulation of irreversible computations. All known reversible simulations are either space hungry or time hungry. The leanest method was proposed by Bennett and can be analyzed using a simple ‘reversible’ pebble game. The reachable reversible simulation instantaneous descriptions (pebble configurations) of such pebble games are characterized completely. As a corollary we obtain the reversible simulation by Bennett and, moreover, show that it is a space-optimal pebble game. We also introduce irreversible steps and give a theorem on the tradeoff between the number of allowed irreversible steps and the memory gain in the pebble game. In this resource-bounded setting the limited erasing needs to be performed at precise instants during the simulation. The reversible simulation can be modified so that it is applicable also when the simulated computation time is unknown.
A Study of Driver's Route Choice Behavior Based on Evolutionary Game Theory
Jiang, Xiaowei; Ji, Yanjie; Deng, Wei
2014-01-01
This paper proposes a route choice analytic method that embeds cumulative prospect theory in evolutionary game theory to analyze how the drivers adjust their route choice behaviors under the influence of the traffic information. A simulated network with two alternative routes and one variable message sign is built to illustrate the analytic method. We assume that the drivers in the transportation system are bounded rational, and the traffic information they receive is incomplete. An evolutionary game model is constructed to describe the evolutionary process of the drivers' route choice decision-making behaviors. Here we conclude that the traffic information plays an important role in the route choice behavior. The driver's route decision-making process develops towards different evolutionary stable states in accordance with different transportation situations. The analysis results also demonstrate that employing cumulative prospect theory and evolutionary game theory to study the driver's route choice behavior is effective. This analytic method provides an academic support and suggestion for the traffic guidance system, and may optimize the travel efficiency to a certain extent. PMID:25610455
A study of driver's route choice behavior based on evolutionary game theory.
Jiang, Xiaowei; Ji, Yanjie; Du, Muqing; Deng, Wei
2014-01-01
This paper proposes a route choice analytic method that embeds cumulative prospect theory in evolutionary game theory to analyze how the drivers adjust their route choice behaviors under the influence of the traffic information. A simulated network with two alternative routes and one variable message sign is built to illustrate the analytic method. We assume that the drivers in the transportation system are bounded rational, and the traffic information they receive is incomplete. An evolutionary game model is constructed to describe the evolutionary process of the drivers' route choice decision-making behaviors. Here we conclude that the traffic information plays an important role in the route choice behavior. The driver's route decision-making process develops towards different evolutionary stable states in accordance with different transportation situations. The analysis results also demonstrate that employing cumulative prospect theory and evolutionary game theory to study the driver's route choice behavior is effective. This analytic method provides an academic support and suggestion for the traffic guidance system, and may optimize the travel efficiency to a certain extent.
Social cycling and conditional responses in the Rock-Paper-Scissors game
Wang, Zhijian; Xu, Bin; Zhou, Hai-Jun
2014-01-01
How humans make decisions in non-cooperative strategic interactions is a big question. For the fundamental Rock-Paper-Scissors (RPS) model game system, classic Nash equilibrium (NE) theory predicts that players randomize completely their action choices to avoid being exploited, while evolutionary game theory of bounded rationality in general predicts persistent cyclic motions, especially in finite populations. However as empirical studies have been relatively sparse, it is still a controversial issue as to which theoretical framework is more appropriate to describe decision-making of human subjects. Here we observe population-level persistent cyclic motions in a laboratory experiment of the discrete-time iterated RPS game under the traditional random pairwise-matching protocol. This collective behavior contradicts with the NE theory but is quantitatively explained, without any adjustable parameter, by a microscopic model of win-lose-tie conditional response. Theoretical calculations suggest that if all players adopt the same optimized conditional response strategy, their accumulated payoff will be much higher than the reference value of the NE mixed strategy. Our work demonstrates the feasibility of understanding human competition behaviors from the angle of non-equilibrium statistical physics. PMID:25060115
A new scoring method for evaluating the performance of earthquake forecasts and predictions
NASA Astrophysics Data System (ADS)
Zhuang, J.
2009-12-01
This study presents a new method, namely the gambling score, for scoring the performance of earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. A fair scoring scheme should reward the success in a way that is compatible with the risk taken. Suppose that we have the reference model, usually the Poisson model for usual cases or Omori-Utsu formula for the case of forecasting aftershocks, which gives probability p0 that at least 1 event occurs in a given space-time-magnitude window. The forecaster, similar to a gambler, who starts with a certain number of reputation points, bets 1 reputation point on ``Yes'' or ``No'' according to his forecast, or bets nothing if he performs a NA-prediction. If the forecaster bets 1 reputation point of his reputations on ``Yes" and loses, the number of his reputation points is reduced by 1; if his forecasts is successful, he should be rewarded (1-p0)/p0 reputation points. The quantity (1-p0)/p0 is the return (reward/bet) ratio for bets on ``Yes''. In this way, if the reference model is correct, the expected return that he gains from this bet is 0. This rule also applies to probability forecasts. Suppose that p is the occurrence probability of an earthquake given by the forecaster. We can regard the forecaster as splitting 1 reputation point by betting p on ``Yes'' and 1-p on ``No''. In this way, the forecaster's expected pay-off based on the reference model is still 0. From the viewpoints of both the reference model and the forecaster, the rule for rewarding and punishment is fair. This method is also extended to the continuous case of point process models, where the reputation points bet by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.
An Idea for Generating Diversity Conversations: Physics Jeopardy and the Future Faces of Physics Kit
NASA Astrophysics Data System (ADS)
Rand, Kendra; White, Gary
2008-10-01
Is there a way to engage typical physics undergraduates in a conversation about under-represented groups in physics that doesn't result in rolled-eyes or fingers-in-the-ears? The Society of Physics Students (SPS) has begun an experiment using a jeopardy-like game at physics meetings in an attempt to generate conversations about diversity. The physics jeopardy game is part of a "Future Faces of Physics" kit that includes a variety of materials that are of interest to those wanting to address under-represented audiences in physics, such as video clips exhibiting common physics words in sign language, tactile representations of the lunar surface for blind students, guidelines regarding lab procedures for the wheel-chair bound, and the book, Einstein on Race and Racism with a challenge letter directed at SPS chapters from the authors. While attempts to assess the impact of the game are modest, we report anecdotally some of the qualitative features seen in the discussions when the game is played. We also strive to indulge in a few physics jeopardy game moments to give a sense of how the game works. If you are hosting a meeting, large or small, and would like to receive this kit for use at your meeting, notify Kendra Rand, SPS Program Coordinator at krand@aip.org.
Study on system dynamics of evolutionary mix-game models
NASA Astrophysics Data System (ADS)
Gou, Chengling; Guo, Xiaoqian; Chen, Fang
2008-11-01
Mix-game model is ameliorated from an agent-based MG model, which is used to simulate the real financial market. Different from MG, there are two groups of agents in Mix-game: Group 1 plays a majority game and Group 2 plays a minority game. These two groups of agents have different bounded abilities to deal with historical information and to count their own performance. In this paper, we modify Mix-game model by assigning the evolution abilities to agents: if the winning rates of agents are smaller than a threshold, they will copy the best strategies the other agent has; and agents will repeat such evolution at certain time intervals. Through simulations this paper finds: (1) the average winning rates of agents in Group 1 and the mean volatilities increase with the increases of the thresholds of Group 1; (2) the average winning rates of both groups decrease but the mean volatilities of system increase with the increase of the thresholds of Group 2; (3) the thresholds of Group 2 have greater impact on system dynamics than the thresholds of Group 1; (4) the characteristics of system dynamics under different time intervals of strategy change are similar to each other qualitatively, but they are different quantitatively; (5) As the time interval of strategy change increases from 1 to 20, the system behaves more and more stable and the performances of agents in both groups become better also.
Evaluation of Improved Pushback Forecasts Derived from Airline Ground Operations Data
NASA Technical Reports Server (NTRS)
Carr, Francis; Theis, Georg; Feron, Eric; Clarke, John-Paul
2003-01-01
Accurate and timely predictions of airline pushbacks can potentially lead to improved performance of automated decision-support tools for airport surface traffic, thus reducing the variability and average duration of costly airline delays. One factor which affects the realization of these benefits is the level of uncertainty inherent in the turn processes. To characterize this inherent uncertainty, three techniques are developed for predicting time-to-go until pushback as a function of available ground-time; elapsed ground-time; and the status (not-started/in-progress/completed) of individual turn processes (cleaning, fueling, etc.). These techniques are tested against a large and detailed dataset covering approximately l0(exp 4) real-world turn operations obtained through collaboration with Deutsche Lufthansa AG. Even after the dataset is filtered to obtain a sample of turn operations with minimal uncertainty, the standard deviation of forecast error for all three techniques is lower-bounded away from zero, indicating that turn operations have a significant stochastic component. This lower-bound result shows that decision-support tools must be designed to incorporate robust mechanisms for coping with pushback demand stochasticity, rather than treating the pushback demand process as a known deterministic input.
On subgame perfect equilibria in quantum Stackelberg duopoly
NASA Astrophysics Data System (ADS)
Frąckiewicz, Piotr; Pykacz, Jarosław
2018-02-01
Our purpose is to study the Stackelberg duopoly with the use of the Li-Du-Massar quantum duopoly scheme. The result of Lo and Kiang has shown that the correlation of players's quantities caused by the quantum entanglement enlarges the first-mover advantage in the quantum Stackelberg duopoly. However, the interval of entanglement parameters for which this result is valid is bounded from above. It has been an open question what the equilibrium result is over the upper bound, in particular when the entanglement parameter goes to infinity. Our work provides complete analysis of subgame perfect equilibria of the game for all the values of the entanglement parameter.
Towards an integrated forecasting system for fisheries on habitat-bound stocks
NASA Astrophysics Data System (ADS)
Christensen, A.; Butenschön, M.; Gürkan, Z.; Allen, I. J.
2013-03-01
First results of a coupled modelling and forecasting system for fisheries on habitat-bound stocks are being presented. The system consists currently of three mathematically, fundamentally different model subsystems coupled offline: POLCOMS providing the physical environment implemented in the domain of the north-west European shelf, the SPAM model which describes sandeel stocks in the North Sea, and the third component, the SLAM model, which connects POLCOMS and SPAM by computing the physical-biological interaction. Our major experience by the coupling model subsystems is that well-defined and generic model interfaces are very important for a successful and extendable coupled model framework. The integrated approach, simulating ecosystem dynamics from physics to fish, allows for analysis of the pathways in the ecosystem to investigate the propagation of changes in the ocean climate and to quantify the impacts on the higher trophic level, in this case the sandeel population, demonstrated here on the basis of hindcast data. The coupled forecasting system is tested for some typical scientific questions appearing in spatial fish stock management and marine spatial planning, including determination of local and basin-scale maximum sustainable yield, stock connectivity and source/sink structure. Our presented simulations indicate that sandeel stocks are currently exploited close to the maximum sustainable yield, even though periodic overfishing seems to have occurred, but large uncertainty is associated with determining stock maximum sustainable yield due to stock inherent dynamics and climatic variability. Our statistical ensemble simulations indicates that the predictive horizon set by climate interannual variability is 2-6 yr, after which only an asymptotic probability distribution of stock properties, like biomass, are predictable.
Design of vibration isolation systems using multiobjective optimization techniques
NASA Technical Reports Server (NTRS)
Rao, S. S.
1984-01-01
The design of vibration isolation systems is considered using multicriteria optimization techniques. The integrated values of the square of the force transmitted to the main mass and the square of the relative displacement between the main mass and the base are taken as the performance indices. The design of a three degrees-of-freedom isolation system with an exponentially decaying type of base disturbance is considered for illustration. Numerical results are obtained using the global criterion, utility function, bounded objective, lexicographic, goal programming, goal attainment and game theory methods. It is found that the game theory approach is superior in finding a better optimum solution with proper balance of the various objective functions.
Local randomness: Examples and application
NASA Astrophysics Data System (ADS)
Fu, Honghao; Miller, Carl A.
2018-03-01
When two players achieve a superclassical score at a nonlocal game, their outputs must contain intrinsic randomness. This fact has many useful implications for quantum cryptography. Recently it has been observed [C. Miller and Y. Shi, Quantum Inf. Computat. 17, 0595 (2017)] that such scores also imply the existence of local randomness—that is, randomness known to one player but not to the other. This has potential implications for cryptographic tasks between two cooperating but mistrustful players. In the current paper we bring this notion toward practical realization, by offering near-optimal bounds on local randomness for the CHSH game, and also proving the security of a cryptographic application of local randomness (single-bit certified deletion).
NASA Technical Reports Server (NTRS)
Macready, William; Wolpert, David
2005-01-01
We demonstrate a new framework for analyzing and controlling distributed systems, by solving constrained optimization problems with an algorithm based on that framework. The framework is ar. information-theoretic extension of conventional full-rationality game theory to allow bounded rational agents. The associated optimization algorithm is a game in which agents control the variables of the optimization problem. They do this by jointly minimizing a Lagrangian of (the probability distribution of) their joint state. The updating of the Lagrange parameters in that Lagrangian is a form of automated annealing, one that focuses the multi-agent system on the optimal pure strategy. We present computer experiments for the k-sat constraint satisfaction problem and for unconstrained minimization of NK functions.
Enhancing Nursing Staffing Forecasting With Safety Stock Over Lead Time Modeling.
McNair, Douglas S
2015-01-01
In balancing competing priorities, it is essential that nursing staffing provide enough nurses to safely and effectively care for the patients. Mathematical models to predict optimal "safety stocks" have been routine in supply chain management for many years but have up to now not been applied in nursing workforce management. There are various aspects that exhibit similarities between the 2 disciplines, such as an evolving demand forecast according to acuity and the fact that provisioning "stock" to meet demand in a future period has nonzero variable lead time. Under assumptions about the forecasts (eg, the demand process is well fit as an autoregressive process) and about the labor supply process (≥1 shifts' lead time), we show that safety stock over lead time for such systems is effectively equivalent to the corresponding well-studied problem for systems with stationary demand bounds and base stock policies. Hence, we can apply existing models from supply chain analytics to find the optimal safety levels of nurse staffing. We use a case study with real data to demonstrate that there are significant benefits from the inclusion of the forecast process when determining the optimal safety stocks.
NASA Astrophysics Data System (ADS)
Ali, Mumtaz; Deo, Ravinesh C.; Downs, Nathan J.; Maraseni, Tek
2018-07-01
Forecasting drought by means of the World Meteorological Organization-approved Standardized Precipitation Index (SPI) is considered to be a fundamental task to support socio-economic initiatives and effectively mitigating the climate-risk. This study aims to develop a robust drought modelling strategy to forecast multi-scalar SPI in drought-rich regions of Pakistan where statistically significant lagged combinations of antecedent SPI are used to forecast future SPI. With ensemble-Adaptive Neuro Fuzzy Inference System ('ensemble-ANFIS') executed via a 10-fold cross-validation procedure, a model is constructed by randomly partitioned input-target data. Resulting in 10-member ensemble-ANFIS outputs, judged by mean square error and correlation coefficient in the training period, the optimal forecasts are attained by the averaged simulations, and the model is benchmarked with M5 Model Tree and Minimax Probability Machine Regression (MPMR). The results show the proposed ensemble-ANFIS model's preciseness was notably better (in terms of the root mean square and mean absolute error including the Willmott's, Nash-Sutcliffe and Legates McCabe's index) for the 6- and 12- month compared to the 3-month forecasts as verified by the largest error proportions that registered in smallest error band. Applying 10-member simulations, ensemble-ANFIS model was validated for its ability to forecast severity (S), duration (D) and intensity (I) of drought (including the error bound). This enabled uncertainty between multi-models to be rationalized more efficiently, leading to a reduction in forecast error caused by stochasticity in drought behaviours. Through cross-validations at diverse sites, a geographic signature in modelled uncertainties was also calculated. Considering the superiority of ensemble-ANFIS approach and its ability to generate uncertainty-based information, the study advocates the versatility of a multi-model approach for drought-risk forecasting and its prime importance for estimating drought properties over confidence intervals to generate better information for strategic decision-making.
Forecasting the quality of water-suppressed 1 H MR spectra based on a single-shot water scan.
Kyathanahally, Sreenath P; Kreis, Roland
2017-08-01
To investigate whether an initial non-water-suppressed acquisition that provides information about the signal-to-noise ratio (SNR) and linewidth is enough to forecast the maximally achievable final spectral quality and thus inform the operator whether the foreseen number of averages and achieved field homogeneity is adequate. A large range of spectra with varying SNR and linewidth was simulated and fitted with popular fitting programs to determine the dependence of fitting errors on linewidth and SNR. A tool to forecast variance based on a single acquisition was developed and its performance evaluated on simulated and in vivo data obtained at 3 Tesla from various brain regions and acquisition settings. A strong correlation to real uncertainties in estimated metabolite contents was found for the forecast values and the Cramer-Rao lower bounds obtained from the water-suppressed spectra. It appears to be possible to forecast the best-case errors associated with specific metabolites to be found in model fits of water-suppressed spectra based on a single water scan. Thus, nonspecialist operators will be able to judge ahead of time whether the planned acquisition can possibly be of sufficient quality to answer the targeted clinical question or whether it needs more averages or improved shimming. Magn Reson Med 78:441-451, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Customization of Discriminant Function Analysis for Prediction of Solar Flares
2005-03-01
lives such as telecommunication, commercial airlines, electrical power , wireless services, and terrestrial weather tracking and forecasting...the 1800’s can wreak havoc on today’s power , fuel, and telecommunication lines and finds its origin in solar activity. Enormous amounts of solar...inducing potential differences across large areas of the surface. Earth-bound power , fuel, and telecommunication lines grounded to the Earth provide an
NASA Astrophysics Data System (ADS)
Argoneto, Pierluigi; Renna, Paolo
2016-02-01
This paper proposes a Framework for Capacity Sharing in Cloud Manufacturing (FCSCM) able to support the capacity sharing issue among independent firms. The success of geographical distributed plants depends strongly on the use of opportune tools to integrate their resources and demand forecast in order to gather a specific production objective. The framework proposed is based on two different tools: a cooperative game algorithm, based on the Gale-Shapley model, and a fuzzy engine. The capacity allocation policy takes into account the utility functions of the involved firms. It is shown how the capacity allocation policy proposed induces all firms to report truthfully their information about their requirements. A discrete event simulation environment has been developed to test the proposed FCSCM. The numerical results show the drastic reduction of unsatisfied capacity obtained by the model of cooperation implemented in this work.
Managing distrust-induced risk with deposit in supply chain contract decisions.
Han, Guanghua; Dong, Ming; Sun, Qi
2014-01-01
This paper studies the trust issue in a two-echelon supply chain information sharing process. In a supply chain, the retailer reports the forecasted demand to the supplier. Traditionally, the supplier's trust in the retailer's reported information is based on the retailer's reputation. However, this paper considers that trust is random and is also affected by the reputation and the demand gap. The supplier and retailer have been shown to have different evaluations regarding the degree of trust. Furthermore, distrust is inherently linked to perceived risk. To mitigate perceived risk, a two-stage decision process with an unpayback deposit contract is proposed. At the first stage, the supplier and the retailer negotiate the deposit contract. At the second stage, a Stackelberg game is used to determine the retailer's reported demand and the supplier's production quantity. We show that the deposits from the retailer's and supplier's perspectives are different. When the retailer's reported demand is equal to the supplier's forecasted demand, the retailer's evaluation of the deposit is more than that of supplier's. When the retailer's reported demand is equal to the retailer's forecasted demand, the deposit from the retailer's perspective is at the lowest level.
Pricing a Protest: Forecasting the Dynamics of Civil Unrest Activity in Social Media.
Goode, Brian J; Krishnan, Siddharth; Roan, Michael; Ramakrishnan, Naren
2015-01-01
Online social media activity can often be a precursor to disruptive events such as protests, strikes, and "occupy" movements. We have observed that such civil unrest can galvanize supporters through social networks and help recruit activists to their cause. Understanding the dynamics of social network cascades and extrapolating their future growth will enable an analyst to detect or forecast major societal events. Existing work has primarily used structural and temporal properties of cascades to predict their future behavior. But factors like societal pressure, alignment of individual interests with broader causes, and perception of expected benefits also affect protest participation in social media. Here we develop an analysis framework using a differential game theoretic approach to characterize the cost of participating in a cascade, and demonstrate how we can combine such cost features with classical properties to forecast the future behavior of cascades. Using data from Twitter, we illustrate the effectiveness of our models on the "Brazilian Spring" and Venezuelan protests that occurred in June 2013 and November 2013, respectively. We demonstrate how our framework captures both qualitative and quantitative aspects of how these uprisings manifest through the lens of tweet volume on Twitter social media.
Managing Distrust-Induced Risk with Deposit in Supply Chain Contract Decisions
Han, Guanghua; Dong, Ming; Sun, Qi
2014-01-01
This paper studies the trust issue in a two-echelon supply chain information sharing process. In a supply chain, the retailer reports the forecasted demand to the supplier. Traditionally, the supplier's trust in the retailer's reported information is based on the retailer's reputation. However, this paper considers that trust is random and is also affected by the reputation and the demand gap. The supplier and retailer have been shown to have different evaluations regarding the degree of trust. Furthermore, distrust is inherently linked to perceived risk. To mitigate perceived risk, a two-stage decision process with an unpayback deposit contract is proposed. At the first stage, the supplier and the retailer negotiate the deposit contract. At the second stage, a Stackelberg game is used to determine the retailer's reported demand and the supplier's production quantity. We show that the deposits from the retailer's and supplier's perspectives are different. When the retailer's reported demand is equal to the supplier's forecasted demand, the retailer's evaluation of the deposit is more than that of supplier's. When the retailer's reported demand is equal to the retailer's forecasted demand, the deposit from the retailer's perspective is at the lowest level. PMID:25054190
NASA Astrophysics Data System (ADS)
Hayes, P.; Trigg, J. L.; Stauffer, D.; Hunter, G.; McQueen, J.
2006-05-01
Consequence assessment (CA) operations are those processes that attempt to mitigate negative impacts of incidents involving hazardous materials such as chemical, biological, radiological, nuclear, and high explosive (CBRNE) agents, facilities, weapons, or transportation. Incident types range from accidental spillage of chemicals at/en route to/from a manufacturing plant, to the deliberate use of radiological or chemical material as a weapon in a crowded city. The impacts of these incidents are highly variable, from little or no impact to catastrophic loss of life and property. Local and regional scale atmospheric conditions strongly influence atmospheric transport and dispersion processes in the boundary layer, and the extent and scope of the spread of dangerous materials in the lower levels of the atmosphere. Therefore, CA personnel charged with managing the consequences of CBRNE incidents must have detailed knowledge of current and future weather conditions to accurately model potential effects. A meteorology team was established at the U.S. Defense Threat Reduction Agency (DTRA) to provide weather support to CA personnel operating DTRA's CA tools, such as the Hazard Prediction and Assessment Capability (HPAC) tool. The meteorology team performs three main functions: 1) regular provision of meteorological data for use by personnel using HPAC, 2) determination of the best performing medium-range model forecast for the 12 - 48 hour timeframe and 3) provision of real-time help-desk support to users regarding acquisition and use of weather in HPAC CA applications. The normal meteorology team operations were expanded during a recent modeling project which took place during the 2006 Winter Olympic Games. The meteorology team took advantage of special weather observation datasets available in the domain of the Winter Olympic venues and undertook a project to improve weather modeling at high resolution. The varied and complex terrain provided a special challenge to the modelers on the meteorology team. Some of the Olympic venues were located in the mountains to the west of Torino, while the rest were located on the relatively flat plain in and around the cities of Pinerolo and Torino to the east. DTRA partners at Pennsylvania State University (PSU) and the U.S. National Center for Atmospheric Research (NCAR) established data collection and assimilation, and forecast modeling processes that used special weather station observations provided by the Area Previsione e Monitoraggio Ambientale of Italy's ARPA Piemonte. At PSU a version of the MM5 was especially prepared to use observation data to forecast weather in a four-nest configuration. Two other DTRA partners provided independent weather forecast models against which the PSU model data were compared. The U.S. Air Force Weather Agency provided its MM5 forecast model data and the U.S. National Oceanic and Atmospheric Administration's National Centers for Environmental Prediction provided data from a special version of their WRF model. The project produced many opportunities to improve the modeling and forecasting capability at DTRA. DTRA and its partners plan to expand upon this experience during upcoming field tests, and to further improve and expand the capability to provide accurate high-resolution weather forecast information to hazard and consequence assessment operations.
Convective Weather Forecast Quality Metrics for Air Traffic Management Decision-Making
NASA Technical Reports Server (NTRS)
Chatterji, Gano B.; Gyarfas, Brett; Chan, William N.; Meyn, Larry A.
2006-01-01
Since numerical weather prediction models are unable to accurately forecast the severity and the location of the storm cells several hours into the future when compared with observation data, there has been a growing interest in probabilistic description of convective weather. The classical approach for generating uncertainty bounds consists of integrating the state equations and covariance propagation equations forward in time. This step is readily recognized as the process update step of the Kalman Filter algorithm. The second well known method, known as the Monte Carlo method, consists of generating output samples by driving the forecast algorithm with input samples selected from distributions. The statistical properties of the distributions of the output samples are then used for defining the uncertainty bounds of the output variables. This method is computationally expensive for a complex model compared to the covariance propagation method. The main advantage of the Monte Carlo method is that a complex non-linear model can be easily handled. Recently, a few different methods for probabilistic forecasting have appeared in the literature. A method for computing probability of convection in a region using forecast data is described in Ref. 5. Probability at a grid location is computed as the fraction of grid points, within a box of specified dimensions around the grid location, with forecast convection precipitation exceeding a specified threshold. The main limitation of this method is that the results are dependent on the chosen dimensions of the box. The examples presented Ref. 5 show that this process is equivalent to low-pass filtering of the forecast data with a finite support spatial filter. References 6 and 7 describe the technique for computing percentage coverage within a 92 x 92 square-kilometer box and assigning the value to the center 4 x 4 square-kilometer box. This technique is same as that described in Ref. 5. Characterizing the forecast, following the process described in Refs. 5 through 7, in terms of percentage coverage or confidence level is notionally sound compared to characterizing in terms of probabilities because the probability of the forecast being correct can only be determined using actual observations. References 5 through 7 only use the forecast data and not the observations. The method for computing the probability of detection, false alarm ratio and several forecast quality metrics (Skill Scores) using both the forecast and observation data are given in Ref. 2. This paper extends the statistical verification method in Ref. 2 to determine co-occurrence probabilities. The method consists of computing the probability that a severe weather cell (grid location) is detected in the observation data in the neighborhood of the severe weather cell in the forecast data. Probabilities of occurrence at the grid location and in its neighborhood with higher severity, and with lower severity in the observation data compared to that in the forecast data are examined. The method proposed in Refs. 5 through 7 is used for computing the probability that a certain number of cells in the neighborhood of severe weather cells in the forecast data are seen as severe weather cells in the observation data. Finally, the probability of existence of gaps in the observation data in the neighborhood of severe weather cells in forecast data is computed. Gaps are defined as openings between severe weather cells through which an aircraft can safely fly to its intended destination. The rest of the paper is organized as follows. Section II summarizes the statistical verification method described in Ref. 2. The extension of this method for computing the co-occurrence probabilities in discussed in Section HI. Numerical examples using NCWF forecast data and NCWD observation data are presented in Section III to elucidate the characteristics of the co-occurrence probabilities. This section also discusses the procedure for computing throbabilities that the severity of convection in the observation data will be higher or lower in the neighborhood of grid locations compared to that indicated at the grid locations in the forecast data. The probability of coverage of neighborhood grid cells is also described via examples in this section. Section IV discusses the gap detection algorithm and presents a numerical example to illustrate the method. The locations of the detected gaps in the observation data are used along with the locations of convective weather cells in the forecast data to determine the probability of existence of gaps in the neighborhood of these cells. Finally, the paper is concluded in Section V.
Turing Trade: A Hybrid of a Turing Test and a Prediction Market
NASA Astrophysics Data System (ADS)
Farfel, Joseph; Conitzer, Vincent
We present Turing Trade, a web-based game that is a hybrid of a Turing test and a prediction market. In this game, there is a mystery conversation partner, the “target,” who is trying to appear human, but may in reality be either a human or a bot. There are multiple judges (or “bettors”), who interrogate the target in order to assess whether it is a human or a bot. Throughout the interrogation, each bettor bets on the nature of the target by buying or selling human (or bot) securities, which pay out if the target is a human (bot). The resulting market price represents the bettors’ aggregate belief that the target is a human. This game offers multiple advantages over standard variants of the Turing test. Most significantly, our game gathers much more fine-grained data, since we obtain not only the judges’ final assessment of the target’s humanity, but rather the entire progression of their aggregate belief over time. This gives us the precise moments in conversations where the target’s response caused a significant shift in the aggregate belief, indicating that the response was decidedly human or unhuman. An additional benefit is that (we believe) the game is more enjoyable to participants than a standard Turing test. This is important because otherwise, we will fail to collect significant amounts of data. In this paper, we describe in detail how Turing Trade works, exhibit some example logs, and analyze how well Turing Trade functions as a prediction market by studying the calibration and sharpness of its forecasts (from real user data).
NASA Astrophysics Data System (ADS)
Demaria, E. M.; Valdes, J. B.; Wi, S.; Serrat-Capdevila, A.; Valdés-Pineda, R.; Durcik, M.
2016-12-01
In under-instrumented basins around the world, accurate and timely forecasts of river streamflows have the potential of assisting water and natural resource managers in their management decisions. The Upper Zambezi river basin is the largest basin in southern Africa and its water resources are critical to sustainable economic growth and poverty reduction in eight riparian countries. We present a real-time streamflow forecast for the basin using a multi-model-multi-satellite approach that allows accounting for model and input uncertainties. Three distributed hydrologic models with different levels of complexity: VIC, HYMOD_DS, and HBV_DS are setup at a daily time step and a 0.25 degree spatial resolution for the basin. The hydrologic models are calibrated against daily observed streamflows at the Katima-Mulilo station using a Genetic Algorithm. Three real-time satellite products: Climate Prediction Center's morphing technique (CMORPH), Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN), and Tropical Rainfall Measuring Mission (TRMM-3B42RT) are bias-corrected with daily CHIRPS estimates. Uncertainty bounds for predicted flows are estimated with the Inverse Variance Weighting method. Because concentration times in the basin range from a few days to more than a week, we include the use of precipitation forecasts from the Global Forecasting System (GFS) to predict daily streamflows in the basin with a 10-days lead time. The skill of GFS-predicted streamflows is evaluated and the usefulness of the forecasts for short term water allocations is presented.
Zero-Determinant Strategies in Iterated Public Goods Game
Pan, Liming; Hao, Dong; Rong, Zhihai; Zhou, Tao
2015-01-01
Recently, Press and Dyson have proposed a new class of probabilistic and conditional strategies for the two-player iterated Prisoner’s Dilemma, so-called zero-determinant strategies. A player adopting zero-determinant strategies is able to pin the expected payoff of the opponents or to enforce a linear relationship between his own payoff and the opponents’ payoff, in a unilateral way. This paper considers zero-determinant strategies in the iterated public goods game, a representative multi-player game where in each round each player will choose whether or not to put his tokens into a public pot, and the tokens in this pot are multiplied by a factor larger than one and then evenly divided among all players. The analytical and numerical results exhibit a similar yet different scenario to the case of two-player games: (i) with small number of players or a small multiplication factor, a player is able to unilaterally pin the expected total payoff of all other players; (ii) a player is able to set the ratio between his payoff and the total payoff of all other players, but this ratio is limited by an upper bound if the multiplication factor exceeds a threshold that depends on the number of players. PMID:26293589
Product Distribution Theory for Control of Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Lee, Chia Fan; Wolpert, David H.
2004-01-01
Product Distribution (PD) theory is a new framework for controlling Multi-Agent Systems (MAS's). First we review one motivation of PD theory, as the information-theoretic extension of conventional full-rationality game theory to the case of bounded rational agents. In this extension the equilibrium of the game is the optimizer of a Lagrangian of the (probability distribution of) the joint stare of the agents. Accordingly we can consider a team game in which the shared utility is a performance measure of the behavior of the MAS. For such a scenario the game is at equilibrium - the Lagrangian is optimized - when the joint distribution of the agents optimizes the system's expected performance. One common way to find that equilibrium is to have each agent run a reinforcement learning algorithm. Here we investigate the alternative of exploiting PD theory to run gradient descent on the Lagrangian. We present computer experiments validating some of the predictions of PD theory for how best to do that gradient descent. We also demonstrate how PD theory can improve performance even when we are not allowed to rerun the MAS from different initial conditions, a requirement implicit in some previous work.
NASA Astrophysics Data System (ADS)
Rundle, J. B.
2017-12-01
Earthquakes and financial markets share surprising similarities [1]. For example, the well-known VIX index, which by definition is the implied volatility of the Standard and Poors 500 index, behaves in very similar quantitative fashion to time series for earthquake rates. Both display sudden increases at the time of an earthquake or an announcement of the US Federal Reserve Open Market Committee [2], and both decay as an inverse power of time. Both can be regarded as examples of first order phase transitions [1], and display fractal and scaling behavior associated with critical transitions, such as power-law magnitude-frequency relations in the tails of the distributions. Early quantitative investors such as Edward Thorpe and John Kelly invented novel methods to mitigate or manage risk in games of chance such as blackjack, and in markets using hedging techniques that are still in widespread use today. The basic idea is the concept of proportional betting, where the gambler/investor bets a fraction of the bankroll whose size is determined by the "edge" or inside knowledge of the real (and changing) odds. For earthquake systems, the "edge" over nature can only exist in the form of a forecast (probability of a future earthquake); a nowcast (knowledge of the current state of an earthquake fault system); or a timecast (statistical estimate of the waiting time until the next major earthquake). In our terminology, a forecast is a model, while the nowcast and timecast are analysis methods using observed data only (no model). We also focus on defined geographic areas rather than on faults, thereby eliminating the need to consider specific fault data or fault interactions. Data used are online earthquake catalogs, generally since 1980. Forecasts are based on the Weibull (1952) probability law, and only a handful of parameters are needed. These methods allow the development of real time hazard and risk estimation using cloud-based technologies, and permit the application of quantitative backtesting techniques. In addition, the similarities to the financial markets point us toward similar hedging strategies to mitigate and manage earthquake risk. [1] https://millervalue.com/?s=earthquakes [2] A.M. Person et al., Phys. Rev. E, 81, 066121, (2010)
Social Learning in the Ultimatum Game
Zhang, Boyu
2013-01-01
In the ultimatum game, two players divide a sum of money. The proposer suggests how to split and the responder can accept or reject. If the suggestion is rejected, both players get nothing. The rational solution is that the responder accepts even the smallest offer but humans prefer fair share. In this paper, we study the ultimatum game by a learning-mutation process based on quantal response equilibrium, where players are assumed boundedly rational and make mistakes when estimating the payoffs of strategies. Social learning is never stabilized at the fair outcome or the rational outcome, but leads to oscillations from offering 40 percent to 50 percent. To be precise, there is a clear tendency to increase the mean offer if it is lower than 40 percent, but will decrease when it reaches the fair offer. If mutations occur rarely, fair behavior is favored in the limit of local mutation. If mutation rate is sufficiently high, fairness can evolve for both local mutation and global mutation. PMID:24023950
NASA Astrophysics Data System (ADS)
Kuzma, H. A.; Golubkova, A.; Eklund, C.
2015-12-01
Nevada has the second largest output of geothermal energy in the United States (after California) with 14 major power plants producing over 425 megawatts of electricity meeting 7% of the state's total energy needs. A number of wells, particularly older ones, have shown significant temperature and pressure declines over their lifetimes, adversely affecting economic returns. Production declines are almost universal in the oil and gas (O&G) industry. BetaZi (BZ) is a proprietary algorithm which uses a physiostatistical model to forecast production from the past history of O&G wells and to generate "type curves" which are used to estimate the production of undrilled wells. Although BZ was designed and calibrated for O&G, it is a general purpose diffusion equation solver, capable of modeling complex fluid dynamics in multi-phase systems. In this pilot study, it is applied directly to the temperature data from five Nevada geothermal fields. With the data appropriately normalized, BZ is shown to accurately predict temperature declines. The figure shows several examples of BZ forecasts using historic data from Steamboat Hills field near Reno. BZ forecasts were made using temperature on a normalized scale (blue) with two years of data held out for blind testing (yellow). The forecast is returned in terms of percentiles of probability (red) with the median forecast marked (solid green). Actual production is expected to fall within the majority of the red bounds 80% of the time. Blind tests such as these are used to verify that the probabilistic forecast can be trusted. BZ is also used to compute and accurate type temperature profile for wells that have yet to be drilled. These forecasts can be combined with estimated costs to evaluate the economics and risks of a project or potential capital investment. It is remarkable that an algorithm developed for oil and gas can accurately predict temperature in geothermal wells without significant recasting.
Improving Simulated Annealing by Replacing Its Variables with Game-Theoretic Utility Maximizers
NASA Technical Reports Server (NTRS)
Wolpert, David H.; Bandari, Esfandiar; Tumer, Kagan
2001-01-01
The game-theory field of Collective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved as a side-effect. Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting significantly improves simulated annealing for a model of an economic process run over an underlying small-worlds topology. Furthermore, these experiments reveal novel small-worlds phenomena, and highlight the shortcomings of conventional mechanism design in bounded rationality domains.
Yoshida, Wako; Dolan, Ray J.; Friston, Karl J.
2008-01-01
This paper introduces a model of ‘theory of mind’, namely, how we represent the intentions and goals of others to optimise our mutual interactions. We draw on ideas from optimum control and game theory to provide a ‘game theory of mind’. First, we consider the representations of goals in terms of value functions that are prescribed by utility or rewards. Critically, the joint value functions and ensuing behaviour are optimised recursively, under the assumption that I represent your value function, your representation of mine, your representation of my representation of yours, and so on ad infinitum. However, if we assume that the degree of recursion is bounded, then players need to estimate the opponent's degree of recursion (i.e., sophistication) to respond optimally. This induces a problem of inferring the opponent's sophistication, given behavioural exchanges. We show it is possible to deduce whether players make inferences about each other and quantify their sophistication on the basis of choices in sequential games. This rests on comparing generative models of choices with, and without, inference. Model comparison is demonstrated using simulated and real data from a ‘stag-hunt’. Finally, we note that exactly the same sophisticated behaviour can be achieved by optimising the utility function itself (through prosocial utility), producing unsophisticated but apparently altruistic agents. This may be relevant ethologically in hierarchal game theory and coevolution. PMID:19112488
Adaptive, Distributed Control of Constrained Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Bieniawski, Stefan; Wolpert, David H.
2004-01-01
Product Distribution (PO) theory was recently developed as a broad framework for analyzing and optimizing distributed systems. Here we demonstrate its use for adaptive distributed control of Multi-Agent Systems (MASS), i.e., for distributed stochastic optimization using MAS s. First we review one motivation of PD theory, as the information-theoretic extension of conventional full-rationality game theory to the case of bounded rational agents. In this extension the equilibrium of the game is the optimizer of a Lagrangian of the (Probability dist&&on on the joint state of the agents. When the game in question is a team game with constraints, that equilibrium optimizes the expected value of the team game utility, subject to those constraints. One common way to find that equilibrium is to have each agent run a Reinforcement Learning (E) algorithm. PD theory reveals this to be a particular type of search algorithm for minimizing the Lagrangian. Typically that algorithm i s quite inefficient. A more principled alternative is to use a variant of Newton's method to minimize the Lagrangian. Here we compare this alternative to RL-based search in three sets of computer experiments. These are the N Queen s problem and bin-packing problem from the optimization literature, and the Bar problem from the distributed RL literature. Our results confirm that the PD-theory-based approach outperforms the RL-based scheme in all three domains.
Distributed Power Allocation for Wireless Sensor Network Localization: A Potential Game Approach.
Ke, Mingxing; Li, Ding; Tian, Shiwei; Zhang, Yuli; Tong, Kaixiang; Xu, Yuhua
2018-05-08
The problem of distributed power allocation in wireless sensor network (WSN) localization systems is investigated in this paper, using the game theoretic approach. Existing research focuses on the minimization of the localization errors of individual agent nodes over all anchor nodes subject to power budgets. When the service area and the distribution of target nodes are considered, finding the optimal trade-off between localization accuracy and power consumption is a new critical task. To cope with this issue, we propose a power allocation game where each anchor node minimizes the square position error bound (SPEB) of the service area penalized by its individual power. Meanwhile, it is proven that the power allocation game is an exact potential game which has one pure Nash equilibrium (NE) at least. In addition, we also prove the existence of an ϵ -equilibrium point, which is a refinement of NE and the better response dynamic approach can reach the end solution. Analytical and simulation results demonstrate that: (i) when prior distribution information is available, the proposed strategies have better localization accuracy than the uniform strategies; (ii) when prior distribution information is unknown, the performance of the proposed strategies outperforms power management strategies based on the second-order cone program (SOCP) for particular agent nodes after obtaining the estimated distribution of agent nodes. In addition, proposed strategies also provide an instructional trade-off between power consumption and localization accuracy.
Distribution-Agnostic Stochastic Optimal Power Flow for Distribution Grids: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Kyri; Dall'Anese, Emiliano; Summers, Tyler
2016-09-01
This paper outlines a data-driven, distributionally robust approach to solve chance-constrained AC optimal power flow problems in distribution networks. Uncertain forecasts for loads and power generated by photovoltaic (PV) systems are considered, with the goal of minimizing PV curtailment while meeting power flow and voltage regulation constraints. A data- driven approach is utilized to develop a distributionally robust conservative convex approximation of the chance-constraints; particularly, the mean and covariance matrix of the forecast errors are updated online, and leveraged to enforce voltage regulation with predetermined probability via Chebyshev-based bounds. By combining an accurate linear approximation of the AC power flowmore » equations with the distributionally robust chance constraint reformulation, the resulting optimization problem becomes convex and computationally tractable.« less
NASA Astrophysics Data System (ADS)
Emory, A. E.; Wick, G. A.; Dunion, J. P.; McLinden, M.; Schreier, M. M.; Black, P.; Hood, R. E.; Sippel, J.; Tallapragada, V.
2017-12-01
The impacts of Harvey, Irma, and Maria during the 2017 Atlantic hurricane season re-emphasized the critical need for accurate operational forecasts. The combined NASA East Pacific Origins and Characteristics of Hurricanes (EPOCH) and NOAA UAS field campaign during August 2017 was the fourth campaign in a series of dual agency partnerships between NASA and NOAA to improve forecasting accuracy in tropical cyclogenesis and rapid intensification. A brief history of Global Hawk (GH) hurricane field campaigns, including GRIP (2010), HS3 (2012-2014), NOAA-SHOUT (2015-2016) and EPOCH (2017), will show the incremental steps taken over the last eight years to bring the GH from a research platform to a candidate for operational hurricane reconnaissance. GH dropsondes were assimilated into the ECMWF and HWRF forecast models during the 2015-2016 NOAA SHOUT campaigns. EPOCH marked the first time that GH dropsondes were assimilated in real-time into NOAA's GFS forecast model. Early results show that assimilating dropsonde data significantly increases skill in predicting intensity change, which is game changing since the National Hurricane Center intensity error trend has remained virtually unchanged, particularly at 24 hours, over the last 25 years. The results from the past few years suggest that a paradigm shift of sampling the environment with a high-altitude, long-duration UAS like the GH that is capable of deploying up to 90 dropsondes ahead of and over the top of a developing or strengthening tropical cyclone could produce the best return on hurricane forecast predictions in subsequent years. Recommendations for the future, including lessons learned and the potential for R2O transition will be discussed.
NASA Astrophysics Data System (ADS)
Aulov, Oleg
This dissertation presents a novel approach that utilizes quantifiable social media data as a human aware, near real-time observing system, coupled with geophysical predictive models for improved response to disasters and extreme events. It shows that social media data has the potential to significantly improve disaster management beyond informing the public, and emphasizes the importance of different roles that social media can play in management, monitoring, modeling and mitigation of natural and human-caused extreme disasters. In the proposed approach Social Media users are viewed as "human sensors" that are "deployed" in the field, and their posts are considered to be "sensor observations", thus different social media outlets all together form a Human Sensor Network. We utilized the "human sensor" observations, as boundary value forcings, to show improved geophysical model forecasts of extreme disaster events when combined with other scientific data such as satellite observations and sensor measurements. Several recent extreme disasters are presented as use case scenarios. In the case of the Deepwater Horizon oil spill disaster of 2010 that devastated the Gulf of Mexico, the research demonstrates how social media data from Flickr can be used as a boundary forcing condition of GNOME oil spill plume forecast model, and results in an order of magnitude forecast improvement. In the case of Hurricane Sandy NY/NJ landfall impact of 2012, we demonstrate how the model forecasts, when combined with social media data in a single framework, can be used for near real-time forecast validation, damage assessment and disaster management. Owing to inherent uncertainties in the weather forecasts, the NOAA operational surge model only forecasts the worst-case scenario for flooding from any given hurricane. Geolocated and time-stamped Instagram photos and tweets allow near real-time assessment of the surge levels at different locations, which can validate model forecasts, give timely views of the actual levels of surge, as well as provide an upper bound beyond which the surge did not spread. Additionally, we developed AsonMaps---a crisis-mapping tool that combines dynamic model forecast outputs with social media observations and physical measurements to define the regions of event impacts.
Mogasale, Vittal; Ramani, Enusa; Park, Il Yeon; Lee, Jung Seok
2017-09-02
A Typhoid Conjugate Vaccine (TCV) is expected to acquire WHO prequalification soon, which will pave the way for its use in many low- and middle-income countries where typhoid fever is endemic. Thus it is critical to forecast future vaccine demand to ensure supply meets demand, and to facilitate vaccine policy and introduction planning. We forecasted introduction dates for countries based on specific criteria and estimated vaccine demand by year for defined vaccination strategies in 2 scenarios: rapid vaccine introduction and slow vaccine introduction. In the rapid introduction scenario, we forecasted 17 countries and India introducing TCV in the first 5 y of the vaccine's availability while in the slow introduction scenario we forecasted 4 countries and India introducing TCV in the same time period. If the vaccine is targeting infants in high-risk populations as a routine single dose, the vaccine demand peaks around 40 million doses per year under the rapid introduction scenario. Similarly, if the vaccine is targeting infants in the general population as a routine single dose, the vaccine demand increases to 160 million doses per year under the rapid introduction scenario. The demand forecast projected here is an upper bound estimate of vaccine demand, where actual demand depends on various factors such as country priorities, actual vaccine introduction, vaccination strategies, Gavi financing, costs, and overall product profile. Considering the potential role of TCV in typhoid control globally; manufacturers, policymakers, donors and financing bodies should work together to ensure vaccine access through sufficient production capacity, early WHO prequalification of the vaccine, continued Gavi financing and supportive policy.
Ramani, Enusa; Park, Il Yeon; Lee, Jung Seok
2017-01-01
ABSTRACT A Typhoid Conjugate Vaccine (TCV) is expected to acquire WHO prequalification soon, which will pave the way for its use in many low- and middle-income countries where typhoid fever is endemic. Thus it is critical to forecast future vaccine demand to ensure supply meets demand, and to facilitate vaccine policy and introduction planning. We forecasted introduction dates for countries based on specific criteria and estimated vaccine demand by year for defined vaccination strategies in 2 scenarios: rapid vaccine introduction and slow vaccine introduction. In the rapid introduction scenario, we forecasted 17 countries and India introducing TCV in the first 5 y of the vaccine's availability while in the slow introduction scenario we forecasted 4 countries and India introducing TCV in the same time period. If the vaccine is targeting infants in high-risk populations as a routine single dose, the vaccine demand peaks around 40 million doses per year under the rapid introduction scenario. Similarly, if the vaccine is targeting infants in the general population as a routine single dose, the vaccine demand increases to 160 million doses per year under the rapid introduction scenario. The demand forecast projected here is an upper bound estimate of vaccine demand, where actual demand depends on various factors such as country priorities, actual vaccine introduction, vaccination strategies, Gavi financing, costs, and overall product profile. Considering the potential role of TCV in typhoid control globally; manufacturers, policymakers, donors and financing bodies should work together to ensure vaccine access through sufficient production capacity, early WHO prequalification of the vaccine, continued Gavi financing and supportive policy. PMID:28604164
Derivation of a formula to predict patient volume based on temperature at college football games.
Kman, Nicholas E; Russell, Gregory B; Bozeman, William P; Ehrman, Kevin; Winslow, James
2007-01-01
We sought to explore the relationship between temperature and spectator illness at Division I college football games by deriving a formula to predict the number of patrons seeking medical care based on the ambient temperature and attendance of the game. A retrospective review was conducted of medical records from 47 Division I college football games at two outdoor stadiums from 2001 through 2005. Any person presenting for medical care was counted as a patient seen. Weather data were collected from the National Weather Service. A binomial model was fit to the spectator illness records by using the patients seen per attendance as the outcome measure, with temperature as the predictor. Using a binomial model, a formula was derived to estimate the number of patients needing medical attention based on the temperature and the number of spectators in attendance. Predicted # of Patients = exp (-7.4383 - 0.24439* Temperature C + 0.0156032 * Temperature C(2) - 0.000229196 * Temperature(3)) * number of spectators; all factors were highly significant (p < 0.0001). The model suggests that as the temperature rises, the number of patients seeking medical attention will also increase. The formula shows that an increase in temperature from 20 to 21 degrees C will show an increase in patient encounters from 3.64 to 4.05 visits per 10,000 in attendance (an 11% increase). These results show that temperature is an important variable to consider when determining the medical resources needed in caring for spectators at outdoor football games. Our model may help providers predict the number of spectators presenting for medical care based on the forecasted temperature and predicted attendance.
System-Level Experimentation: Executive Summary and Annotated Brief
2006-07-01
both friendlies and adversaries – to define their own capabilities, unfettered by doctrinal or cultural limitations and bounded only by the laws of...Gen Eric Nelson, USAF (Ret) Independent Consultant (Systems, Software) Dr. Brad Parkinson Stanford University (GPS, Sensors, Systems) Mr. Skip Saunders...RCO* – rapid fielding for high priority needs SLE is focused on understanding the future - unknown needs A8’ s Future Game – analysis of future
Two-Way Communication with a Single Quantum Particle.
Del Santo, Flavio; Dakić, Borivoje
2018-02-09
In this Letter we show that communication when restricted to a single information carrier (i.e., single particle) and finite speed of propagation is fundamentally limited for classical systems. On the other hand, quantum systems can surpass this limitation. We show that communication bounded to the exchange of a single quantum particle (in superposition of different spatial locations) can result in "two-way signaling," which is impossible in classical physics. We quantify the discrepancy between classical and quantum scenarios by the probability of winning a game played by distant players. We generalize our result to an arbitrary number of parties and we show that the probability of success is asymptotically decreasing to zero as the number of parties grows, for all classical strategies. In contrast, quantum strategy allows players to win the game with certainty.
Two-Way Communication with a Single Quantum Particle
NASA Astrophysics Data System (ADS)
Del Santo, Flavio; Dakić, Borivoje
2018-02-01
In this Letter we show that communication when restricted to a single information carrier (i.e., single particle) and finite speed of propagation is fundamentally limited for classical systems. On the other hand, quantum systems can surpass this limitation. We show that communication bounded to the exchange of a single quantum particle (in superposition of different spatial locations) can result in "two-way signaling," which is impossible in classical physics. We quantify the discrepancy between classical and quantum scenarios by the probability of winning a game played by distant players. We generalize our result to an arbitrary number of parties and we show that the probability of success is asymptotically decreasing to zero as the number of parties grows, for all classical strategies. In contrast, quantum strategy allows players to win the game with certainty.
Network formation: neighborhood structures, establishment costs, and distributed learning.
Chasparis, Georgios C; Shamma, Jeff S
2013-12-01
We consider the problem of network formation in a distributed fashion. Network formation is modeled as a strategic-form game, where agents represent nodes that form and sever unidirectional links with other nodes and derive utilities from these links. Furthermore, agents can form links only with a limited set of neighbors. Agents trade off the benefit from links, which is determined by a distance-dependent reward function, and the cost of maintaining links. When each agent acts independently, trying to maximize its own utility function, we can characterize “stable” networks through the notion of Nash equilibrium. In fact, the introduced reward and cost functions lead to Nash equilibria (networks), which exhibit several desirable properties such as connectivity, bounded-hop diameter, and efficiency (i.e., minimum number of links). Since Nash networks may not necessarily be efficient, we also explore the possibility of “shaping” the set of Nash networks through the introduction of state-based utility functions. Such utility functions may represent dynamic phenomena such as establishment costs (either positive or negative). Finally, we show how Nash networks can be the outcome of a distributed learning process. In particular, we extend previous learning processes to so-called “state-based” weakly acyclic games, and we show that the proposed network formation games belong to this class of games.
Cockpit automation - In need of a philosophy
NASA Technical Reports Server (NTRS)
Wiener, E. L.
1985-01-01
Concern has been expressed over the rapid development and deployment of automatic devices in transport aircraft, due mainly to the human interface and particularly the role of automation in inducing human error. The paper discusses the need for coherent philosophies of automation, and proposes several approaches: (1) flight management by exception, which states that as long as a crew stays within the bounds of regulations, air traffic control and flight safety, it may fly as it sees fit; (2) exceptions by forecasting, where the use of forecasting models would predict boundary penetration, rather than waiting for it to happen; (3) goal-sharing, where a computer is informed of overall goals, and subsequently has the capability of checking inputs and aircraft position for consistency with the overall goal or intentions; and (4) artificial intelligence and expert systems, where intelligent machines could mimic human reason.
Cropland Capture: A Game to Improve Global Cropland through Crowdsourcing
NASA Astrophysics Data System (ADS)
Fritz, Steffen; Sturn, Tobias; See, Linda; Perger, Christoph; Schill, Christian; McCallum, Ian; Schepaschenko, Dmitry; Karner, Mathias; Dueruer, Martina; Kraxner, Florian; Obersteiner, Michael
2014-05-01
Accurate and reliable global cropland extent maps are essential for estimating and forecasting crop yield, in particular losses due to drought and production anomalies. Major questions surrounding energy futures and environmental change (EU and US biofuel target setting, determination of greenhouse gas emissions, REDD initiatives, and implications of climate change on crop production and productivity patterns) also require reliable information on the spatial distribution of cropland as well as crop types. Although global land cover maps identify cropland (which exist as one or more land cover categories), this information is currently not accurate enough for many applications. There are several ways of improving current cropland extent though hybrid approaches and by integrating information collected though Geo-Wiki (a global crowdsourcing platform) from very high resolution imagery such as that found on Google Earth. Another way of getting improved cropland extent maps would be to classify all very high resolution images found on Google Earth and to create a wall-to-wall map of cropland. This is a very ambitious task that would require a large number of individuals, like that found in massive multiplayer online games. For this reason we have developed a game called 'Cropland Capture'. The game can be played on a desktop, on a tablet (iPad or Android) or mobile phone (iPhone or Android) where the game mechanics are very simple. The player is provided with a satellite image or in-situ photo and they must determine if the image contains cropland or not. The game was launched in the middle of November 2013 and will run for 6 months, after which the weekly winners will be entered into a draw to win large prizes. To date we have collected more than 2.5 million areas, where we will continue to expand the sample to more locations around the world. Eventually the data will be used to calibrate and validate a new version of our global cropland map, where the latest version is available from http://beta-hybrid.geo-wiki.org. If we find, however, that a large number of people participate in the game, we will aim to make wall-to-wall cropland maps for those countries where no national maps exist. This paper will present an overview of the game and a summary of the crowdsourced data from the game, including information about quality and user performance. If successful, this gaming approach could be used to gather information about other land cover types in the future in order to improve global land cover information more generally.
Approximate N-Player Nonzero-Sum Game Solution for an Uncertain Continuous Nonlinear System.
Johnson, Marcus; Kamalapurkar, Rushikesh; Bhasin, Shubhendu; Dixon, Warren E
2015-08-01
An approximate online equilibrium solution is developed for an N -player nonzero-sum game subject to continuous-time nonlinear unknown dynamics and an infinite horizon quadratic cost. A novel actor-critic-identifier structure is used, wherein a robust dynamic neural network is used to asymptotically identify the uncertain system with additive disturbances, and a set of critic and actor NNs are used to approximate the value functions and equilibrium policies, respectively. The weight update laws for the actor neural networks (NNs) are generated using a gradient-descent method, and the critic NNs are generated by least square regression, which are both based on the modified Bellman error that is independent of the system dynamics. A Lyapunov-based stability analysis shows that uniformly ultimately bounded tracking is achieved, and a convergence analysis demonstrates that the approximate control policies converge to a neighborhood of the optimal solutions. The actor, critic, and identifier structures are implemented in real time continuously and simultaneously. Simulations on two and three player games illustrate the performance of the developed method.
Indigenous cultural contexts for STEM experiences: snow snakes' impact on students and the community
NASA Astrophysics Data System (ADS)
Miller, Brant G.; Roehrig, Gillian
2018-03-01
Opportunities for American Indian youth to meaningfully engage in school-based science, technology, engineering, and mathematics (STEM) experiences have historically been inadequate. As a consequence, American Indian students perform lower on standardized assessments of science education than their peers. In this article we describe the emergence of meaning for students—as well as their community—resulting from Indigenous culturally-based STEM curriculum that used an American Indian tradition as a focal context. Specifically, the game of snow snakes ( Gooneginebig in Ojibwe) afforded an opportunity for STEM and culturally-based resources to work in unison. A case study research design was used with the bounded case represented by the community associated with the snow snake project. The research question guiding this study was: What forms of culturally relevant meaning do students and the community form as a result of the snow snake game? Results indicate evidence of increased student and community engagement through culturally-based STEM experiences in the form of active participation and the rejuvenation of a traditional game. Implications are discussed for using culturally-based contexts for STEM learning.
Equilibria of an epidemic game with piecewise linear social distancing cost.
Reluga, Timothy C
2013-10-01
Around the world, infectious disease epidemics continue to threaten people's health. When epidemics strike, we often respond by changing our behaviors to reduce our risk of infection. This response is sometimes called "social distancing." Since behavior changes can be costly, we would like to know the optimal social distancing behavior. But the benefits of changes in behavior depend on the course of the epidemic, which itself depends on our behaviors. Differential population game theory provides a method for resolving this circular dependence. Here, I present the analysis of a special case of the differential SIR epidemic population game with social distancing when the relative infection rate is linear, but bounded below by zero. Equilibrium solutions are constructed in closed-form for an open-ended epidemic. Constructions are also provided for epidemics that are stopped by the deployment of a vaccination that becomes available a fixed-time after the start of the epidemic. This can be used to anticipate a window of opportunity during which mass vaccination can significantly reduce the cost of an epidemic.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yishen; Zhou, Zhi; Liu, Cong
2016-08-01
As more wind power and other renewable resources are being integrated into the electric power grid, the forecast uncertainty brings operational challenges for the power system operators. In this report, different operational strategies for uncertainty management are presented and evaluated. A comprehensive and consistent simulation framework is developed to analyze the performance of different reserve policies and scheduling techniques under uncertainty in wind power. Numerical simulations are conducted on a modified version of the IEEE 118-bus system with a 20% wind penetration level, comparing deterministic, interval, and stochastic unit commitment strategies. The results show that stochastic unit commitment provides amore » reliable schedule without large increases in operational costs. Moreover, decomposition techniques, such as load shift factor and Benders decomposition, can help in overcoming the computational obstacles to stochastic unit commitment and enable the use of a larger scenario set to represent forecast uncertainty. In contrast, deterministic and interval unit commitment tend to give higher system costs as more reserves are being scheduled to address forecast uncertainty. However, these approaches require a much lower computational effort Choosing a proper lower bound for the forecast uncertainty is important for balancing reliability and system operational cost in deterministic and interval unit commitment. Finally, we find that the introduction of zonal reserve requirements improves reliability, but at the expense of higher operational costs.« less
Marshall Installs Receiving Antenna for Next-Generation Weather Satellites
2016-12-16
Technicians assemble a hefty segment of a new antenna system in this 30-second time-lapse video captured Dec. 16 at NASA's Marshall Space Flight Center. The high-performance ground station is designed to receive meteorological and space weather data from instruments flown on the National Oceanic and Atmospheric Administration's new, game-changing Geostationary Operational Environmental Satellite series. The six-meter dish antenna near Building 4316 expands the capacity of Marshall’s Earth Science Office to use real-time GOES observations for studies of Earth and to deliver new forecasting, warning and disaster response tools to partners around the world. (NASA/MSFC)
NASA Astrophysics Data System (ADS)
Meißner, Dennis; Klein, Bastian; Ionita, Monica; Hemri, Stephan; Rademacher, Silke
2017-04-01
Inland waterway transport (IWT) is an important commercial sector significantly vulnerable to hydrological impacts. River ice and floods limit the availability of the waterway network and may cause considerable damages to waterway infrastructure. Low flows significantly affect IWT's operation efficiency usually several months a year due to the close correlation of (low) water levels / water depths and (high) transport costs. Therefore "navigation-related" hydrological forecasts focussing on the specific requirements of water-bound transport (relevant forecast locations, target parameters, skill characteristics etc.) play a major role in order to mitigate IWT's vulnerability to hydro-meteorological impacts. In light of continuing transport growth within the European Union, hydrological forecasts for the waterways are essential to stimulate the use of the free capacity IWT still offers more consequently. An overview of the current operational and pre-operational forecasting systems for the German waterways predicting water levels, discharges and river ice thickness on various time-scales will be presented. While short-term (deterministic) forecasts have a long tradition in navigation-related forecasting, (probabilistic) forecasting services offering extended lead-times are not yet well-established and are still subject to current research and development activities (e.g. within the EU-projects EUPORIAS and IMPREX). The focus is on improving technical aspects as well as on exploring adequate ways of disseminating and communicating probabilistic forecast information. For the German stretch of the River Rhine, one of the most frequented inland waterways worldwide, the existing deterministic forecast scheme has been extended by ensemble forecasts combined with statistical post-processing modules applying EMOS (Ensemble Model Output Statistics) and ECC (Ensemble Copula Coupling) in order to generate water level predictions up to 10 days and to estimate its predictive uncertainty properly. Additionally for the key locations at the international waterways Rhine, Elbe and Danube three competing forecast approaches are currently tested in a pre-operational set-up in order to generate monthly to seasonal (up to 3 months) forecasts: (1) the well-known Ensemble Streamflow Prediction approach (ensemble based on historical meteorology), (2) coupling hydrological models with post-processed outputs from ECMWF's general circulation model (System 4), and (3) a purely statistical approach based on the stable relationship (teleconnection) of global or regional oceanic, climate and hydrological data with river flows. The current results, still pre-operational, reveal the existence of a valuable predictability of water levels and streamflow also at monthly up to seasonal time-scales along the larger rivers used as waterways in Germany. Last but not least insight into the technical set-up of the aforementioned forecasting systems operated at the Federal Institute of Hydrology, which are based on a Delft-FEWS application, will be given focussing on the step-wise extension of the former system by integrating new components in order to meet the growing needs of the customers and to improve and extend the forecast portfolio for waterway users.
Optimality of semiquantum nonlocality in the presence of high inconclusive rates
Lim, Charles Ci Wen
2016-02-01
Quantum nonlocality is a counterintuitive phenomenon that lies beyond the purview of causal influences. Recently, Bell inequalities have been generalized to the case of quantum inputs, leading to a powerful family of semiquantum Bell inequalities that are capable of detecting any entangled state. We focus on a different problem and investigate how the local indistinguishability of quantum inputs and postselection may affect the requirements to detect semiquantum nonlocality. Moreover, we consider a semiquantum nonlocal game based on locally indistinguishable qubit inputs, and derive its postselected local and quantum bounds by using a connection to the local distinguishability of quantum states.more » Interestingly, we find that the postselected local bound is independent of the measurement efficiency, and the achievable postselected Bell violation increases with decreasing measurement efficiency.« less
Multiobjective optimization techniques for structural design
NASA Technical Reports Server (NTRS)
Rao, S. S.
1984-01-01
The multiobjective programming techniques are important in the design of complex structural systems whose quality depends generally on a number of different and often conflicting objective functions which cannot be combined into a single design objective. The applicability of multiobjective optimization techniques is studied with reference to simple design problems. Specifically, the parameter optimization of a cantilever beam with a tip mass and a three-degree-of-freedom vabration isolation system and the trajectory optimization of a cantilever beam are considered. The solutions of these multicriteria design problems are attempted by using global criterion, utility function, game theory, goal programming, goal attainment, bounded objective function, and lexicographic methods. It has been observed that the game theory approach required the maximum computational effort, but it yielded better optimum solutions with proper balance of the various objective functions in all the cases.
A probabilistic neural network based approach for predicting the output power of wind turbines
NASA Astrophysics Data System (ADS)
Tabatabaei, Sajad
2017-03-01
Finding the authentic predicting tools of eliminating the uncertainty of wind speed forecasts is highly required while wind power sources are strongly penetrating. Recently, traditional predicting models of generating point forecasts have no longer been trustee. Thus, the present paper aims at utilising the concept of prediction intervals (PIs) to assess the uncertainty of wind power generation in power systems. Besides, this paper uses a newly introduced non-parametric approach called lower upper bound estimation (LUBE) to build the PIs since the forecasting errors are unable to be modelled properly by applying distribution probability functions. In the present proposed LUBE method, a PI combination-based fuzzy framework is used to overcome the performance instability of neutral networks (NNs) used in LUBE. In comparison to other methods, this formulation more suitably has satisfied the PI coverage and PI normalised average width (PINAW). Since this non-linear problem has a high complexity, a new heuristic-based optimisation algorithm comprising a novel modification is introduced to solve the aforesaid problems. Based on data sets taken from a wind farm in Australia, the feasibility and satisfying performance of the suggested method have been investigated.
The causal link among militarization, economic growth, CO2 emission, and energy consumption.
Bildirici, Melike E
2017-02-01
This paper examines the long-run and the causal relationship among CO 2 emissions, militarization, economic growth, and energy consumption for USA for the period 1960-2013. Using the bound test approach to cointegration, a short-run as well as a long-run relationship among the variables with a positive and a statistically significant relationship between CO 2 emissions and militarization was found. To determine the causal link, MWALD and Rao's F tests were applied. According to Rao's F tests, the evidence of a unidirectional causality running from militarization to CO 2 emissions, from energy consumption to CO 2 emissions, and from militarization to energy consumption all without a feedback was found. Further, the results determined that 26% of the forecast-error variance of CO 2 emissions was explained by the forecast error variance of militarization and 60% by energy consumption.
Kumaraswamy autoregressive moving average models for double bounded environmental data
NASA Astrophysics Data System (ADS)
Bayer, Fábio Mariano; Bayer, Débora Missio; Pumi, Guilherme
2017-12-01
In this paper we introduce the Kumaraswamy autoregressive moving average models (KARMA), which is a dynamic class of models for time series taking values in the double bounded interval (a,b) following the Kumaraswamy distribution. The Kumaraswamy family of distribution is widely applied in many areas, especially hydrology and related fields. Classical examples are time series representing rates and proportions observed over time. In the proposed KARMA model, the median is modeled by a dynamic structure containing autoregressive and moving average terms, time-varying regressors, unknown parameters and a link function. We introduce the new class of models and discuss conditional maximum likelihood estimation, hypothesis testing inference, diagnostic analysis and forecasting. In particular, we provide closed-form expressions for the conditional score vector and conditional Fisher information matrix. An application to environmental real data is presented and discussed.
Computational Fluid Dynamics: Algorithms and Supercomputers
1988-03-01
1985. 1.2. Pulliam, T., and Steger, J. , Implicit Finite Difference Simulations of Three Dimensional Compressible Flow, AIAA Journal , Vol. 18, No. 2...approaches infinity, assuming N is bounded. The question as to actual performance when M is finite and N varies, is a different matter. (Note: the CYBER...PARTICLE-IN-CELL 9i% 3.b7 j.48 WEATHER FORECAST 98% 3.77 3.55 SEISMIC MIGRATION 98% 3.85 3.45 MONTE CARLO 99% 3.85 3.75 LATTICE GAUGE 100% 4.00 3.77
Investigation of geomagnetic field forecasting and fluid dynamics of the core
NASA Technical Reports Server (NTRS)
Benton, E. R. (Principal Investigator)
1981-01-01
The magnetic determination of the depth of the core-mantle boundary using MAGSAT data is discussed. Refinements to the approach of using the pole-strength of Earth to evaluate the radius of the Earth's core-mantle boundary are reported. The downward extrapolation through the electrically conducting mantle was reviewed. Estimates of an upper bound for the time required for Earth's liquid core to overturn completely are presented. High order analytic approximations to the unsigned magnetic flux crossing the Earth's surface are also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Kyri; Toomey, Bridget
Evolving power systems with increasing levels of stochasticity call for a need to solve optimal power flow problems with large quantities of random variables. Weather forecasts, electricity prices, and shifting load patterns introduce higher levels of uncertainty and can yield optimization problems that are difficult to solve in an efficient manner. Solution methods for single chance constraints in optimal power flow problems have been considered in the literature, ensuring single constraints are satisfied with a prescribed probability; however, joint chance constraints, ensuring multiple constraints are simultaneously satisfied, have predominantly been solved via scenario-based approaches or by utilizing Boole's inequality asmore » an upper bound. In this paper, joint chance constraints are used to solve an AC optimal power flow problem while preventing overvoltages in distribution grids under high penetrations of photovoltaic systems. A tighter version of Boole's inequality is derived and used to provide a new upper bound on the joint chance constraint, and simulation results are shown demonstrating the benefit of the proposed upper bound. The new framework allows for a less conservative and more computationally efficient solution to considering joint chance constraints, specifically regarding preventing overvoltages.« less
CMB bispectrum, trispectrum, non-Gaussianity, and the Cramer-Rao bound
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamionkowski, Marc; Smith, Tristan L.; Heavens, Alan
Minimum-variance estimators for the parameter f{sub nl} that quantifies local-model non-Gaussianity can be constructed from the cosmic microwave background (CMB) bispectrum (three-point function) and also from the trispectrum (four-point function). Some have suggested that a comparison between the estimates for the values of f{sub nl} from the bispectrum and trispectrum allow a consistency test for the model. But others argue that the saturation of the Cramer-Rao bound--which gives a lower limit to the variance of an estimator--by the bispectrum estimator implies that no further information on f{sub nl} can be obtained from the trispectrum. Here, we elaborate the nature ofmore » the correlation between the bispectrum and trispectrum estimators for f{sub nl}. We show that the two estimators become statistically independent in the limit of large number of CMB pixels, and thus that the trispectrum estimator does indeed provide additional information on f{sub nl} beyond that obtained from the bispectrum. We explain how this conclusion is consistent with the Cramer-Rao bound. Our discussion of the Cramer-Rao bound may be of interest to those doing Fisher-matrix parameter-estimation forecasts or data analysis in other areas of physics as well.« less
Posner, A; Hesse, M; St Cyr, O C
2014-04-01
Space weather forecasting critically depends upon availability of timely and reliable observational data. It is therefore particularly important to understand how existing and newly planned observational assets perform during periods of severe space weather. Extreme space weather creates challenging conditions under which instrumentation and spacecraft may be impeded or in which parameters reach values that are outside the nominal observational range. This paper analyzes existing and upcoming observational capabilities for forecasting, and discusses how the findings may impact space weather research and its transition to operations. A single limitation to the assessment is lack of information provided to us on radiation monitor performance, which caused us not to fully assess (i.e., not assess short term) radiation storm forecasting. The assessment finds that at least two widely spaced coronagraphs including L4 would provide reliability for Earth-bound CMEs. Furthermore, all magnetic field measurements assessed fully meet requirements. However, with current or even with near term new assets in place, in the worst-case scenario there could be a near-complete lack of key near-real-time solar wind plasma data of severe disturbances heading toward and impacting Earth's magnetosphere. Models that attempt to simulate the effects of these disturbances in near real time or with archival data require solar wind plasma observations as input. Moreover, the study finds that near-future observational assets will be less capable of advancing the understanding of extreme geomagnetic disturbances at Earth, which might make the resulting space weather models unsuitable for transition to operations. Manuscript assesses current and near-future space weather assetsCurrent assets unreliable for forecasting of severe geomagnetic stormsNear-future assets will not improve the situation.
Posner, A; Hesse, M; St Cyr, O C
2014-01-01
Space weather forecasting critically depends upon availability of timely and reliable observational data. It is therefore particularly important to understand how existing and newly planned observational assets perform during periods of severe space weather. Extreme space weather creates challenging conditions under which instrumentation and spacecraft may be impeded or in which parameters reach values that are outside the nominal observational range. This paper analyzes existing and upcoming observational capabilities for forecasting, and discusses how the findings may impact space weather research and its transition to operations. A single limitation to the assessment is lack of information provided to us on radiation monitor performance, which caused us not to fully assess (i.e., not assess short term) radiation storm forecasting. The assessment finds that at least two widely spaced coronagraphs including L4 would provide reliability for Earth-bound CMEs. Furthermore, all magnetic field measurements assessed fully meet requirements. However, with current or even with near term new assets in place, in the worst-case scenario there could be a near-complete lack of key near-real-time solar wind plasma data of severe disturbances heading toward and impacting Earth's magnetosphere. Models that attempt to simulate the effects of these disturbances in near real time or with archival data require solar wind plasma observations as input. Moreover, the study finds that near-future observational assets will be less capable of advancing the understanding of extreme geomagnetic disturbances at Earth, which might make the resulting space weather models unsuitable for transition to operations. Key Points Manuscript assesses current and near-future space weather assets Current assets unreliable for forecasting of severe geomagnetic storms Near-future assets will not improve the situation PMID:26213516
NASA Astrophysics Data System (ADS)
Mishra, Abhilash; Hirata, Christopher M.
2018-05-01
In the first paper of this series, we showed that the CMB quadrupole at high redshifts results in a small circular polarization of the emitted 21 cm radiation. In this paper we forecast the sensitivity of future radio experiments to measure the CMB quadrupole during the era of first cosmic light (z ˜20 ). The tomographic measurement of 21 cm circular polarization allows us to construct a 3D remote quadrupole field. Measuring the B -mode component of this remote quadrupole field can be used to put bounds on the tensor-to-scalar ratio r . We make Fisher forecasts for a future Fast Fourier Transform Telescope (FFTT), consisting of an array of dipole antennas in a compact grid configuration, as a function of array size and observation time. We find that a FFTT with a side length of 100 km can achieve σ (r )˜4 ×10-3 after ten years of observation and with a sky coverage fsky˜0.7 . The forecasts are dependent on the evolution of the Lyman-α flux in the pre-reionization era, that remains observationally unconstrained. Finally, we calculate the typical order of magnitudes for circular polarization foregrounds and comment on their mitigation strategies. We conclude that detection of primordial gravitational waves with 21 cm observations is in principle possible, so long as the primordial magnetic field amplitude is small, but would require a very futuristic experiment with corresponding advances in calibration and foreground suppression techniques.
Multiobjective optimization in structural design with uncertain parameters and stochastic processes
NASA Technical Reports Server (NTRS)
Rao, S. S.
1984-01-01
The application of multiobjective optimization techniques to structural design problems involving uncertain parameters and random processes is studied. The design of a cantilever beam with a tip mass subjected to a stochastic base excitation is considered for illustration. Several of the problem parameters are assumed to be random variables and the structural mass, fatigue damage, and negative of natural frequency of vibration are considered for minimization. The solution of this three-criteria design problem is found by using global criterion, utility function, game theory, goal programming, goal attainment, bounded objective function, and lexicographic methods. It is observed that the game theory approach is superior in finding a better optimum solution, assuming the proper balance of the various objective functions. The procedures used in the present investigation are expected to be useful in the design of general dynamic systems involving uncertain parameters, stochastic process, and multiple objectives.
Complexity analysis of dual-channel game model with different managers' business objectives
NASA Astrophysics Data System (ADS)
Li, Ting; Ma, Junhai
2015-01-01
This paper considers dual-channel game model with bounded rationality, using the theory of bifurcations of dynamical system. The business objectives of retailers are assumed to be different, which is closer to reality than previous studies. We study the local stable region of Nash equilibrium point and find that business objectives can expand the stable region and play an important role in price strategy. One interesting finding is that a fiercer competition tends to stabilize the Nash equilibrium. Simulation shows the complex behavior of two dimensional dynamic system, we find period doubling bifurcation and chaos phenomenon. We measure performances of the model in different period by using the index of average profit. The results show that unstable behavior in economic system is often an unfavorable outcome. So this paper discusses the application of adaptive adjustment mechanism when the model exhibits chaotic behavior and then allows the retailers to eliminate the negative effects.
Compensating for telecommunication delays during robotic telerehabilitation.
Consoni, Leonardo J; Siqueira, Adriano A G; Krebs, Hermano I
2017-07-01
Rehabilitation robotic systems may afford better care and telerehabilitation may extend the use and benefits of robotic therapy to the home. Data transmissions over distance are bound by intrinsic communication delays which can be significant enough to deem the activity unfeasible. Here we describe an approach that combines unilateral robotic telerehabilitation and serious games. This approach has a modular and distributed design that permits different types of robots to interact without substantial code changes. We demonstrate the approach through an online multiplayer game. Two users can remotely interact with each other with no force exchanges, while a smoothing and prediction algorithm compensates motions for the delay in the Internet connection. We demonstrate that this approach can successfully compensate for data transmission delays, even when testing between the United States and Brazil. This paper presents the initial experimental results, which highlight the performance degradation with increasing delays as well as improvements provided by the proposed algorithm, and discusses planned future developments.
An evolutionary model of cooperation, fairness and altruistic punishment in public good games.
Hetzer, Moritz; Sornette, Didier
2013-01-01
We identify and explain the mechanisms that account for the emergence of fairness preferences and altruistic punishment in voluntary contribution mechanisms by combining an evolutionary perspective together with an expected utility model. We aim at filling a gap between the literature on the theory of evolution applied to cooperation and punishment, and the empirical findings from experimental economics. The approach is motivated by previous findings on other-regarding behavior, the co-evolution of culture, genes and social norms, as well as bounded rationality. Our first result reveals the emergence of two distinct evolutionary regimes that force agents to converge either to a defection state or to a state of coordination, depending on the predominant set of self- or other-regarding preferences. Our second result indicates that subjects in laboratory experiments of public goods games with punishment coordinate and punish defectors as a result of an aversion against disadvantageous inequitable outcomes. Our third finding identifies disadvantageous inequity aversion as evolutionary dominant and stable in a heterogeneous population of agents endowed initially only with purely self-regarding preferences. We validate our model using previously obtained results from three independently conducted experiments of public goods games with punishment.
An Evolutionary Model of Cooperation, Fairness and Altruistic Punishment in Public Good Games
Hetzer, Moritz; Sornette, Didier
2013-01-01
We identify and explain the mechanisms that account for the emergence of fairness preferences and altruistic punishment in voluntary contribution mechanisms by combining an evolutionary perspective together with an expected utility model. We aim at filling a gap between the literature on the theory of evolution applied to cooperation and punishment, and the empirical findings from experimental economics. The approach is motivated by previous findings on other-regarding behavior, the co-evolution of culture, genes and social norms, as well as bounded rationality. Our first result reveals the emergence of two distinct evolutionary regimes that force agents to converge either to a defection state or to a state of coordination, depending on the predominant set of self- or other-regarding preferences. Our second result indicates that subjects in laboratory experiments of public goods games with punishment coordinate and punish defectors as a result of an aversion against disadvantageous inequitable outcomes. Our third finding identifies disadvantageous inequity aversion as evolutionary dominant and stable in a heterogeneous population of agents endowed initially only with purely self-regarding preferences. We validate our model using previously obtained results from three independently conducted experiments of public goods games with punishment. PMID:24260101
On a cost functional for H2/H(infinity) minimization
NASA Technical Reports Server (NTRS)
Macmartin, Douglas G.; Hall, Steven R.; Mustafa, Denis
1990-01-01
A cost functional is proposed and investigated which is motivated by minimizing the energy in a structure using only collocated feedback. Defined for an H(infinity)-norm bounded system, this cost functional also overbounds the H2 cost. Some properties of this cost functional are given, and preliminary results on the procedure for minimizing it are presented. The frequency domain cost functional is shown to have a time domain representation in terms of a Stackelberg non-zero sum differential game.
The Radiation, Interplanetary Shocks, and Coronal Sources (RISCS) Toolset
NASA Technical Reports Server (NTRS)
Zank, G. P.; Spann, J.
2014-01-01
We outline a plan to develop a physics based predictive toolset RISCS to describe the interplanetary energetic particle and radiation environment throughout the inner heliosphere, including at the Earth. To forecast and "nowcast" the radiation environment requires the fusing of three components: 1) the ability to provide probabilities for incipient solar activity; 2) the use of these probabilities and daily coronal and solar wind observations to model the 3D spatial and temporal heliosphere, including magnetic field structure and transients, within 10 AU; and 3) the ability to model the acceleration and transport of energetic particles based on current and anticipated coronal and heliospheric conditions. We describe how to address 1) - 3) based on our existing, well developed, and validated codes and models. The goal of RISCS toolset is to provide an operational forecast and "nowcast" capability that will a) predict solar energetic particle (SEP) intensities; b) spectra for protons and heavy ions; c) predict maximum energies and their duration; d) SEP composition; e) cosmic ray intensities, and f) plasma parameters, including shock arrival times, strength and obliquity at any given heliospheric location and time. The toolset would have a 72 hour predicative capability, with associated probabilistic bounds, that would be updated hourly thereafter to improve the predicted event(s) and reduce the associated probability bounds. The RISCS toolset would be highly adaptable and portable, capable of running on a variety of platforms to accommodate various operational needs and requirements.
Kwong, Jessica Y Y; Wong, Kin Fai Ellick; Tang, Suki K Y
2013-10-01
One of the conjectures in affective forecasting literature is that people are advised to discount their anticipated emotions because their forecasts are often inaccurate. The present research distinguishes between emotional reactions to process versus those to outcome, and highlights an alternative view that affective misforecasts could indeed be adaptive to goal pursuit. Using an ultimatum game, Study 1 showed that people overpredicted how much they would regret and be disappointed by the amount of effort they exerted, should the outcomes turned out worse than expected; nonetheless, people could accurately predict their emotional responses to unfavorable outcomes per se. In a natural setting of a university examination, Study 2 demonstrated that actual regret and disappointment toward favorable outcomes were more intense than the level people expected, but this discrepancy was not observed in their emotional responses to efforts they had invested. These two distinct patterns of results substantiate the argument that the deviation between predicted and actual emotions is dependent on the referents of the emotional reactions. Copyright © 2013 Elsevier B.V. All rights reserved.
On some methods for assessing earthquake predictions
NASA Astrophysics Data System (ADS)
Molchan, G.; Romashkova, L.; Peresan, A.
2017-09-01
A regional approach to the problem of assessing earthquake predictions inevitably faces a deficit of data. We point out some basic limits of assessment methods reported in the literature, considering the practical case of the performance of the CN pattern recognition method in the prediction of large Italian earthquakes. Along with the classical hypothesis testing, a new game approach, the so-called parimutuel gambling (PG) method, is examined. The PG, originally proposed for the evaluation of the probabilistic earthquake forecast, has been recently adapted for the case of 'alarm-based' CN prediction. The PG approach is a non-standard method; therefore it deserves careful examination and theoretical analysis. We show that the PG alarm-based version leads to an almost complete loss of information about predicted earthquakes (even for a large sample). As a result, any conclusions based on the alarm-based PG approach are not to be trusted. We also show that the original probabilistic PG approach does not necessarily identifies the genuine forecast correctly among competing seismicity rate models, even when applied to extensive data.
NASA Astrophysics Data System (ADS)
Wang, Yi; Wang, Jun; Xu, Xiaoguang; Henze, Daven K.; Wang, Yuxuan; Qu, Zhen
2016-09-01
SO2 emissions, the largest source of anthropogenic aerosols, can respond rapidly to economic and policy driven changes. However, bottom-up SO2 inventories have inherent limitations owing to 24-48 months latency and lack of month-to-month variation in emissions (especially in developing countries). This study develops a new approach that integrates Ozone Monitoring Instrument (OMI) SO2 satellite measurements and GEOS-Chem adjoint model simulations to constrain monthly anthropogenic SO2 emissions. The approach's effectiveness is demonstrated for 14 months in East Asia; resultant posterior emissions not only capture a 20% SO2 emission reduction in Beijing during the 2008 Olympic Games but also improve agreement between modeled and in situ surface measurements. Further analysis reveals that posterior emissions estimates, compared to the prior, lead to significant improvements in forecasting monthly surface and columnar SO2. With the pending availability of geostationary measurements of tropospheric composition, we show that it may soon be possible to rapidly constrain SO2 emissions and associated air quality predictions at fine spatiotemporal scales.
Parametrisation of initial conditions for seasonal stream flow forecasting in the Swiss Rhine basin
NASA Astrophysics Data System (ADS)
Schick, Simon; Rössler, Ole; Weingartner, Rolf
2016-04-01
Current climate forecast models show - to the best of our knowledge - low skill in forecasting climate variability in Central Europe at seasonal lead times. When it comes to seasonal stream flow forecasting, initial conditions thus play an important role. Here, initial conditions refer to the catchments moisture at the date of forecast, i.e. snow depth, stream flow and lake level, soil moisture content, and groundwater level. The parametrisation of these initial conditions can take place at various spatial and temporal scales. Examples are the grid size of a distributed model or the time aggregation of predictors in statistical models. Therefore, the present study aims to investigate the extent to which the parametrisation of initial conditions at different spatial scales leads to differences in forecast errors. To do so, we conduct a forecast experiment for the Swiss Rhine at Basel, which covers parts of Germany, Austria, and Switzerland and is southerly bounded by the Alps. Seasonal mean stream flow is defined for the time aggregation of 30, 60, and 90 days and forecasted at 24 dates within the calendar year, i.e. at the 1st and 16th day of each month. A regression model is employed due to the various anthropogenic effects on the basins hydrology, which often are not quantifiable but might be grasped by a simple black box model. Furthermore, the pool of candidate predictors consists of antecedent temperature, precipitation, and stream flow only. This pragmatic approach follows the fact that observations of variables relevant for hydrological storages are either scarce in space or time (soil moisture, groundwater level), restricted to certain seasons (snow depth), or regions (lake levels, snow depth). For a systematic evaluation, we therefore focus on the comprehensive archives of meteorological observations and reanalyses to estimate the initial conditions via climate variability prior to the date of forecast. The experiment itself is based on four different approaches, whose differences in model skill were estimated within a rigorous cross-validation framework for the period 1982-2013: The predictands are regressed on antecedent temperature, precipitation, and stream flow. Here, temperature and precipitation constitute basin averages out of the E-OBS gridded data set. As in 1., but temperature and precipitation are used at the E-OBS grid scale (0.25 degree in longitude and latitude) without spatial averaging. As in 1., but the regression model is applied to 66 gauged subcatchments of the Rhine basin. Forecasts for these subcatchments are then simply summed and upscaled to the area of the Rhine basin. As in 3., but the forecasts at the subcatchment scale are additionally weighted in terms of hydrological representativeness of the corresponding subcatchment.
NASA Astrophysics Data System (ADS)
Beig, Gufran; Chate, Dilip M.; Ghude, Sachin. D.; Mahajan, A. S.; Srinivas, R.; Ali, K.; Sahu, S. K.; Parkhi, N.; Surendran, D.; Trimbake, H. R.
2013-12-01
In 2010, the XIX Commonwealth Games (CWG-2010) were held in India for the first time at Delhi and involved 71 commonwealth nations and dependencies with more than 6000 athletes participating in 272 events. This was the largest international multi-sport event to be staged in India and strict emission controls were imposed during the games in order to ensure improved air quality for the participating athletes as a significant portion of the population in Delhi is regularly exposed to elevated levels of pollution. The air quality control measures ranged from vehicular and traffic controls to relocation of factories and reduction of power plant emissions. In order to understand the effects of these policy induced control measures, a network of air quality and weather monitoring stations was set-up across different areas in Delhi under the Government of India's System of Air quality Forecasting And Research (SAFAR) project. Simultaneous measurements of aerosols, reactive trace gases (e.g. NOx, O3, CO) and meteorological parameters were made before, during and after CWG-2010. Contrary to expectations, the emission controls implemented were not sufficient to reduce the pollutants, instead in some cases, causing an increase. The measured pollutants regularly exceeded the National Ambient Air Quality limits over the games period. The reasons for this increase are attributed to an underestimation of the required control measures, which resulted in inadequate planning. The results indicate that any future air quality control measures need to be well planned and strictly imposed in order to improve the air quality in Delhi, which affects a large population and is deteriorating rapidly. Thus, the presence of systematic high resolution data and realistic emission inventories through networks such as SAFAR will be directly useful for the future.
NASA Astrophysics Data System (ADS)
Chen, Zhixiang; Fu, Bin
This paper is our third step towards developing a theory of testing monomials in multivariate polynomials and concentrates on two problems: (1) How to compute the coefficients of multilinear monomials; and (2) how to find a maximum multilinear monomial when the input is a ΠΣΠ polynomial. We first prove that the first problem is #P-hard and then devise a O *(3 n s(n)) upper bound for this problem for any polynomial represented by an arithmetic circuit of size s(n). Later, this upper bound is improved to O *(2 n ) for ΠΣΠ polynomials. We then design fully polynomial-time randomized approximation schemes for this problem for ΠΣ polynomials. On the negative side, we prove that, even for ΠΣΠ polynomials with terms of degree ≤ 2, the first problem cannot be approximated at all for any approximation factor ≥ 1, nor "weakly approximated" in a much relaxed setting, unless P=NP. For the second problem, we first give a polynomial time λ-approximation algorithm for ΠΣΠ polynomials with terms of degrees no more a constant λ ≥ 2. On the inapproximability side, we give a n (1 - ɛ)/2 lower bound, for any ɛ> 0, on the approximation factor for ΠΣΠ polynomials. When the degrees of the terms in these polynomials are constrained as ≤ 2, we prove a 1.0476 lower bound, assuming Pnot=NP; and a higher 1.0604 lower bound, assuming the Unique Games Conjecture.
Thompson, Robin N.; Gilligan, Christopher A.; Cunniffe, Nik J.
2016-01-01
We assess how presymptomatic infection affects predictability of infectious disease epidemics. We focus on whether or not a major outbreak (i.e. an epidemic that will go on to infect a large number of individuals) can be predicted reliably soon after initial cases of disease have appeared within a population. For emerging epidemics, significant time and effort is spent recording symptomatic cases. Scientific attention has often focused on improving statistical methodologies to estimate disease transmission parameters from these data. Here we show that, even if symptomatic cases are recorded perfectly, and disease spread parameters are estimated exactly, it is impossible to estimate the probability of a major outbreak without ambiguity. Our results therefore provide an upper bound on the accuracy of forecasts of major outbreaks that are constructed using data on symptomatic cases alone. Accurate prediction of whether or not an epidemic will occur requires records of symptomatic individuals to be supplemented with data concerning the true infection status of apparently uninfected individuals. To forecast likely future behavior in the earliest stages of an emerging outbreak, it is therefore vital to develop and deploy accurate diagnostic tests that can determine whether asymptomatic individuals are actually uninfected, or instead are infected but just do not yet show detectable symptoms. PMID:27046030
NASA Astrophysics Data System (ADS)
Pande, Saket; Sharma, Ashish
2014-05-01
This study is motivated by the need to robustly specify, identify, and forecast runoff generation processes for hydroelectricity production. It atleast requires the identification of significant predictors of runoff generation and the influence of each such significant predictor on runoff response. To this end, we compare two non-parametric algorithms of predictor subset selection. One is based on information theory that assesses predictor significance (and hence selection) based on Partial Information (PI) rationale of Sharma and Mehrotra (2014). The other algorithm is based on a frequentist approach that uses bounds on probability of error concept of Pande (2005), assesses all possible predictor subsets on-the-go and converges to a predictor subset in an computationally efficient manner. Both the algorithms approximate the underlying system by locally constant functions and select predictor subsets corresponding to these functions. The performance of the two algorithms is compared on a set of synthetic case studies as well as a real world case study of inflow forecasting. References: Sharma, A., and R. Mehrotra (2014), An information theoretic alternative to model a natural system using observational information alone, Water Resources Research, 49, doi:10.1002/2013WR013845. Pande, S. (2005), Generalized local learning in water resource management, PhD dissertation, Utah State University, UT-USA, 148p.
Air quality in Delhi during the Commonwealth Games
NASA Astrophysics Data System (ADS)
Marrapu, P.; Cheng, Y.; Beig, G.; Sahu, S.; Srinivas, R.; Carmichael, G. R.
2014-10-01
Air quality during the Commonwealth Games (CWG, held in Delhi in October 2010) is analyzed using a new air quality forecasting system established for the games. The CWG stimulated enhanced efforts to monitor and model air quality in the region. The air quality of Delhi during the CWG had high levels of particles with mean values of PM2.5 and PM10 at the venues of 111 and 238 μg m-3, respectively. Black carbon (BC) accounted for ~ 10% of the PM2.5 mass. It is shown that BC, PM2.5 and PM10 concentrations are well predicted, but with positive biases of ~ 25%. The diurnal variations are also well captured, with both the observations and the modeled values showing nighttime maxima and daytime minima. A new emissions inventory, developed as part of this air quality forecasting initiative, is evaluated by comparing the observed and predicted species-species correlations (i.e., BC : CO; BC : PM2.5; PM2.5 : PM10). Assuming that the observations at these sites are representative and that all the model errors are associated with the emissions, then the modeled concentrations and slopes can be made consistent by scaling the emissions by 0.6 for NOx, 2 for CO, and 0.7 for BC, PM2.5, and PM10. The emission estimates for particles are remarkably good considering the uncertainty in the estimates due to the diverse spread of activities and technologies that take place in Delhi and the rapid rates of change. The contribution of various emission sectors including transportation, power, domestic and industry to surface concentrations are also estimated. Transport, domestic and industrial sectors all make significant contributions to PM levels in Delhi, and the sectoral contributions vary spatially within the city. Ozone levels in Delhi are elevated, with hourly values sometimes exceeding 100 ppb. The continued growth of the transport sector is expected to make ozone pollution a more pressing air pollution problem in Delhi. The sector analysis provides useful inputs into the design of strategies to reduce air pollution levels in Delhi. The contribution for sources outside of Delhi on Delhi air quality range from ~ 25% for BC and PM to ~ 60% for day time ozone. The significant contributions from non-Delhi sources indicates that in Delhi (as has been show elsewhere) these strategies will also need a more regional perspective.
Total probabilities of ensemble runoff forecasts
NASA Astrophysics Data System (ADS)
Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian
2016-04-01
Ensemble forecasting has for a long time been used as a method in meteorological modelling to indicate the uncertainty of the forecasts. However, as the ensembles often exhibit both bias and dispersion errors, it is necessary to calibrate and post-process them. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters which are different in space and time, but still can give a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, and cannot directly be regionalized in the way we would like, so we suggest a different path below. The target of our work is to create a mean forecast with uncertainty bounds for a large number of locations in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu) We are therefore more interested in improving the forecast skill for high-flows rather than the forecast skill of lower runoff levels. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to find a total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but assuring that they have some spatial correlation, by adding a spatial penalty in the calibration process. This can in some cases have a slight negative impact on the calibration error, but makes it easier to interpolate the post-processing parameters to uncalibrated locations. We also look into different methods for handling the non-normal distributions of runoff data and the effect of different data transformations on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Engeland, K. and Steinsland, I.: Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times, Water Resour. Res., 50(1), 182-197, doi:10.1002/2012WR012757, 2014. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005.
NASA Technical Reports Server (NTRS)
Posner, Arik; Hesse, Michael; SaintCyr, Chris
2014-01-01
Space weather forecasting critically depends upon availability of timely and reliable observational data. It is therefore particularly important to understand how existing and newly planned observational assets perform during periods of severe space weather. Extreme space weather creates challenging conditions under which instrumentation and spacecraft may be impeded or in which parameters reach values that are outside the nominal observational range. This paper analyzes existing and upcoming observational capabilities for forecasting, and discusses how the findings may impact space weather research and its transition to operations. A single limitation to the assessment is lack of information provided to us on radiation monitor performance, which caused us not to fully assess (i.e., not assess short term) radiation storm forecasting. The assessment finds that at least two widely spaced coronagraphs including L4 would provide reliability for Earth-bound CMEs. Furthermore, all magnetic field measurements assessed fully meet requirements. However, with current or even with near term new assets in place, in the worst-case scenario there could be a near-complete lack of key near-real-time solar wind plasma data of severe disturbances heading toward and impacting Earth's magnetosphere. Models that attempt to simulate the effects of these disturbances in near real time or with archival data require solar wind plasma observations as input. Moreover, the study finds that near-future observational assets will be less capable of advancing the understanding of extreme geomagnetic disturbances at Earth, which might make the resulting space weather models unsuitable for transition to operations.
The System Dynamics Research on the Private Cars' Amount in Beijing
NASA Astrophysics Data System (ADS)
Fan, Jie; Yan, Guang-Le
The thesis analyzes the development problem of private cars’ amount in Beijing from the perspective of system dynamics. With the flow chart illustrating the relationships of relevant elements, the SD model is established by VENSIM to simulate the growth trend of private autos’ amount in the future on the background of “Public Transportation First” policy based on the original data in Beijing. Then the article discusses the forecasting impacts of “Single-and-double license plate number limit” on the number of city vehicles and private cars under the assumption that this policy implemented for long after the 2008 Olympic Games. Finally, some recommendations are put forward for proper control over this problem.
Determinants of public cooperation in multiplex networks
NASA Astrophysics Data System (ADS)
Battiston, Federico; Perc, Matjaž; Latora, Vito
2017-07-01
Synergies between evolutionary game theory and statistical physics have significantly improved our understanding of public cooperation in structured populations. Multiplex networks, in particular, provide the theoretical framework within network science that allows us to mathematically describe the rich structure of interactions characterizing human societies. While research has shown that multiplex networks may enhance the resilience of cooperation, the interplay between the overlap in the structure of the layers and the control parameters of the corresponding games has not yet been investigated. With this aim, we consider here the public goods game on a multiplex network, and we unveil the role of the number of layers and the overlap of links, as well as the impact of different synergy factors in different layers, on the onset of cooperation. We show that enhanced public cooperation emerges only when a significant edge overlap is combined with at least one layer being able to sustain some cooperation by means of a sufficiently high synergy factor. In the absence of either of these conditions, the evolution of cooperation in multiplex networks is determined by the bounds of traditional network reciprocity with no enhanced resilience. These results caution against overly optimistic predictions that the presence of multiple social domains may in itself promote cooperation, and they help us better understand the complexity behind prosocial behavior in layered social systems.
NASA Astrophysics Data System (ADS)
Wu, Q.
2013-12-01
The MM5-SMOKE-CMAQ model system, which is developed by the United States Environmental Protection Agency(U.S. EPA) as the Models-3 system, has been used for the daily air quality forecast in the Beijing Municipal Environmental Monitoring Center(Beijing MEMC), as a part of the Ensemble Air Quality Forecast System for Beijing(EMS-Beijing) since the Olympic Games year 2008. In this study, we collect the daily forecast results of the CMAQ model in the whole year 2010 for the model evaluation. The results show that the model play a good model performance in most days but underestimate obviously in some air pollution episode. A typical air pollution episode from 11st - 20th January 2010 was chosen, which the air pollution index(API) of particulate matter (PM10) observed by Beijing MEMC reaches to 180 while the prediction of PM10-API is about 100. Taking in account all stations in Beijing, including urban and suburban stations, three numerical methods are used for model improvement: firstly, enhance the inner domain with 4km grids, the coverage from only Beijing to the area including its surrounding cities; secondly, update the Beijing stationary area emission inventory, from statistical county-level to village-town level, that would provide more detail spatial informance for area emissions; thirdly, add some industrial points emission in Beijing's surrounding cities, the latter two are both the improvement of emission. As the result, the peak of the nine national standard stations averaged PM10-API, which is simulated by CMAQ as daily hindcast PM10-API, reach to 160 and much near to the observation. The new results show better model performance, which the correlation coefficent is 0.93 in national standard stations average and 0.84 in all stations, the relative error is 15.7% in national standard stations averaged and 27% in all stations. The time series of 9 national standard in Beijing urban The scatter diagram of all stations in Beijing, the red is the forecast and the blue is new result.
Modeling Extra-Long Tsunami Propagation: Assessing Data, Model Accuracy and Forecast Implications
NASA Astrophysics Data System (ADS)
Titov, V. V.; Moore, C. W.; Rabinovich, A.
2017-12-01
Detecting and modeling tsunamis propagating tens of thousands of kilometers from the source is a formidable scientific challenge and seemingly satisfies only scientific curiosity. However, results of such analyses produce a valuable insight into the tsunami propagation dynamics, model accuracy and would provide important implications for tsunami forecast. The Mw = 9.3 megathrust earthquake of December 26, 2004 off the coast of Sumatra generated a tsunami that devastated Indian Ocean coastlines and spread into the Pacific and Atlantic oceans. The tsunami was recorded by a great number of coastal tide gauges, including those located in 15-25 thousand kilometers from the source area. To date, it is still the farthest instrumentally detected tsunami. The data from these instruments throughout the world oceans enabled to estimate various statistical parameters and energy decay of this event. High-resolution records of this tsunami from DARTs 32401 (offshore of northern Chile), 46405 and NeMO (both offshore of the US West Coast), combined with the mainland tide gauge measurements enabled us to examine far-field characteristics of the 2004 in the Pacific Ocean and to compare the results of global numerical simulations with the observations. Despite their small heights (less than 2 cm at deep-ocean locations), the records demonstrated consistent spatial and temporal structure. The numerical model described well the frequency content, amplitudes and general structure of the observed waves at deep-ocean and coastal gages. We present analysis of the measurements and comparison with model data to discuss implication for tsunami forecast accuracy. Model study for such extreme distances from the tsunami source and at extra-long times after the event is an attempt to find accuracy bounds for tsunami models and accuracy limitations of model use for forecast. We discuss results in application to tsunami model forecast and tsunami modeling in general.
A fully-online Neuro-Fuzzy model for flow forecasting in basins with limited data
NASA Astrophysics Data System (ADS)
Ashrafi, Mohammad; Chua, Lloyd Hock Chye; Quek, Chai; Qin, Xiaosheng
2017-02-01
Current state-of-the-art online neuro fuzzy models (NFMs) such as DENFIS (Dynamic Evolving Neural-Fuzzy Inference System) have been used for runoff forecasting. Online NFMs adopt a local learning approach and are able to adapt to changes continuously. The DENFIS model however requires upper/lower bound for normalization and also the number of rules increases monotonically. This requirement makes the model unsuitable for use in basins with limited data, since a priori data is required. In order to address this and other drawbacks of current online models, the Generic Self-Evolving Takagi-Sugeno-Kang (GSETSK) is adopted in this study for forecast applications in basins with limited data. GSETSK is a fully-online NFM which updates its structure and parameters based on the most recent data. The model does not require the need for historical data and adopts clustering and rule pruning techniques to generate a compact and up-to-date rule-base. GSETSK was used in two forecast applications, rainfall-runoff (a catchment in Sweden) and river routing (Lower Mekong River) forecasts. Each of these two applications was studied under two scenarios: (i) there is no prior data, and (ii) only limited data is available (1 year for the Swedish catchment and 1 season for the Mekong River). For the Swedish Basin, GSETSK model results were compared to available results from a calibrated HBV (Hydrologiska Byråns Vattenbalansavdelning) model. For the Mekong River, GSETSK results were compared against the URBS (Unified River Basin Simulator) model. Both comparisons showed that results from GSETSK are comparable with the physically based models, which were calibrated with historical data. Thus, even though GSETSK was trained with a very limited dataset in comparison with HBV or URBS, similar results were achieved. Similarly, further comparisons between GSETSK with DENFIS and the RBF (Radial Basis Function) models highlighted further advantages of GSETSK as having a rule-base (compared to opaque RBF) which is more compact, up-to-date and more easily interpretable.
Kohler, Stefan
2013-01-01
Zimbabwean villagers of distinct background have resettled in government-organized land reforms for more than three decades. Against this backdrop, I assess the level of social cohesion in some of the newly established communities by estimating the average preferences for fairness in a structural model of bounded rationality. The estimations are based on behavioral data from an ultimatum game field experiment played by 234 randomly selected households in 6 traditional and 14 resettled villages almost two decades after resettlement. Equal or higher degrees of fairness are estimated in all resettlement schemes. In one, or arguably two, out of three distinct resettlement schemes studied, the resettled villagers exhibit significantly higher degrees of fairness ( ) and rationality ( ) than those who live in traditional villages. Overall, villagers appear similarly rational, but the attitude toward fairness is significantly stronger in resettled communities ( ). These findings are consistent with the idea of an increased need for cooperation required in recommencement. PMID:23724095
Kohler, Stefan
2013-01-01
Zimbabwean villagers of distinct background have resettled in government-organized land reforms for more than three decades. Against this backdrop, I assess the level of social cohesion in some of the newly established communities by estimating the average preferences for fairness in a structural model of bounded rationality. The estimations are based on behavioral data from an ultimatum game field experiment played by 234 randomly selected households in 6 traditional and 14 resettled villages almost two decades after resettlement. Equal or higher degrees of fairness are estimated in all resettlement schemes. In one, or arguably two, out of three distinct resettlement schemes studied, the resettled villagers exhibit significantly higher degrees of fairness (p ≤ 0.11) and rationality (p ≤ 0.04) than those who live in traditional villages. Overall, villagers appear similarly rational, but the attitude toward fairness is significantly stronger in resettled communities (p ≤ 0.01). These findings are consistent with the idea of an increased need for cooperation required in recommencement.
NASA Astrophysics Data System (ADS)
Mai, Yazong
2017-12-01
In the context of the upcoming implementation of the environmental tax policy, there is a need for a focus on the relationship between government regulation and corporate emissions. To achieve the real effect of environmental tax policy, government need to regulate the illegal emissions of enterprises. Based on the hypothesis of bounded rationality, this paper analyses the strategic set of government regulators and polluting enterprises in the implementation of environmental tax policy. By using the evolutionary game model, the utility function and payoff matrix of the both sides are constructed, and the evolutionary analysis and strategy adjustment of the environmental governance target and the actual profit of the stakeholders are carried out. Thus, the wrong behaviours could be corrected so that the equilibrium of the evolutionary system can be achieved gradually, which could also get the evolutionary stable strategies of the government and the polluting enterprises in the implementation of environmental tax policy.
Evolution of Cooperation in Public Goods Games
NASA Astrophysics Data System (ADS)
Xia, Cheng-Yi; Zhang, Juan-Juan; Wang, Yi-Ling; Wang, Jin-Song
2011-10-01
We investigate the evolution of cooperation with evolutionary public goods games based on finite populations, where four pure strategies: cooperators, defectors, punishers and loners who are unwilling to participate are considered. By adopting approximate best response dynamics, we show that the magnitude of rationality not only quantitatively explains the experiment results in [Nature (London) 425 (2003) 390], but also it will heavily influence the evolution of cooperation. Compared with previous results of infinite populations, which result in two equilibriums, we show that there merely exists a special equilibrium and the relevant high value of bounded rationality will sustain cooperation. In addition, we characterize that loner's payoff plays an active role in the maintenance of cooperation, which will only be warranted for the low and moderate values of loner's payoff. It thus indicates the effects of rationality and loner's payoff will influence the cooperation. Finally, we highlight the important result that the introduction of voluntary participation and punishment will facilitate cooperation greatly.
Serious-game for water resources management adaptation training to climatic changes
NASA Astrophysics Data System (ADS)
Leroy, Eve; Saulnier, Georges-Marie
2013-04-01
Water resources access is a main issue for territorial development to ensure environmental and human well-being. Indeed, sustainable development is vulnerable to water availability and climate change may affect the quantity and temporality of available water resources for anthropogenic water uses. How then to adapt, how to change water management rules and practices and how to involve stakeholders is such process? To prevent water scarcity situations, which may generate conflicts and impacts on ecosystems, it is important to think about a sustainable development where anthropogenic water uses are in good balance with forecasted water resources availability. This implies to raise awareness and involve stakeholders for a sustainable water management. Stakeholders have to think about future territorial development taking into account climate change impacts on water resources. Collaboration between scientists and stakeholders is essential to insure consistent climate change knowledge, well identification of anthropogenic uses, tensions and stakes of the territory. However sharing information on complex questions such as climate change, hydro-meteorological modeling and practical constraints may be a difficult task. Therefore to contribute to an easier debate and to the global training of all the interested actors, a serious game about water management was built. The serious game uses scientist complex models with real data but via a simple and playful web-game interface. The advantage of this interface is that it may help stakeholders, citizen or the target group to raise their understandings of impacts of climate change on water resources and to raise their awareness to the need for a sustainable water management while using state-of-the-art knowledge. The principle of the game is simple. The gamer is a mayor of a city and has to manage the water withdrawals from hydro systems, water distribution and consumption, water retreatment etc. In the same time, a clock is running and climate change occurs on the territory which impacts the water resources. The gamer has to deal with this evolution and try to help its municipality in growing. If the water management plays well the city can develop. At the opposite, wrong player decisions may generate water, energy or food scarcities, which lead the city to decrease. A first version of this game still under development was built. It makes uses of data from a famous French ski resort: Megève municipality. A demo of this game will be presented. Under a playful approach the serious game helps to discuss essential but strained topics between stakeholders, scientists and citizens. It may be considered as a useful tool for decision support and explanation of a complex topic. It is also hoped that this approach offers new ways of collaboration with stakeholders to approach complex situations in order to find the best paths for future water management.
A Study of The Effect of Demand Uncertainty for Low-Carbon Products Using a Newsvendor Model
Qu, Shaojian; Zhou, Yongyi
2017-01-01
This paper studies the effect of uncertain demand on a low-carbon product by using a newsvendor model. With two different kinds of market scales, we examine a game whereby a manufacturer produces and delivers a single new low-carbon product to a single retailer. The retailer observes the demand information and gives an order before the selling season. We find in the game that if the retailer shares truthful (or in contrast unreal or even does not share) forecast information with the manufacturer, the manufacturer will give a low (high) wholesale price through the sequence of events. In addition, as a policy-maker, the government posts a subsidy by selling the low-carbon product per unit. The manufacturer creates a new contract with a rebate for the retailer. We also take the consumer aversion coefficient and truth coefficient as qualitative variables into our model to study the order, pricing, and expected profit for the members of supply chain. The research shows that uncertain demand causes a the major effect on the new low-carbon product. Thereby, we suggest the retailer should share more truthful information with the manufacturer. PMID:29068382
Woody, Carol Ann; Johnson, D.H.; Shrier, Brianna M.; O'Neal, Jennifer S.; Knutzen, John A.; Augerot, Xanthippe; O'Neal, Thomas A.; Pearsons, Todd N.
2007-01-01
Counting towers provide an accurate, low-cost, low-maintenance, low-technology, and easily mobilized escapement estimation program compared to other methods (e.g., weirs, hydroacoustics, mark-recapture, and aerial surveys) (Thompson 1962; Siebel 1967; Cousens et al. 1982; Symons and Waldichuk 1984; Anderson 2000; Alaska Department of Fish and Game 2003). Counting tower data has been found to be consistent with that of digital video counts (Edwards 2005). Counting towers do not interfere with natural fish migration patterns, nor are fish handled or stressed; however, their use is generally limited to clear rivers that meet specific site selection criteria. The data provided by counting tower sampling allow fishery managers to determine reproductive population size, estimate total return (escapement + catch) and its uncertainty, evaluate population productivity and trends, set harvest rates, determine spawning escapement goals, and forecast future returns (Alaska Department of Fish and Game 1974-2000 and 1975-2004). The number of spawning fish is determined by subtracting subsistence, sport-caught fish, and prespawn mortality from the total estimated escapement. The methods outlined in this protocol for tower counts can be used to provide reasonable estimates ( plus or minus 6%-10%) of reproductive salmon population size and run timing in clear rivers.
A Study of The Effect of Demand Uncertainty for Low-Carbon Products Using a Newsvendor Model.
Qu, Shaojian; Zhou, Yongyi
2017-10-25
This paper studies the effect of uncertain demand on a low-carbon product by using a newsvendor model. With two different kinds of market scales, we examine a game whereby a manufacturer produces and delivers a single new low-carbon product to a single retailer. The retailer observes the demand information and gives an order before the selling season. We find in the game that if the retailer shares truthful (or in contrast unreal or even does not share) forecast information with the manufacturer, the manufacturer will give a low (high) wholesale price through the sequence of events. In addition, as a policy-maker, the government posts a subsidy by selling the low-carbon product per unit. The manufacturer creates a new contract with a rebate for the retailer. We also take the consumer aversion coefficient and truth coefficient as qualitative variables into our model to study the order, pricing, and expected profit for the members of supply chain. The research shows that uncertain demand causes a the major effect on the new low-carbon product. Thereby, we suggest the retailer should share more truthful information with the manufacturer.
Forecasting approaches to the Mekong River
NASA Astrophysics Data System (ADS)
Plate, E. J.
2009-04-01
Hydrologists distinguish between flood forecasts, which are concerned with events of the immediate future, and flood predictions, which are concerned with events that are possible, but whose date of occurrence is not determined. Although in principle both involve the determination of runoff from rainfall, the analytical approaches differ because of different objectives. The differences between the two approaches will be discussed, starting with an analysis of the forecasting process. The Mekong River in south-east Asia is used as an example. Prediction is defined as forecast for a hypothetical event, such as the 100-year flood, which is usually sufficiently specified by its magnitude and its probability of occurrence. It forms the basis for designing flood protection structures and risk management activities. The method for determining these quantities is hydrological modeling combined with extreme value statistics, today usually applied both to rainfall events and to observed river discharges. A rainfall-runoff model converts extreme rainfall events into extreme discharges, which at certain gage points along a river are calibrated against observed discharges. The quality of the model output is assessed against the mean value by means of the Nash-Sutcliffe quality criterion. The result of this procedure is a design hydrograph (or a family of design hydrographs) which are used as inputs into a hydraulic model, which converts the hydrograph into design water levels according to the hydraulic situation of the location. The accuracy of making a prediction in this sense is not particularly high: hydrologists know that the 100-year flood is a statistical quantity which can be estimated only within comparatively wide error bounds, and the hydraulics of a river site, in particular under conditions of heavy sediment loads has many uncertainties. Safety margins, such as additional freeboards are arranged to compensate for the uncertainty of the prediction. Forecasts, on the other hand, have as objective to obtain an accurate hydrograph of the near future. The method by means of which this is done is not as important as the accuracy of the forecast. A mathematical rainfall-runoff model is not necessarily a good forecast model. It has to be very carefully designed, and in many cases statistical models are found to give better results than mathematical models. Forecasters have the advantage of knowing the course of the hydrographs up to the point in time where forecasts have to be made. Therefore, models can be calibrated on line against the hydrograph of the immediate past. To assess the quality of a forecast, the quality criterion should not be based on the mean value, as does the Nash-Sutcliffe criterion, but should be based on the best forecast given the information up to the forecast time. Without any additional information, the best forecast when only the present day value is known is to assume a no-change scenario, i.e. to assume that the present value does not change in the immediate future. For the Mekong there exists a forecasting system which is based on a rainfall-runoff model operated by the Mekong River Commission. This model is found not to be adequate for forecasting for periods longer than one or two days ahead. Improvements are sought through two approaches: a strictly deterministic rainfall-runoff model, and a strictly statistical model based on regression with upstream stations. The two approaches are com-pared, and suggestions are made how to best combine the advantages of both approaches. This requires that due consideration is given to critical hydraulic conditions of the river at and in between the gauging stations. Critical situations occur in two ways: when the river overtops, in which case the rainfall-runoff model is incomplete unless overflow losses are considered, and at the confluence with tributaries. Of particular importance is the role of the large Tonle Sap Lake, which dampens the hydrograph downstream of Phnom Penh. The effect of these components of river hydraulics on forecasting accuracy will be assessed.
Future trends in computer waste generation in India.
Dwivedy, Maheshwar; Mittal, R K
2010-11-01
The objective of this paper is to estimate the future projection of computer waste in India and to subsequently analyze their flow at the end of their useful phase. For this purpose, the study utilizes the logistic model-based approach proposed by Yang and Williams to forecast future trends in computer waste. The model estimates future projection of computer penetration rate utilizing their first lifespan distribution and historical sales data. A bounding analysis on the future carrying capacity was simulated using the three parameter logistic curve. The observed obsolete generation quantities from the extrapolated penetration rates are then used to model the disposal phase. The results of the bounding analysis indicate that in the year 2020, around 41-152 million units of computers will become obsolete. The obsolete computer generation quantities are then used to estimate the End-of-Life outflows by utilizing a time-series multiple lifespan model. Even a conservative estimate of the future recycling capacity of PCs will reach upwards of 30 million units during 2025. Apparently, more than 150 million units could be potentially recycled in the upper bound case. However, considering significant future investment in the e-waste recycling sector from all stakeholders in India, we propose a logistic growth in the recycling rate and estimate the requirement of recycling capacity between 60 and 400 million units for the lower and upper bound case during 2025. Finally, we compare the future obsolete PC generation amount of the US and India. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Smith, Stephan B.; Pace, David; Goodman, Steven J.; Burgess, Donald W.; Smarsh, David; Roberts, Rita D.; Wolfson, Marilyn M.; Goodman, H. Michael (Technical Monitor)
2001-01-01
Thunderstorms are high impact weather phenomena. They also pose an extremely challenging forecast problem. The National Oceanic and Atmospheric Administration (NOAA), the Federal Aviation Administration (FAA), the National Aeronautic and Space Administration (NASA), and the Air Force Weather Agency (AFWA), have decided to pool technology and scientific expertise into an unprecedented effort to better observe, diagnose, and forecast thunderstorms. This paper describes plans for an operational field test called the THunderstorm Operational Research (THOR) Project beginning in 2002, the primary goals of which are to: 1) Reduce the number of Thunderstorm-related Air Traffic Delays with in the National Airspace System (NAS) and, 2) Improve severe thunderstorm, tornado and airport thunderstorm warning accuracy and lead time. Aviation field operations will be focused on the prime air traffic bottleneck in the NAS, the airspace bounded roughly by Chicago, New York City and Washington D.C., sometimes called the Northeast Corridor. A variety of new automated thunderstorm forecasting applications will be tested here that, when implemented into FAA-NWS operations, will allow for better tactical decision making and NAS management during thunderstorm days. Severe thunderstorm operations will be centered on Northern Alabama. NWS meteorologists from the forecast office in Birmingham will test the utility of experimental lightning, radar, and profiler data from a mesoscale observing network being established by NASA's Marshall Space Flight Center. In addition, new tornado detection and thunderstorm nowcasting algorithms will be examined for their potential for improving warning accuracy. The Alabama THOR site will also serve as a test bed for new gridded, digital thunderstorm and flash flood warning products.
Agur, Z; Safriel, U N
1981-07-01
Since the opening of the Suez Canal, more than 120 Red Sea species colonized the eastern Mediterranean, whereas less than 10 Mediterranean species colonized the Red Sea. For most of the species involved in this colonization, the mode of dispersal from the source to the colonized area is through free-drifting propagules. In order to examine whether the current regime of the Suez Canal may be involved in this assymetry in colonization, a mathematical hydraulic model that forecasts the direction and velocity of water currents through the year, along the length of the Canal, was utilized. The movements of free-floating propagules that occur at either entrance of the Canal, was simulated on a computer, and it was found that the completion of a Mediterranean-bound passage of Red Sea propagules is far faster and much more likely than a completion of a Red Sea-bound passage of Mediterranean propagules.
NASA Astrophysics Data System (ADS)
Pallotta, M.; Herdies, D. L.; Gonçalves, L. G.
2013-05-01
There is nowadays a growing interest in the influence and impacts of weather and climate in human life. The weather conditions analysis shows the utility of this type of tool when applied in sports. These conditions act as a differential in strategy and training, especially for outdoor sports. This study had as aim objective develop weather forecast and thermal comfort evaluation targeted to sports, and hoped that the results can be used to the development of products and weather service in the Olympic Games 2016 in Rio de Janeiro City. The use of weather forecast applied to the sport showed to be efficient for the case of Rio de Janeiro City Marathon, especially due to the high spatial resolution. The WRF simulations for the three marathons studied showed good results for temperature, atmospheric pressure, and relative humidity. On the other hand, the forecast of the wind showed a pattern of overestimation of the real situation in all cases. It was concluded that the WRF model provides, in general, more representative simulations from 36 hours in advance, and with 18 hours of integration they were even better, describing efficiently the synoptic situation that would be found. A review of weather conditions and thermal comfort at specific points of the marathon route showed that there are significant differences between the stages of the marathon, which makes possible to plan the competition strategy under the thermal comfort. It was concluded that a relationship between a situation more thermally comfortable (uncomfortable) and the best (worst) time in Rio de Janeiro City Marathon
A platform for evolving intelligently interactive adversaries.
Fogel, David B; Hays, Timothy J; Johnson, Douglas R
2006-07-01
Entertainment software developers face significant challenges in designing games with broad appeal. One of the challenges concerns creating nonplayer (computer-controlled) characters that can adapt their behavior in light of the current and prospective situation, possibly emulating human behaviors. This adaptation should be inherently novel, unrepeatable, yet within the bounds of realism. Evolutionary algorithms provide a suitable method for generating such behaviors. This paper provides background on the entertainment software industry, and details a prior and current effort to create a platform for evolving nonplayer characters with genetic and behavioral traits within a World War I combat flight simulator.
NASA Technical Reports Server (NTRS)
Spann, James F.; Zank, G.
2014-01-01
We outline a plan to develop and transition a physics based predictive toolset called The Radiation, Interplanetary Shocks, and Coronal Sources (RISCS) to describe the interplanetary energetic particle and radiation environment throughout the inner heliosphere, including at the Earth. To forecast and "nowcast" the radiation environment requires the fusing of three components: 1) the ability to provide probabilities for incipient solar activity; 2) the use of these probabilities and daily coronal and solar wind observations to model the 3D spatial and temporal heliosphere, including magnetic field structure and transients, within 10 Astronomical Units; and 3) the ability to model the acceleration and transport of energetic particles based on current and anticipated coronal and heliospheric conditions. We describe how to address 1) - 3) based on our existing, well developed, and validated codes and models. The goal of RISCS toolset is to provide an operational forecast and "nowcast" capability that will a) predict solar energetic particle (SEP) intensities; b) spectra for protons and heavy ions; c) predict maximum energies and their duration; d) SEP composition; e) cosmic ray intensities, and f) plasma parameters, including shock arrival times, strength and obliquity at any given heliospheric location and time. The toolset would have a 72 hour predicative capability, with associated probabilistic bounds, that would be updated hourly thereafter to improve the predicted event(s) and reduce the associated probability bounds. The RISCS toolset would be highly adaptable and portable, capable of running on a variety of platforms to accommodate various operational needs and requirements. The described transition plan is based on a well established approach developed in the Earth Science discipline that ensures that the customer has a tool that meets their needs
When does "economic man" dominate social behavior?
Camerer, Colin F; Fehr, Ernst
2006-01-06
The canonical model in economics considers people to be rational and self-regarding. However, much evidence challenges this view, raising the question of when "Economic Man" dominates the outcome of social interactions, and when bounded rationality or other-regarding preferences dominate. Here we show that strategic incentives are the key to answering this question. A minority of self-regarding individuals can trigger a "noncooperative" aggregate outcome if their behavior generates incentives for the majority of other-regarding individuals to mimic the minority's behavior. Likewise, a minority of other-regarding individuals can generate a "cooperative" aggregate outcome if their behavior generates incentives for a majority of self-regarding people to behave cooperatively. Similarly, in strategic games, aggregate outcomes can be either far from or close to Nash equilibrium if players with high degrees of strategic thinking mimic or erase the effects of others who do very little strategic thinking. Recently developed theories of other-regarding preferences and bounded rationality explain these findings and provide better predictions of actual aggregate behavior than does traditional economic theory.
Reputation-Based Investment Helps to Optimize Group Behaviors in Spatial Lattice Networks.
Ding, Hong; Cao, Lin; Ren, Yizhi; Choo, Kim-Kwang Raymond; Shi, Benyun
2016-01-01
Encouraging cooperation among selfish individuals is crucial in many real-world systems, where individuals' collective behaviors can be analyzed using evolutionary public goods game. Along this line, extensive studies have shown that reputation is an effective mechanism to investigate the evolution of cooperation. In most existing studies, participating individuals in a public goods game are assumed to contribute unconditionally into the public pool, or they can choose partners based on a common reputation standard (e.g., preferences or characters). However, to assign one reputation standard for all individuals is impractical in many real-world deployment. In this paper, we introduce a reputation tolerance mechanism that allows an individual to select its potential partners and decide whether or not to contribute an investment to the public pool based on its tolerance to other individuals' reputation. Specifically, an individual takes part in a public goods game only if the number of participants with higher reputation exceeds the value of its tolerance. Moreover, in this paper, an individual's reputation can increase or decrease in a bounded interval based on its historical behaviors. We explore the principle that how the reputation tolerance and conditional investment mechanisms can affect the evolution of cooperation in spatial lattice networks. Our simulation results demonstrate that a larger tolerance value can achieve an environment that promote the cooperation of participants.
Self-Coexistence among IEEE 802.22 Networks: Distributed Allocation of Power and Channel
Sakin, Sayef Azad; Alamri, Atif; Tran, Nguyen H.
2017-01-01
Ensuring self-coexistence among IEEE 802.22 networks is a challenging problem owing to opportunistic access of incumbent-free radio resources by users in co-located networks. In this study, we propose a fully-distributed non-cooperative approach to ensure self-coexistence in downlink channels of IEEE 802.22 networks. We formulate the self-coexistence problem as a mixed-integer non-linear optimization problem for maximizing the network data rate, which is an NP-hard one. This work explores a sub-optimal solution by dividing the optimization problem into downlink channel allocation and power assignment sub-problems. Considering fairness, quality of service and minimum interference for customer-premises-equipment, we also develop a greedy algorithm for channel allocation and a non-cooperative game-theoretic framework for near-optimal power allocation. The base stations of networks are treated as players in a game, where they try to increase spectrum utilization by controlling power and reaching a Nash equilibrium point. We further develop a utility function for the game to increase the data rate by minimizing the transmission power and, subsequently, the interference from neighboring networks. A theoretical proof of the uniqueness and existence of the Nash equilibrium has been presented. Performance improvements in terms of data-rate with a degree of fairness compared to a cooperative branch-and-bound-based algorithm and a non-cooperative greedy approach have been shown through simulation studies. PMID:29215591
Self-Coexistence among IEEE 802.22 Networks: Distributed Allocation of Power and Channel.
Sakin, Sayef Azad; Razzaque, Md Abdur; Hassan, Mohammad Mehedi; Alamri, Atif; Tran, Nguyen H; Fortino, Giancarlo
2017-12-07
Ensuring self-coexistence among IEEE 802.22 networks is a challenging problem owing to opportunistic access of incumbent-free radio resources by users in co-located networks. In this study, we propose a fully-distributed non-cooperative approach to ensure self-coexistence in downlink channels of IEEE 802.22 networks. We formulate the self-coexistence problem as a mixed-integer non-linear optimization problem for maximizing the network data rate, which is an NP-hard one. This work explores a sub-optimal solution by dividing the optimization problem into downlink channel allocation and power assignment sub-problems. Considering fairness, quality of service and minimum interference for customer-premises-equipment, we also develop a greedy algorithm for channel allocation and a non-cooperative game-theoretic framework for near-optimal power allocation. The base stations of networks are treated as players in a game, where they try to increase spectrum utilization by controlling power and reaching a Nash equilibrium point. We further develop a utility function for the game to increase the data rate by minimizing the transmission power and, subsequently, the interference from neighboring networks. A theoretical proof of the uniqueness and existence of the Nash equilibrium has been presented. Performance improvements in terms of data-rate with a degree of fairness compared to a cooperative branch-and-bound-based algorithm and a non-cooperative greedy approach have been shown through simulation studies.
Detection of communities with Naming Game-based methods
Ribeiro, Carlos Henrique Costa
2017-01-01
Complex networks are often organized in groups or communities of agents that share the same features and/or functions, and this structural organization is built naturally with the formation of the system. In social networks, we argue that the dynamic of linguistic interactions of agreement among people can be a crucial factor in generating this community structure, given that sharing opinions with another person bounds them together, and disagreeing constantly would probably weaken the relationship. We present here a computational model of opinion exchange that uncovers the community structure of a network. Our aim is not to present a new community detection method proper, but to show how a model of social communication dynamics can reveal the (simple and overlapping) community structure in an emergent way. Our model is based on a standard Naming Game, but takes into consideration three social features: trust, uncertainty and opinion preference, that are built over time as agents communicate among themselves. We show that the separate addition of each social feature in the Naming Game results in gradual improvements with respect to community detection. In addition, the resulting uncertainty and trust values classify nodes and edges according to role and position in the network. Also, our model has shown a degree of accuracy both for non-overlapping and overlapping communities that are comparable with most algorithms specifically designed for topological community detection. PMID:28797097
Air quality in Delhi during the CommonWealth Games
NASA Astrophysics Data System (ADS)
Marrapu, P.; Cheng, Y.; Beig, G.; Sahu, S.; Srinivas, R.; Carmichael, G. R.
2014-04-01
Air quality during The CommonWealth Games (CWG, held in Delhi in October 2010) is analyzed using a new air quality forecasting system established for the Games. The CWG stimulated enhanced efforts to monitor and model air quality in the region. The air quality of Delhi during the CWG had high levels of particles with mean values of PM2.5 and PM10 at the venues of 111 and 238 μg m-3, respectively. Black carbon (BC) accounted for ∼10% of the PM2.5 mass. It is shown that BC, PM2.5 and PM10 concentrations are well predicted, but with positive biases of ∼25%. The diurnal variations are also well captured, with both the observations and the modeled values showing nighttime maxima and daytime minima. A new emissions inventory, developed as part of this air quality forecasting initiative, is evaluated by comparing the observed and predicted species-species correlations (i.e., BC : CO; BC : PM2.5; PM2.5 : PM10). Assuming that the observations at these sites are representative and that all the model errors are associated with the emissions, then the modeled concentrations and slopes can be made consistent by scaling the emissions by: 0.6 for NOx, 2 for CO, and 0.7 for BC, PM2.5 and PM10. The emission estimates for particles are remarkably good considering the uncertainty in the estimates due to the diverse spread of activities and technologies that take place in Delhi and the rapid rates of change. The contribution of various emission sectors including transportation, power, domestic and industry to surface concentrations are also estimated. Transport, domestic and industrial sectors all make significant contributions to PM levels in Delhi, and the sectoral contributions vary spatially within the city. Ozone levels in Delhi are elevated, with hourly values sometimes exceeding 100 ppb. The continued growth of the transport sector is expected to make ozone pollution a more pressing air pollution problem in Delhi. The sector analysis provides useful inputs into the design of strategies to reduce air pollution levels in Delhi. The contribution for sources outside of Delhi on Delhi air quality range from ∼25% for BC and PM to ∼60% for day time ozone. The significant contributions from non-Delhi sources indicates that in Delhi (as has been show elsewhere) these strategies will also need a more regional perspective.
Improved Vote Aggregation Techniques for the Geo-Wiki Cropland Capture Crowdsourcing Game
NASA Astrophysics Data System (ADS)
Baklanov, Artem; Fritz, Steffen; Khachay, Michael; Nurmukhametov, Oleg; Salk, Carl; See, Linda; Shchepashchenko, Dmitry
2016-04-01
Crowdsourcing is a new approach for solving data processing problems for which conventional methods appear to be inaccurate, expensive, or time-consuming. Nowadays, the development of new crowdsourcing techniques is mostly motivated by so called Big Data problems, including problems of assessment and clustering for large datasets obtained in aerospace imaging, remote sensing, and even in social network analysis. By involving volunteers from all over the world, the Geo-Wiki project tackles problems of environmental monitoring with applications to flood resilience, biomass data analysis and classification of land cover. For example, the Cropland Capture Game, which is a gamified version of Geo-Wiki, was developed to aid in the mapping of cultivated land, and was used to gather 4.5 million image classifications from the Earth's surface. More recently, the Picture Pile game, which is a more generalized version of Cropland Capture, aims to identify tree loss over time from pairs of very high resolution satellite images. Despite recent progress in image analysis, the solution to these problems is hard to automate since human experts still outperform the majority of machine learning algorithms and artificial systems in this field on certain image recognition tasks. The replacement of rare and expensive experts by a team of distributed volunteers seems to be promising, but this approach leads to challenging questions such as: how can individual opinions be aggregated optimally, how can confidence bounds be obtained, and how can the unreliability of volunteers be dealt with? In this paper, on the basis of several known machine learning techniques, we propose a technical approach to improve the overall performance of the majority voting decision rule used in the Cropland Capture Game. The proposed approach increases the estimated consistency with expert opinion from 77% to 86%.
Analysis and Forecasting of Shoreline Position
NASA Astrophysics Data System (ADS)
Barton, C. C.; Tebbens, S. F.
2007-12-01
Analysis of historical shoreline positions on sandy coasts, in the geologic record, and study of sea-level rise curves reveals that the dynamics of the underlying processes produce temporal/spatial signals that exhibit power scaling and are therefore self-affine fractals. Self-affine time series signals can be quantified over many orders of magnitude in time and space in terms of persistence, a measure of the degree of correlation between adjacent values in the stochastic portion of a time series. Fractal statistics developed for self-affine time series are used to forecast a probability envelope bounding future shoreline positions. The envelope provides the standard deviation as a function of three variables: persistence, a constant equal to the value of the power spectral density when 1/period equals 1, and the number of time increments. The persistence of a twenty-year time series of the mean-high-water (MHW) shoreline positions was measured for four profiles surveyed at Duck, NC at the Field Research Facility (FRF) by the U.S. Army Corps of Engineers. The four MHW shoreline time series signals are self-affine with persistence ranging between 0.8 and 0.9, which indicates that the shoreline position time series is weakly persistent (where zero is uncorrelated), and has highly varying trends for all time intervals sampled. Forecasts of a probability envelope for future MHW positions are made for the 20 years of record and beyond to 50 years from the start of the data records. The forecasts describe the twenty-year data sets well and indicate that within a 96% confidence envelope, future decadal MHW shoreline excursions should be within 14.6 m of the position at the start of data collection. This is a stable-oscillatory shoreline. The forecasting method introduced here includes the stochastic portion of the time series while the traditional method of predicting shoreline change reduces the time series to a linear trend line fit to historic shoreline positions and extrapolated linearly to forecast future positions with a linearly increasing mean that breaks the confidence envelope eight years into the future and continues to increase. The traditional method is a poor representation of the observed shoreline position time series and is a poor basis for extrapolating future shoreline positions.
Radiation protection for manned space activities
NASA Technical Reports Server (NTRS)
Jordan, T. M.
1983-01-01
The Earth's natural radiation environment poses a hazard to manned space activities directly through biological effects and indirectly through effects on materials and electronics. The following standard practices are indicated that address: (1) environment models for all radiation species including uncertainties and temporal variations; (2) upper bound and nominal quality factors for biological radiation effects that include dose, dose rate, critical organ, and linear energy transfer variations; (3) particle transport and shielding methodology including system and man modeling and uncertainty analysis; (4) mission planning that includes active dosimetry, minimizes exposure during extravehicular activities, subjects every mission to a radiation review, and specifies operational procedures for forecasting, recognizing, and dealing with large solar flaes.
Forecast of drifter trajectories using a Rapid Environmental Assessment based on CTD observations
NASA Astrophysics Data System (ADS)
Sorgente, R.; Tedesco, C.; Pessini, F.; De Dominicis, M.; Gerin, R.; Olita, A.; Fazioli, L.; Di Maio, A.; Ribotti, A.
2016-11-01
A high resolution submesoscale resolving ocean model was implemented in a limited area north of Island of Elba where a maritime exercise, named Serious Game 1 (SG1), took place on May 2014 in the framework of the project MEDESS-4MS (Mediterranean Decision Support System for Marine Safety). During the exercise, CTD data have been collected responding to the necessity of a Rapid Environmental Assessment, i.e. to a rapid evaluation of the marine conditions able to provide sensible information for initialisation of modelling tools, in the scenario of possible maritime accidents. The aim of this paper is to evaluate the impact of such mesoscale-resolving CTD observations on short-term forecasts of the surface currents, within the framework of possible oil-spill related emergencies. For this reason, modelling outputs were compared with Lagrangian observations at sea: the high resolution modelled currents, together with the ones of the coarser sub-regional model WMED, are used to force the MEDSLIK-II oil-spill model to simulate drifter trajectories. Both ocean models have been assessed by comparing the prognostic scalar and vector fields as an independent CTD data set and with real drifter trajectories acquired during SG1. The diagnosed and prognosed circulation reveals that the area was characterised by water masses of Atlantic origin influenced by small mesoscale cyclonic and anti-cyclonic eddies, which govern the spatial and temporal evolution of the drifter trajectories and of the water masses distribution. The assimilation of CTD data into the initial conditions of the high resolution model highly improves the accuracy of the short-term forecast in terms of location and structure of the thermocline and positively influence the ability of the model in reproducing the observed paths of the surface drifters.
Managing in the Virtual World: How Second Life is Rewriting the Rules of "Real Life" Business
NASA Astrophysics Data System (ADS)
Wyld, David C.
In this paper, we will explore the growth of virtual worlds - one of the most exciting and fast-growing concepts in the Web 2.0 era. We will see that while there has been significant growth across all demographic groups, online gaming in MMOGs (Massively Multiplayer Online Games) are finding particular appeal in today's youth - the so-called "digital native" generation. We then overview the today's virtual world marketplace, both in the youth and adult-oriented markets. Second Life is emerging as the most important virtual world today, due to the intense interest amongst both large organizations and individual entrepreneurs to conduct real business in the virtual environment. Due to its prominence today and its forecasted growth over the next decade, we take a look at the unscripted world of Second Life, examining the corporate presence in-world, as well as the economic, technical, legal, ethical and security issues involved for companies doing business in the virtual world. In conclusion, we present an analysis of where we stand in terms of virtual world development today and a projection of where we will be heading in the near future. Finally, we present advice to management practitioners and academicians on how to learn about virtual worlds and explore the world of opportunities in them.
Beyond Wiki to Judgewiki for Transparent Climate Change Decisions
NASA Astrophysics Data System (ADS)
Capron, M. E.
2008-12-01
Climate Change is like the prisoner's dilemma, a zero-sum game, or cheating in sports. Everyone and every country is tempted to selfishly maintain or advance their standard of living. The tremendous difference between standards of living amplifies the desire to opt out of Climate Change solutions adverse to economic competitiveness. Climate Change is also exceedingly complex. No one person, one organization, one country, or partial collection of countries has the capacity and the global support needed to make decisions on Climate Change solutions. There are thousands of potential actions, tens of thousands of known and unknown environmental and economic impacts. Some actions are belatedly found to be unsustainable beyond token volumes, corn ethanol or soy-biodiesel for example. Mankind can address human nature and complexity with a globally transparent information and decision process available to all 7 billion of us. We need a process that builds trust and simplifies complexity. Fortunately, we have the Internet for trust building communication and computers to simplify complexity. Mankind can produce new software tailored to the challenge. We would combine group information collection software (a wiki) with a decision-matrix (a judge), market forecasting, and video games to produce the tool mankind needs for trust building transparent decisions on Climate Change actions. The resulting software would be a judgewiki.
Run-Reversal Equilibrium for Clinical Trial Randomization
Grant, William C.
2015-01-01
In this paper, we describe a new restricted randomization method called run-reversal equilibrium (RRE), which is a Nash equilibrium of a game where (1) the clinical trial statistician chooses a sequence of medical treatments, and (2) clinical investigators make treatment predictions. RRE randomization counteracts how each investigator could observe treatment histories in order to forecast upcoming treatments. Computation of a run-reversal equilibrium reflects how the treatment history at a particular site is imperfectly correlated with the treatment imbalance for the overall trial. An attractive feature of RRE randomization is that treatment imbalance follows a random walk at each site, while treatment balance is tightly constrained and regularly restored for the overall trial. Less predictable and therefore more scientifically valid experiments can be facilitated by run-reversal equilibrium for multi-site clinical trials. PMID:26079608
DOE Office of Scientific and Technical Information (OSTI.GOV)
Host, Ole; Lahav, Ofer; Abdalla, Filipe B.
We present a showcase for deriving bounds on the neutrino masses from laboratory experiments and cosmological observations. We compare the frequentist and Bayesian bounds on the effective electron neutrino mass m{sub {beta}} which the KATRIN neutrino mass experiment is expected to obtain, using both an analytical likelihood function and Monte Carlo simulations of KATRIN. Assuming a uniform prior in m{sub {beta}}, we find that a null result yields an upper bound of about 0.17 eV at 90% confidence in the Bayesian analysis, to be compared with the frequentist KATRIN reference value of 0.20 eV. This is a significant difference whenmore » judged relative to the systematic and statistical uncertainties of the experiment. On the other hand, an input m{sub {beta}}=0.35 eV, which is the KATRIN 5{sigma} detection threshold, would be detected at virtually the same level. Finally, we combine the simulated KATRIN results with cosmological data in the form of present (post-WMAP) and future (simulated Planck) observations. If an input of m{sub {beta}}=0.2 eV is assumed in our simulations, KATRIN alone excludes a zero neutrino mass at 2.2{sigma}. Adding Planck data increases the probability of detection to a median 2.7{sigma}. The analysis highlights the importance of combining cosmological and laboratory data on an equal footing.« less
Crew Earth Observations (CEO) taken during Expedition 9
2004-06-12
ISS009-E-11537 (12 June 2004) --- Athens, Greece is featured in this image photographed by an Expedition 9 crewmember on the International Space Station (ISS). This photo includes areas of new construction or renovation for the Olympic Games, such as the Faliro Coastal Zone Olympic Complex and Helliniko Olympic Complex. The image also demonstrates the control of bounding mountain ranges (Mts. Aigeleos and Hymettos) on the western and southern expansion of the Athens urban area. Athens is located in the Central Plains region of Attica in eastern Greece. The large basin in which Athens is located was formed by faulting and has accumulated thick deposits of clays and alluvium.
Dynamic pricing of network goods with boundedly rational consumers.
Radner, Roy; Radunskaya, Ami; Sundararajan, Arun
2014-01-07
We present a model of dynamic monopoly pricing for a good that displays network effects. In contrast with the standard notion of a rational-expectations equilibrium, we model consumers as boundedly rational and unable either to pay immediate attention to each price change or to make accurate forecasts of the adoption of the network good. Our analysis shows that the seller's optimal price trajectory has the following structure: The price is low when the user base is below a target level, is high when the user base is above the target, and is set to keep the user base stationary once the target level has been attained. We show that this pricing policy is robust to a number of extensions, which include the product's user base evolving over time and consumers basing their choices on a mixture of a myopic and a "stubborn" expectation of adoption. Our results differ significantly from those that would be predicted by a model based on rational-expectations equilibrium and are more consistent with the pricing of network goods observed in practice.
Dynamic pricing of network goods with boundedly rational consumers
Radner, Roy; Radunskaya, Ami; Sundararajan, Arun
2014-01-01
We present a model of dynamic monopoly pricing for a good that displays network effects. In contrast with the standard notion of a rational-expectations equilibrium, we model consumers as boundedly rational and unable either to pay immediate attention to each price change or to make accurate forecasts of the adoption of the network good. Our analysis shows that the seller’s optimal price trajectory has the following structure: The price is low when the user base is below a target level, is high when the user base is above the target, and is set to keep the user base stationary once the target level has been attained. We show that this pricing policy is robust to a number of extensions, which include the product’s user base evolving over time and consumers basing their choices on a mixture of a myopic and a “stubborn” expectation of adoption. Our results differ significantly from those that would be predicted by a model based on rational-expectations equilibrium and are more consistent with the pricing of network goods observed in practice. PMID:24367101
NASA Astrophysics Data System (ADS)
Kotegawa, Tatsuya
Complexity in the Air Transportation System (ATS) arises from the intermingling of many independent physical resources, operational paradigms, and stakeholder interests, as well as the dynamic variation of these interactions over time. Currently, trade-offs and cost benefit analyses of new ATS concepts are carried out on system-wide evaluation simulations driven by air traffic forecasts that assume fixed airline routes. However, this does not well reflect reality as airlines regularly add and remove routes. A airline service route network evolution model that projects route addition and removal was created and combined with state-of-the-art air traffic forecast methods to better reflect the dynamic properties of the ATS in system-wide simulations. Guided by a system-of-systems framework, network theory metrics and machine learning algorithms were applied to develop the route network evolution models based on patterns extracted from historical data. Constructing the route addition section of the model posed the greatest challenge due to the large pool of new link candidates compared to the actual number of routes historically added to the network. Of the models explored, algorithms based on logistic regression, random forests, and support vector machines showed best route addition and removal forecast accuracies at approximately 20% and 40%, respectively, when validated with historical data. The combination of network evolution models and a system-wide evaluation tool quantified the impact of airline route network evolution on air traffic delay. The expected delay minutes when considering network evolution increased approximately 5% for a forecasted schedule on 3/19/2020. Performance trade-off studies between several airline route network topologies from the perspectives of passenger travel efficiency, fuel burn, and robustness were also conducted to provide bounds that could serve as targets for ATS transformation efforts. The series of analysis revealed that high robustness is achievable only in exchange of lower passenger travel and fuel burn efficiency. However, increase in the network density can mitigate this trade-off.
Emergence of cooperation as a non-equilibrium transition in noisy spatial games
NASA Astrophysics Data System (ADS)
Menon, Shakti N.; Sasidevan, V.; Sinha, Sitabhra
2018-04-01
The emergence of cooperation among selfish agents that have no incentive to cooperate is a non-trivial phenomenon that has long intrigued biologists, social scientists and physicists. The iterated Prisoner's Dilemma (IPD) game provides a natural framework for investigating this phenomenon. Here, agents repeatedly interact with their opponents, and their choice to either cooperate or defect is determined at each round by knowledge of the previous outcomes. The spatial version of IPD, where each agent interacts only with their nearest neighbors on a specified connection topology, has been used to study the evolution of cooperation under conditions of bounded rationality. In this paper we study how the collective behavior that arises from the simultaneous actions of the agents (implemented by synchronous update) is affected by (i) uncertainty, measured as noise intensity K, (ii) the payoff b, quantifying the temptation to defect and (iii) the nature of the underlying connection topology. In particular, we study the phase transitions between states characterized by distinct collective dynamics as the connection topology is gradually altered from a two-dimensional lattice to a random network. This is achieved by rewiring links between agents with a probability p following the small-world network construction paradigm. On crossing a specified threshold value of b, the game switches from being Prisoner's Dilemma, characterized by a unique equilibrium, to Stag Hunt, a well-known coordination game having multiple equilibria. We observe that the system can exhibit three collective states corresponding to a pair of absorbing states (viz., all agents cooperating or defecting) and a fluctuating state characterized by agents switching intermittently between cooperation and defection. As noise and temptation can be interpreted as temperature and an external field respectively, a strong analogy can be drawn between the phase diagrams of such games with that of interacting spin systems. Considering the 3-dimensional p-K-b parameter space allows us to investigate the different phase transitions that occur between these collective states and characterize them using finite-size scaling. We find that the values of the critical exponents depend on the connection topology and are different from the Directed Percolation (DP) universality class.
Concrete ensemble Kalman filters with rigorous catastrophic filter divergence
Kelly, David; Majda, Andrew J.; Tong, Xin T.
2015-01-01
The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature. PMID:26261335
Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.
Kelly, David; Majda, Andrew J; Tong, Xin T
2015-08-25
The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.
Multi-agent fare optimization model of two modes problem and its analysis based on edge of chaos
NASA Astrophysics Data System (ADS)
Li, Xue-yan; Li, Xue-mei; Li, Xue-wei; Qiu, He-ting
2017-03-01
This paper proposes a new framework of fare optimization & game model for studying the competition between two travel modes (high speed railway and civil aviation) in which passengers' group behavior is taken into consideration. The small-world network is introduced to construct the multi-agent model of passengers' travel mode choice. The cumulative prospect theory is adopted to depict passengers' bounded rationality, the heterogeneity of passengers' reference point is depicted using the idea of group emotion computing. The conceptions of "Langton parameter" and "evolution entropy" in the theory of "edge of chaos" are introduced to create passengers' "decision coefficient" and "evolution entropy of travel mode choice" which are used to quantify passengers' group behavior. The numerical simulation and the analysis of passengers' behavior show that (1) the new model inherits the features of traditional model well and the idea of self-organizing traffic flow evolution fully embodies passengers' bounded rationality, (2) compared with the traditional model (logit model), when passengers are in the "edge of chaos" state, the total profit of the transportation system is higher.
NASA Astrophysics Data System (ADS)
Newchurch, M.; Zavodsky, B.; Chance, K.; Haynes, J.; Lefer, B. L.; Naeger, A.
2016-12-01
The AQ research community has a long legacy of using space-based observations (e.g., Solar Backscatter Ultraviolet Instrument [SBUV], Global Ozone Monitoring Experiment [GOME], Ozone Monitoring Instrument [OMI], and the Ozone Mapping & Profiler Suite [OMPS]) to study atmospheric chemistry. These measurements have been used to observe day-to-day and year-to-year changes in atmospheric constituents. However, they have not been able to capture the diurnal variability of pollution with enough temporal or spatial fidelity and a low enough latency for regular use by operational decision makers. As a result, the operational AQ community has traditionally relied on ground-based (e.g., collection stations, LIDAR) and airborne observing systems to study tropospheric chemistry. In order to maximize its utility for applications and decision support, there is a need to educate the community about the game-changing potential for the geostationary TEMPO mission well ahead of its expected launch date early in the third decade of this millinium. This NASA mission will engage user communities and enable science across the NASA Applied Science Focus Areas of Health and Air Quality, Disasters, Water Resources, and Ecological Forecasting, In addition, topics discussed will provide opportunities for collaborations extending TEMPO applications to future program areas in Agriculture, Weather and Climate (including Numerical Weather Prediction), Energy, and Oceans.
Behavior-Based Budget Management Using Predictive Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Troy Hiltbrand
Historically, the mechanisms to perform forecasting have primarily used two common factors as a basis for future predictions: time and money. While time and money are very important aspects of determining future budgetary spend patterns, organizations represent a complex system of unique individuals with a myriad of associated behaviors and all of these behaviors have bearing on how budget is utilized. When looking to forecasted budgets, it becomes a guessing game about how budget managers will behave under a given set of conditions. This becomes relatively messy when human nature is introduced, as different managers will react very differently undermore » similar circumstances. While one manager becomes ultra conservative during periods of financial austerity, another might be un-phased and continue to spend as they have in the past. Both might revert into a state of budgetary protectionism masking what is truly happening at a budget holder level, in order to keep as much budget and influence as possible while at the same time sacrificing the greater good of the organization. To more accurately predict future outcomes, the models should consider both time and money and other behavioral patterns that have been observed across the organization. The field of predictive analytics is poised to provide the tools and methodologies needed for organizations to do just this: capture and leverage behaviors of the past to predict the future.« less
Some applications of uncertainty relations in quantum information
NASA Astrophysics Data System (ADS)
Majumdar, A. S.; Pramanik, T.
2016-08-01
We discuss some applications of various versions of uncertainty relations for both discrete and continuous variables in the context of quantum information theory. The Heisenberg uncertainty relation enables demonstration of the Einstein, Podolsky and Rosen (EPR) paradox. Entropic uncertainty relations (EURs) are used to reveal quantum steering for non-Gaussian continuous variable states. EURs for discrete variables are studied in the context of quantum memory where fine-graining yields the optimum lower bound of uncertainty. The fine-grained uncertainty relation is used to obtain connections between uncertainty and the nonlocality of retrieval games for bipartite and tripartite systems. The Robertson-Schrödinger (RS) uncertainty relation is applied for distinguishing pure and mixed states of discrete variables.
Adversarial risk analysis with incomplete information: a level-k approach.
Rothschild, Casey; McLay, Laura; Guikema, Seth
2012-07-01
This article proposes, develops, and illustrates the application of level-k game theory to adversarial risk analysis. Level-k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple defend-attack model in which the defender's countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack. © 2011 Society for Risk Analysis.
Distributed Constrained Optimization with Semicoordinate Transformations
NASA Technical Reports Server (NTRS)
Macready, William; Wolpert, David
2006-01-01
Recent work has shown how information theory extends conventional full-rationality game theory to allow bounded rational agents. The associated mathematical framework can be used to solve constrained optimization problems. This is done by translating the problem into an iterated game, where each agent controls a different variable of the problem, so that the joint probability distribution across the agents moves gives an expected value of the objective function. The dynamics of the agents is designed to minimize a Lagrangian function of that joint distribution. Here we illustrate how the updating of the Lagrange parameters in the Lagrangian is a form of automated annealing, which focuses the joint distribution more and more tightly about the joint moves that optimize the objective function. We then investigate the use of "semicoordinate" variable transformations. These separate the joint state of the agents from the variables of the optimization problem, with the two connected by an onto mapping. We present experiments illustrating the ability of such transformations to facilitate optimization. We focus on the special kind of transformation in which the statistically independent states of the agents induces a mixture distribution over the optimization variables. Computer experiment illustrate this for &sat constraint satisfaction problems and for unconstrained minimization of NK functions.
MoKey: A versatile exergame creator for everyday usage.
Eckert, Martina; López, Marcos; Lázaro, Carlos; Meneses, Juan
2017-11-27
Currently, virtual applications for physical exercises are highly appreciated as rehabilitation instruments. This article presents a middleware called "MoKey" (Motion Keyboard), which converts standard off-the-shelf software into exergames (exercise games). A configurable set of gestures, captured by a motion capture camera, is translated into the key strokes required by the chosen software. The present study assesses the tool regarding usability and viability on a heterogeneous group of 11 participants, aged 5 to 51, with moderate to severe disabilities, and mostly bound to a wheelchair. In comparison with FAAST (The Flexible Action and Articulated Skeleton Toolkit), MoKey achieved better results in terms of ease of use and computational load. The viability as an exergame creator tool was proven with help of four applications (PowerPoint®, e-book reader, Skype®, and Tetris). Success rates of up to 91% have been achieved, subjective perception was rated with 4.5 points (from 0-5). The middleware provides increased motivation due to the use of favorite software and the advantage of exploiting it for exercise. Used together with communication software or online games, social inclusion can be stimulated. The therapists can employ the tool to monitor the correctness and progress of the exercises.
Extinction rates in tumour public goods games.
Gerlee, Philip; Altrock, Philipp M
2017-09-01
Cancer evolution and progression are shaped by cellular interactions and Darwinian selection. Evolutionary game theory incorporates both of these principles, and has been proposed as a framework to understand tumour cell population dynamics. A cornerstone of evolutionary dynamics is the replicator equation, which describes changes in the relative abundance of different cell types, and is able to predict evolutionary equilibria. Typically, the replicator equation focuses on differences in relative fitness. We here show that this framework might not be sufficient under all circumstances, as it neglects important aspects of population growth. Standard replicator dynamics might miss critical differences in the time it takes to reach an equilibrium, as this time also depends on cellular turnover in growing but bounded populations. As the system reaches a stable manifold, the time to reach equilibrium depends on cellular death and birth rates. These rates shape the time scales, in particular, in coevolutionary dynamics of growth factor producers and free-riders. Replicator dynamics might be an appropriate framework only when birth and death rates are of similar magnitude. Otherwise, population growth effects cannot be neglected when predicting the time to reach an equilibrium, and cell-type-specific rates have to be accounted for explicitly. © 2017 The Authors.
Gordon, C; Roopchand-Martin, S; Gregg, A
2012-09-01
To explore the possibility of using the Nintendo Wii™ as a rehabilitation tool for children with cerebral palsy (CP) in a developing country, and determine whether there is potential for an impact on their gross motor function. Pilot study with a pre-post-test design. Sir John Golding Rehabilitation Center, Jamaica, West Indies. Seven children, aged 6 to 12 years, with dyskinetic CP were recruited for the study. One child dropped out at week 4. Training with the Nintendo Wii was conducted twice weekly for 6 weeks. The games used were Wii Sports Boxing, Baseball and Tennis. Percentage attendance over the 6-week period, percentage of sessions for which the full duration of training was completed, and changes in gross motor function using the Gross Motor Function Measure (GMFM). All six participants who completed the study had 100% attendance, and all were able to complete the full 45 minutes of training at every session. Those who were wheelchair bound participated in two games, whilst those who were ambulant played three games. The mean GMFM score increased from 62.83 [standard deviation (SD) 24.86] to 70.17 (SD 23.67). The Nintendo Wii has the potential for use as a rehabilitation tool in the management of children with CP. Clinical trials should be conducted in this area to determine whether this could be an effective tool for improving gross motor function. Copyright © 2012 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.
Interdependent Network Recovery Games.
Smith, Andrew M; González, Andrés D; Dueñas-Osorio, Leonardo; D'Souza, Raissa M
2017-10-30
Recovery of interdependent infrastructure networks in the presence of catastrophic failure is crucial to the economy and welfare of society. Recently, centralized methods have been developed to address optimal resource allocation in postdisaster recovery scenarios of interdependent infrastructure systems that minimize total cost. In real-world systems, however, multiple independent, possibly noncooperative, utility network controllers are responsible for making recovery decisions, resulting in suboptimal decentralized processes. With the goal of minimizing recovery cost, a best-case decentralized model allows controllers to develop a full recovery plan and negotiate until all parties are satisfied (an equilibrium is reached). Such a model is computationally intensive for planning and negotiating, and time is a crucial resource in postdisaster recovery scenarios. Furthermore, in this work, we prove this best-case decentralized negotiation process could continue indefinitely under certain conditions. Accounting for network controllers' urgency in repairing their system, we propose an ad hoc sequential game-theoretic model of interdependent infrastructure network recovery represented as a discrete time noncooperative game between network controllers that is guaranteed to converge to an equilibrium. We further reduce the computation time needed to find a solution by applying a best-response heuristic and prove bounds on ε-Nash equilibrium, where ε depends on problem inputs. We compare best-case and ad hoc models on an empirical interdependent infrastructure network in the presence of simulated earthquakes to demonstrate the extent of the tradeoff between optimality and computational efficiency. Our method provides a foundation for modeling sociotechnical systems in a way that mirrors restoration processes in practice. © 2017 Society for Risk Analysis.
Developing International Guidelines on Volcanic Hazard Assessments for Nuclear Facilities
NASA Astrophysics Data System (ADS)
Connor, Charles
2014-05-01
Worldwide, tremendous progress has been made in recent decades in forecasting volcanic events, such as episodes of volcanic unrest, eruptions, and the potential impacts of eruptions. Generally these forecasts are divided into two categories. Short-term forecasts are prepared in response to unrest at volcanoes, rely on geophysical monitoring and related observations, and have the goal of forecasting events on timescales of hours to weeks to provide time for evacuation of people, shutdown of facilities, and implementation of related safety measures. Long-term forecasts are prepared to better understand the potential impacts of volcanism in the future and to plan for potential volcanic activity. Long-term forecasts are particularly useful to better understand and communicate the potential consequences of volcanic events for populated areas around volcanoes and for siting critical infrastructure, such as nuclear facilities. Recent work by an international team, through the auspices of the International Atomic Energy Agency, has focused on developing guidelines for long-term volcanic hazard assessments. These guidelines have now been implemented for hazard assessment for nuclear facilities in nations including Indonesia, the Philippines, Armenia, Chile, and the United States. One any time scale, all volcanic hazard assessments rely on a geologically reasonable conceptual model of volcanism. Such conceptual models are usually built upon years or decades of geological studies of specific volcanic systems, analogous systems, and development of a process-level understanding of volcanic activity. Conceptual models are used to bound potential rates of volcanic activity, potential magnitudes of eruptions, and to understand temporal and spatial trends in volcanic activity. It is these conceptual models that provide essential justification for assumptions made in statistical model development and the application of numerical models to generate quantitative forecasts. It is a tremendous challenge in quantitative volcanic hazard assessments to encompass alternative conceptual models, and to create models that are robust to evolving understanding of specific volcanic systems by the scientific community. A central question in volcanic hazards forecasts is quantifying rates of volcanic activity. Especially for long-dormant volcanic systems, data from the geologic record may be sparse, individual events may be missing or unrecognized in the geologic record, patterns of activity may be episodic or otherwise nonstationary. This leads to uncertainty in forecasting long-term rates of activity. Hazard assessments strive to quantify such uncertainty, for example by comparing observed rates of activity with alternative parametric and nonparametric models. Numerical models are presented that characterize the spatial distribution of potential volcanic events. These spatial density models serve as the basis for application of numerical models of specific phenomena such as development of lava flow, tephra fallout, and a host of other volcanic phenomena. Monte Carlo techniques (random sampling, stratified sampling, importance sampling) are methods used to sample vent location and other key eruption parameters, such as eruption volume, magma rheology, and eruption column height for probabilistic models. The development of coupled scenarios (e.g., the probability of tephra accumulation on a slope resulting in subsequent debris flows) is also assessed through these methods, usually with the aid of event trees. The primary products of long-term forecasts are a statistical model of the conditional probability of the potential effects of volcanism, should an eruption occur, and the probability of such activity occurring. It is emphasized that hazard forecasting is an iterative process, and board consideration must be given to alternative conceptual models of volcanism, weighting of volcanological data in the analyses, and alternative statistical and numerical models. This structure is amenable to expert elicitation in order to weight alternative models and to explore alternative scenarios.
Are transaction taxes a cause of financial instability?
NASA Astrophysics Data System (ADS)
Fontini, Fulvio; Sartori, Elena; Tolotti, Marco
2016-05-01
We analyze a stylized market where N boundedly rational agents may decide to trade or not a share of a risky asset at subsequent trading dates. Agents' payoff depends on returns, which are endogenously determined taking into account observed and forecasted demand and an exogenous transaction tax. We study the time evolutions of demand, returns and market activity. We show that the introduction of a transaction tax generally helps in reducing variability of returns and market activity. On the other hand, there are market conditions under which a low taxation may lead the market into a very unstable phase characterized by the fluctuation of the fundamentals around two different regimes; indeed, under these circumstances, heteroscedasticity of time series is detected and statistically analyzed.
NASA Astrophysics Data System (ADS)
Pianosi, Francesca; Lal Shrestha, Durga; Solomatine, Dimitri
2010-05-01
This research presents an extension of UNEEC (Uncertainty Estimation based on Local Errors and Clustering, Shrestha and Solomatine, 2006, 2008 & Solomatine and Shrestha, 2009) method in the direction of explicit inclusion of parameter uncertainty. UNEEC method assumes that there is an optimal model and the residuals of the model can be used to assess the uncertainty of the model prediction. It is assumed that all sources of uncertainty including input, parameter and model structure uncertainty are explicitly manifested in the model residuals. In this research, theses assumptions are relaxed, and the UNEEC method is extended to consider parameter uncertainty as well (abbreviated as UNEEC-P). In UNEEC-P, first we use Monte Carlo (MC) sampling in parameter space to generate N model realizations (each of which is a time series), estimate the prediction quantiles based on the empirical distribution functions of the model residuals considering all the residual realizations, and only then apply the standard UNEEC method that encapsulates the uncertainty of a hydrologic model (expressed by quantiles of the error distribution) in a machine learning model (e.g., ANN). UNEEC-P is applied first to a linear regression model of synthetic data, and then to a real case study of forecasting inflow to lake Lugano in northern Italy. The inflow forecasting model is a stochastic heteroscedastic model (Pianosi and Soncini-Sessa, 2009). The preliminary results show that the UNEEC-P method produces wider uncertainty bounds, which is consistent with the fact that the method considers also parameter uncertainty of the optimal model. In the future UNEEC method will be further extended to consider input and structure uncertainty which will provide more realistic estimation of model predictions.
NASA Astrophysics Data System (ADS)
Mueller, M.; Mahoney, K. M.; Holman, K. D.
2015-12-01
The Bureau of Reclamation (Reclamation) is responsible for the safety of Taylor Park Dam, located in central Colorado at an elevation of 9300 feet. A key aspect of dam safety is anticipating extreme precipitation, runoff and the associated inflow of water to the reservoir within a probabilistic framework for risk analyses. The Cooperative Institute for Research in Environmental Sciences (CIRES) has partnered with Reclamation to improve understanding and estimation of precipitation in the western United States, including the Taylor Park watershed. A significant challenge is that Taylor Park Dam is located in a relatively data-sparse region, surrounded by mountains exceeding 12,000 feet. To better estimate heavy precipitation events in this basin, a high-resolution modeling approach is used. The Weather Research and Forecasting (WRF) model is employed to simulate events that have produced observed peaks in streamflow at the location of interest. Importantly, an ensemble of model simulations are run on each event so that uncertainty bounds (i.e., forecast error) may be provided such that the model outputs may be more effectively used in Reclamation's risk assessment framework. Model estimates of precipitation (and the uncertainty thereof) are then used in rainfall runoff models to determine the probability of inflows to the reservoir for use in Reclamation's dam safety risk analyses.
Technosocial Predictive Analytics for Security Informatics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Gilbert, Nigel; Greaves, Mark
2012-08-22
Challenges to the security, health, and sustainable growth of our society keep escalating asymmetrically due to the growing pace of globalization and global change. The increasing velocity of information sharing, social networking, economic forces, and environmental change has resulted in a rapid increase in the number and frequency of “game-changing moments” that a community can face. Social movements that once took a decade to build now take a year; shifts in public opinion that once took a year to take root now take a couple of months. More and more frequently, these critical moments occur too suddenly for the affectedmore » communities to succeed in countering the consequent adversities or seizing the emerging opportunities. Now more than ever, we need anticipatory reasoning technologies to forecast and manage change in order to secure and improve our way of life and the environment we inhabit.« less
The politics of space mining - An account of a simulation game
NASA Astrophysics Data System (ADS)
Paikowsky, Deganit; Tzezana, Roey
2018-01-01
Celestial bodies like the Moon and asteroids contain materials and precious metals, which are valuable for human activity on Earth and beyond. Space mining has been mainly relegated to the realm of science fiction, and was not treated seriously by the international community. The private industry is starting to assemble towards space mining, and success on this front would have major impact on all nations. We present in this paper a review of current space mining ventures, and the international legislation, which could stand in their way - or aid them in their mission. Following that, we present the results of a role-playing simulation in which the role of several important nations was played by students of international relations. The results of the simulation are used as a basis for forecasting the potential initial responses of the nations of the world to a successful space mining operation in the future.
Decision theory with resource-bounded agents.
Halpern, Joseph Y; Pass, Rafael; Seeman, Lior
2014-04-01
There have been two major lines of research aimed at capturing resource-bounded players in game theory. The first, initiated by Rubinstein (), charges an agent for doing costly computation; the second, initiated by Neyman (), does not charge for computation, but limits the computation that agents can do, typically by modeling agents as finite automata. We review recent work on applying both approaches in the context of decision theory. For the first approach, we take the objects of choice in a decision problem to be Turing machines, and charge players for the "complexity" of the Turing machine chosen (e.g., its running time). This approach can be used to explain well-known phenomena like first-impression-matters biases (i.e., people tend to put more weight on evidence they hear early on) and belief polarization (two people with different prior beliefs, hearing the same evidence, can end up with diametrically opposed conclusions) as the outcomes of quite rational decisions. For the second approach, we model people as finite automata, and provide a simple algorithm that, on a problem that captures a number of settings of interest, provably performs optimally as the number of states in the automaton increases. Copyright © 2014 Cognitive Science Society, Inc.
Parochial cooperation in nested intergroup dilemmas is reduced when it harms out-groups.
Aaldering, Hillie; Ten Velden, Femke S; van Kleef, Gerben A; De Dreu, Carsten K W
2018-06-01
In intergroup settings, humans often contribute to their in-group at a personal cost. Such parochial cooperation benefits the in-group and creates and fuels intergroup conflict when it simultaneously hurts out-groups. Here, we introduce a new game paradigm in which individuals can display universal cooperation (which benefits both in- and out-group) as well as parochial cooperation that does, versus does not hurt the out-group. Using this set-up, we test hypotheses derived from group selection theory, social identity, and bounded generalized reciprocity theory. Across three experiments we find, first, that individuals choose parochial over universal cooperation. Second, there was no evidence for a motive to maximize differences between in- and out-group, which is central to both group selection and social identity theory. However, fitting bounded generalized reciprocity theory, we find that individuals with a prosocial value orientation display parochial cooperation, provided that this does not harm the out-group; individualists, in contrast, display parochialism whether or not nut it hurts the out-group. Our findings were insensitive to cognitive taxation (Experiments 2-3), and emerged even when universal cooperation served social welfare more than parochialism (Experiment 3). (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Speculative and Hedging Interaction Model in Oil and U.S. Dollar Markets—Phase Transition
NASA Astrophysics Data System (ADS)
Campbell, Michael; Carfì, David
2018-01-01
We show that there is a phase transition in the bounded rational Carfì-Musolino model, and the possibility of a market crash. This model has two types of operators: a real economic subject (Air) and one or more investment banks (Bank). It also has two markets: oil spot market and US dollar futures. Bank agents react to Air and equilibrate much more quickly than Air. Thus Air is an acting external agent due to its longer-term investing, whereas the action of the banks equilibrates before Air makes its next transaction. This model constitutes a potential game, and agents crowd their preferences into one of the markets at a critical temperature when air makes no purchases of oil futures.
A Learning-Based Approach to Reactive Security
NASA Astrophysics Data System (ADS)
Barth, Adam; Rubinstein, Benjamin I. P.; Sundararajan, Mukund; Mitchell, John C.; Song, Dawn; Bartlett, Peter L.
Despite the conventional wisdom that proactive security is superior to reactive security, we show that reactive security can be competitive with proactive security as long as the reactive defender learns from past attacks instead of myopically overreacting to the last attack. Our game-theoretic model follows common practice in the security literature by making worst-case assumptions about the attacker: we grant the attacker complete knowledge of the defender's strategy and do not require the attacker to act rationally. In this model, we bound the competitive ratio between a reactive defense algorithm (which is inspired by online learning theory) and the best fixed proactive defense. Additionally, we show that, unlike proactive defenses, this reactive strategy is robust to a lack of information about the attacker's incentives and knowledge.
Imperial boyhood: piracy and the play ethic.
Deane, Bradley
2011-01-01
Representations of perpetual boyhood came to fascinate the late Victorians, partly because such images could naturalize a new spirit of imperial aggression and new policies of preserving power. This article traces the emergence of this fantasy through a series of stories about the relationship of the boy and the pirate, figures whose opposition in mid-Victorian literature was used to articulate the moral legitimacy of colonialism, but who became doubles rather than antitheses in later novels, such as R.L. Stevenson's "Treasure Island" and Joseph Conrad's "Lord Jim." Masculine worth needed no longer to be measured by reference to transcendent, universal laws, but by a morally flexible ethic of competitive play, one that bound together boyishness and piracy in a satisfying game of international adventure.
Barber, M Craig; Rashleigh, Brenda; Cyterski, Michael J
2016-01-01
Regional fishery conditions of Mid-Atlantic wadeable streams in the eastern United States are estimated using the Bioaccumulation and Aquatic System Simulator (BASS) bioaccumulation and fish community model and data collected by the US Environmental Protection Agency's Environmental Monitoring and Assessment Program (EMAP). Average annual biomasses and population densities and annual productions are estimated for 352 randomly selected streams. Realized bioaccumulation factors (BAF) and biomagnification factors (BMF), which are dependent on these forecasted biomasses, population densities, and productions, are also estimated by assuming constant water exposures to methylmercury and tetra-, penta-, hexa-, and hepta-chlorinated biphenyls. Using observed biomasses, observed densities, and estimated annual productions of total fish from 3 regions assumed to support healthy fisheries as benchmarks (eastern Tennessee and Catskill Mountain trout streams and Ozark Mountains smallmouth bass streams), 58% of the region's wadeable streams are estimated to be in marginal or poor condition (i.e., not healthy). Using simulated BAFs and EMAP Hg fish concentrations, we also estimate that approximately 24% of the game fish and subsistence fishing species that are found in streams having detectable Hg concentrations would exceed an acceptable human consumption criterion of 0.185 μg/g wet wt. Importantly, such streams have been estimated to represent 78.2% to 84.4% of the Mid-Atlantic's wadeable stream lengths. Our results demonstrate how a dynamic simulation model can support regional assessment and trends analysis for fisheries. © 2015 SETAC.
Stochastic control approaches for sensor management in search and exploitation
NASA Astrophysics Data System (ADS)
Hitchings, Darin Chester
Recent improvements in the capabilities of autonomous vehicles have motivated their increased use in such applications as defense, homeland security, environmental monitoring, and surveillance. To enhance performance in these applications, new algorithms are required to control teams of robots autonomously and through limited interactions with human operators. In this dissertation we develop new algorithms for control of robots performing information-seeking missions in unknown environments. These missions require robots to control their sensors in order to discover the presence of objects, keep track of the objects, and learn what these objects are, given a fixed sensing budget. Initially, we investigate control of multiple sensors, with a finite set of sensing options and finite-valued measurements, to locate and classify objects given a limited resource budget. The control problem is formulated as a Partially Observed Markov Decision Problem (POMDP), but its exact solution requires excessive computation. Under the assumption that sensor error statistics are independent and time-invariant, we develop a class of algorithms using Lagrangian Relaxation techniques to obtain optimal mixed strategies using performance bounds developed in previous research. We investigate alternative Receding Horizon (RH) controllers to convert the mixed strategies to feasible adaptive-sensing strategies and evaluate the relative performance of these controllers in simulation. The resulting controllers provide superior performance to alternative algorithms proposed in the literature and obtain solutions to large-scale POMDP problems several orders of magnitude faster than optimal Dynamic Programming (DP) approaches with comparable performance quality. We extend our results for finite action, finite measurement sensor control to scenarios with moving objects. We use Hidden Markov Models (HMMs) for the evolution of objects, according to the dynamics of a birth-death process. We develop a new lower bound on the performance of adaptive controllers in these scenarios, develop algorithms for computing solutions to this lower bound, and use these algorithms as part of a RH controller for sensor allocation in the presence of moving objects We also consider an adaptive Search problem where sensing actions are continuous and the underlying measurement space is also continuous. We extend our previous hierarchical decomposition approach based on performance bounds to this problem and develop novel implementations of Stochastic Dynamic Programming (SDP) techniques to solve this problem. Our algorithms are nearly two orders of magnitude faster than previously proposed approaches and yield solutions of comparable quality. For supervisory control, we discuss how human operators can work with and augment robotic teams performing these tasks. Our focus is on how tasks are partitioned among teams of robots and how a human operator can make intelligent decisions for task partitioning. We explore these questions through the design of a game that involves robot automata controlled by our algorithms and a human supervisor that partitions tasks based on different levels of support information. This game can be used with human subject experiments to explore the effect of information on quality of supervisory control.
Learning, Realizability and Games in Classical Arithmetic
NASA Astrophysics Data System (ADS)
Aschieri, Federico
2010-12-01
In this dissertation we provide mathematical evidence that the concept of learning can be used to give a new and intuitive computational semantics of classical proofs in various fragments of Predicative Arithmetic. First, we extend Kreisel modified realizability to a classical fragment of first order Arithmetic, Heyting Arithmetic plus EM1 (Excluded middle axiom restricted to Sigma^0_1 formulas). We introduce a new realizability semantics we call "Interactive Learning-Based Realizability". Our realizers are self-correcting programs, which learn from their errors and evolve through time. Secondly, we extend the class of learning based realizers to a classical version PCFclass of PCF and, then, compare the resulting notion of realizability with Coquand game semantics and prove a full soundness and completeness result. In particular, we show there is a one-to-one correspondence between realizers and recursive winning strategies in the 1-Backtracking version of Tarski games. Third, we provide a complete and fully detailed constructive analysis of learning as it arises in learning based realizability for HA+EM1, Avigad's update procedures and epsilon substitution method for Peano Arithmetic PA. We present new constructive techniques to bound the length of learning processes and we apply them to reprove - by means of our theory - the classic result of Godel that provably total functions of PA can be represented in Godel's system T. Last, we give an axiomatization of the kind of learning that is needed to computationally interpret Predicative classical second order Arithmetic. Our work is an extension of Avigad's and generalizes the concept of update procedure to the transfinite case. Transfinite update procedures have to learn values of transfinite sequences of non computable functions in order to extract witnesses from classical proofs.
Cooperation in memory-based prisoner's dilemma game on interdependent networks
NASA Astrophysics Data System (ADS)
Luo, Chao; Zhang, Xiaolin; Liu, Hong; Shao, Rui
2016-05-01
Memory or so-called experience normally plays the important role to guide the human behaviors in real world, that is essential for rational decisions made by individuals. Hence, when the evolutionary behaviors of players with bounded rationality are investigated, it is reasonable to make an assumption that players in system are with limited memory. Besides, in order to unravel the intricate variability of complex systems in real world and make a highly integrative understanding of their dynamics, in recent years, interdependent networks as a comprehensive network structure have obtained more attention in this community. In this article, the evolution of cooperation in memory-based prisoner's dilemma game (PDG) on interdependent networks composed by two coupled square lattices is studied. Herein, all or part of players are endowed with finite memory ability, and we focus on the mutual influence of memory effect and interdependent network reciprocity on cooperation of spatial PDG. We show that the density of cooperation can be significantly promoted within an optimal region of memory length and interdependent strength. Furthermore, distinguished by whether having memory ability/external links or not, each kind of players on networks would have distinct evolutionary behaviors. Our work could be helpful to understand the emergence and maintenance of cooperation under the evolution of memory-based players on interdependent networks.
Entrepreneurial "mining" of the dying: viatical transactions, tax strategies and mind games.
Trinkaus, John; Giacalone, Joseph A
2002-03-01
Conceptually, entrepreneurship is seen as the engine that drives a robust economy, promotes a favorable quality of life, and assures the availability of the attributes needed for meaningful living. However, like many popular concepts in this world, its limitations are normally not well acknowledged. A grouping of entrepreneurial ventures which as recently come into existence deals with the personal fiscal issues associated with the end-of-life phase of the human cycle. While generally praised as humanitarian services for society, that are assuredly within legal bounds, these activities have potentially significant negative implications. When examined from an ethical perspective, some questionable practices became apparent. Three such undertakings in particular have to do with the terminally ill and are examined in this paper. The first, viatical transactions, is defined as the transfer of a life insurance policy's ownership to someone who does not have an insurable interest in the life of the insured individual. The second, creative tax shelters for wealthy people involved in estate planning, involves hiring unrelated parties, particularly those who are young and in failing health, to do the dying for the principal. The third, mind games, involves curtailing the use of medical resources for the dying. The justification for this is rationing scarce medical resources.
2016-10-01
Reports an error in "When Does Making Detailed Predictions Make Predictions Worse" by Theresa F. Kelly and Joseph P. Simmons ( Journal of Experimental Psychology: General , Advanced Online Publication, Aug 8, 2016, np). In the article, the symbols in Figure 2 were inadvertently altered in production. All versions of this article have been corrected. (The following abstract of the original article appeared in record 2016-37952-001.) In this article, we investigate whether making detailed predictions about an event worsens other predictions of the event. Across 19 experiments, 10,896 participants, and 407,045 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes useless or redundant information more accessible and thus more likely to be incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of events will and will not be susceptible to the negative effect of making detailed predictions. PsycINFO Database Record (c) 2016 APA, all rights reserved
Redox Non-Innocence of Nitrosobenzene at Nickel
Kundu, Subrata; Stieber, S. Chantal; Ferrier, Maryline Ghislaine; ...
2016-08-22
Nitrosobenzene (PhNO) serves as a stable analogue of nitroxyl (HNO), a biologically relevant, redox-active nitric oxide derivative. Capture of nitrosobenzene at the electrondeficient β-diketiminato nickel(I) complex [ iPr 2NN F6]Ni results in reduction of the PhNO ligand to a (PhNO) ./ ⁻ species coordinated to a square planar Ni II center in [ iPr 2NN F6]Ni(η 2- ONPh). Ligand centered reduction leads to the (PhNO) -2 moiety bound to Ni II supported by XAS studies. Ultimately, systematic investigation of structure–reactivity patterns of (PhNO) ./ ⁻ and (PhNO) 2- ligands reveals parallels with superoxo (O 2) ./ ⁻ and peroxo (Omore » 2) 2- ligands, respectively, and forecasts reactivity patterns of the more transient HNO ligand.« less
Masursky, Danielle; Dexter, Franklin; O'Leary, Colleen E; Applegeet, Carol; Nussmeier, Nancy A
2008-04-01
Anesthesia department planning depends on forecasting future demand for perioperative services. Little is known about long-range forecasting of anesthesia workload. We studied operating room (OR) times at Hospital A over 16 yr (1991-2006), anesthesia times at Hospital B over 26 yr (1981-2006), and cases at Hospital C over 13 yr (1994-2006). Each hospital is >100 yr old and is located in a US city with other hospitals that are >50 yr old. Hospitals A and B are the sole University hospitals in their metropolitan statistical areas (and many counties beyond). Hospital C is the sole tertiary hospital for >375 km. Each hospital's choice of a measure of anesthesia work to be analyzed was likely unimportant, as the annual hours of anesthesia correlated highly both with annual numbers of cases (r = 0.98) and with American Society of Anesthesiologist's Relative Value Guide units of work (r = 0.99). Despite a 2% decline in the local population, the hours of OR time at Hospital A increased overall (Pearson r = -0.87, P < 0.001) and for children (r = -0.84). At Hospital B, there was a strong positive correlation between population and hours of anesthesia (r = 0.97, P < 0.001), but not between annual increases in population and workload (r = -0.18). At Hospital C, despite a linear increase in population, the annual numbers of cases increased, declined with opening of two outpatient surgery facilities, and then stabilized. The predictive value of local personal income was low. In contrast, the annual increases in the hours of OR time and anesthesia could be modeled using simple time series methods. Although growth of the elderly population is a simple justification for building more ORs, managers should be cautious in arguing for strategic changes in capacity at individual hospitals based on future changes in the national age-adjusted population. Local population can provide little value in forecasting future anesthesia workloads at individual hospitals. In addition, anesthesia groups and hospital administrators should not focus on quarterly changes in workload, because workload can vary widely, despite consistent patterns over decades. To facilitate long-range planning, anesthesia groups and hospitals should save their billing and OR time data, display it graphically over years, and supplement with corresponding forecasting methods (e.g., staff an additional OR when an upper prediction bound of workload per OR exceeds a threshold).
Burdea, Grigore; Polistico, Kevin; Krishnamoorthy, Amalan; House, Gregory; Rethage, Dario; Hundal, Jasdeep; Damiani, Frank; Pollack, Simcha
2015-01-01
To describe the development of BrightBrainer™ integrative cognitive rehabilitation system and determine clinical feasibility with nursing home-bound dementia patients. BrightBrainer cognitive rehabilitation simulations were first played uni-manually, then bimanually. Participants sat in front of a laptop and interacted through a game controller that measured hand movements in 3D, as well as flexion of both index fingers. Interactive serious games were designed to improve basic and complex attention (concentration, short-term memory, dual tasking), memory recall, executive functioning and emotional well-being. Individual simulations adapted automatically to each participant's level of motor functioning. The system underwent feasibility trials spanning 16 sessions over 8 weeks. Participants were evaluated pre- and post-intervention, using standardized neuropsychological measures. Computerized measures of movement repetitions and task performance were stored on a remote server. Group analysis for 10 participants showed statistically significant improvement in decision making (p < 0.01), with trend improvements in depression (p < 0.056). Improvements were also seen in processing speed (p < 0.13) and auditory attention (p < 0.17); however, these were not statistically significant (partly attributable to the modest sample size). Eight of nine neuropsychological tests showed changes in the improvement direction indicating an effective rehabilitation (p < 0.01). BrightBrainer technology was well tolerated with mean satisfaction ratings of 4.9/5.0 across participants. Preliminary findings demonstrate utility within an advanced dementia population, suggesting that it will be beneficial to evaluate BrightBrainer through controlled clinical trials and to investigate its application in other clinical populations. Implications for Rehabilitation It is possible to improve cognitive function in older low-functioning patients. Integrative rehabilitation through games combining cognitive (memory, focusing, executive function) and physical (bimanual whole arm movement, grasping, task sequencing) elements is enjoyable for this population. The severity of depression in these elderly can be reduced through virtual reality bimanual games. The number of upper extremity active repetitions performed in the process of solving cognitive problems with the BrightBrainer™ system is 600. This number is 18 times (1875%) larger than those observed by other researchers in conventional physical or occupational rehabilitation sessions.
NASA Astrophysics Data System (ADS)
Jiang, Yulian; Liu, Jianchang; Tan, Shubin; Ming, Pingsong
2014-09-01
In this paper, a robust consensus algorithm is developed and sufficient conditions for convergence to consensus are proposed for a multi-agent system (MAS) with exogenous disturbances subject to partial information. By utilizing H∞ robust control, differential game theory and a design-based approach, the consensus problem of the MAS with exogenous bounded interference is resolved and the disturbances are restrained, simultaneously. Attention is focused on designing an H∞ robust controller (the robust consensus algorithm) based on minimisation of our proposed rational and individual cost functions according to goals of the MAS. Furthermore, sufficient conditions for convergence of the robust consensus algorithm are given. An example is employed to demonstrate that our results are effective and more capable to restrain exogenous disturbances than the existing literature.
Constructing probabilistic scenarios for wide-area solar power generation
Woodruff, David L.; Deride, Julio; Staid, Andrea; ...
2017-12-22
Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less
Constructing probabilistic scenarios for wide-area solar power generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodruff, David L.; Deride, Julio; Staid, Andrea
Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less
Evaluation of the Analysis Influence on Transport in Reanalysis Regional Water Cycles
NASA Technical Reports Server (NTRS)
Bosilovich, M. G.; Chen, J.; Robertson, F. R.
2011-01-01
Regional water cycles of reanalyses do not follow theoretical assumptions applicable to pure simulated budgets. The data analysis changes the wind, temperature and moisture, perturbing the theoretical balance. Of course, the analysis is correcting the model forecast error, so that the state fields should be more aligned with observations. Recently, it has been reported that the moisture convergence over continental regions, even those with significant quantities of radiosonde profiles present, can produce long term values not consistent with theoretical bounds. Specifically, long averages over continents produce some regions of moisture divergence. This implies that the observational analysis leads to a source of water in the region. One such region is the Unite States Great Plains, which many radiosonde and lidar wind observations are assimilated. We will utilize a new ancillary data set from the MERRA reanalysis called the Gridded Innovations and Observations (GIO) which provides the assimilated observations on MERRA's native grid allowing more thorough consideration of their impact on regional and global climatology. Included with the GIO data are the observation minus forecast (OmF) and observation minus analysis (OmA). Using OmF and OmA, we can identify the bias of the analysis against each observing system and gain a better understanding of the observations that are controlling the regional analysis. In this study we will focus on the wind and moisture assimilation.
The future of primordial features with 21 cm tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Xingang; Meerburg, P. Daniel; Münchmeyer, Moritz, E-mail: xingang.chen@cfa.harvard.edu, E-mail: meerburg@cita.utoronto.ca, E-mail: munchmey@iap.fr
2016-09-01
Detecting a deviation from a featureless primordial power spectrum of fluctuations would give profound insight into the physics of the primordial Universe. Depending on their nature, primordial features can either provide direct evidence for the inflation scenario or pin down details of the inflation model. Thus far, using the cosmic microwave background (CMB) we have only been able to put stringent constraints on the amplitude of features, but no significant evidence has been found for such signals. Here we explore the limit of the experimental reach in constraining such features using 21 cm tomography at high redshift. A measurement ofmore » the 21 cm power spectrum from the Dark Ages is generally considered as the ideal experiment for early Universe physics, with potentially access to a large number of modes. We consider three different categories of theoretically motivated models: the sharp feature models, resonance models, and standard clock models. We study the improvements on bounds on features as a function of the total number of observed modes and identify parameter degeneracies. The detectability depends critically on the amplitude, frequency and scale-location of the features, as well as the angular and redshift resolution of the experiment. We quantify these effects by considering different fiducial models. Our forecast shows that a cosmic variance limited 21 cm experiment measuring fluctuations in the redshift range 30 ≤ z ≤ 100 with a 0.01-MHz bandwidth and sub-arcminute angular resolution could potentially improve bounds by several orders of magnitude for most features compared to current Planck bounds. At the same time, 21 cm tomography also opens up a unique window into features that are located on very small scales.« less
Constraints on the dark matter and dark energy interactions from weak lensing bispectrum tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
An, Rui; Feng, Chang; Wang, Bin, E-mail: an_rui@sjtu.edu.cn, E-mail: chang.feng@uci.edu, E-mail: wang_b@sjtu.edu.cn
We estimate uncertainties of cosmological parameters for phenomenological interacting dark energy models using weak lensing convergence power spectrum and bispectrum. We focus on the bispectrum tomography and examine how well the weak lensing bispectrum with tomography can constrain the interactions between dark sectors, as well as other cosmological parameters. Employing the Fisher matrix analysis, we forecast parameter uncertainties derived from weak lensing bispectra with a two-bin tomography and place upper bounds on strength of the interactions between the dark sectors. The cosmic shear will be measured from upcoming weak lensing surveys with high sensitivity, thus it enables us to usemore » the higher order correlation functions of weak lensing to constrain the interaction between dark sectors and will potentially provide more stringent results with other observations combined.« less
NASA Astrophysics Data System (ADS)
Nordio, Sergio; Flapp, Federica
2017-04-01
The aim of this work is to present some experiences of intergenerational education about meteorology and climatology issues carried out with school pupils from 6 to 19 years old, through peer-tutoring methodology. These experiences started in 2003 and each year the project involves about 500 students in Friuli Venezia Giulia region (about 8.000 km2) in northeastern Italy. A group of volunteers (older students from upper secondary school, 17-19 years old) play the role of "tutor": they receive supplementary training on meteorology and climatology, and then, during students' meetings and/or public events, they teach younger pupils how to use meteorological instruments (thermometer, hygrometer, barometer, anemometer, rain gages, etc.) and they carry out interactive experiences such as "game-experiments", to better understand some meteorological concepts, like density of fluids, and some climatological notions, like the effects of climate change with an exhibit that simulates the greenhouse effect. They also do some meteorological forecasting exercises, using meteorological maps, as if they were actual forecasters. All these activities are addressed to pupils from primary (age 6-11) and lower secondary schools (age 11-14), and both tutors and their younger "apprentices" are not only cognitively, but also emotionally involved in such learning experiences. As a second step of this educational process, after consolidating the above mentioned peer-tutoring activities, high school students hare being actively involved in developing visual tools - e.g. video-clips, interviews and cartoons - in order to communicate climate change issues in the most effective way to younger pupils. Keywords: meteorology, climatology, climate change, schools, education, communication.
Benchmarking NWP Kernels on Multi- and Many-core Processors
NASA Astrophysics Data System (ADS)
Michalakes, J.; Vachharajani, M.
2008-12-01
Increased computing power for weather, climate, and atmospheric science has provided direct benefits for defense, agriculture, the economy, the environment, and public welfare and convenience. Today, very large clusters with many thousands of processors are allowing scientists to move forward with simulations of unprecedented size. But time-critical applications such as real-time forecasting or climate prediction need strong scaling: faster nodes and processors, not more of them. Moreover, the need for good cost- performance has never been greater, both in terms of performance per watt and per dollar. For these reasons, the new generations of multi- and many-core processors being mass produced for commercial IT and "graphical computing" (video games) are being scrutinized for their ability to exploit the abundant fine- grain parallelism in atmospheric models. We present results of our work to date identifying key computational kernels within the dynamics and physics of a large community NWP model, the Weather Research and Forecast (WRF) model. We benchmark and optimize these kernels on several different multi- and many-core processors. The goals are to (1) characterize and model performance of the kernels in terms of computational intensity, data parallelism, memory bandwidth pressure, memory footprint, etc. (2) enumerate and classify effective strategies for coding and optimizing for these new processors, (3) assess difficulties and opportunities for tool or higher-level language support, and (4) establish a continuing set of kernel benchmarks that can be used to measure and compare effectiveness of current and future designs of multi- and many-core processors for weather and climate applications.
NASA Astrophysics Data System (ADS)
Baart, F.; van Gils, A.; Hagenaars, G.; Donchyts, G.; Eisemann, E.; van Velzen, J. W.
2016-12-01
A compelling visualization is captivating, beautiful and narrative. Here we show how melding the skills of computer graphics, art, statistics, and environmental modeling can be used to generate innovative, attractive and very informative visualizations. We focus on the topic of visualizing forecasts and measurements of water (water level, waves, currents, density, and salinity). For the field of computer graphics and arts, water is an important topic because it occurs in many natural scenes. For environmental modeling and statistics, water is an important topic because the water is essential for transport, a healthy environment, fruitful agriculture, and a safe environment.The different disciplines take different approaches to visualizing water. In computer graphics, one focusses on creating water as realistic looking as possible. The focus on realistic perception (versus the focus on the physical balance pursued by environmental scientists) resulted in fascinating renderings, as seen in recent games and movies. Visualization techniques for statistical results have benefited from the advancement in design and journalism, resulting in enthralling infographics. The field of environmental modeling has absorbed advances in contemporary cartography as seen in the latest interactive data-driven maps. We systematically review the design emerging types of water visualizations. The examples that we analyze range from dynamically animated forecasts, interactive paintings, infographics, modern cartography to web-based photorealistic rendering. By characterizing the intended audience, the design choices, the scales (e.g. time, space), and the explorability we provide a set of guidelines and genres. The unique contributions of the different fields show how the innovations in the current state of the art of water visualization have benefited from inter-disciplinary collaborations.
Bae, Jonghoon; Cha, Young-Jae; Lee, Hyungsuk; Lee, Boyun; Baek, Sojung; Choi, Semin; Jang, Dayk
2017-01-01
This study examines whether the way that a person makes inferences about unknown events is associated with his or her social relations, more precisely, those characterized by ego network density that reflects the structure of a person's immediate social relation. From the analysis of individual predictions over the Go match between AlphaGo and Sedol Lee in March 2016 in Seoul, Korea, this study shows that the low-density group scored higher than the high-density group in the accuracy of the prediction over a future state of a social event, i.e., the outcome of the first game. We corroborated this finding with three replication tests that asked the participants to predict the following: film awards, President Park's impeachment in Korea, and the counterfactual assessment of the US presidential election. Taken together, this study suggests that network density is negatively associated with vision advantage, i.e., the ability to discover and forecast an unknown aspect of a social event.
NASA Astrophysics Data System (ADS)
Hartkorn, O. A.; Ritter, B.; Meskers, A. J. H.; Miles, O.; Russwurm, M.; Scully, S.; Roldan, A.; Juestel, P.; Reville, V.; Lupu, S.; Ruffenach, A.
2014-12-01
The Earth's magnetosphere is formed as a consequence of the interaction between the planet's magnetic field and the solar wind, a continuous plasma stream from the Sun. A number of different solar wind phenomena have been studied over the past forty years with the intention of understandingand forcasting solar behavior and space weather. In particular, Earth-bound interplanetary coronal mass ejections (CMEs) can significantly disturb the Earth's magnetosphere for a short time and cause geomagnetic storms. We present a mission concept consisting of six spacecraft that are equally spaced in a heliocentric orbit at 0.72 AU. These spacecraft will monitor the plasma properties, the magnetic field's orientation and magnitude, and the 3D-propagation trajectory of CMEs heading for Earth. The primary objective of this mission is to increase space weather forecasting time by means of a near real-time information service, that is based upon in-situ and remote measurements of the CME properties. The mission secondary objective is the improvement of scientific space weather models. In-situ measurements are performed using a Solar Wind Analyzer instrumentation package and flux gate magnetometers. For remote measurements, coronagraphs are employed. The proposed instruments originate from other space missions with the intention to reduce mission costs and to streamline the mission design process. Communication with the six identical spacecraft is realized via a deep space network consisting of six ground stations. This network provides an information service that is in uninterrupted contact with the spacecraft, allowing for continuos space weather monitoring. A dedicated data processing center will handle all the data, and forward the processed data to the SSA Space Weather Coordination Center. This organization will inform the general public through a space weather forecast. The data processing center will additionally archive the data for the scientific community. This concept mission allows for major advances in space weather forecasting and the scientific modeling of space weather.
VISIR-I: small vessels - least-time nautical routes using wave forecasts
NASA Astrophysics Data System (ADS)
Mannarini, Gianandrea; Pinardi, Nadia; Coppini, Giovanni; Oddo, Paolo; Iafrati, Alessandro
2016-05-01
A new numerical model for the on-demand computation of optimal ship routes based on sea-state forecasts has been developed. The model, named VISIR (discoVerIng Safe and effIcient Routes) is designed to support decision-makers when planning a marine voyage. The first version of the system, VISIR-I, considers medium and small motor vessels with lengths of up to a few tens of metres and a displacement hull. The model is comprised of three components: a route optimization algorithm, a mechanical model of the ship, and a processor of the environmental fields. The optimization algorithm is based on a graph-search method with time-dependent edge weights. The algorithm is also able to compute a voluntary ship speed reduction. The ship model accounts for calm water and added wave resistance by making use of just the principal particulars of the vessel as input parameters. It also checks the optimal route for parametric roll, pure loss of stability, and surfriding/broaching-to hazard conditions. The processor of the environmental fields employs significant wave height, wave spectrum peak period, and wave direction forecast fields as input. The topological issues of coastal navigation (islands, peninsulas, narrow passages) are addressed. Examples of VISIR-I routes in the Mediterranean Sea are provided. The optimal route may be longer in terms of miles sailed and yet it is faster and safer than the geodetic route between the same departure and arrival locations. Time savings up to 2.7 % and route lengthening up to 3.2 % are found for the case studies analysed. However, there is no upper bound for the magnitude of the changes of such route metrics, which especially in case of extreme sea states can be much greater. Route diversions result from the safety constraints and the fact that the algorithm takes into account the full temporal evolution and spatial variability of the environmental fields.
Faithful Squashed Entanglement
NASA Astrophysics Data System (ADS)
Brandão, Fernando G. S. L.; Christandl, Matthias; Yard, Jon
2011-09-01
Squashed entanglement is a measure for the entanglement of bipartite quantum states. In this paper we present a lower bound for squashed entanglement in terms of a distance to the set of separable states. This implies that squashed entanglement is faithful, that is, it is strictly positive if and only if the state is entangled. We derive the lower bound on squashed entanglement from a lower bound on the quantum conditional mutual information which is used to define squashed entanglement. The quantum conditional mutual information corresponds to the amount by which strong subadditivity of von Neumann entropy fails to be saturated. Our result therefore sheds light on the structure of states that almost satisfy strong subadditivity with equality. The proof is based on two recent results from quantum information theory: the operational interpretation of the quantum mutual information as the optimal rate for state redistribution and the interpretation of the regularised relative entropy of entanglement as an error exponent in hypothesis testing. The distance to the set of separable states is measured in terms of the LOCC norm, an operationally motivated norm giving the optimal probability of distinguishing two bipartite quantum states, each shared by two parties, using any protocol formed by local quantum operations and classical communication (LOCC) between the parties. A similar result for the Frobenius or Euclidean norm follows as an immediate consequence. The result has two applications in complexity theory. The first application is a quasipolynomial-time algorithm solving the weak membership problem for the set of separable states in LOCC or Euclidean norm. The second application concerns quantum Merlin-Arthur games. Here we show that multiple provers are not more powerful than a single prover when the verifier is restricted to LOCC operations thereby providing a new characterisation of the complexity class QMA.
Crowd Computing as a Cooperation Problem: An Evolutionary Approach
NASA Astrophysics Data System (ADS)
Christoforou, Evgenia; Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A.; Sánchez, Angel
2013-05-01
Cooperation is one of the socio-economic issues that has received more attention from the physics community. The problem has been mostly considered by studying games such as the Prisoner's Dilemma or the Public Goods Game. Here, we take a step forward by studying cooperation in the context of crowd computing. We introduce a model loosely based on Principal-agent theory in which people (workers) contribute to the solution of a distributed problem by computing answers and reporting to the problem proposer (master). To go beyond classical approaches involving the concept of Nash equilibrium, we work on an evolutionary framework in which both the master and the workers update their behavior through reinforcement learning. Using a Markov chain approach, we show theoretically that under certain----not very restrictive—conditions, the master can ensure the reliability of the answer resulting of the process. Then, we study the model by numerical simulations, finding that convergence, meaning that the system reaches a point in which it always produces reliable answers, may in general be much faster than the upper bounds given by the theoretical calculation. We also discuss the effects of the master's level of tolerance to defectors, about which the theory does not provide information. The discussion shows that the system works even with very large tolerances. We conclude with a discussion of our results and possible directions to carry this research further.
NASA Astrophysics Data System (ADS)
Buckley, Bruce W.; Leslie, Lance M.
2000-03-01
The accurate prediction of sudden large changes in the maximum temperature from one day to the next remains one of the major challenges for operational forecasters. It is probably the meteorological parameter most commonly verified and used as a measure of the skill of a meteorological service and one that is immediately evident to the general public. Marked temperature changes over a short period of time have widespread social, economic, health and safety effects on the community. The first part of this paper describes a 40-year climatology for Sydney, Australia, of sudden temperature rises and falls, defined as maximum temperature changes of 5°C or more from one day to the next, for the months of September and October. The nature of the forecasting challenge during the period of the Olympic and Paralympic Games to be held in Sydney in the year 2000 will be described as a special application. The international importance of the accurate prediction of all types of significant weather phenomena during this period has been recognized by the World Meteorological Organisation's Commission for Atmospheric Science. The first World Weather Research Program forecast demonstration project is to be established in the Sydney Office of the Bureau of Meteorology over this period in order to test the ability of existing systems to predict such phenomena. The second part of this study investigates two case studies from the Olympic months in which there were both abrupt temperature rises and falls over a 4-day interval. Currently available high resolution numerical weather prediction systems are found to have significant skill several days ahead in predicting a large amount of the detail of these events, provided they are run at an appropriate resolution. The limitations of these systems are also discussed, with areas requiring further development being identified if the desired levels of accuracy of predictions are to be reliably delivered. Differences between the predictability of sudden temperature rises and sudden temperature falls are also explored.
NASA Astrophysics Data System (ADS)
Li, Juan; Wang, Bin; Yang, Young-Min
2017-04-01
Prediction of Indian summer (June-September) rainfall on regional scales remains an open issue. The operational predictions of West Central Indian summer rainfall (WCI-R) and Peninsular Indian summer rainfall (PI-R) made by the Indian Meteorological Department (IMD) had no skills during 2004-2012. This motivates the present study aiming at better understanding the predictability sources and physical processes governing summer rainfall variability over these two regions. Analysis of 133 year data reveal that although the lower boundary forcing that associated with enhanced WCI-R and PI-R featured a similar developing La-Nina and "east high west low" sea-level pressure (SLP) dipole pattern across the Indo-Pacific, the anomalous high sea surface temperature (SST) over the northern Indian Ocean and weak low pressure over northern Asia tended to enhance PI-R but reduce WCI-R. Based on our understanding of physical linkages with the predictands, we selected four and two causative predictors for predictions of the WCI-R and PI-R, respectively. The intensified summer WCI-R is preceded by (a) Indian Ocean zonal dipole-like SST tendency (west-warming and east-cooling), (b) tropical Pacific zonal dipole SST tendency (west-warming and east-cooling), (c) central Pacific meridional dipole SST tendency (north-cooling and south-warming), and (d) decreasing SLP tendency over northern Asia in the previous season. The enhanced PI-R was lead by the central-eastern Pacific cooling and 2-m temperature cooling tendency east of Lake Balkhash in the previous seasons. These causative processes linking the predictors and WCI-R and PI-R are supported by ensemble numerical experiments using a coupled climate model. For the period of 1871-2012, the physics-based empirical (P-E) prediction models built on these predictors result in cross-validated forecast temporal correlation coefficient skills of 0.55 and 0.47 for WCI-R and PI-R, respectively. The independent forecast skill is significantly higher than the skill of operational seasonal forecast made by the IMD for the period of 2004-2012. These prediction models offer a tool for seasonal prediction and their retrospective forecast skills provide an estimation of the lower bound of the predictability for WCI-R and PI-R.
Complexity study on the Cournot-Bertrand mixed duopoly game model with market share preference
NASA Astrophysics Data System (ADS)
Ma, Junhai; Sun, Lijian; Hou, Shunqi; Zhan, Xueli
2018-02-01
In this paper, a Cournot-Bertrand duopoly model with market share preference is established. Assume that there is a degree of product difference between the two firms, where one firm takes the price as a decision variable and the other takes the quantity. Both firms are bounded rational, with linear cost functions and demand functions. The stability of the equilibrium points is analyzed, and the effects of some parameters (α, β, d and v1) on the model stability are studied. Basins of attraction are investigated and the evolution process is shown with the increase in the output adjustment speed. The simulation results show that instability will lead to the increase in the average utility of the firm that determines the quantity and reduce the average utility of the firm that determines price.
Arts and technology - Mosaic new techniques and procedures
NASA Astrophysics Data System (ADS)
Papiu, G. A.; Suciu, N.
2017-05-01
The relationship between art and technique has been along the time one that is inseparable and systematic, artists appealing to various technologies, tools and practices that help them stimulate their imagination. Today there is a new category of artists, coming from a technical or scientific field, that are being 'trapped’ in this ‘game of art”. The mosaic, even if it is an old technique, responded to the social requirements and it evolved over time, being constantly related to aesthetic and artistic thinking, discoveries of science, assimilating permanent new techniques and technologies, diversifying its artistic forms of expression and methods of transposition. Not being bound any more to a religious institution, which was its birth place, today, she migrated to all public spaces. Works of art in public space have become today an active factor in reshaping the urban aesthetic landscape.
Searching for the QCD Axion with Gravitational Microlensing
NASA Astrophysics Data System (ADS)
Fairbairn, Malcolm; Marsh, David J. E.; Quevillon, Jérémie
2017-07-01
The phase transition responsible for axion dark matter (DM) production can create large amplitude isocurvature perturbations, which collapse into dense objects known as axion miniclusters. We use microlensing data from the EROS survey and from recent observations with the Subaru Hyper Suprime Cam to place constraints on the minicluster scenario. We compute the microlensing event rate for miniclusters, treating them as spatially extended objects. Using the published bounds on the number of microlensing events, we bound the fraction of DM collapsed into miniclusters fMC. For an axion with temperature-dependent mass consistent with the QCD axion, we find fMC<0.083 (ma/100 μ eV )0.12 , which represents the first observational constraint on the minicluster fraction. We forecast that a high-efficiency observation of around ten nights with Subaru would be sufficient to constrain fMC≲0.004 over the entire QCD axion mass range. We make various approximations to derive these constraints, and dedicated analyses by the observing teams of EROS and Subaru are necessary to confirm our results. If accurate theoretical predictions for fMC can be made in the future, then microlensing can be used to exclude or discover the QCD axion. Further details of our computations are presented in a companion paper [M. Fairbairn, D. J. E. Marsh, J. Quevillon, and S. Rozier (to be published)].
The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments
NASA Astrophysics Data System (ADS)
Chen, Fajing; Jiao, Meiyan; Chen, Jing
2013-04-01
Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.
A virtual pebble game to ensemble average graph rigidity.
González, Luis C; Wang, Hui; Livesay, Dennis R; Jacobs, Donald J
2015-01-01
The body-bar Pebble Game (PG) algorithm is commonly used to calculate network rigidity properties in proteins and polymeric materials. To account for fluctuating interactions such as hydrogen bonds, an ensemble of constraint topologies are sampled, and average network properties are obtained by averaging PG characterizations. At a simpler level of sophistication, Maxwell constraint counting (MCC) provides a rigorous lower bound for the number of internal degrees of freedom (DOF) within a body-bar network, and it is commonly employed to test if a molecular structure is globally under-constrained or over-constrained. MCC is a mean field approximation (MFA) that ignores spatial fluctuations of distance constraints by replacing the actual molecular structure by an effective medium that has distance constraints globally distributed with perfect uniform density. The Virtual Pebble Game (VPG) algorithm is a MFA that retains spatial inhomogeneity in the density of constraints on all length scales. Network fluctuations due to distance constraints that may be present or absent based on binary random dynamic variables are suppressed by replacing all possible constraint topology realizations with the probabilities that distance constraints are present. The VPG algorithm is isomorphic to the PG algorithm, where integers for counting "pebbles" placed on vertices or edges in the PG map to real numbers representing the probability to find a pebble. In the VPG, edges are assigned pebble capacities, and pebble movements become a continuous flow of probability within the network. Comparisons between the VPG and average PG results over a test set of proteins and disordered lattices demonstrate the VPG quantitatively estimates the ensemble average PG results well. The VPG performs about 20% faster than one PG, and it provides a pragmatic alternative to averaging PG rigidity characteristics over an ensemble of constraint topologies. The utility of the VPG falls in between the most accurate but slowest method of ensemble averaging over hundreds to thousands of independent PG runs, and the fastest but least accurate MCC.
Figures of merit for present and future dark energy probes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mortonson, Michael J.; Huterer, Dragan; Hu, Wayne
2010-09-15
We compare current and forecasted constraints on dynamical dark energy models from Type Ia supernovae and the cosmic microwave background using figures of merit based on the volume of the allowed dark energy parameter space. For a two-parameter dark energy equation of state that varies linearly with the scale factor, and assuming a flat universe, the area of the error ellipse can be reduced by a factor of {approx}10 relative to current constraints by future space-based supernova data and CMB measurements from the Planck satellite. If the dark energy equation of state is described by a more general basis ofmore » principal components, the expected improvement in volume-based figures of merit is much greater. While the forecasted precision for any single parameter is only a factor of 2-5 smaller than current uncertainties, the constraints on dark energy models bounded by -1{<=}w{<=}1 improve for approximately 6 independent dark energy parameters resulting in a reduction of the total allowed volume of principal component parameter space by a factor of {approx}100. Typical quintessence models can be adequately described by just 2-3 of these parameters even given the precision of future data, leading to a more modest but still significant improvement. In addition to advances in supernova and CMB data, percent-level measurement of absolute distance and/or the expansion rate is required to ensure that dark energy constraints remain robust to variations in spatial curvature.« less
Goldschmidt, Pablo
2011-06-01
The milestones marking substantial changes in the lives or in the survival of humans deserve to be remembered. It has been only 11 years since we experienced an event that not even the most optimistic amongst us would have predicted before 1997. Let us place the facts in time. At the beginning of the 80s, we faced the distressing reality that more than three quarters of all children, men and women found to have antibodies directed against a new infectious agent named human immune deficient virus (HIV) were bound to die. The mere reactivity of the serum of a human being against a virus characterized in 1983 (antibodies) handed an almost inevitable sentence of death. At that time the evolution of this viral infection was assessed by the quantification of a sub type of white cells, the auxiliary lymphocytes or CD4. This count was the principal evidence for most of the predictions on how a person might survive without degradation, and the value of such cells was the abacus used to forecast the time when an individual would develop irreversible blindness, to anticipate respiratory failure, and to predict the time before weakness would appear after devastating diarrhea, etc. We should recall that the CD4 cell count was even used as a predictor of the initiation of cognitive shrinkage, forecasting dementia as well as the signs that would take hold of personality as a consequence of infections or neoplastic transformations in the encephalitic mass.
Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation
NASA Astrophysics Data System (ADS)
Zhao, T.; Cai, X.; Yang, D.
2010-12-01
Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover, streamflow variability and reservoir capacity can change the magnitude of the effects of forecast uncertainty, but not the relative merit of DSF, DPSF, and ESF. Schematic diagram of the increase in forecast uncertainty with forecast lead-time and the dynamic updating property of real-time streamflow forecast
NASA Technical Reports Server (NTRS)
Allen, B. Danette; Alexandrov, Natalia
2016-01-01
Incremental approaches to air transportation system development inherit current architectural constraints, which, in turn, place hard bounds on system capacity, efficiency of performance, and complexity. To enable airspace operations of the future, a clean-slate (ab initio) airspace design(s) must be considered. This ab initio National Airspace System (NAS) must be capable of accommodating increased traffic density, a broader diversity of aircraft, and on-demand mobility. System and subsystem designs should scale to accommodate the inevitable demand for airspace services that include large numbers of autonomous Unmanned Aerial Vehicles and a paradigm shift in general aviation (e.g., personal air vehicles) in addition to more traditional aerial vehicles such as commercial jetliners and weather balloons. The complex and adaptive nature of ab initio designs for the future NAS requires new approaches to validation, adding a significant physical experimentation component to analytical and simulation tools. In addition to software modeling and simulation, the ability to exercise system solutions in a flight environment will be an essential aspect of validation. The NASA Langley Research Center (LaRC) Autonomy Incubator seeks to develop a flight simulation infrastructure for ab initio modeling and simulation that assumes no specific NAS architecture and models vehicle-to-vehicle behavior to examine interactions and emergent behaviors among hundreds of intelligent aerial agents exhibiting collaborative, cooperative, coordinative, selfish, and malicious behaviors. The air transportation system of the future will be a complex adaptive system (CAS) characterized by complex and sometimes unpredictable (or unpredicted) behaviors that result from temporal and spatial interactions among large numbers of participants. A CAS not only evolves with a changing environment and adapts to it, it is closely coupled to all systems that constitute the environment. Thus, the ecosystem that contains the system and other systems evolves with the CAS as well. The effects of the emerging adaptation and co-evolution are difficult to capture with only combined mathematical and computational experimentation. Therefore, an ab initio flight simulation environment must accommodate individual vehicles, groups of self-organizing vehicles, and large-scale infrastructure behavior. Inspired by Massively Multiplayer Online Role Playing Games (MMORPG) and Serious Gaming, the proposed ab initio simulation environment is similar to online gaming environments in which player participants interact with each other, affect their environment, and expect the simulation to persist and change regardless of any individual player's active participation.
2006-08-28
KENNEDY SPACE CENTER, FLA. - Crawler-transporter No. 2 makes its way toward Launch Pad 39B (in the background). The crawler is being moved nearby in the event the mission management team decides to roll back Space Shuttle Atlantis due to Hurricane Ernesto. The hurricane has been forecast on a heading north and east from Cuba, taking it along the eastern coast of Florida. NASA's lighted launch window extends to Sept. 13, but mission managers are hoping to launch on mission STS-115 by Sept. 7 to avoid a conflict with a Russian Soyuz rocket also bound for the International Space Station. The crawler is 131 feet long, 113 feet wide and 20 feet high. It weights 5.5 million pounds unloaded. The combined weight of crawler, mobile launcher platform and a space shuttle is 12 million pounds. Unloaded, the crawler moves at 2 mph. Loaded, the snail's pace slows to 1 mph. Photo credit: NASA/Kim Shiflett
Constraining chameleon field theories using the GammeV afterglow experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upadhye, A.; Steffen, J. H.; Weltman, A.
2010-01-01
The GammeV experiment has constrained the couplings of chameleon scalar fields to matter and photons. Here, we present a detailed calculation of the chameleon afterglow rate underlying these constraints. The dependence of GammeV constraints on various assumptions in the calculation is studied. We discuss the GammeV-CHameleon Afterglow SEarch, a second-generation GammeV experiment, which will improve upon GammeV in several major ways. Using our calculation of the chameleon afterglow rate, we forecast model-independent constraints achievable by GammeV-CHameleon Afterglow SEarch. We then apply these constraints to a variety of chameleon models, including quartic chameleons and chameleon dark energy models. The new experimentmore » will be able to probe a large region of parameter space that is beyond the reach of current tests, such as fifth force searches, constraints on the dimming of distant astrophysical objects, and bounds on the variation of the fine structure constant.« less
Constraining chameleon field theories using the GammeV afterglow experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upadhye, A.; /Chicago U., EFI /KICP, Chicago; Steffen, J.H.
2009-11-01
The GammeV experiment has constrained the couplings of chameleon scalar fields to matter and photons. Here we present a detailed calculation of the chameleon afterglow rate underlying these constraints. The dependence of GammeV constraints on various assumptions in the calculation is studied. We discuss GammeV-CHASE, a second-generation GammeV experiment, which will improve upon GammeV in several major ways. Using our calculation of the chameleon afterglow rate, we forecast model-independent constraints achievable by GammeV-CHASE. We then apply these constraints to a variety of chameleon models, including quartic chameleons and chameleon dark energy models. The new experiment will be able to probemore » a large region of parameter space that is beyond the reach of current tests, such as fifth force searches, constraints on the dimming of distant astrophysical objects, and bounds on the variation of the fine structure constant.« less
Chance-Constrained AC Optimal Power Flow for Distribution Systems With Renewables
DOE Office of Scientific and Technical Information (OSTI.GOV)
DallAnese, Emiliano; Baker, Kyri; Summers, Tyler
This paper focuses on distribution systems featuring renewable energy sources (RESs) and energy storage systems, and presents an AC optimal power flow (OPF) approach to optimize system-level performance objectives while coping with uncertainty in both RES generation and loads. The proposed method hinges on a chance-constrained AC OPF formulation where probabilistic constraints are utilized to enforce voltage regulation with prescribed probability. A computationally more affordable convex reformulation is developed by resorting to suitable linear approximations of the AC power-flow equations as well as convex approximations of the chance constraints. The approximate chance constraints provide conservative bounds that hold for arbitrarymore » distributions of the forecasting errors. An adaptive strategy is then obtained by embedding the proposed AC OPF task into a model predictive control framework. Finally, a distributed solver is developed to strategically distribute the solution of the optimization problems across utility and customers.« less
Turbulent vertical diffusivity in the sub-tropical stratosphere
NASA Astrophysics Data System (ADS)
Pisso, I.; Legras, B.
2008-02-01
Vertical (cross-isentropic) mixing is produced by small-scale turbulent processes which are still poorly understood and paramaterized in numerical models. In this work we provide estimates of local equivalent diffusion in the lower stratosphere by comparing balloon borne high-resolution measurements of chemical tracers with reconstructed mixing ratio from large ensembles of random Lagrangian backward trajectories using European Centre for Medium-range Weather Forecasts analysed winds and a chemistry-transport model (REPROBUS). We focus on a case study in subtropical latitudes using data from HIBISCUS campaign. An upper bound on the vertical diffusivity is found in this case study to be of the order of 0.5 m2 s-1 in the subtropical region, which is larger than the estimates at higher latitudes. The relation between diffusion and dispersion is studied by estimating Lyapunov exponents and studying their variation according to the presence of active dynamical structures.
Weather forecasting expert system study
NASA Technical Reports Server (NTRS)
1985-01-01
Weather forecasting is critical to both the Space Transportation System (STS) ground operations and the launch/landing activities at NASA Kennedy Space Center (KSC). The current launch frequency places significant demands on the USAF weather forecasters at the Cape Canaveral Forecasting Facility (CCFF), who currently provide the weather forecasting for all STS operations. As launch frequency increases, KSC's weather forecasting problems will be great magnified. The single most important problem is the shortage of highly skilled forecasting personnel. The development of forecasting expertise is difficult and requires several years of experience. Frequent personnel changes within the forecasting staff jeopardize the accumulation and retention of experience-based weather forecasting expertise. The primary purpose of this project was to assess the feasibility of using Artificial Intelligence (AI) techniques to ameliorate this shortage of experts by capturing aria incorporating the forecasting knowledge of current expert forecasters into a Weather Forecasting Expert System (WFES) which would then be made available to less experienced duty forecasters.
ERIC Educational Resources Information Center
Walford, Rex
Six games designed for classroom use are described in this book: 1) Shopping Game; 2) Bus Service Game; 3) North Sea Gas Game; 4) Railway Pioneers Game; 5) Development Game; and 6) Export Drive Game. The description of each game comprises a separate chapter, and includes information about the general aims of the game, how the various game elements…
Integration of Behind-the-Meter PV Fleet Forecasts into Utility Grid System Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoff, Thomas Hoff; Kankiewicz, Adam
Four major research objectives were completed over the course of this study. Three of the objectives were to evaluate three, new, state-of-the-art solar irradiance forecasting models. The fourth objective was to improve the California Independent System Operator’s (ISO) load forecasts by integrating behind-the-meter (BTM) PV forecasts. The three, new, state-of-the-art solar irradiance forecasting models included: the infrared (IR) satellite-based cloud motion vector (CMV) model; the WRF-SolarCA model and variants; and the Optimized Deep Machine Learning (ODML)-training model. The first two forecasting models targeted known weaknesses in current operational solar forecasts. They were benchmarked against existing operational numerical weather prediction (NWP)more » forecasts, visible satellite CMV forecasts, and measured PV plant power production. IR CMV, WRF-SolarCA, and ODML-training forecasting models all improved the forecast to a significant degree. Improvements varied depending on time of day, cloudiness index, and geographic location. The fourth objective was to demonstrate that the California ISO’s load forecasts could be improved by integrating BTM PV forecasts. This objective represented the project’s most exciting and applicable gains. Operational BTM forecasts consisting of 200,000+ individual rooftop PV forecasts were delivered into the California ISO’s real-time automated load forecasting (ALFS) environment. They were then evaluated side-by-side with operational load forecasts with no BTM-treatment. Overall, ALFS-BTM day-ahead (DA) forecasts performed better than baseline ALFS forecasts when compared to actual load data. Specifically, ALFS-BTM DA forecasts were observed to have the largest reduction of error during the afternoon on cloudy days. Shorter term 30 minute-ahead ALFS-BTM forecasts were shown to have less error under all sky conditions, especially during the morning time periods when traditional load forecasts often experience their largest uncertainties. This work culminated in a GO decision being made by the California ISO to include zonal BTM forecasts into its operational load forecasting system. The California ISO’s Manager of Short Term Forecasting, Jim Blatchford, summarized the research performed in this project with the following quote: “The behind-the-meter (BTM) California ISO region forecasting research performed by Clean Power Research and sponsored by the Department of Energy’s SUNRISE program was an opportunity to verify value and demonstrate improved load forecast capability. In 2016, the California ISO will be incorporating the BTM forecast into the Hour Ahead and Day Ahead load models to look for improvements in the overall load forecast accuracy as BTM PV capacity continues to grow.”« less
Optimising seasonal streamflow forecast lead time for operational decision making in Australia
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Zhao, Tongtiegang; Wang, Q. J.; Zhou, Senlin; Feikema, Paul
2016-10-01
Statistical seasonal forecasts of 3-month streamflow totals are released in Australia by the Bureau of Meteorology and updated on a monthly basis. The forecasts are often released in the second week of the forecast period, due to the onerous forecast production process. The current service relies on models built using data for complete calendar months, meaning the forecast production process cannot begin until the first day of the forecast period. Somehow, the bureau needs to transition to a service that provides forecasts before the beginning of the forecast period; timelier forecast release will become critical as sub-seasonal (monthly) forecasts are developed. Increasing the forecast lead time to one month ahead is not considered a viable option for Australian catchments that typically lack any predictability associated with snowmelt. The bureau's forecasts are built around Bayesian joint probability models that have antecedent streamflow, rainfall and climate indices as predictors. In this study, we adapt the modelling approach so that forecasts have any number of days of lead time. Daily streamflow and sea surface temperatures are used to develop predictors based on 28-day sliding windows. Forecasts are produced for 23 forecast locations with 0-14- and 21-day lead time. The forecasts are assessed in terms of continuous ranked probability score (CRPS) skill score and reliability metrics. CRPS skill scores, on average, reduce monotonically with increase in days of lead time, although both positive and negative differences are observed. Considering only skilful forecast locations, CRPS skill scores at 7-day lead time are reduced on average by 4 percentage points, with differences largely contained within +5 to -15 percentage points. A flexible forecasting system that allows for any number of days of lead time could benefit Australian seasonal streamflow forecast users by allowing more time for forecasts to be disseminated, comprehended and made use of prior to the commencement of a forecast season. The system would allow for forecasts to be updated if necessary.
NASA Astrophysics Data System (ADS)
Christensen, Hannah; Moroz, Irene; Palmer, Tim
2015-04-01
Forecast verification is important across scientific disciplines as it provides a framework for evaluating the performance of a forecasting system. In the atmospheric sciences, probabilistic skill scores are often used for verification as they provide a way of unambiguously ranking the performance of different probabilistic forecasts. In order to be useful, a skill score must be proper -- it must encourage honesty in the forecaster, and reward forecasts which are reliable and which have good resolution. A new score, the Error-spread Score (ES), is proposed which is particularly suitable for evaluation of ensemble forecasts. It is formulated with respect to the moments of the forecast. The ES is confirmed to be a proper score, and is therefore sensitive to both resolution and reliability. The ES is tested on forecasts made using the Lorenz '96 system, and found to be useful for summarising the skill of the forecasts. The European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system (EPS) is evaluated using the ES. Its performance is compared to a perfect statistical probabilistic forecast -- the ECMWF high resolution deterministic forecast dressed with the observed error distribution. This generates a forecast that is perfectly reliable if considered over all time, but which does not vary from day to day with the predictability of the atmospheric flow. The ES distinguishes between the dynamically reliable EPS forecasts and the statically reliable dressed deterministic forecasts. Other skill scores are tested and found to be comparatively insensitive to this desirable forecast quality. The ES is used to evaluate seasonal range ensemble forecasts made with the ECMWF System 4. The ensemble forecasts are found to be skilful when compared with climatological or persistence forecasts, though this skill is dependent on region and time of year.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anghileri, Daniela; Voisin, Nathalie; Castelletti, Andrea F.
In this study, we develop a forecast-based adaptive control framework for Oroville reservoir, California, to assess the value of seasonal and inter-annual forecasts for reservoir operation.We use an Ensemble Streamflow Prediction (ESP) approach to generate retrospective, one-year-long streamflow forecasts based on the Variable Infiltration Capacity hydrology model. The optimal sequence of daily release decisions from the reservoir is then determined by Model Predictive Control, a flexible and adaptive optimization scheme.We assess the forecast value by comparing system performance based on the ESP forecasts with that based on climatology and a perfect forecast. In addition, we evaluate system performance based onmore » a synthetic forecast, which is designed to isolate the contribution of seasonal and inter-annual forecast skill to the overall value of the ESP forecasts.Using the same ESP forecasts, we generalize our results by evaluating forecast value as a function of forecast skill, reservoir features, and demand. Our results show that perfect forecasts are valuable when the water demand is high and the reservoir is sufficiently large to allow for annual carry-over. Conversely, ESP forecast value is highest when the reservoir can shift water on a seasonal basis.On average, for the system evaluated here, the overall ESP value is 35% less than the perfect forecast value. The inter-annual component of the ESP forecast contributes 20-60% of the total forecast value. Improvements in the seasonal component of the ESP forecast would increase the overall ESP forecast value between 15 and 20%.« less
NASA Astrophysics Data System (ADS)
La, I.; Yum, S. S.; Yeom, J. M.; Gultepe, I.
2017-12-01
Since microphysical and dynamical processes of fog are not well-known and have non-linear relationships among processes that are related to fog formation, improving the accuracy of the fog forecasting/nowcasting system is challenging. For these reasons, understanding the fog mechanism is needed to develop the fog forecasting system. So, we focus on understanding fog-turbulence interactions and fog-gravity wave interactions. Many studies noted that turbulence plays important roles in fog. However, a discrepancy between arguments for the effect of turbulent mixing on fog formation exists. Several studies suggested that turbulent mixing suppresses fog formation. Some other studies reported that turbulent mixing contributes to fog formation. On the other hand, several quasi-periodic oscillations of temperature, visibility, and vertical velocity, which have period of 10-20 minutes, were observed to be related to gravity waves in fog; because gravity waves play significant dynamic roles in the atmosphere. Furthermore, a numerical study suggested that gravity waves, simulated near the top of the fog layer, may affect fog microphysics. Thus, we investigate the effects of turbulent mixing on fog formation and the influences of gravity waves on fog microphysics to understand fog structure in Pyeongchang. In these studies, we analyze the data that are obtained from doppler lidar and 3.5 m meteorological observation tower including 3D-ultrasonic anemometer, IR sensor, and fog monitor during ICE-POP (International Collaborative Experiments for Pyeongchang 2018 Olympic and Paralympic winter games) campaign. In these instruments, doppler lidar is a good instrument to observe the gravity waves near the fog top, while in situ measurements have small spatial coverage. The instruments are installed at the mountainous terrain of Pyeongchang, Korea. More details will be presented at the conference.
Sub-kilometer Numerical Weather Prediction in complex urban areas
NASA Astrophysics Data System (ADS)
Leroyer, S.; Bélair, S.; Husain, S.; Vionnet, V.
2013-12-01
A Sub-kilometer atmospheric modeling system with grid-spacings of 2.5 km, 1 km and 250 m and including urban processes is currently being developed at the Meteorological Service of Canada (MSC) in order to provide more accurate weather forecasts at the city scale. Atmospheric lateral boundary conditions are provided with the 15-km Canadian Regional Deterministic Prediction System (RDPS). Surface physical processes are represented with the Town Energy Balance (TEB) model for the built-up covers and with the Interactions between the Surface, Biosphere, and Atmosphere (ISBA) land surface model for the natural covers. In this study, several research experiments over large metropolitan areas and using observational networks at the urban scale are presented, with a special emphasis on the representation of local atmospheric circulations and their impact on extreme weather forecasting. First, numerical simulations are performed over the Vancouver metropolitan area during a summertime Intense Observing Period (IOP of 14-15 August 2008) of the Environmental Prediction in Canadian Cities (EPiCC) observational network. The influence of the horizontal resolution on the fine-scale representation of the sea-breeze development over the city is highlighted (Leroyer et al., 2013). Then severe storms cases occurring in summertime within the Greater Toronto Area (GTA) are simulated. In view of supporting the 2015 PanAmerican and Para-Pan games to be hold in GTA, a dense observational network has been recently deployed over this region to support model evaluations at the urban and meso scales. In particular, simulations are conducted for the case of 8 July 2013 when exceptional rainfalls were recorded. Leroyer, S., S. Bélair, J. Mailhot, S.Z. Husain, 2013: Sub-kilometer Numerical Weather Prediction in an Urban Coastal Area: A case study over the Vancouver Metropolitan Area, submitted to Journal of Applied Meteorology and Climatology.
NASA Technical Reports Server (NTRS)
Lambert, Winifred C.
2000-01-01
This report describes the outcome of Phase 1 of the AMU's Improved Anvil Forecasting task. Forecasters in the 45th Weather Squadron and the Spaceflight Meteorology Group have found that anvil forecasting is a difficult task when predicting LCC and FR violations. The purpose of this task is to determine the technical feasibility of creating an anvil-forecasting tool. Work on this study was separated into three steps: literature search, forecaster discussions, and determination of technical feasibility. The literature search revealed no existing anvil-forecasting techniques. However, there appears to be growing interest in anvils in recent years. If this interest continues to grow, more information will be available to aid in developing a reliable anvil-forecasting tool. The forecaster discussion step revealed an array of methods on how better forecasting techniques could be developed. The forecasters have ideas based on sound meteorological principles and personal experience in forecasting and analyzing anvils. Based on the information gathered in the discussions with the forecasters, the conclusion of this report is that it is technically feasible at this time to develop an anvil forecasting technique that will significantly contribute to the confidence in anvil forecasts.
Flare forecasting at the Met Office Space Weather Operations Centre
NASA Astrophysics Data System (ADS)
Murray, S. A.; Bingham, S.; Sharpe, M.; Jackson, D. R.
2017-04-01
The Met Office Space Weather Operations Centre produces 24/7/365 space weather guidance, alerts, and forecasts to a wide range of government and commercial end-users across the United Kingdom. Solar flare forecasts are one of its products, which are issued multiple times a day in two forms: forecasts for each active region on the solar disk over the next 24 h and full-disk forecasts for the next 4 days. Here the forecasting process is described in detail, as well as first verification of archived forecasts using methods commonly used in operational weather prediction. Real-time verification available for operational flare forecasting use is also described. The influence of human forecasters is highlighted, with human-edited forecasts outperforming original model results and forecasting skill decreasing over longer forecast lead times.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finley, Cathy
2014-04-30
This report contains the results from research aimed at improving short-range (0-6 hour) hub-height wind forecasts in the NOAA weather forecast models through additional data assimilation and model physics improvements for use in wind energy forecasting. Additional meteorological observing platforms including wind profilers, sodars, and surface stations were deployed for this study by NOAA and DOE, and additional meteorological data at or near wind turbine hub height were provided by South Dakota State University and WindLogics/NextEra Energy Resources over a large geographical area in the U.S. Northern Plains for assimilation into NOAA research weather forecast models. The resulting improvements inmore » wind energy forecasts based on the research weather forecast models (with the additional data assimilation and model physics improvements) were examined in many different ways and compared with wind energy forecasts based on the current operational weather forecast models to quantify the forecast improvements important to power grid system operators and wind plant owners/operators participating in energy markets. Two operational weather forecast models (OP_RUC, OP_RAP) and two research weather forecast models (ESRL_RAP, HRRR) were used as the base wind forecasts for generating several different wind power forecasts for the NextEra Energy wind plants in the study area. Power forecasts were generated from the wind forecasts in a variety of ways, from very simple to quite sophisticated, as they might be used by a wide range of both general users and commercial wind energy forecast vendors. The error characteristics of each of these types of forecasts were examined and quantified using bulk error statistics for both the local wind plant and the system aggregate forecasts. The wind power forecast accuracy was also evaluated separately for high-impact wind energy ramp events. The overall bulk error statistics calculated over the first six hours of the forecasts at both the individual wind plant and at the system-wide aggregate level over the one year study period showed that the research weather model-based power forecasts (all types) had lower overall error rates than the current operational weather model-based power forecasts, both at the individual wind plant level and at the system aggregate level. The bulk error statistics of the various model-based power forecasts were also calculated by season and model runtime/forecast hour as power system operations are more sensitive to wind energy forecast errors during certain times of year and certain times of day. The results showed that there were significant differences in seasonal forecast errors between the various model-based power forecasts. The results from the analysis of the various wind power forecast errors by model runtime and forecast hour showed that the forecast errors were largest during the times of day that have increased significance to power system operators (the overnight hours and the morning/evening boundary layer transition periods), but the research weather model-based power forecasts showed improvement over the operational weather model-based power forecasts at these times.« less
Lyons, Elizabeth J.; Tate, Deborah F.; Komoski, Stephanie E.; Carr, Philip M.; Ward, Dianne S.
2012-01-01
Background Some active video games have been found to promote physical activity adherence because of enjoyment. However, many active games are exercise themed, which may interfere with the distracting properties that make game-based exercise more enjoyable than traditional exercise. This study compared exercise-themed and game-themed active games to investigate differences in energy expenditure and enjoyment. Method Young adults (N = 100, 50 female, 55 overweight, aged 18–35 years) played two of four Wii Fit games (one aerobic game and one balance game per person) for 10 min each. Of the two aerobic games, one was exercise themed (jogging) and the other was game themed (hula hooping). Both balance games were game themed. Energy expenditure and enjoyment were measured. Results After adjustment for gender and weight, aerobic games produced 2.70 kcal/kg-1/h-1 (95% confidence interval 2.41, 3.00) greater energy expenditure than balance games (p < .001), but balance games were more enjoyable (p < .001). In aerobic games, jogging produced greater energy expenditure than hula hooping in normal-weight and male participants (p < .001); in overweight and female participants, no differences were found (p > .17). Hula hooping was enjoyed more than jogging (p = .008). Enjoyment predicted energy expenditure in aerobic games (B = 0.767, p = .010). Conclusions Aerobic games produced greater energy expenditure but lower enjoyment than balance games, and a game-themed aerobic game was found more enjoyable than an exercise-themed aerobic game. Integrating more strenuous activity into entertaining games instead of games that simply simulate exercise may be a fruitful avenue for active game development. PMID:22920810
Lyons, Elizabeth J; Tate, Deborah F; Komoski, Stephanie E; Carr, Philip M; Ward, Dianne S
2012-07-01
Some active video games have been found to promote physical activity adherence because of enjoyment. However, many active games are exercise themed, which may interfere with the distracting properties that make game-based exercise more enjoyable than traditional exercise. This study compared exercise-themed and game-themed active games to investigate differences in energy expenditure and enjoyment. Young adults (N = 100, 50 female, 55 overweight, aged 18-35 years) played two of four Wii Fit games (one aerobic game and one balance game per person) for 10 min each. Of the two aerobic games, one was exercise themed (jogging) and the other was game themed (hula hooping). Both balance games were game themed. Energy expenditure and enjoyment were measured. After adjustment for gender and weight, aerobic games produced 2.70 kcal/kg(-1)/h(-1) (95% confidence interval 2.41, 3.00) greater energy expenditure than balance games (p < .001), but balance games were more enjoyable (p < .001). In aerobic games, jogging produced greater energy expenditure than hula hooping in normal-weight and male participants (p < .001); in overweight and female participants, no differences were found (p > .17). Hula hooping was enjoyed more than jogging (p = .008). Enjoyment predicted energy expenditure in aerobic games (B = 0.767, p = .010). Aerobic games produced greater energy expenditure but lower enjoyment than balance games, and a game-themed aerobic game was found more enjoyable than an exercise-themed aerobic game. Integrating more strenuous activity into entertaining games instead of games that simply simulate exercise may be a fruitful avenue for active game development. © 2012 Diabetes Technology Society.
The Uses of Teaching Games in Game Theory Classes and Some Experimental Games.
ERIC Educational Resources Information Center
Shubik, Martin
2002-01-01
Discusses the use of lightly controlled games, primarily in classes in game theory. Considers the value of such games from the viewpoint of both teaching and experimentation and discusses context; control; pros and cons of games in teaching; experimental games; and games in class, including cooperative game theory. (Author/LRW)
NASA Astrophysics Data System (ADS)
Medina, Hanoi; Tian, Di; Srivastava, Puneet; Pelosi, Anna; Chirico, Giovanni B.
2018-07-01
Reference evapotranspiration (ET0) plays a fundamental role in agronomic, forestry, and water resources management. Estimating and forecasting ET0 have long been recognized as a major challenge for researchers and practitioners in these communities. This work explored the potential of multiple leading numerical weather predictions (NWPs) for estimating and forecasting summer ET0 at 101 U.S. Regional Climate Reference Network stations over nine climate regions across the contiguous United States (CONUS). Three leading global NWP model forecasts from THORPEX Interactive Grand Global Ensemble (TIGGE) dataset were used in this study, including the single model ensemble forecasts from the European Centre for Medium-Range Weather Forecasts (EC), the National Centers for Environmental Prediction Global Forecast System (NCEP), and the United Kingdom Meteorological Office forecasts (MO), as well as multi-model ensemble forecasts from the combinations of these NWP models. A regression calibration was employed to bias correct the ET0 forecasts. Impact of individual forecast variables on ET0 forecasts were also evaluated. The results showed that the EC forecasts provided the least error and highest skill and reliability, followed by the MO and NCEP forecasts. The multi-model ensembles constructed from the combination of EC and MO forecasts provided slightly better performance than the single model EC forecasts. The regression process greatly improved ET0 forecast performances, particularly for the regions involving stations near the coast, or with a complex orography. The performance of EC forecasts was only slightly influenced by the size of the ensemble members, particularly at short lead times. Even with less ensemble members, EC still performed better than the other two NWPs. Errors in the radiation forecasts, followed by those in the wind, had the most detrimental effects on the ET0 forecast performances.
NASA Astrophysics Data System (ADS)
Huang, Qian; Wang, Tijian; Chen, Pulong; Huang, Xiaoxian; Zhu, Jialei; Zhuang, Bingliang
2017-11-01
As the holding city of the 2nd Youth Olympic Games (YOG), Nanjing is highly industrialized and urbanized, and faces several air pollution issues. In order to ensure better air quality during the event, the local government took great efforts to control the emissions from pollutant sources. However, air quality can still be affected by synoptic weather, not only emission. In this paper, the influences of meteorological factors and emission reductions were investigated using observational data and numerical simulations with WRF-CMAQ (Weather Research and Forecasting - Community Multiscale Air Quality). During the month in which the YOG were held (August 2014), the observed hourly mean concentrations of SO2, NO2, PM10, PM2.5, CO and O3 were 11.6 µg m-3, 34.0 µg m-3, 57.8 µg m-3, 39.4 µg m-3, 0.9 mg m-3 and 38.8 µg m-3, respectively, which were below China National Ambient Air Quality Standard (level 2). However, model simulation showed that the weather conditions, such as weaker winds during the YOG, were adverse for better air quality and could increase SO2, NO2, PM10, PM2.5 and CO by 17.5, 16.9, 18.5, 18.8, 7.8 and 0.8 %. Taking account of local emission abatement only, the simulated SO2, NO2, PM10, PM2.5 and CO decreased by 24.6, 12.1, 15.1, 8.1 and 7.2 %. Consequently, stringent emission control measures can reduce the concentrations of air pollutants in the short term, and emission reduction is very important for air quality improvement during the YOG. A good example has been set for air quality protection for important social events.
Uses and Applications of Climate Forecasts for Power Utilities.
NASA Astrophysics Data System (ADS)
Changnon, Stanley A.; Changnon, Joyce M.; Changnon, David
1995-05-01
The uses and potential applications of climate forecasts for electric and gas utilities were assessed 1) to discern needs for improving climate forecasts and guiding future research, and 2) to assist utilities in making wise use of forecasts. In-depth structured interviews were conducted with 56 decision makers in six utilities to assess existing and potential uses of climate forecasts. Only 3 of the 56 use forecasts. Eighty percent of those sampled envisioned applications of climate forecasts, given certain changes and additional information. Primary applications exist in power trading, load forecasting, fuel acquisition, and systems planning, with slight differences in interests between utilities. Utility staff understand probability-based forecasts but desire climatological information related to forecasted outcomes, including analogs similar to the forecasts, and explanations of the forecasts. Desired lead times vary from a week to three months, along with forecasts of up to four seasons ahead. The new NOAA forecasts initiated in 1995 provide the lead times and longer-term forecasts desired. Major hindrances to use of forecasts are hard-to-understand formats, lack of corporate acceptance, and lack of access to expertise. Recent changes in government regulations altered the utility industry, leading to a more competitive world wherein information about future weather conditions assumes much more value. Outreach efforts by government forecast agencies appear valuable to help achieve the appropriate and enhanced use of climate forecasts by the utility industry. An opportunity for service exists also for the private weather sector.
NASA Astrophysics Data System (ADS)
Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.
2017-08-01
Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.
The prevalence of problematic video gamers in the Netherlands.
Haagsma, Maria C; Pieterse, Marcel E; Peters, Oscar
2012-03-01
This study surveyed Dutch adolescents and adults about their video gaming behavior to assess the prevalence of problematic gaming. A representative national panel of 902 respondents aged 14 to 81 took part in the study. The results show that gaming in general is a wide-spread and popular activity among the Dutch population. Browser games (small games played via the internet) and offline casual games (e.g., offline card games) were reported as most popular type of game. Online games (e.g., massively multiplayer online role-playing games) are played by a relatively small part of the respondents, yet considerably more time is spent on these online games than on browser games, offline casual games, and offline games (e.g., offline racing games). The prevalence of problematic gaming in the total sample is 1.3 percent. Among adolescents and young adults problematic gaming occurs in 3.3 percent of cases. Particularly male adolescents seem to be more vulnerable to developing problematic gaming habits.
Continuous punishment and the potential of gentle rule enforcement.
Erev, Ido; Ingram, Paul; Raz, Ornit; Shany, Dror
2010-05-01
The paper explores the conditions that determine the effect of rule enforcement policies that imply an attempt to punish all the visible violations of the rule. We start with a simple game-theoretic analysis that highlights the value of gentle COntinuous Punishment (gentle COP) policies. If the subjects of the rule are rational, gentle COP can eliminate violations even when the rule enforcer has limited resources. The second part of the paper uses simulations to examine the robustness of gentle COP policies to likely deviations from rationality. The results suggest that when the probability of detecting violations is sufficiently high, gentle COP policies can be effective even when the subjects of the rule are boundedly rational adaptive learners. The paper concludes with experimental studies that clarify the value of gentle COP policies in the lab, and in attempt to eliminate cheating in exams. Copyright (c) 2009 Elsevier B.V. All rights reserved.
An overview of health forecasting.
Soyiri, Ireneous N; Reidpath, Daniel D
2013-01-01
Health forecasting is a novel area of forecasting, and a valuable tool for predicting future health events or situations such as demands for health services and healthcare needs. It facilitates preventive medicine and health care intervention strategies, by pre-informing health service providers to take appropriate mitigating actions to minimize risks and manage demand. Health forecasting requires reliable data, information and appropriate analytical tools for the prediction of specific health conditions or situations. There is no single approach to health forecasting, and so various methods have often been adopted to forecast aggregate or specific health conditions. Meanwhile, there are no defined health forecasting horizons (time frames) to match the choices of health forecasting methods/approaches that are often applied. The key principles of health forecasting have not also been adequately described to guide the process. This paper provides a brief introduction and theoretical analysis of health forecasting. It describes the key issues that are important for health forecasting, including: definitions, principles of health forecasting, and the properties of health data, which influence the choices of health forecasting methods. Other matters related to the value of health forecasting, and the general challenges associated with developing and using health forecasting services are discussed. This overview is a stimulus for further discussions on standardizing health forecasting approaches and methods that will facilitate health care and health services delivery.
Dutch children and parents' views on active and non-active video gaming.
De Vet, Emely; Simons, Monique; Wesselman, Maarten
2014-06-01
Active video games that require whole body movement to play the game may be an innovative health promotion tool to substitute sedentary pastime with more active time and may therefore contribute to children's health. To inform strategies aimed at reducing sedentary behavior by replacing non-active by active gaming, opinions about active and non-active video games are explored among 8- to 12-year-old children and their parents. Six qualitative, semi-structured focus groups were held with 8- to 12-year-old children (n = 46) and four with their parents (n = 19) at three different primary schools in The Netherlands. The focus groups with children discussed game preferences, gaming context and perceived game-related parenting. The focus groups with parents addressed considerations in purchasing video games, perceived positive and negative consequences of gaming, and game-related parenting. Both children and their parents were very positive about active video games and preferred active games over non-active games. Active video games were considered more social than non-active video games, and active games were played more often together with friends and family than non-active video games. Parenting practices did not differ for active and non-active video games, although some parents were less strict regarding active games. Two conditions for practical implementation were met: children enjoyed active video games, and parents were willing to buy active video games. Active video games were preferred to non-active video games, illustrating that using active video games is a promising health promotion tool to reduce sedentary pastime in youth.
Männikkö, Niko; Ruotsalainen, Heidi; Demetrovics, Zsolt; Lopez-Fernandez, Olatz; Myllymäki, Laura; Miettunen, Jouko; Kääriäinen, Maria
2017-09-14
Multiplatform digital media use and gaming have been increased in recent years. The aim of this study was to examine associations between sociodemographics and digital gaming behavior characteristics (i.e., gaming time, medium, and genres) with problematic gaming behavior in adolescents. A convenience sample of Finnish junior high school students (n = 560; mean age 14 years, ranging from 12 to 16 years) participated in the cross-sectional survey, of which, 83% (n = 465) reported having played digital games regularly. Sociodemographic data, different forms of digital media use, gaming behavior characteristics and problematic gaming behavior was assessed. Study participants spent on average one hour per day playing digital games; casual games (23.9%), shooting games (19.8%), and sport games (12.9%), were the most popular games among participants. By using regression analysis, a blended family structure and gaming time related positively to problematic gaming behavior. Preferences for game genres such as solo, Massively Multiplayer Online Role-Playing and strategy-management games were also positively associated with problematic use of digital games. These findings provide knowledge that can be utilized in the prevention of the possible negative consequences of digital gaming.
NASA Astrophysics Data System (ADS)
Rawicz, Paul Lawrence
In this thesis, the similarities between the structure of the H infinity, H2, and Kalman filters are examined. The filters used in this examination have been derived through duality to the full information controller. In addition, a direct variation of parameters derivation of the Hinfinity filter is presented for both continuous and discrete time (staler case). Direct and controller dual derivations using differential games exist in the literature and also employ variational techniques. Using a variational, rather than a differential games, viewpoint has resulted in a simple relationship between the Riccati equations that arise from the derivation and the results of the Bounded Real Lemma. This same relation has previously been found in the literature and used to relate the Riccati inequality for linear systems to the Hamilton Jacobi inequality for nonlinear systems when implementing the Hinfinity controller. The Hinfinity, H2, and Kalman filters are applied to the two-state target tracking problem. In continuous time, closed form analytic expressions for the trackers and their performance are determined. To evaluate the trackers using a neutral, realistic, criterion, the probability of target escape is developed. That is, the probability that the target position error will be such that the target is outside the radar beam width resulting in a loss of measurement. In discrete time, a numerical example, using the probability of target escape, is presented to illustrate the differences in tracker performance.
Marchetti, Antonella; Baglio, Francesca; Castelli, Ilaria; Griffanti, Ludovica; Nemni, Raffaello; Rossetto, Federica; Valle, Annalisa; Zanette, Michela; Massaro, Davide
2018-01-01
During adolescence and early adulthood, individuals deal with important developmental changes, especially in the context of complex social interactions. Previous studies demonstrated that those changes have a significant impact on the social decision making process, in terms of a progressive increase of intentionality comprehension of others, of the sensitivity to fairness, and of the impermeability to decisional biases. However, neither adolescents nor adults reach the ideal level of maximization and of rationality of the homo economicus proposed by classical economics theory, thus remaining more close to the model of the "bounded rationality" proposed by cognitive psychology. In the present study, we analyzed two aspects of decision making in 110 participants from early adolescence to young adulthood: the sensitivity to fairness and the permeability to decisional biases (Outcome Bias and Hindsight Bias). To address these questions, we adopted a modified version of the Ultimatum Game task, where participants faced fair, unfair, and hyperfair offers from proposers described as generous, selfish, or neutral. We also administered two behavioral tasks testing the influence of the Outcome Bias and of the Hindsight Bias in the evaluation of the decision. Our behavioral results highlighted that the participants are still partially consequentialist, as the decisional process is influenced by a complex balance between the outcome and the psychological description of the proposer. As regards cognitive biases, the Outcome Bias and the Hindsight Bias are present in the whole sample, with no relevant age differences.
Individual versus superensemble forecasts of seasonal influenza outbreaks in the United States.
Yamana, Teresa K; Kandula, Sasikiran; Shaman, Jeffrey
2017-11-01
Recent research has produced a number of methods for forecasting seasonal influenza outbreaks. However, differences among the predicted outcomes of competing forecast methods can limit their use in decision-making. Here, we present a method for reconciling these differences using Bayesian model averaging. We generated retrospective forecasts of peak timing, peak incidence, and total incidence for seasonal influenza outbreaks in 48 states and 95 cities using 21 distinct forecast methods, and combined these individual forecasts to create weighted-average superensemble forecasts. We compared the relative performance of these individual and superensemble forecast methods by geographic location, timing of forecast, and influenza season. We find that, overall, the superensemble forecasts are more accurate than any individual forecast method and less prone to producing a poor forecast. Furthermore, we find that these advantages increase when the superensemble weights are stratified according to the characteristics of the forecast or geographic location. These findings indicate that different competing influenza prediction systems can be combined into a single more accurate forecast product for operational delivery in real time.
Individual versus superensemble forecasts of seasonal influenza outbreaks in the United States
Kandula, Sasikiran; Shaman, Jeffrey
2017-01-01
Recent research has produced a number of methods for forecasting seasonal influenza outbreaks. However, differences among the predicted outcomes of competing forecast methods can limit their use in decision-making. Here, we present a method for reconciling these differences using Bayesian model averaging. We generated retrospective forecasts of peak timing, peak incidence, and total incidence for seasonal influenza outbreaks in 48 states and 95 cities using 21 distinct forecast methods, and combined these individual forecasts to create weighted-average superensemble forecasts. We compared the relative performance of these individual and superensemble forecast methods by geographic location, timing of forecast, and influenza season. We find that, overall, the superensemble forecasts are more accurate than any individual forecast method and less prone to producing a poor forecast. Furthermore, we find that these advantages increase when the superensemble weights are stratified according to the characteristics of the forecast or geographic location. These findings indicate that different competing influenza prediction systems can be combined into a single more accurate forecast product for operational delivery in real time. PMID:29107987
NASA Astrophysics Data System (ADS)
Regonda, Satish Kumar; Seo, Dong-Jun; Lawrence, Bill; Brown, James D.; Demargne, Julie
2013-08-01
We present a statistical procedure for generating short-term ensemble streamflow forecasts from single-valued, or deterministic, streamflow forecasts produced operationally by the U.S. National Weather Service (NWS) River Forecast Centers (RFCs). The resulting ensemble streamflow forecast provides an estimate of the predictive uncertainty associated with the single-valued forecast to support risk-based decision making by the forecasters and by the users of the forecast products, such as emergency managers. Forced by single-valued quantitative precipitation and temperature forecasts (QPF, QTF), the single-valued streamflow forecasts are produced at a 6-h time step nominally out to 5 days into the future. The single-valued streamflow forecasts reflect various run-time modifications, or "manual data assimilation", applied by the human forecasters in an attempt to reduce error from various sources in the end-to-end forecast process. The proposed procedure generates ensemble traces of streamflow from a parsimonious approximation of the conditional multivariate probability distribution of future streamflow given the single-valued streamflow forecast, QPF, and the most recent streamflow observation. For parameter estimation and evaluation, we used a multiyear archive of the single-valued river stage forecast produced operationally by the NWS Arkansas-Red River Basin River Forecast Center (ABRFC) in Tulsa, Oklahoma. As a by-product of parameter estimation, the procedure provides a categorical assessment of the effective lead time of the operational hydrologic forecasts for different QPF and forecast flow conditions. To evaluate the procedure, we carried out hindcasting experiments in dependent and cross-validation modes. The results indicate that the short-term streamflow ensemble hindcasts generated from the procedure are generally reliable within the effective lead time of the single-valued forecasts and well capture the skill of the single-valued forecasts. For smaller basins, however, the effective lead time is significantly reduced by short basin memory and reduced skill in the single-valued QPF.
Donati, Maria Anna; Chiesi, Francesca; Ammannato, Giulio; Primi, Caterina
2015-02-01
This study tested the predictive power of gaming versatility (i.e., the number of video game genres engaged in) on game addiction in male adolescents, controlling for time spent on gaming. Participants were 701 male adolescents attending high school (Mage=15.6 years). Analyses showed that pathological gaming was predicted not only by higher time spent on gaming, but also by participation in a greater number of video game genres. Specifically, the wider the array of video game genres played, the higher were the negative consequences caused by gaming. Findings show that versatility can be considered as one of the behavioral risk factors related to gaming addiction, which may be characterized by a composite and diversified experience with video games. This study suggests that educational efforts designed to prevent gaming addiction among youth may also be focused on adolescents' engagement in different video games.
U.S. High Seas Marine Text Forecasts by Area
Flooding Tsunamis 406 EPIRB's U.S. High Seas Marine Text Forecasts by Area OPC N.Atlantic High Seas Forecast NHC N.Atlantic High Seas Forecast OPC N.Pacific High Seas Forecast HFO N.Pacific High Seas Forecast NHC N.Pacific High Seas Forecast HFO S.Pacific High Seas Forecast U.S. High Seas Marine Text
A Diagnostics Tool to detect ensemble forecast system anomaly and guide operational decisions
NASA Astrophysics Data System (ADS)
Park, G. H.; Srivastava, A.; Shrestha, E.; Thiemann, M.; Day, G. N.; Draijer, S.
2017-12-01
The hydrologic community is moving toward using ensemble forecasts to take uncertainty into account during the decision-making process. The New York City Department of Environmental Protection (DEP) implements several types of ensemble forecasts in their decision-making process: ensemble products for a statistical model (Hirsch and enhanced Hirsch); the National Weather Service (NWS) Advanced Hydrologic Prediction Service (AHPS) forecasts based on the classical Ensemble Streamflow Prediction (ESP) technique; and the new NWS Hydrologic Ensemble Forecasting Service (HEFS) forecasts. To remove structural error and apply the forecasts to additional forecast points, the DEP post processes both the AHPS and the HEFS forecasts. These ensemble forecasts provide mass quantities of complex data, and drawing conclusions from these forecasts is time-consuming and difficult. The complexity of these forecasts also makes it difficult to identify system failures resulting from poor data, missing forecasts, and server breakdowns. To address these issues, we developed a diagnostic tool that summarizes ensemble forecasts and provides additional information such as historical forecast statistics, forecast skill, and model forcing statistics. This additional information highlights the key information that enables operators to evaluate the forecast in real-time, dynamically interact with the data, and review additional statistics, if needed, to make better decisions. We used Bokeh, a Python interactive visualization library, and a multi-database management system to create this interactive tool. This tool compiles and stores data into HTML pages that allows operators to readily analyze the data with built-in user interaction features. This paper will present a brief description of the ensemble forecasts, forecast verification results, and the intended applications for the diagnostic tool.
NASA Astrophysics Data System (ADS)
Omi, Takahiro; Ogata, Yosihiko; Hirata, Yoshito; Aihara, Kazuyuki
2015-04-01
Because aftershock occurrences can cause significant seismic risks for a considerable time after the main shock, prospective forecasting of the intermediate-term aftershock activity as soon as possible is important. The epidemic-type aftershock sequence (ETAS) model with the maximum likelihood estimate effectively reproduces general aftershock activity including secondary or higher-order aftershocks and can be employed for the forecasting. However, because we cannot always expect the accurate parameter estimation from incomplete early aftershock data where many events are missing, such forecasting using only a single estimated parameter set (plug-in forecasting) can frequently perform poorly. Therefore, we here propose Bayesian forecasting that combines the forecasts by the ETAS model with various probable parameter sets given the data. By conducting forecasting tests of 1 month period aftershocks based on the first 1 day data after the main shock as an example of the early intermediate-term forecasting, we show that the Bayesian forecasting performs better than the plug-in forecasting on average in terms of the log-likelihood score. Furthermore, to improve forecasting of large aftershocks, we apply a nonparametric (NP) model using magnitude data during the learning period and compare its forecasting performance with that of the Gutenberg-Richter (G-R) formula. We show that the NP forecast performs better than the G-R formula in some cases but worse in other cases. Therefore, robust forecasting can be obtained by employing an ensemble forecast that combines the two complementary forecasts. Our proposed method is useful for a stable unbiased intermediate-term assessment of aftershock probabilities.
Healthy Gaming - Video Game Design to promote Health.
Brox, E; Fernandez-Luque, L; Tøllefsen, T
2011-01-01
There is an increasing interest in health games including simulation tools, games for specific conditions, persuasive games to promote a healthy life style or exergames where physical exercise is used to control the game. The objective of the article is to review current literature about available health games and the impact related to game design principles as well as some educational theory aspects. Literature from the big databases and known sites with games for health has been searched to find articles about games for health purposes. The focus has been on educational games, persuasive games and exergames as well as articles describing game design principles. The medical objectives can either be a part of the game theme (intrinsic) or be totally dispatched (extrinsic), and particularly persuasive games seem to use extrinsic game design. Peer support is important, but there is only limited research on multiplayer health games. Evaluation of health games can be both medical and technical, and the focus will depend on the game purpose. There is still not enough evidence to conclude which design principles work for what purposes since most of the literature in health serious games does not specify design methodologies, but it seems that extrinsic methods work in persuasion. However, when designing health care games it is important to define both the target group and main objective, and then design a game accordingly using sound game design principles, but also utilizing design elements to enhance learning and persuasion. A collaboration with health professionals from an early design stage is necessary both to ensure that the content is valid and to have the game validated from a clinical viewpoint. Patients need to be involved, especially to improve usability. More research should be done on social aspects in health games, both related to learning and persuasion.
Paik, Soo-Hyun; Cho, Hyun; Chun, Ji-Won; Jeong, Jo-Eun; Kim, Dai-Jin
2017-12-05
Gaming behaviors have been significantly influenced by smartphones. This study was designed to explore gaming behaviors and clinical characteristics across different gaming device usage patterns and the role of the patterns on Internet gaming disorder (IGD). Responders of an online survey regarding smartphone and online game usage were classified by different gaming device usage patterns: (1) individuals who played only computer games; (2) individuals who played computer games more than smartphone games; (3) individuals who played computer and smartphone games evenly; (4) individuals who played smartphone games more than computer games; (5) individuals who played only smartphone games. Data on demographics, gaming-related behaviors, and scales for Internet and smartphone addiction, depression, anxiety disorder, and substance use were collected. Combined users, especially those who played computer and smartphone games evenly, had higher prevalence of IGD, depression, anxiety disorder, and substance use disorder. These subjects were more prone to develop IGD than reference group (computer only gamers) (B = 0.457, odds ratio = 1.579). Smartphone only gamers had the lowest prevalence of IGD, spent the least time and money on gaming, and showed lowest scores of Internet and smartphone addiction. Our findings suggest that gaming device usage patterns may be associated with the occurrence, course, and prognosis of IGD.
Cho, Hyun; Chun, Ji-Won; Jeong, Jo-Eun; Kim, Dai-Jin
2017-01-01
Gaming behaviors have been significantly influenced by smartphones. This study was designed to explore gaming behaviors and clinical characteristics across different gaming device usage patterns and the role of the patterns on Internet gaming disorder (IGD). Responders of an online survey regarding smartphone and online game usage were classified by different gaming device usage patterns: (1) individuals who played only computer games; (2) individuals who played computer games more than smartphone games; (3) individuals who played computer and smartphone games evenly; (4) individuals who played smartphone games more than computer games; (5) individuals who played only smartphone games. Data on demographics, gaming-related behaviors, and scales for Internet and smartphone addiction, depression, anxiety disorder, and substance use were collected. Combined users, especially those who played computer and smartphone games evenly, had higher prevalence of IGD, depression, anxiety disorder, and substance use disorder. These subjects were more prone to develop IGD than reference group (computer only gamers) (B = 0.457, odds ratio = 1.579). Smartphone only gamers had the lowest prevalence of IGD, spent the least time and money on gaming, and showed lowest scores of Internet and smartphone addiction. Our findings suggest that gaming device usage patterns may be associated with the occurrence, course, and prognosis of IGD. PMID:29206183
Quantifying model uncertainty in seasonal Arctic sea-ice forecasts
NASA Astrophysics Data System (ADS)
Blanchard-Wrigglesworth, Edward; Barthélemy, Antoine; Chevallier, Matthieu; Cullather, Richard; Fučkar, Neven; Massonnet, François; Posey, Pamela; Wang, Wanqiu; Zhang, Jinlun; Ardilouze, Constantin; Bitz, Cecilia; Vernieres, Guillaume; Wallcraft, Alan; Wang, Muyin
2017-04-01
Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or post-processing techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.
Model-free aftershock forecasts constructed from similar sequences in the past
NASA Astrophysics Data System (ADS)
van der Elst, N.; Page, M. T.
2017-12-01
The basic premise behind aftershock forecasting is that sequences in the future will be similar to those in the past. Forecast models typically use empirically tuned parametric distributions to approximate past sequences, and project those distributions into the future to make a forecast. While parametric models do a good job of describing average outcomes, they are not explicitly designed to capture the full range of variability between sequences, and can suffer from over-tuning of the parameters. In particular, parametric forecasts may produce a high rate of "surprises" - sequences that land outside the forecast range. Here we present a non-parametric forecast method that cuts out the parametric "middleman" between training data and forecast. The method is based on finding past sequences that are similar to the target sequence, and evaluating their outcomes. We quantify similarity as the Poisson probability that the observed event count in a past sequence reflects the same underlying intensity as the observed event count in the target sequence. Event counts are defined in terms of differential magnitude relative to the mainshock. The forecast is then constructed from the distribution of past sequences outcomes, weighted by their similarity. We compare the similarity forecast with the Reasenberg and Jones (RJ95) method, for a set of 2807 global aftershock sequences of M≥6 mainshocks. We implement a sequence-specific RJ95 forecast using a global average prior and Bayesian updating, but do not propagate epistemic uncertainty. The RJ95 forecast is somewhat more precise than the similarity forecast: 90% of observed sequences fall within a factor of two of the median RJ95 forecast value, whereas the fraction is 85% for the similarity forecast. However, the surprise rate is much higher for the RJ95 forecast; 10% of observed sequences fall in the upper 2.5% of the (Poissonian) forecast range. The surprise rate is less than 3% for the similarity forecast. The similarity forecast may be useful to emergency managers and non-specialists when confidence or expertise in parametric forecasting may be lacking. The method makes over-tuning impossible, and minimizes the rate of surprises. At the least, this forecast constitutes a useful benchmark for more precisely tuned parametric forecasts.
Energy cost and game flow of 5 exer-games in trained players.
Bronner, Shaw; Pinsker, Russell; Noah, J Adam
2013-05-01
To determine energy expenditure and player experience in exer-games designed for novel platforms. Energy cost of 7 trained players was measured in 5 music-based exer-games. Participants answered a questionnaire about "game flow," experience of enjoyment, and immersion in game play. Energy expenditure during game play ranged from moderate to vigorous intensity (4 - 9 MET). Participant achieved highest MET levels and game flow while playing StepMania and lowest MET levels and game flow when playing Wii Just Dance 3(®) and Kinect Dance Central™. Game flow scores positively correlated with MET levels. Physiological measurement and game flow testing during game development may help to optimize exer-game player activity and experience.
A study for systematic errors of the GLA forecast model in tropical regions
NASA Technical Reports Server (NTRS)
Chen, Tsing-Chang; Baker, Wayman E.; Pfaendtner, James; Corrigan, Martin
1988-01-01
From the sensitivity studies performed with the Goddard Laboratory for Atmospheres (GLA) analysis/forecast system, it was revealed that the forecast errors in the tropics affect the ability to forecast midlatitude weather in some cases. Apparently, the forecast errors occurring in the tropics can propagate to midlatitudes. Therefore, the systematic error analysis of the GLA forecast system becomes a necessary step in improving the model's forecast performance. The major effort of this study is to examine the possible impact of the hydrological-cycle forecast error on dynamical fields in the GLA forecast system.
Simons, Monique; de Vet, Emely; Chinapaw, Mai Jm; de Boer, Michiel; Seidell, Jacob C; Brug, Johannes
2014-04-04
Playing video games contributes substantially to sedentary behavior in youth. A new generation of video games-active games-seems to be a promising alternative to sedentary games to promote physical activity and reduce sedentary behavior. At this time, little is known about correlates of active and non-active gaming among adolescents. The objective of this study was to examine potential personal, social, and game-related correlates of both active and non-active gaming in adolescents. A survey assessing game behavior and potential personal, social, and game-related correlates was conducted among adolescents (12-16 years, N=353) recruited via schools. Multivariable, multilevel logistic regression analyses, adjusted for demographics (age, sex and educational level of adolescents), were conducted to examine personal, social, and game-related correlates of active gaming ≥1 hour per week (h/wk) and non-active gaming >7 h/wk. Active gaming ≥1 h/wk was significantly associated with a more positive attitude toward active gaming (OR 5.3, CI 2.4-11.8; P<.001), a less positive attitude toward non-active games (OR 0.30, CI 0.1-0.6; P=.002), a higher score on habit strength regarding gaming (OR 1.9, CI 1.2-3.2; P=.008) and having brothers/sisters (OR 6.7, CI 2.6-17.1; P<.001) and friends (OR 3.4, CI 1.4-8.4; P=.009) who spend more time on active gaming and a little bit lower score on game engagement (OR 0.95, CI 0.91-0.997; P=.04). Non-active gaming >7 h/wk was significantly associated with a more positive attitude toward non-active gaming (OR 2.6, CI 1.1-6.3; P=.035), a stronger habit regarding gaming (OR 3.0, CI 1.7-5.3; P<.001), having friends who spend more time on non-active gaming (OR 3.3, CI 1.46-7.53; P=.004), and a more positive image of a non-active gamer (OR 2, CI 1.07-3.75; P=.03). Various factors were significantly associated with active gaming ≥1 h/wk and non-active gaming >7 h/wk. Active gaming is most strongly (negatively) associated with attitude with respect to non-active games, followed by observed active game behavior of brothers and sisters and attitude with respect to active gaming (positive associations). On the other hand, non-active gaming is most strongly associated with observed non-active game behavior of friends, habit strength regarding gaming and attitude toward non-active gaming (positive associations). Habit strength was a correlate of both active and non-active gaming, indicating that both types of gaming are habitual behaviors. Although these results should be interpreted with caution because of the limitations of the study, they do provide preliminary insights into potential correlates of active and non-active gaming that can be used for further research as well as preliminary direction for the development of effective intervention strategies for replacing non-active gaming by active gaming among adolescents.
Game injuries in relation to game schedules in the National Basketball Association.
Teramoto, Masaru; Cross, Chad L; Cushman, Daniel M; Maak, Travis G; Petron, David J; Willick, Stuart E
2017-03-01
Injury management is critical in the National Basketball Association (NBA), as players experience a wide variety of injuries. Recently, it has been suggested that game schedules, such as back-to-back games and four games in five days, increase the risk of injuries in the NBA. The aim of this study was to examine the association between game schedules and player injuries in the NBA. Descriptive epidemiology study. The present study analyzed game injuries and game schedules in the 2012-13 through 2014-15 regular seasons. Game injuries by game schedules and players' profiles were examined using an exact binomial test, the Fisher's exact test and the Mann-Whitney-Wilcoxon test. A Poisson regression analysis was performed to predict the number of game injuries sustained by each player from game schedules and injured players' profiles. There were a total of 681 cases of game injuries sustained by 280 different players during the three years (total N=1443 players). Playing back-to-back games or playing four games in five days alone was not associated with an increased rate of game injuries, whereas a significant positive association was found between game injuries and playing away from home (p<0.05). Playing back-to-back games and away games were significant predictors of frequent game injuries (p<0.05). Game schedules could be one factor that impacts the risk of game injuries in the NBA. The findings could be useful for designing optimal game schedules in the NBA as well as helping NBA teams make adjustments to minimize game injuries. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Multi-Year Revenue and Expenditure Forecasting for Small Municipal Governments.
1981-03-01
Management Audit Econometric Revenue Forecast Gap and Impact Analysis Deterministic Expenditure Forecast Municipal Forecasting Municipal Budget Formlto...together with a multi-year revenue and expenditure forecasting model for the City of Monterey, California. The Monterey model includes an econometric ...65 5 D. FORECAST BASED ON THE ECONOMETRIC MODEL ------- 67 E. FORECAST BASED ON EXPERT JUDGMENT AND TREND ANALYSIS
Convective Weather Forecast Accuracy Analysis at Center and Sector Levels
NASA Technical Reports Server (NTRS)
Wang, Yao; Sridhar, Banavar
2010-01-01
This paper presents a detailed convective forecast accuracy analysis at center and sector levels. The study is aimed to provide more meaningful forecast verification measures to aviation community, as well as to obtain useful information leading to the improvements in the weather translation capacity models. In general, the vast majority of forecast verification efforts over past decades have been on the calculation of traditional standard verification measure scores over forecast and observation data analyses onto grids. These verification measures based on the binary classification have been applied in quality assurance of weather forecast products at the national level for many years. Our research focuses on the forecast at the center and sector levels. We calculate the standard forecast verification measure scores for en-route air traffic centers and sectors first, followed by conducting the forecast validation analysis and related verification measures for weather intensities and locations at centers and sectors levels. An approach to improve the prediction of sector weather coverage by multiple sector forecasts is then developed. The weather severe intensity assessment was carried out by using the correlations between forecast and actual weather observation airspace coverage. The weather forecast accuracy on horizontal location was assessed by examining the forecast errors. The improvement in prediction of weather coverage was determined by the correlation between actual sector weather coverage and prediction. observed and forecasted Convective Weather Avoidance Model (CWAM) data collected from June to September in 2007. CWAM zero-minute forecast data with aircraft avoidance probability of 60% and 80% are used as the actual weather observation. All forecast measurements are based on 30-minute, 60- minute, 90-minute, and 120-minute forecasts with the same avoidance probabilities. The forecast accuracy analysis for times under one-hour showed that the errors in intensity and location for center forecast are relatively low. For example, 1-hour forecast intensity and horizontal location errors for ZDC center were about 0.12 and 0.13. However, the correlation between sector 1-hour forecast and actual weather coverage was weak, for sector ZDC32, about 32% of the total variation of observation weather intensity was unexplained by forecast; the sector horizontal location error was about 0.10. The paper also introduces an approach to estimate the sector three-dimensional actual weather coverage by using multiple sector forecasts, which turned out to produce better predictions. Using Multiple Linear Regression (MLR) model for this approach, the correlations between actual observation and the multiple sector forecast model prediction improved by several percents at 95% confidence level in comparison with single sector forecast.
Kazakova, Snezhanka; Cauberghe, Veroline; Pandelaere, Mario; De Pelsmacker, Patrick
2014-01-01
The current study explores how competition and gaming expertise affect the satisfaction of competence needs and gaming gratifications. We demonstrate that competition moderates the effect of gaming expertise on the satisfaction of competence needs, which in turn affects game enjoyment and replay intention. Gaming expertise predicted players' need satisfaction, game enjoyment, and replay intention significantly better in a competitive compared to a noncompetitive context. The effect of gaming expertise on game enjoyment and replay intention was, furthermore, mediated by the satisfaction of competence needs. Finally, gaming expertise positively affected the importance of competition for players' self-esteem only in the competitive gaming context. The present findings demonstrate the importance of competition and gaming expertise for the satisfaction of competence needs, gaming gratifications, and the pursuit of self-esteem during gameplay, attesting to the applicability of self-determination theory to gaming contexts.
Institutional games played by confined juveniles.
Bartollas, C; Sieverdes, C M
1983-01-01
This study examined the games played by 561 juvenile offenders confined in six coeducational correctional facilities in one state. The types of games these residents used against staff and peers within the confines of the institution varied considerably. The study documented nineteen games used by males and females, twelve to deal with staff and seven to deal with peers. The games were defined as therapeutic games, material games, psychological games, and physical games. Peer-oriented games included attention-seeking activities and a variety of dominance games. Additionally, these games were described and tabulated according to the sex and race of the residents. The conclusion was that game-playing behavior was no less frequent in coeducational institutions than it was in single-sex institutions.
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.
2018-03-01
Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.
Bayesian analyses of seasonal runoff forecasts
NASA Astrophysics Data System (ADS)
Krzysztofowicz, R.; Reese, S.
1991-12-01
Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.
de Vet, Emely; Chinapaw, Mai JM; de Boer, Michiel; Seidell, Jacob C; Brug, Johannes
2014-01-01
Background Playing video games contributes substantially to sedentary behavior in youth. A new generation of video games—active games—seems to be a promising alternative to sedentary games to promote physical activity and reduce sedentary behavior. At this time, little is known about correlates of active and non-active gaming among adolescents. Objective The objective of this study was to examine potential personal, social, and game-related correlates of both active and non-active gaming in adolescents. Methods A survey assessing game behavior and potential personal, social, and game-related correlates was conducted among adolescents (12-16 years, N=353) recruited via schools. Multivariable, multilevel logistic regression analyses, adjusted for demographics (age, sex and educational level of adolescents), were conducted to examine personal, social, and game-related correlates of active gaming ≥1 hour per week (h/wk) and non-active gaming >7 h/wk. Results Active gaming ≥1 h/wk was significantly associated with a more positive attitude toward active gaming (OR 5.3, CI 2.4-11.8; P<.001), a less positive attitude toward non-active games (OR 0.30, CI 0.1-0.6; P=.002), a higher score on habit strength regarding gaming (OR 1.9, CI 1.2-3.2; P=.008) and having brothers/sisters (OR 6.7, CI 2.6-17.1; P<.001) and friends (OR 3.4, CI 1.4-8.4; P=.009) who spend more time on active gaming and a little bit lower score on game engagement (OR 0.95, CI 0.91-0.997; P=.04). Non-active gaming >7 h/wk was significantly associated with a more positive attitude toward non-active gaming (OR 2.6, CI 1.1-6.3; P=.035), a stronger habit regarding gaming (OR 3.0, CI 1.7-5.3; P<.001), having friends who spend more time on non-active gaming (OR 3.3, CI 1.46-7.53; P=.004), and a more positive image of a non-active gamer (OR 2, CI 1.07–3.75; P=.03). Conclusions Various factors were significantly associated with active gaming ≥1 h/wk and non-active gaming >7 h/wk. Active gaming is most strongly (negatively) associated with attitude with respect to non-active games, followed by observed active game behavior of brothers and sisters and attitude with respect to active gaming (positive associations). On the other hand, non-active gaming is most strongly associated with observed non-active game behavior of friends, habit strength regarding gaming and attitude toward non-active gaming (positive associations). Habit strength was a correlate of both active and non-active gaming, indicating that both types of gaming are habitual behaviors. Although these results should be interpreted with caution because of the limitations of the study, they do provide preliminary insights into potential correlates of active and non-active gaming that can be used for further research as well as preliminary direction for the development of effective intervention strategies for replacing non-active gaming by active gaming among adolescents. PMID:25654657
An Unofficial Guide to Web-Based Instructional Gaming and Simulation Resources.
ERIC Educational Resources Information Center
Kirk, James J.
Games and digital-based games and simulations are slowly becoming an accepted learning strategy. Public school teachers, college professors, corporate trainers, and military trainers are embracing games as an effective means of motivating learners and teaching complex concepts. Popular games include action games, adventure games,arcade games,…
A short-term ensemble wind speed forecasting system for wind power applications
NASA Astrophysics Data System (ADS)
Baidya Roy, S.; Traiteur, J. J.; Callicutt, D.; Smith, M.
2011-12-01
This study develops an adaptive, blended forecasting system to provide accurate wind speed forecasts 1 hour ahead of time for wind power applications. The system consists of an ensemble of 21 forecasts with different configurations of the Weather Research and Forecasting Single Column Model (WRFSCM) and a persistence model. The ensemble is calibrated against observations for a 2 month period (June-July, 2008) at a potential wind farm site in Illinois using the Bayesian Model Averaging (BMA) technique. The forecasting system is evaluated against observations for August 2008 at the same site. The calibrated ensemble forecasts significantly outperform the forecasts from the uncalibrated ensemble while significantly reducing forecast uncertainty under all environmental stability conditions. The system also generates significantly better forecasts than persistence, autoregressive (AR) and autoregressive moving average (ARMA) models during the morning transition and the diurnal convective regimes. This forecasting system is computationally more efficient than traditional numerical weather prediction models and can generate a calibrated forecast, including model runs and calibration, in approximately 1 minute. Currently, hour-ahead wind speed forecasts are almost exclusively produced using statistical models. However, numerical models have several distinct advantages over statistical models including the potential to provide turbulence forecasts. Hence, there is an urgent need to explore the role of numerical models in short-term wind speed forecasting. This work is a step in that direction and is likely to trigger a debate within the wind speed forecasting community.
NASA Astrophysics Data System (ADS)
Stevens, N. T.; Keranen, K. M.; Lambert, C.
2016-12-01
Induced seismicity in northern Oklahoma presents risk for infrastructure, but also an opportunity to gain new insights to earthquake processes [Petersen et al., 2016]. Here we present a double-difference tomographic study using TomoDD [Zhang and Thurber, 2003] in northern Oklahoma utilizing records from a dense broadband network over a 1-year period, constituting a catalog of over 10,000 local seismic events. We image a shallow (depth < 4 km) high-velocity structure consistent with the Nemaha uplift [Gay, 2003a], bounded by shallow, lower-velocity regions on either side, likely sedimentary strata at this depth bounding uplifted basement. Velocities within the uplift are lower than expected in subjacent crystalline basement rock (depth > 4 km). We suggest that this low velocity anomaly stems from enhanced fracturing and/or weathering of the basement in the Nemaha uplift in northern Oklahoma. This velocity anomaly is not observed in basement off the shoulders of the structure, particularly to the southeast of the Nemaha bounding fault. Enhanced fracturing, and related increases to permeability, would ease pressure migration from injection wells linked to increased seismicity in the region, and may explain the relative absence of seismicity coincident with this structure compared to it periphery. References Gay, S. Parker, J. (2003), The Nemaha Trend-A System of Compressional Thrust-Fold, Strike-Slilp Structural Features in Kansas and Oklahoma, Part 1, Shale Shak., 9-49. Petersen, M. D., C. S. Mueller, M. P. Moschetti, S. M. Hoover, A. L. Llenos, W. L. Ellsworth, A. J. Michael, J. L. Rubinstein, A. F. McGarr, and K. S. Rukstales (2016), 2016 One-Year Seismic Hazard Forecast for the Central and Eastern United States from Induced and Natural Earthquakes, Open-File Rep., doi:10.3133/OFR20161035. Zhang, H., and C. H. Thurber (2003), Double-difference tomography: The method and its application to the Hayward Fault, California, Bull. Seismol. Soc. Am., 93(5), 1875-1889, doi:10.1785/0120020190.
Healthy Gaming – Video Game Design to promote Health
Brox, E.; Fernandez-Luque, L.; Tøllefsen, T.
2011-01-01
Background There is an increasing interest in health games including simulation tools, games for specific conditions, persuasive games to promote a healthy life style or exergames where physical exercise is used to control the game. Objective The objective of the article is to review current literature about available health games and the impact related to game design principles as well as some educational theory aspects. Methods Literature from the big databases and known sites with games for health has been searched to find articles about games for health purposes. The focus has been on educational games, persuasive games and exergames as well as articles describing game design principles. Results The medical objectives can either be a part of the game theme (intrinsic) or be totally dispatched (extrinsic), and particularly persuasive games seem to use extrinsic game design. Peer support is important, but there is only limited research on multiplayer health games. Evaluation of health games can be both medical and technical, and the focus will depend on the game purpose. Conclusion There is still not enough evidence to conclude which design principles work for what purposes since most of the literature in health serious games does not specify design methodologies, but it seems that extrinsic methods work in persuasion. However, when designing health care games it is important to define both the target group and main objective, and then design a game accordingly using sound game design principles, but also utilizing design elements to enhance learning and persuasion. A collaboration with health professionals from an early design stage is necessary both to ensure that the content is valid and to have the game validated from a clinical viewpoint. Patients need to be involved, especially to improve usability. More research should be done on social aspects in health games, both related to learning and persuasion. PMID:23616865
Violence in Teen-Rated Video Games
Haninger, Kevin; Ryan, M. Seamus; Thompson, Kimberly M
2004-01-01
Context: Children's exposure to violence in the media remains a source of public health concern; however, violence in video games rated T (for “Teen”) by the Entertainment Software Rating Board (ESRB) has not been quantified. Objective: To quantify and characterize the depiction of violence and blood in T-rated video games. According to the ESRB, T-rated video games may be suitable for persons aged 13 years and older and may contain violence, mild or strong language, and/or suggestive themes. Design: We created a database of all 396 T-rated video game titles released on the major video game consoles in the United States by April 1, 2001 to identify the distribution of games by genre and to characterize the distribution of content descriptors for violence and blood assigned to these games. We randomly sampled 80 game titles (which included 81 games because 1 title included 2 separate games), played each game for at least 1 hour, and quantitatively assessed the content. Given the release of 2 new video game consoles, Microsoft Xbox and Nintendo GameCube, and a significant number of T-rated video games released after we drew our random sample, we played and assessed 9 additional games for these consoles. Finally, we assessed the content of 2 R-rated films, The Matrix and The Matrix: Reloaded, associated with the T-rated video game Enter the Matrix. Main Outcome Measures: Game genre; percentage of game play depicting violence; depiction of injury; depiction of blood; number of human and nonhuman fatalities; types of weapons used; whether injuring characters, killing characters, or destroying objects is rewarded or is required to advance in the game; and content that may raise concerns about marketing T-rated video games to children. Results: Based on analysis of the 396 T-rated video game titles, 93 game titles (23%) received content descriptors for both violence and blood, 280 game titles (71%) received only a content descriptor for violence, 9 game titles (2%) received only a content descriptor for blood, and 14 game titles (4%) received no content descriptors for violence or blood. In the random sample of 81 T-rated video games we played, 79 games (98%) involved intentional violence for an average of 36% of game play time, and 34 games (42%) contained blood. More than half of the games (51%) depicted 5 or more types of weapons, with players able to select weapons in 48 games (59%). We observed 37 games (46%) that rewarded or required the player to destroy objects, 73 games (90%) that rewarded or required the player to injure characters, and 56 games (69%) that rewarded or required the player to kill. We observed a total of 11,499 character deaths in the 81 games, occurring at an average rate of 122 deaths per hour of game play (range 0 to 1310). This included 5689 human deaths, occurring at an average rate of 61 human deaths per hour of game play (range 0 to 1291). Overall, we identified 44 games (54%) that depicted deaths to nonhuman characters and 51 games (63%) that depicted deaths to human characters, including the player. Conclusions: Content analysis suggests a significant amount of violence, injury, and death in T-rated video games. Given the large amount of violence involving guns and knives, the relative lack of blood suggests that many T-rated video games do not realistically portray the consequences of violence. Physicians and parents should appreciate that T-rated video games may be a source of exposure to violence and some unexpected content for children and adolescents, and that the majority of T-rated video games provide incentives to the players to commit simulated acts of violence. PMID:15208514
Violence in teen-rated video games.
Haninger, Kevin; Ryan, M Seamus; Thompson, Kimberly M
2004-03-11
Children's exposure to violence in the media remains a source of public health concern; however, violence in video games rated T (for "Teen") by the Entertainment Software Rating Board (ESRB) has not been quantified. To quantify and characterize the depiction of violence and blood in T-rated video games. According to the ESRB, T-rated video games may be suitable for persons aged 13 years and older and may contain violence, mild or strong language, and/or suggestive themes. We created a database of all 396 T-rated video game titles released on the major video game consoles in the United States by April 1, 2001 to identify the distribution of games by genre and to characterize the distribution of content descriptors for violence and blood assigned to these games. We randomly sampled 80 game titles (which included 81 games because 1 title included 2 separate games), played each game for at least 1 hour, and quantitatively assessed the content. Given the release of 2 new video game consoles, Microsoft Xbox and Nintendo GameCube, and a significant number of T-rated video games released after we drew our random sample, we played and assessed 9 additional games for these consoles. Finally, we assessed the content of 2 R-rated films, The Matrix and The Matrix: Reloaded, associated with the T-rated video game Enter the Matrix. Game genre; percentage of game play depicting violence; depiction of injury; depiction of blood; number of human and nonhuman fatalities; types of weapons used; whether injuring characters, killing characters, or destroying objects is rewarded or is required to advance in the game; and content that may raise concerns about marketing T-rated video games to children. Based on analysis of the 396 T-rated video game titles, 93 game titles (23%) received content descriptors for both violence and blood, 280 game titles (71%) received only a content descriptor for violence, 9 game titles (2%) received only a content descriptor for blood, and 14 game titles (4%) received no content descriptors for violence or blood. In the random sample of 81 T-rated video games we played, 79 games (98%) involved intentional violence for an average of 36% of game play time, and 34 games (42%) contained blood. More than half of the games (51%) depicted 5 or more types of weapons, with players able to select weapons in 48 games (59%). We observed 37 games (46%) that rewarded or required the player to destroy objects, 73 games (90%) that rewarded or required the player to injure characters, and 56 games (69%) that rewarded or required the player to kill. We observed a total of 11,499 character deaths in the 81 games, occurring at an average rate of 122 deaths per hour of game play (range 0 to 1310). This included 5689 human deaths, occurring at an average rate of 61 human deaths per hour of game play (range 0 to 1291). Overall, we identified 44 games (54%) that depicted deaths to nonhuman characters and 51 games (63%) that depicted deaths to human characters, including the player. Content analysis suggests a significant amount of violence, injury, and death in T-rated video games. Given the large amount of violence involving guns and knives, the relative lack of blood suggests that many T-rated video games do not realistically portray the consequences of violence. Physicians and parents should appreciate that T-rated video games may be a source of exposure to violence and some unexpected content for children and adolescents, and that the majority of T-rated video games provide incentives to the players to commit simulated acts of violence.
Bae, Jonghoon; Cha, Young-Jae; Lee, Hyungsuk; Lee, Boyun; Baek, Sojung; Choi, Semin
2017-01-01
This study examines whether the way that a person makes inferences about unknown events is associated with his or her social relations, more precisely, those characterized by ego network density that reflects the structure of a person’s immediate social relation. From the analysis of individual predictions over the Go match between AlphaGo and Sedol Lee in March 2016 in Seoul, Korea, this study shows that the low-density group scored higher than the high-density group in the accuracy of the prediction over a future state of a social event, i.e., the outcome of the first game. We corroborated this finding with three replication tests that asked the participants to predict the following: film awards, President Park’s impeachment in Korea, and the counterfactual assessment of the US presidential election. Taken together, this study suggests that network density is negatively associated with vision advantage, i.e., the ability to discover and forecast an unknown aspect of a social event. PMID:28222114
Communicating uncertainty in hydrological forecasts: mission impossible?
NASA Astrophysics Data System (ADS)
Ramos, Maria-Helena; Mathevet, Thibault; Thielen, Jutta; Pappenberger, Florian
2010-05-01
Cascading uncertainty in meteo-hydrological modelling chains for forecasting and integrated flood risk assessment is an essential step to improve the quality of hydrological forecasts. Although the best methodology to quantify the total predictive uncertainty in hydrology is still debated, there is a common agreement that one must avoid uncertainty misrepresentation and miscommunication, as well as misinterpretation of information by users. Several recent studies point out that uncertainty, when properly explained and defined, is no longer unwelcome among emergence response organizations, users of flood risk information and the general public. However, efficient communication of uncertain hydro-meteorological forecasts is far from being a resolved issue. This study focuses on the interpretation and communication of uncertain hydrological forecasts based on (uncertain) meteorological forecasts and (uncertain) rainfall-runoff modelling approaches to decision-makers such as operational hydrologists and water managers in charge of flood warning and scenario-based reservoir operation. An overview of the typical flow of uncertainties and risk-based decisions in hydrological forecasting systems is presented. The challenges related to the extraction of meaningful information from probabilistic forecasts and the test of its usefulness in assisting operational flood forecasting are illustrated with the help of two case-studies: 1) a study on the use and communication of probabilistic flood forecasting within the European Flood Alert System; 2) a case-study on the use of probabilistic forecasts by operational forecasters from the hydroelectricity company EDF in France. These examples show that attention must be paid to initiatives that promote or reinforce the active participation of expert forecasters in the forecasting chain. The practice of face-to-face forecast briefings, focusing on sharing how forecasters interpret, describe and perceive the model output forecasted scenarios, is essential. We believe that the efficient communication of uncertainty in hydro-meteorological forecasts is not a mission impossible. Questions remaining unanswered in probabilistic hydrological forecasting should not neutralize the goal of such a mission, and the suspense kept should instead act as a catalyst for overcoming the remaining challenges.
Bayesian flood forecasting methods: A review
NASA Astrophysics Data System (ADS)
Han, Shasha; Coulibaly, Paulin
2017-08-01
Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.
Verification of Space Weather Forecasts using Terrestrial Weather Approaches
NASA Astrophysics Data System (ADS)
Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.
2015-12-01
The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help MOSWOC forecasters view verification results in near real-time; plans to objectively assess flare forecasts under the EU Horizon 2020 FLARECAST project; and summarise ISES efforts to achieve consensus on verification.
A framework for improving a seasonal hydrological forecasting system using sensitivity analysis
NASA Astrophysics Data System (ADS)
Arnal, Louise; Pappenberger, Florian; Smith, Paul; Cloke, Hannah
2017-04-01
Seasonal streamflow forecasts are of great value for the socio-economic sector, for applications such as navigation, flood and drought mitigation and reservoir management for hydropower generation and water allocation to agriculture and drinking water. However, as we speak, the performance of dynamical seasonal hydrological forecasting systems (systems based on running seasonal meteorological forecasts through a hydrological model to produce seasonal hydrological forecasts) is still limited in space and time. In this context, the ESP (Ensemble Streamflow Prediction) remains an attractive forecasting method for seasonal streamflow forecasting as it relies on forcing a hydrological model (starting from the latest observed or simulated initial hydrological conditions) with historical meteorological observations. This makes it cheaper to run than a standard dynamical seasonal hydrological forecasting system, for which the seasonal meteorological forecasts will first have to be produced, while still producing skilful forecasts. There is thus the need to focus resources and time towards improvements in dynamical seasonal hydrological forecasting systems which will eventually lead to significant improvements in the skill of the streamflow forecasts generated. Sensitivity analyses are a powerful tool that can be used to disentangle the relative contributions of the two main sources of errors in seasonal streamflow forecasts, namely the initial hydrological conditions (IHC; e.g., soil moisture, snow cover, initial streamflow, among others) and the meteorological forcing (MF; i.e., seasonal meteorological forecasts of precipitation and temperature, input to the hydrological model). Sensitivity analyses are however most useful if they inform and change current operational practices. To this end, we propose a method to improve the design of a seasonal hydrological forecasting system. This method is based on sensitivity analyses, informing the forecasters as to which element of the forecasting chain (i.e., IHC or MF) could potentially lead to the highest increase in seasonal hydrological forecasting performance, after each forecast update.
What Older People Like to Play: Genre Preferences and Acceptance of Casual Games.
Chesham, Alvin; Wyss, Patric; Müri, René Martin; Mosimann, Urs Peter; Nef, Tobias
2017-04-18
In recent computerized cognitive training studies, video games have emerged as a promising tool that can benefit cognitive function and well-being. Whereas most video game training studies have used first-person shooter (FPS) action video games, subsequent studies found that older adults dislike this type of game and generally prefer casual video games (CVGs), which are a subtype of video games that are easy to learn and use simple rules and interfaces. Like other video games, CVGs are organized into genres (eg, puzzle games) based on the rule-directed interaction with the game. Importantly, game genre not only influences the ease of interaction and cognitive abilities CVGs demand, but also affects whether older adults are willing to play any particular genre. To date, studies looking at how different CVG genres resonate with older adults are lacking. The aim of this study was to investigate how much older adults enjoy different CVG genres and how favorably their CVG characteristics are rated. A total of 16 healthy adults aged 65 years and above playtested 7 CVGs from 4 genres: casual action, puzzle, simulation, and strategy video games. Thereafter, they rated casual game preference and acceptance of casual game characteristics using 4 scales from the Core Elements of the Gaming Experience Questionnaire (CEGEQ). For this, participants rated how much they liked the game (enjoyment), understood the rules of the game (game-play), learned to manipulate the game (control), and make the game their own (ownership). Overall, enjoyment and acceptance of casual game characteristics was high and significantly above the midpoint of the rating scale for all CVG genres. Mixed model analyses revealed that ratings of enjoyment and casual game characteristics were significantly influenced by CVG genre. Participants' mean enjoyment of casual puzzle games (mean 0.95 out of 1.00) was significantly higher than that for casual simulation games (mean 0.75 and 0.73). For casual game characteristics, casual puzzle and simulation games were given significantly higher game-play ratings than casual action games. Similarly, participants' control ratings for casual puzzle games were significantly higher than that for casual action and simulation games. Finally, ownership was rated significantly higher for casual puzzle and strategy games than for casual action games. The findings of this study show that CVGs have characteristics that are suitable and enjoyable for older adults. In addition, genre was found to influence enjoyment and ratings of CVG characteristics, indicating that puzzle games are particularly easy to understand, learn, and play, and are enjoyable. Future studies should continue exploring the potential of CVG interventions for older adults in improving cognitive function, everyday functioning, and well-being. We see particular potential for CVGs in people suffering from cognitive impairment due to dementia or brain injury. ©Alvin Chesham, Patric Wyss, René Martin Müri, Urs Peter Mosimann, Tobias Nef. Originally published in JMIR Serious Games (http://games.jmir.org), 18.04.2017.
Inclusive Competitive Game Play Through Balanced Sensory Feedback.
Westin, Thomas; Söderström, David; Karlsson, Olov; Peiris, Ranil
2017-01-01
While game accessibility has improved significantly the last few years, there are still barriers for equal participation and multiplayer issues have been less researched. Game balance is here about making the game fair in a player versus player competitive game. One difficult design task is to balance the game to be fair regardless of visual or hearing capabilities, with clearly different requirements. This paper explores a tentative design method for enabling inclusive competitive game-play without individual adaptations of game rules that could spoil the game. The method involved applying a unified design method to design an unbalanced game, then modifying visual feedback as a hypothetical balanced design, and testing the game with totally 52 people with and without visual or hearing disabilities in three workshops. Game balance was evaluated based on score differences and less structured qualitative data, and a redesign of the game was made. Conclusions are a tentative method for balancing a multiplayer, competitive game without changing game rules and how the method can be applied.
ERIC Educational Resources Information Center
Klopfenstein, Bruce C.
1989-01-01
Describes research that examined the strengths and weaknesses of technological forecasting methods by analyzing forecasting studies made for home video players. The discussion covers assessments and explications of correct and incorrect forecasting assumptions, and their implications for forecasting the adoption of home information technologies…
Uncertainties in Forecasting Streamflow using Entropy Theory
NASA Astrophysics Data System (ADS)
Cui, H.; Singh, V. P.
2017-12-01
Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.
A national-scale seasonal hydrological forecast system: development and evaluation over Britain
NASA Astrophysics Data System (ADS)
Bell, Victoria A.; Davies, Helen N.; Kay, Alison L.; Brookshaw, Anca; Scaife, Adam A.
2017-09-01
Skilful winter seasonal predictions for the North Atlantic circulation and northern Europe have now been demonstrated and the potential for seasonal hydrological forecasting in the UK is now being explored. One of the techniques being used combines seasonal rainfall forecasts provided by operational weather forecast systems with hydrological modelling tools to provide estimates of seasonal mean river flows up to a few months ahead. The work presented here shows how spatial information contained in a distributed hydrological model typically requiring high-resolution (daily or better) rainfall data can be used to provide an initial condition for a much simpler forecast model tailored to use low-resolution monthly rainfall forecasts. Rainfall forecasts (hindcasts
) from the GloSea5 model (1996 to 2009) are used to provide the first assessment of skill in these national-scale flow forecasts. The skill in the combined modelling system is assessed for different seasons and regions of Britain, and compared to what might be achieved using other approaches such as use of an ensemble of historical rainfall in a hydrological model, or a simple flow persistence forecast. The analysis indicates that only limited forecast skill is achievable for Spring and Summer seasonal hydrological forecasts; however, Autumn and Winter flows can be reasonably well forecast using (ensemble mean) rainfall forecasts based on either GloSea5 forecasts or historical rainfall (the preferred type of forecast depends on the region). Flow forecasts using ensemble mean GloSea5 rainfall perform most consistently well across Britain, and provide the most skilful forecasts overall at the 3-month lead time. Much of the skill (64 %) in the 1-month ahead seasonal flow forecasts can be attributed to the hydrological initial condition (particularly in regions with a significant groundwater contribution to flows), whereas for the 3-month ahead lead time, GloSea5 forecasts account for ˜ 70 % of the forecast skill (mostly in areas of high rainfall to the north and west) and only 30 % of the skill arises from hydrological memory (typically groundwater-dominated areas). Given the high spatial heterogeneity in typical patterns of UK rainfall and evaporation, future development of skilful spatially distributed seasonal forecasts could lead to substantial improvements in seasonal flow forecast capability, potentially benefitting practitioners interested in predicting hydrological extremes, not only in the UK but also across Europe.
Improving medium-range and seasonal hydroclimate forecasts in the southeast USA
NASA Astrophysics Data System (ADS)
Tian, Di
Accurate hydro-climate forecasts are important for decision making by water managers, agricultural producers, and other stake holders. Numerical weather prediction models and general circulation models may have potential for improving hydro-climate forecasts at different scales. In this study, forecast analogs of the Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) based on different approaches were evaluated for medium-range reference evapotranspiration (ETo), irrigation scheduling, and urban water demand forecasts in the southeast United States; the Climate Forecast System version 2 (CFSv2) and the North American national multi-model ensemble (NMME) were statistically downscaled for seasonal forecasts of ETo, precipitation (P) and 2-m temperature (T2M) at the regional level. The GFS mean temperature (Tmean), relative humidity, and wind speed (Wind) reforecasts combined with the climatology of Reanalysis 2 solar radiation (Rs) produced higher skill than using the direct GFS output only. Constructed analogs showed slightly higher skill than natural analogs for deterministic forecasts. Both irrigation scheduling driven by the GEFS-based ETo forecasts and GEFS-based ETo forecast skill were generally positive up to one week throughout the year. The GEFS improved ETo forecast skill compared to the GFS. The GEFS-based analog forecasts for the input variables of an operational urban water demand model were skillful when applied in the Tampa Bay area. The modified operational models driven by GEFS analog forecasts showed higher forecast skill than the operational model based on persistence. The results for CFSv2 seasonal forecasts showed maximum temperature (Tmax) and Rs had the greatest influence on ETo. The downscaled Tmax showed the highest predictability, followed by Tmean, Tmin, Rs, and Wind. The CFSv2 model could better predict ETo in cold seasons during El Nino Southern Oscillation (ENSO) events only when the forecast initial condition was in ENSO. Downscaled P and T2M forecasts were produced by directly downscaling the NMME P and T2M output or indirectly using the NMME forecasts of Nino3.4 sea surface temperatures to predict local-scale P and T2M. The indirect method generally showed the highest forecast skill which occurs in cold seasons. The bias-corrected NMME ensemble forecast skill did not outperform the best single model.
Mobile Game for Learning Bacteriology
ERIC Educational Resources Information Center
Sugimura, Ryo; Kawazu, Sotaro; Tamari, Hiroki; Watanabe, Kodai; Nishimura, Yohei; Oguma, Toshiki; Watanabe, Katsushiro; Kaneko, Kosuke; Okada, Yoshihiro; Yoshida, Motofumi; Takano, Shigeru; Inoue, Hitoshi
2014-01-01
This paper treats serious games. Recently, one of the game genres called serious game has become popular, which has other purposes besides enjoyments like education, training and so on. Especially, learning games of the serious games seem very attractive for the age of video games so that the authors developed a mobile game for learning…
Math Games for the Young Child.
ERIC Educational Resources Information Center
Azzolino, Agnes
This is a textbook of games for children of ages two through seven. In each section, games are listed from the basic to the more sophisticated and advanced. The book contains sections addressing: (1) counting and counting games; (2) travel games; (3) card games; (4) board games; and (5) games and activities with other things. (PK)
Kuss, Daria J; Louws, Jorik; Wiers, Reinout W
2012-09-01
Recently, there have been growing concerns about excessive online gaming. Playing Massively Multiplayer Online Role-Playing Games (MMORPGs) appears to be particularly problematic, because these games require a high degree of commitment and time investment from the players to the detriment of occupational, social, and other recreational activities and relations. A number of gaming motives have been linked to excessive online gaming in adolescents and young adults. We assessed 175 current MMORPG players and 90 nonplayers using a Web-based questionnaire regarding their gaming behavior, problems as consequences of gaming, and game motivations and tested their statistical associations. Results indicated that (a) MMORPG players are significantly more likely to experience gaming-related problems relative to nonplayers, and that (b) the gaming motivations escapism and mechanics significantly predicted excessive gaming and appeared as stronger predictors than time investment in game. The findings support the necessity of using measures that distinguish between different types of online games. In addition, this study proves useful regarding the current discussion on establishing (online) gaming addiction as a diagnosis in future categorizations of psychopathology.
A new Bayesian recursive technique for parameter estimation
NASA Astrophysics Data System (ADS)
Kaheil, Yasir H.; Gill, M. Kashif; McKee, Mac; Bastidas, Luis
2006-08-01
The performance of any model depends on how well its associated parameters are estimated. In the current application, a localized Bayesian recursive estimation (LOBARE) approach is devised for parameter estimation. The LOBARE methodology is an extension of the Bayesian recursive estimation (BARE) method. It is applied in this paper on two different types of models: an artificial intelligence (AI) model in the form of a support vector machine (SVM) application for forecasting soil moisture and a conceptual rainfall-runoff (CRR) model represented by the Sacramento soil moisture accounting (SAC-SMA) model. Support vector machines, based on statistical learning theory (SLT), represent the modeling task as a quadratic optimization problem and have already been used in various applications in hydrology. They require estimation of three parameters. SAC-SMA is a very well known model that estimates runoff. It has a 13-dimensional parameter space. In the LOBARE approach presented here, Bayesian inference is used in an iterative fashion to estimate the parameter space that will most likely enclose a best parameter set. This is done by narrowing the sampling space through updating the "parent" bounds based on their fitness. These bounds are actually the parameter sets that were selected by BARE runs on subspaces of the initial parameter space. The new approach results in faster convergence toward the optimal parameter set using minimum training/calibration data and fewer sets of parameter values. The efficacy of the localized methodology is also compared with the previously used BARE algorithm.
Advancing precision cosmology with 21 cm intensity mapping
NASA Astrophysics Data System (ADS)
Masui, Kiyoshi Wesley
In this thesis we make progress toward establishing the observational method of 21 cm intensity mapping as a sensitive and efficient method for mapping the large-scale structure of the Universe. In Part I we undertake theoretical studies to better understand the potential of intensity mapping. This includes forecasting the ability of intensity mapping experiments to constrain alternative explanations to dark energy for the Universe's accelerated expansion. We also considered how 21 cm observations of the neutral gas in the early Universe (after recombination but before reionization) could be used to detect primordial gravity waves, thus providing a window into cosmological inflation. Finally we showed that scientifically interesting measurements could in principle be performed using intensity mapping in the near term, using existing telescopes in pilot surveys or prototypes for larger dedicated surveys. Part II describes observational efforts to perform some of the first measurements using 21 cm intensity mapping. We develop a general data analysis pipeline for analyzing intensity mapping data from single dish radio telescopes. We then apply the pipeline to observations using the Green Bank Telescope. By cross-correlating the intensity mapping survey with a traditional galaxy redshift survey we put a lower bound on the amplitude of the 21 cm signal. The auto-correlation provides an upper bound on the signal amplitude and we thus constrain the signal from both above and below. This pilot survey represents a pioneering effort in establishing 21 cm intensity mapping as a probe of the Universe.
The effect of online violent video games on levels of aggression.
Hollingdale, Jack; Greitemeyer, Tobias
2014-01-01
In recent years the video game industry has surpassed both the music and video industries in sales. Currently violent video games are among the most popular video games played by consumers, most specifically First-Person Shooters (FPS). Technological advancements in game play experience including the ability to play online has accounted for this increase in popularity. Previous research, utilising the General Aggression Model (GAM), has identified that violent video games increase levels of aggression. Little is known, however, as to the effect of playing a violent video game online. Participants (N = 101) were randomly assigned to one of four experimental conditions; neutral video game--offline, neutral video game--online, violent video game--offline and violent video game--online. Following this they completed questionnaires to assess their attitudes towards the game and engaged in a chilli sauce paradigm to measure behavioural aggression. The results identified that participants who played a violent video game exhibited more aggression than those who played a neutral video game. Furthermore, this main effect was not particularly pronounced when the game was played online. These findings suggest that both playing violent video games online and offline compared to playing neutral video games increases aggression.
Clinical predictors of gaming abstinence in help-seeking adult problematic gamers.
King, Daniel L; Adair, Cam; Saunders, John B; Delfabbro, Paul H
2018-03-01
Research into the effectiveness of interventions for problematic gaming has been limited by a lack of data concerning the clinical characteristics of voluntary treatment-seekers; the nature and history of their gaming problems; and, their reasons for seeking help. The study aimed to identify variables predictive of short-term commitment to gaming abstinence following initial voluntary contact with an online help service. A total of 186 adult gamers with gaming-related problems were recruited online. Participants completed the DSM-5 Internet gaming disorder (IGD) checklist, Depression Anxiety Stress Scales-21, Internet Gaming Cognition Scale, Gaming Craving Scale, and Gaming Quality of Life Scale. A one-week follow up survey assessed adherence with intended gaming abstinence. Abstainers were less likely to have withdrawal symptoms and less likely to play action shooting games. Participants with mood symptoms (40% of the total) reported significantly more IGD symptoms, stronger maladaptive gaming cognitions (e.g., overvaluing game rewards), more previous occurrences of gaming problems, and poorer quality of life. However, mood symptoms did not predict abstinence from or continuation of gaming. Adults with gaming disorder seeking help to reduce their gaming may benefit initially from strategies that manage withdrawal and psychoeducation about riskier gaming activities. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Morin, C.; Quattrochi, D. A.; Zavodsky, B.; Case, J.
2015-12-01
Dengue fever (DF) is an important mosquito transmitted disease that is strongly influenced by meteorological and environmental conditions. Recent research has focused on forecasting DF case numbers based on meteorological data. However, these forecasting tools have generally relied on empirical models that require long DF time series to train. Additionally, their accuracy has been tested retrospectively, using past meteorological data. Consequently, the operational utility of the forecasts are still in question because the error associated with weather and climate forecasts are not reflected in the results. Using up-to-date weekly dengue case numbers for model parameterization and weather forecast data as meteorological input, we produced weekly forecasts of DF cases in San Juan, Puerto Rico. Each week, the past weeks' case counts were used to re-parameterize a process-based DF model driven with updated weather forecast data to generate forecasts of DF case numbers. Real-time weather forecast data was produced using the Weather Research and Forecasting (WRF) numerical weather prediction (NWP) system enhanced using additional high-resolution NASA satellite data. This methodology was conducted in a weekly iterative process with each DF forecast being evaluated using county-level DF cases reported by the Puerto Rico Department of Health. The one week DF forecasts were accurate especially considering the two sources of model error. First, weather forecasts were sometimes inaccurate and generally produced lower than observed temperatures. Second, the DF model was often overly influenced by the previous weeks DF case numbers, though this phenomenon could be lessened by increasing the number of simulations included in the forecast. Although these results are promising, we would like to develop a methodology to produce longer range forecasts so that public health workers can better prepare for dengue epidemics.
NASA Astrophysics Data System (ADS)
Liechti, K.; Panziera, L.; Germann, U.; Zappa, M.
2013-10-01
This study explores the limits of radar-based forecasting for hydrological runoff prediction. Two novel radar-based ensemble forecasting chains for flash-flood early warning are investigated in three catchments in the southern Swiss Alps and set in relation to deterministic discharge forecasts for the same catchments. The first radar-based ensemble forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second ensemble forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialised with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 h between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. A clear preference was found for the ensemble approach. Discharge forecasts perform better when forced by NORA and REAL-C2 rather then by deterministic weather radar data. Moreover, it was observed that using an ensemble of initial conditions at the forecast initialisation, as in REAL-C2, significantly improved the forecast skill. These forecasts also perform better then forecasts forced by ensemble rainfall forecasts (NORA) initialised form a single initial condition of the hydrological model. Thus the best results were obtained with the REAL-C2 forecasting chain. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic precipitation.
Evaluation of Flood Forecast and Warning in Elbe river basin - Impact of Forecaster's Strategy
NASA Astrophysics Data System (ADS)
Danhelka, Jan; Vlasak, Tomas
2010-05-01
Czech Hydrometeorological Institute (CHMI) is responsible for flood forecasting and warning in the Czech Republic. To meet that issue CHMI operates hydrological forecasting systems and publish flow forecast in selected profiles. Flood forecast and warning is an output of system that links observation (flow and atmosphere), data processing, weather forecast (especially NWP's QPF), hydrological modeling and modeled outputs evaluation and interpretation by forecaster. Forecast users are interested in final output without separating uncertainties of separate steps of described process. Therefore an evaluation of final operational forecasts was done for profiles within Elbe river basin produced by AquaLog forecasting system during period 2002 to 2008. Effects of uncertainties of observation, data processing and especially meteorological forecasts were not accounted separately. Forecast of flood levels exceedance (peak over the threshold) during forecasting period was the main criterion as flow increase forecast is of the highest importance. Other evaluation criteria included peak flow and volume difference. In addition Nash-Sutcliffe was computed separately for each time step (1 to 48 h) of forecasting period to identify its change with the lead time. Textual flood warnings are issued for administrative regions to initiate flood protection actions in danger of flood. Flood warning hit rate was evaluated at regions level and national level. Evaluation found significant differences of model forecast skill between forecasting profiles, particularly less skill was evaluated at small headwater basins due to domination of QPF uncertainty in these basins. The average hit rate was 0.34 (miss rate = 0.33, false alarm rate = 0.32). However its explored spatial difference is likely to be influenced also by different fit of parameters sets (due to different basin characteristics) and importantly by different impact of human factor. Results suggest that the practice of interactive model operation, experience and forecasting strategy differs between responsible forecasting offices. Warning is based on model outputs interpretation by hydrologists-forecaster. Warning hit rate reached 0.60 for threshold set to lowest flood stage of which 0.11 was underestimation of flood degree (miss 0.22, false alarm 0.28). Critical success index of model forecast was 0.34, while the same criteria for warning reached 0.55. We assume that the increase accounts not only to change of scale from single forecasting point to region for warning, but partly also to forecaster's added value. There is no official warning strategy preferred in the Czech Republic (f.e. tolerance towards higher false alarm rate). Therefore forecaster decision and personal strategy is of great importance. Results show quite successful warning for 1st flood level exceedance, over-warning for 2nd flood level, but under-warning for 3rd (highest) flood level. That suggests general forecaster's preference of medium level warning (2nd flood level is legally determined to be the start of the flood and flood protection activities). In conclusion human forecaster's experience and analysis skill increases flood warning performance notably. However society preference should be specifically addressed in the warning strategy definition to support forecaster's decision making.
An Optimization of Inventory Demand Forecasting in University Healthcare Centre
NASA Astrophysics Data System (ADS)
Bon, A. T.; Ng, T. K.
2017-01-01
Healthcare industry becomes an important field for human beings nowadays as it concerns about one’s health. With that, forecasting demand for health services is an important step in managerial decision making for all healthcare organizations. Hence, a case study was conducted in University Health Centre to collect historical demand data of Panadol 650mg for 68 months from January 2009 until August 2014. The aim of the research is to optimize the overall inventory demand through forecasting techniques. Quantitative forecasting or time series forecasting model was used in the case study to forecast future data as a function of past data. Furthermore, the data pattern needs to be identified first before applying the forecasting techniques. Trend is the data pattern and then ten forecasting techniques are applied using Risk Simulator Software. Lastly, the best forecasting techniques will be find out with the least forecasting error. Among the ten forecasting techniques include single moving average, single exponential smoothing, double moving average, double exponential smoothing, regression, Holt-Winter’s additive, Seasonal additive, Holt-Winter’s multiplicative, seasonal multiplicative and Autoregressive Integrated Moving Average (ARIMA). According to the forecasting accuracy measurement, the best forecasting technique is regression analysis.
Air Pollution Forecasts: An Overview
Bai, Lu; Wang, Jianzhou; Lu, Haiyan
2018-01-01
Air pollution is defined as a phenomenon harmful to the ecological system and the normal conditions of human existence and development when some substances in the atmosphere exceed a certain concentration. In the face of increasingly serious environmental pollution problems, scholars have conducted a significant quantity of related research, and in those studies, the forecasting of air pollution has been of paramount importance. As a precaution, the air pollution forecast is the basis for taking effective pollution control measures, and accurate forecasting of air pollution has become an important task. Extensive research indicates that the methods of air pollution forecasting can be broadly divided into three classical categories: statistical forecasting methods, artificial intelligence methods, and numerical forecasting methods. More recently, some hybrid models have been proposed, which can improve the forecast accuracy. To provide a clear perspective on air pollution forecasting, this study reviews the theory and application of those forecasting models. In addition, based on a comparison of different forecasting methods, the advantages and disadvantages of some methods of forecasting are also provided. This study aims to provide an overview of air pollution forecasting methods for easy access and reference by researchers, which will be helpful in further studies. PMID:29673227
Air Pollution Forecasts: An Overview.
Bai, Lu; Wang, Jianzhou; Ma, Xuejiao; Lu, Haiyan
2018-04-17
Air pollution is defined as a phenomenon harmful to the ecological system and the normal conditions of human existence and development when some substances in the atmosphere exceed a certain concentration. In the face of increasingly serious environmental pollution problems, scholars have conducted a significant quantity of related research, and in those studies, the forecasting of air pollution has been of paramount importance. As a precaution, the air pollution forecast is the basis for taking effective pollution control measures, and accurate forecasting of air pollution has become an important task. Extensive research indicates that the methods of air pollution forecasting can be broadly divided into three classical categories: statistical forecasting methods, artificial intelligence methods, and numerical forecasting methods. More recently, some hybrid models have been proposed, which can improve the forecast accuracy. To provide a clear perspective on air pollution forecasting, this study reviews the theory and application of those forecasting models. In addition, based on a comparison of different forecasting methods, the advantages and disadvantages of some methods of forecasting are also provided. This study aims to provide an overview of air pollution forecasting methods for easy access and reference by researchers, which will be helpful in further studies.
Analog-Based Postprocessing of Navigation-Related Hydrological Ensemble Forecasts
NASA Astrophysics Data System (ADS)
Hemri, S.; Klein, B.
2017-11-01
Inland waterway transport benefits from probabilistic forecasts of water levels as they allow to optimize the ship load and, hence, to minimize the transport costs. Probabilistic state-of-the-art hydrologic ensemble forecasts inherit biases and dispersion errors from the atmospheric ensemble forecasts they are driven with. The use of statistical postprocessing techniques like ensemble model output statistics (EMOS) allows for a reduction of these systematic errors by fitting a statistical model based on training data. In this study, training periods for EMOS are selected based on forecast analogs, i.e., historical forecasts that are similar to the forecast to be verified. Due to the strong autocorrelation of water levels, forecast analogs have to be selected based on entire forecast hydrographs in order to guarantee similar hydrograph shapes. Custom-tailored measures of similarity for forecast hydrographs comprise hydrological series distance (SD), the hydrological matching algorithm (HMA), and dynamic time warping (DTW). Verification against observations reveals that EMOS forecasts for water level at three gauges along the river Rhine with training periods selected based on SD, HMA, and DTW compare favorably with reference EMOS forecasts, which are based on either seasonal training periods or on training periods obtained by dividing the hydrological forecast trajectories into runoff regimes.
Michael A. Fosberg
1987-01-01
Future improvements in the meteorological forecasts used in fire management will come from improvements in three areas: observational systems, forecast techniques, and postprocessing of forecasts and better integration of this information into the fire management process.
Accuracy of short‐term sea ice drift forecasts using a coupled ice‐ocean model
Zhang, Jinlun
2015-01-01
Abstract Arctic sea ice drift forecasts of 6 h–9 days for the summer of 2014 are generated using the Marginal Ice Zone Modeling and Assimilation System (MIZMAS); the model is driven by 6 h atmospheric forecasts from the Climate Forecast System (CFSv2). Forecast ice drift speed is compared to drifting buoys and other observational platforms. Forecast positions are compared with actual positions 24 h–8 days since forecast. Forecast results are further compared to those from the forecasts generated using an ice velocity climatology driven by multiyear integrations of the same model. The results are presented in the context of scheduling the acquisition of high‐resolution images that need to follow buoys or scientific research platforms. RMS errors for ice speed are on the order of 5 km/d for 24–48 h since forecast using the sea ice model compared with 9 km/d using climatology. Predicted buoy position RMS errors are 6.3 km for 24 h and 14 km for 72 h since forecast. Model biases in ice speed and direction can be reduced by adjusting the air drag coefficient and water turning angle, but the adjustments do not affect verification statistics. This suggests that improved atmospheric forecast forcing may further reduce the forecast errors. The model remains skillful for 8 days. Using the forecast model increases the probability of tracking a target drifting in sea ice with a 10 km × 10 km image from 60 to 95% for a 24 h forecast and from 27 to 73% for a 48 h forecast. PMID:27818852
Two-step forecast of geomagnetic storm using coronal mass ejection and solar wind condition
Kim, R-S; Moon, Y-J; Gopalswamy, N; Park, Y-D; Kim, Y-H
2014-01-01
To forecast geomagnetic storms, we had examined initially observed parameters of coronal mass ejections (CMEs) and introduced an empirical storm forecast model in a previous study. Now we suggest a two-step forecast considering not only CME parameters observed in the solar vicinity but also solar wind conditions near Earth to improve the forecast capability. We consider the empirical solar wind criteria derived in this study (Bz ≤ −5 nT or Ey ≥ 3 mV/m for t≥ 2 h for moderate storms with minimum Dst less than −50 nT) and a Dst model developed by Temerin and Li (2002, 2006) (TL model). Using 55 CME-Dst pairs during 1997 to 2003, our solar wind criteria produce slightly better forecasts for 31 storm events (90%) than the forecasts based on the TL model (87%). However, the latter produces better forecasts for 24 nonstorm events (88%), while the former correctly forecasts only 71% of them. We then performed the two-step forecast. The results are as follows: (i) for 15 events that are incorrectly forecasted using CME parameters, 12 cases (80%) can be properly predicted based on solar wind conditions; (ii) if we forecast a storm when both CME and solar wind conditions are satisfied (∩), the critical success index becomes higher than that from the forecast using CME parameters alone, however, only 25 storm events (81%) are correctly forecasted; and (iii) if we forecast a storm when either set of these conditions is satisfied (∪), all geomagnetic storms are correctly forecasted. PMID:26213515
Two-step forecast of geomagnetic storm using coronal mass ejection and solar wind condition.
Kim, R-S; Moon, Y-J; Gopalswamy, N; Park, Y-D; Kim, Y-H
2014-04-01
To forecast geomagnetic storms, we had examined initially observed parameters of coronal mass ejections (CMEs) and introduced an empirical storm forecast model in a previous study. Now we suggest a two-step forecast considering not only CME parameters observed in the solar vicinity but also solar wind conditions near Earth to improve the forecast capability. We consider the empirical solar wind criteria derived in this study ( B z ≤ -5 nT or E y ≥ 3 mV/m for t ≥ 2 h for moderate storms with minimum Dst less than -50 nT) and a Dst model developed by Temerin and Li (2002, 2006) (TL model). Using 55 CME- Dst pairs during 1997 to 2003, our solar wind criteria produce slightly better forecasts for 31 storm events (90%) than the forecasts based on the TL model (87%). However, the latter produces better forecasts for 24 nonstorm events (88%), while the former correctly forecasts only 71% of them. We then performed the two-step forecast. The results are as follows: (i) for 15 events that are incorrectly forecasted using CME parameters, 12 cases (80%) can be properly predicted based on solar wind conditions; (ii) if we forecast a storm when both CME and solar wind conditions are satisfied (∩), the critical success index becomes higher than that from the forecast using CME parameters alone, however, only 25 storm events (81%) are correctly forecasted; and (iii) if we forecast a storm when either set of these conditions is satisfied (∪), all geomagnetic storms are correctly forecasted.
Computer Games as Instructional Tools.
ERIC Educational Resources Information Center
Bright, George W.; Harvey, John G.
1984-01-01
Defines games, instructional games, and computer instructional games; discusses several unique capabilities that facilitate game playing and may make computer games more attractive to students than noncomputer alternatives; and examines the potential disadvantages of using instructional computer games on a regular basis. (MBR)
Remote gaming on resource-constrained devices
NASA Astrophysics Data System (ADS)
Reza, Waazim; Kalva, Hari; Kaufman, Richard
2010-08-01
Games have become important applications on mobile devices. A mobile gaming approach known as remote gaming is being developed to support games on low cost mobile devices. In the remote gaming approach, the responsibility of rendering a game and advancing the game play is put on remote servers instead of the resource constrained mobile devices. The games rendered on the servers are encoded as video and streamed to mobile devices. Mobile devices gather user input and stream the commands back to the servers to advance game play. With this solution, mobile devices with video playback and network connectivity can become game consoles. In this paper we present the design and development of such a system and evaluate the performance and design considerations to maximize the end user gaming experience.
NASA Astrophysics Data System (ADS)
Brown, G.
2017-12-01
Sediment diversions have been proposed as a crucial component of the restoration of Coastal Louisiana. They are generally characterized as a means of creating land by mimicking natural crevasse-splay sub-delta processes. However, the criteria that are often promoted to optimize the performance of these diversions (i.e. large, sand-rich diversions into existing, degraded wetlands) are at odds with the natural processes that govern the development of crevasse-splay sub-deltas (typically sand-lean or sand-neutral diversions into open water). This is due in large part to the fact that these optimization criteria have been developed in the absence of consideration for the natural constraints associated with fundamental hydraulics: specifically, the conservation of mechanical energy. Although the implementation of the aforementioned optimization criteria have the potential to greatly increase the land-building capacity of a given diversion, the concomitant widespread inundation of the existing wetlands (an unavoidable consequence of diverting into a shallow, vegetated embayment), and the resultant stresses on existing wetland vegetation, have the potential to dramatically accelerate the loss of these existing wetlands. Hence, there are inherent uncertainties in the forecasted performance of sediment diversions that are designed according to the criteria mentioned above. This talk details the reasons for these uncertainties, using analytic and numerical model results, together with evidence from field observations and experiments. The likelihood that, in the foreseeable future, these uncertainties can be reduced, or even rationally bounded, is discussed.
Medium-range fire weather forecasts
J.O. Roads; K. Ueyoshi; S.C. Chen; J. Alpert; F. Fujioka
1991-01-01
The forecast skill of theNational Meteorological Center's medium range forecast (MRF) numerical forecasts of fire weather variables is assessed for the period June 1,1988 to May 31,1990. Near-surface virtual temperature, relative humidity, wind speed and a derived fire weather index (FWI) are forecast well by the MRF model. However, forecast relative humidity has...
Probability fire weather forecasts .. show promise in 3-year trial
Paul G. Scowcroft
1970-01-01
Probability fire weather forecasts were compared with categorical and climatological forecasts in a trial in southern California during the 1965-1967 fire seasons. Equations were developed to express the reliability of forecasts and degree of skill shown by the forecaster. Evaluation of 336 daily reports suggests that probability forecasts were more reliable. For...
NASA Products to Enhance Energy Utility Load Forecasting
NASA Technical Reports Server (NTRS)
Lough, G.; Zell, E.; Engel-Cox, J.; Fungard, Y.; Jedlovec, G.; Stackhouse, P.; Homer, R.; Biley, S.
2012-01-01
Existing energy load forecasting tools rely upon historical load and forecasted weather to predict load within energy company service areas. The shortcomings of load forecasts are often the result of weather forecasts that are not at a fine enough spatial or temporal resolution to capture local-scale weather events. This project aims to improve the performance of load forecasting tools through the integration of high-resolution, weather-related NASA Earth Science Data, such as temperature, relative humidity, and wind speed. Three companies are participating in operational testing one natural gas company, and two electric providers. Operational results comparing load forecasts with and without NASA weather forecasts have been generated since March 2010. We have worked with end users at the three companies to refine selection of weather forecast information and optimize load forecast model performance. The project will conclude in 2012 with transitioning documented improvements from the inclusion of NASA forecasts for sustained use by energy utilities nationwide in a variety of load forecasting tools. In addition, Battelle has consulted with energy companies nationwide to document their information needs for long-term planning, in light of climate change and regulatory impacts.
What Older People Like to Play: Genre Preferences and Acceptance of Casual Games
Chesham, Alvin; Wyss, Patric; Müri, René Martin
2017-01-01
Background In recent computerized cognitive training studies, video games have emerged as a promising tool that can benefit cognitive function and well-being. Whereas most video game training studies have used first-person shooter (FPS) action video games, subsequent studies found that older adults dislike this type of game and generally prefer casual video games (CVGs), which are a subtype of video games that are easy to learn and use simple rules and interfaces. Like other video games, CVGs are organized into genres (eg, puzzle games) based on the rule-directed interaction with the game. Importantly, game genre not only influences the ease of interaction and cognitive abilities CVGs demand, but also affects whether older adults are willing to play any particular genre. To date, studies looking at how different CVG genres resonate with older adults are lacking. Objective The aim of this study was to investigate how much older adults enjoy different CVG genres and how favorably their CVG characteristics are rated. Methods A total of 16 healthy adults aged 65 years and above playtested 7 CVGs from 4 genres: casual action, puzzle, simulation, and strategy video games. Thereafter, they rated casual game preference and acceptance of casual game characteristics using 4 scales from the Core Elements of the Gaming Experience Questionnaire (CEGEQ). For this, participants rated how much they liked the game (enjoyment), understood the rules of the game (game-play), learned to manipulate the game (control), and make the game their own (ownership). Results Overall, enjoyment and acceptance of casual game characteristics was high and significantly above the midpoint of the rating scale for all CVG genres. Mixed model analyses revealed that ratings of enjoyment and casual game characteristics were significantly influenced by CVG genre. Participants’ mean enjoyment of casual puzzle games (mean 0.95 out of 1.00) was significantly higher than that for casual simulation games (mean 0.75 and 0.73). For casual game characteristics, casual puzzle and simulation games were given significantly higher game-play ratings than casual action games. Similarly, participants’ control ratings for casual puzzle games were significantly higher than that for casual action and simulation games. Finally, ownership was rated significantly higher for casual puzzle and strategy games than for casual action games. Conclusions The findings of this study show that CVGs have characteristics that are suitable and enjoyable for older adults. In addition, genre was found to influence enjoyment and ratings of CVG characteristics, indicating that puzzle games are particularly easy to understand, learn, and play, and are enjoyable. Future studies should continue exploring the potential of CVG interventions for older adults in improving cognitive function, everyday functioning, and well-being. We see particular potential for CVGs in people suffering from cognitive impairment due to dementia or brain injury. PMID:28420601
Optimizing Tsunami Forecast Model Accuracy
NASA Astrophysics Data System (ADS)
Whitmore, P.; Nyland, D. L.; Huang, P. Y.
2015-12-01
Recent tsunamis provide a means to determine the accuracy that can be expected of real-time tsunami forecast models. Forecast accuracy using two different tsunami forecast models are compared for seven events since 2006 based on both real-time application and optimized, after-the-fact "forecasts". Lessons learned by comparing the forecast accuracy determined during an event to modified applications of the models after-the-fact provide improved methods for real-time forecasting for future events. Variables such as source definition, data assimilation, and model scaling factors are examined to optimize forecast accuracy. Forecast accuracy is also compared for direct forward modeling based on earthquake source parameters versus accuracy obtained by assimilating sea level data into the forecast model. Results show that including assimilated sea level data into the models increases accuracy by approximately 15% for the events examined.
Seasonal Water Balance Forecasts for Drought Early Warning in Ethiopia
NASA Astrophysics Data System (ADS)
Spirig, Christoph; Bhend, Jonas; Liniger, Mark
2016-04-01
Droughts severely impact Ethiopian agricultural production. Successful early warning for drought conditions in the upcoming harvest season therefore contributes to better managing food shortages arising from adverse climatic conditions. So far, however, meteorological seasonal forecasts have not been used in Ethiopia's national food security early warning system (i.e. the LEAP platform). Here we analyse the forecast quality of seasonal forecasts of total rainfall and of the meteorological water balance as a proxy for plant available water. We analyse forecast skill of June to September rainfall and water balance from dynamical seasonal forecast systems, the ECMWF System4 and EC-EARTH global forecasting systems. Rainfall forecasts outperform forecasts assuming a stationary climate mainly in north-eastern Ethiopia - an area that is particularly vulnerable to droughts. Forecasts of the water balance index seem to be even more skilful and thus more useful than pure rainfall forecasts. The results vary though for different lead times and skill measures employed. We further explore the potential added value of dynamically downscaling the forecasts through several dynamical regional climate models made available through the EU FP7 project EUPORIAS. Preliminary results suggest that dynamically downscaled seasonal forecasts are not significantly better compared with seasonal forecasts from the global models. We conclude that seasonal forecasts of a simple climate index such as the water balance have the potential to benefit drought early warning in Ethiopia, both due to its positive predictive skill and higher usefulness than seasonal mean quantities.
A Decision Support System for effective use of probability forecasts
NASA Astrophysics Data System (ADS)
De Kleermaeker, Simone; Verkade, Jan
2013-04-01
Often, water management decisions are based on hydrological forecasts. These forecasts, however, are affected by inherent uncertainties. It is increasingly common for forecasting agencies to make explicit estimates of these uncertainties and thus produce probabilistic forecasts. Associated benefits include the decision makers' increased awareness of forecasting uncertainties and the potential for risk-based decision-making. Also, a stricter separation of responsibilities between forecasters and decision maker can be made. However, simply having probabilistic forecasts available is not sufficient to realise the associated benefits. Additional effort is required in areas such as forecast visualisation and communication, decision making in uncertainty and forecast verification. Also, revised separation of responsibilities requires a shift in institutional arrangements and responsibilities. A recent study identified a number of additional issues related to the effective use of probability forecasts. When moving from deterministic to probability forecasting, a dimension is added to an already multi-dimensional problem; this makes it increasingly difficult for forecast users to extract relevant information from a forecast. A second issue is that while probability forecasts provide a necessary ingredient for risk-based decision making, other ingredients may not be present. For example, in many cases no estimates of flood damage, of costs of management measures and of damage reduction are available. This paper presents the results of the study, including some suggestions for resolving these issues and the integration of those solutions in a prototype decision support system (DSS). A pathway for further development of the DSS is outlined.
Evaluation of statistical models for forecast errors from the HBV model
NASA Astrophysics Data System (ADS)
Engeland, Kolbjørn; Renard, Benjamin; Steinsland, Ingelin; Kolberg, Sjur
2010-04-01
SummaryThree statistical models for the forecast errors for inflow into the Langvatn reservoir in Northern Norway have been constructed and tested according to the agreement between (i) the forecast distribution and the observations and (ii) median values of the forecast distribution and the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order auto-regressive model was constructed for the forecast errors. The parameters were conditioned on weather classes. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order auto-regressive model was constructed for the forecast errors. For the third model positive and negative errors were modeled separately. The errors were first NQT-transformed before conditioning the mean error values on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: we wanted (a) the forecast distribution to be reliable; (b) the forecast intervals to be narrow; (c) the median values of the forecast distribution to be close to the observed values. Models 1 and 2 gave almost identical results. The median values improved the forecast with Nash-Sutcliffe R eff increasing from 0.77 for the original forecast to 0.87 for the corrected forecasts. Models 1 and 2 over-estimated the forecast intervals but gave the narrowest intervals. Their main drawback was that the distributions are less reliable than Model 3. For Model 3 the median values did not fit well since the auto-correlation was not accounted for. Since Model 3 did not benefit from the potential variance reduction that lies in bias estimation and removal it gave on average wider forecasts intervals than the two other models. At the same time Model 3 on average slightly under-estimated the forecast intervals, probably explained by the use of average measures to evaluate the fit.
An Overview of Structural Characteristics in Problematic Video Game Playing.
Griffiths, Mark D; Nuyens, Filip
2017-01-01
There are many different factors involved in how and why people develop problems with video game playing. One such set of factors concerns the structural characteristics of video games (i.e., the structure, elements, and components of the video games themselves). Much of the research examining the structural characteristics of video games was initially based on research and theorizing from the gambling studies field. The present review briefly overviews the key papers in the field to date. The paper examines a number of areas including (i) similarities in structural characteristics of gambling and video gaming, (ii) structural characteristics in video games, (iii) narrative and flow in video games, (iv) structural characteristic taxonomies for video games, and (v) video game structural characteristics and game design ethics. Many of the studies carried out to date are small-scale, and comprise self-selected convenience samples (typically using self-report surveys or non-ecologically valid laboratory experiments). Based on the small amount of empirical data, it appears that structural features that take a long time to achieve in-game are the ones most associated with problematic video game play (e.g., earning experience points, managing in-game resources, mastering the video game, getting 100% in-game). The study of video games from a structural characteristic perspective is of benefit to many different stakeholders including academic researchers, video game players, and video game designers, as well as those interested in prevention and policymaking by making the games more socially responsible. It is important that researchers understand and recognize the psycho-social effects and impacts that the structural characteristics of video games can have on players, both positive and negative.
Kim, Jun Won; Han, Doug Hyun; Park, Doo Byung; Min, Kyung Joon; Na, Churl; Won, Su Kyung; Park, Ga Na
2010-03-01
Psychobiological traits may be associated with excessive Internet use. This study assessed the relationships between biogenetic traits, the amount of time spent in online game playing, and the genre of the online game being played. Five hundred sixty five students who enjoyed one of the four types of games included in this study were recruited. The types of games examined included role playing games (RPG), real-time strategy games (RTS), first person shooting games (FPS), and sports games. Behavioral patterns of game play, academic performance, and player biogenetic characteristics were assessed. The amount of time that the participants spent playing online games was significantly greater on weekends than on weekdays. On weekends, the types of games with the largest numbers of participants who played games for more than three hours were ranked as follows: RPG and FPS, RTS, and sports games. The Young's Internet Addiction Scale (YIAS)score for the RPG group was the highest among the groups of the four types of game players. The time that participants spent playing games on weekdays was negatively associated with academic performance, especially for the RPG and FPS groups. Compared with the other groups, the RPG and RTS groups had higher novelty seeking (NS) scores and self-directedness (SD) scores, respectively. Additionally, the sports game group had higher reward dependency scores than the other groups. These results suggest that RPGs may have specific factors that are attractive to latent game addicts with higher NS scores. Additionally, excessive playing of online games is related to impaired academic performance.
Impact of Gamification of Vision Tests on the User Experience.
Bodduluri, Lakshmi; Boon, Mei Ying; Ryan, Malcolm; Dain, Stephen J
2017-08-01
Gamification has been incorporated into vision tests and vision therapies in the expectation that it may increase the user experience and engagement with the task. The current study aimed to understand how gamification affects the user experience, specifically during the undertaking of psychophysical tasks designed to estimate vision thresholds (chromatic and achromatic contrast sensitivity). Three tablet computer-based games were developed with three levels of gaming elements. Game 1 was designed to be a simple clinical test (no gaming elements), game 2 was similar to game 1 but with added gaming elements (i.e., feedback, scores, and sounds), and game 3 was a complete game. Participants (N = 144, age: 9.9-42 years) played three games in random order. The user experience for each game was assessed using a Short Feedback Questionnaire. The median (interquartile range) fun level for the three games was 2.5 (1.6), 3.9 (1.7), and 2.5 (2.8), respectively. Overall, participants reported greater fun level and higher preparedness to play the game again for game 2 than games 1 and 3 (P < 0.05). There were significant positive correlations observed between fun level and preparedness to play the game again for all the games (p < 0.05). Engagement (assessed as completion rates) did not differ between the games. Gamified version (game 2) was preferred to the other two versions. Over the short term, the careful application of gaming elements to vision tests was found to increase the fun level of users, without affecting engagement with the vision test.
Video game use in boys with autism spectrum disorder, ADHD, or typical development.
Mazurek, Micah O; Engelhardt, Christopher R
2013-08-01
The study objectives were to examine video game use in boys with autism spectrum disorder (ASD) compared with those with ADHD or typical development (TD) and to examine how specific symptoms and game features relate to problematic video game use across groups. Participants included parents of boys (aged 8-18) with ASD (n = 56), ADHD (n = 44), or TD (n = 41). Questionnaires assessed daily hours of video game use, in-room video game access, video game genres, problematic video game use, ASD symptoms, and ADHD symptoms. Boys with ASD spent more time than did boys with TD playing video games (2.1 vs 1.2 h/d). Both the ASD and ADHD groups had greater in-room video game access and greater problematic video game use than the TD group. Multivariate models showed that inattentive symptoms predicted problematic game use for both the ASD and ADHD groups; and preferences for role-playing games predicted problematic game use in the ASD group only. Boys with ASD spend much more time playing video games than do boys with TD, and boys with ASD and ADHD are at greater risk for problematic video game use than are boys with TD. Inattentive symptoms, in particular, were strongly associated with problematic video game use for both groups, and role-playing game preferences may be an additional risk factor for problematic video game use among children with ASD. These findings suggest a need for longitudinal research to better understand predictors and outcomes of video game use in children with ASD and ADHD.
Kim, Jun Won; Park, Doo Byung; Min, Kyung Joon; Na, Churl; Won, Su Kyung; Park, Ga Na
2010-01-01
Objective Psychobiological traits may be associated with excessive Internet use. This study assessed the relationships between biogenetic traits, the amount of time spent in online game playing, and the genre of the online game being played. Methods Five hundred sixty five students who enjoyed one of the four types of games included in this study were recruited. The types of games examined included role playing games (RPG), real-time strategy games (RTS), first person shooting games (FPS), and sports games. Behavioral patterns of game play, academic performance, and player biogenetic characteristics were assessed. Results The amount of time that the participants spent playing online games was significantly greater on weekends than on weekdays. On weekends, the types of games with the largest numbers of participants who played games for more than three hours were ranked as follows: RPG and FPS, RTS, and sports games. The Young's Internet Addiction Scale (YIAS)score for the RPG group was the highest among the groups of the four types of game players. The time that participants spent playing games on weekdays was negatively associated with academic performance, especially for the RPG and FPS groups. Compared with the other groups, the RPG and RTS groups had higher novelty seeking (NS) scores and self-directedness (SD) scores, respectively. Additionally, the sports game group had higher reward dependency scores than the other groups. Conclusion These results suggest that RPGs may have specific factors that are attractive to latent game addicts with higher NS scores. Additionally, excessive playing of online games is related to impaired academic performance. PMID:20396428
The Association Between Video Game Play and Cognitive Function: Does Gaming Platform Matter?
Huang, Vivian; Young, Michaelia; Fiocco, Alexandra J
2017-11-01
Despite consumer growth, few studies have evaluated the cognitive effects of gaming using mobile devices. This study examined the association between video game play platform and cognitive performance. Furthermore, the differential effect of video game genre (action versus nonaction) was explored. Sixty undergraduate students completed a video game experience questionnaire, and we divided them into three groups: mobile video game players (MVGPs), console/computer video game players (CVGPs), and nonvideo game players (NVGPs). Participants completed a cognitive battery to assess executive function, and learning and memory. Controlling for sex and ethnicity, analyses showed that frequent video game play is associated with enhanced executive function, but not learning and memory. MVGPs were significantly more accurate on working memory performances than NVGPs. Both MVGPs and CVGPs were similarly associated with enhanced cognitive function, suggesting that platform does not significantly determine the benefits of frequent video game play. Video game platform was found to differentially associate with preference for action video game genre and motivation for gaming. Exploratory analyses show that sex significantly effects frequent video game play, platform and genre preference, and cognitive function. This study represents a novel exploration of the relationship between mobile video game play and cognition and adds support to the cognitive benefits of frequent video game play.
NASA Astrophysics Data System (ADS)
Matte, Simon; Boucher, Marie-Amélie; Boucher, Vincent; Fortier Filion, Thomas-Charles
2017-06-01
A large effort has been made over the past 10 years to promote the operational use of probabilistic or ensemble streamflow forecasts. Numerous studies have shown that ensemble forecasts are of higher quality than deterministic ones. Many studies also conclude that decisions based on ensemble rather than deterministic forecasts lead to better decisions in the context of flood mitigation. Hence, it is believed that ensemble forecasts possess a greater economic and social value for both decision makers and the general population. However, the vast majority of, if not all, existing hydro-economic studies rely on a cost-loss ratio framework that assumes a risk-neutral decision maker. To overcome this important flaw, this study borrows from economics and evaluates the economic value of early warning flood systems using the well-known Constant Absolute Risk Aversion (CARA) utility function, which explicitly accounts for the level of risk aversion of the decision maker. This new framework allows for the full exploitation of the information related to a forecasts' uncertainty, making it especially suited for the economic assessment of ensemble or probabilistic forecasts. Rather than comparing deterministic and ensemble forecasts, this study focuses on comparing different types of ensemble forecasts. There are multiple ways of assessing and representing forecast uncertainty. Consequently, there exist many different means of building an ensemble forecasting system for future streamflow. One such possibility is to dress deterministic forecasts using the statistics of past error forecasts. Such dressing methods are popular among operational agencies because of their simplicity and intuitiveness. Another approach is the use of ensemble meteorological forecasts for precipitation and temperature, which are then provided as inputs to one or many hydrological model(s). In this study, three concurrent ensemble streamflow forecasting systems are compared: simple statistically dressed deterministic forecasts, forecasts based on meteorological ensembles, and a variant of the latter that also includes an estimation of state variable uncertainty. This comparison takes place for the Montmorency River, a small flood-prone watershed in southern central Quebec, Canada. The assessment of forecasts is performed for lead times of 1 to 5 days, both in terms of forecasts' quality (relative to the corresponding record of observations) and in terms of economic value, using the new proposed framework based on the CARA utility function. It is found that the economic value of a forecast for a risk-averse decision maker is closely linked to the forecast reliability in predicting the upper tail of the streamflow distribution. Hence, post-processing forecasts to avoid over-forecasting could help improve both the quality and the value of forecasts.
Language Learners & Computer Games: From "Space Invaders" to "Second Life"
ERIC Educational Resources Information Center
Stanley, Graham; Mawer, Kyle
2008-01-01
The term serious game is often used to refer to "games used for training, advertising, simulation, or education." In this article, the authors use the term computer game in its broadest sense, believing it to encompass the broad spectrum of what is usually referred to now as all digital gaming (video games, console games, online games, etc.). They…
Games To Play with Babies. (Revised and Expanded).
ERIC Educational Resources Information Center
Silberg, Jackie
Aimed at parents with infants, this book is a collection of 250 simple, fun-filled games that can be played with children up to the age of 1. The games have been categorized into the following eight types: (1) growing and learning games; (2) special bonding games; (3) kitchen games; (4) laughing and having fun games; (5) art and singing games; (6)…
Cognitive behavioral game design: a unified model for designing serious games
Starks, Katryna
2014-01-01
Video games have a unique ability to engage, challenge, and motivate, which has led teachers, psychology specialists, political activists and health educators to find ways of using them to help people learn, grow and change. Serious games, as they are called, are defined as games that have a primary purpose other than entertainment. However, it is challenging to create games that both educate and entertain. While game designers have embraced some psychological concepts such as flow and mastery, understanding how these concepts work together within established psychological theory would assist them in creating effective serious games. Similarly, game design professionals have understood the propensity of video games to teach while lamenting that educators do not understand how to incorporate educational principles into game play in a way that preserves the entertainment. Bandura (2006) social cognitive theory (SCT) has been used successfully to create video games that create positive behavior outcomes, and teachers have successfully used Gardner’s (1983) theory of multiple intelligences (MIs) to create engaging, immersive learning experiences. Cognitive behavioral game design is a new framework that incorporates SCT and MI with game design principles to create a game design blueprint for serious games. PMID:24550858
Cognitive behavioral game design: a unified model for designing serious games.
Starks, Katryna
2014-01-01
Video games have a unique ability to engage, challenge, and motivate, which has led teachers, psychology specialists, political activists and health educators to find ways of using them to help people learn, grow and change. Serious games, as they are called, are defined as games that have a primary purpose other than entertainment. However, it is challenging to create games that both educate and entertain. While game designers have embraced some psychological concepts such as flow and mastery, understanding how these concepts work together within established psychological theory would assist them in creating effective serious games. Similarly, game design professionals have understood the propensity of video games to teach while lamenting that educators do not understand how to incorporate educational principles into game play in a way that preserves the entertainment. Bandura (2006) social cognitive theory (SCT) has been used successfully to create video games that create positive behavior outcomes, and teachers have successfully used Gardner's (1983) theory of multiple intelligences (MIs) to create engaging, immersive learning experiences. Cognitive behavioral game design is a new framework that incorporates SCT and MI with game design principles to create a game design blueprint for serious games.
Hilgard, Joseph; Engelhardt, Christopher R.; Bartholow, Bruce D.
2013-01-01
A new measure of individual habits and preferences in video game use is developed in order to better study the risk factors of pathological game use (i.e., excessively frequent or prolonged use, sometimes called “game addiction”). This measure was distributed to internet message boards for game enthusiasts and to college undergraduates. An exploratory factor analysis identified 9 factors: Story, Violent Catharsis, Violent Reward, Social Interaction, Escapism, Loss-Sensitivity, Customization, Grinding, and Autonomy. These factors demonstrated excellent fit in a subsequent confirmatory factor analysis, and, importantly, were found to reliably discriminate between inter-individual game preferences (e.g., Super Mario Brothers as compared to Call of Duty). Moreover, three factors were significantly related to pathological game use: the use of games to escape daily life, the use of games as a social outlet, and positive attitudes toward the steady accumulation of in-game rewards. The current research identifies individual preferences and motives relevant to understanding video game players' evaluations of different games and risk factors for pathological video game use. PMID:24058355
Hilgard, Joseph; Engelhardt, Christopher R; Bartholow, Bruce D
2013-01-01
A new measure of individual habits and preferences in video game use is developed in order to better study the risk factors of pathological game use (i.e., excessively frequent or prolonged use, sometimes called "game addiction"). This measure was distributed to internet message boards for game enthusiasts and to college undergraduates. An exploratory factor analysis identified 9 factors: Story, Violent Catharsis, Violent Reward, Social Interaction, Escapism, Loss-Sensitivity, Customization, Grinding, and Autonomy. These factors demonstrated excellent fit in a subsequent confirmatory factor analysis, and, importantly, were found to reliably discriminate between inter-individual game preferences (e.g., Super Mario Brothers as compared to Call of Duty). Moreover, three factors were significantly related to pathological game use: the use of games to escape daily life, the use of games as a social outlet, and positive attitudes toward the steady accumulation of in-game rewards. The current research identifies individual preferences and motives relevant to understanding video game players' evaluations of different games and risk factors for pathological video game use.
A hybrid approach EMD-HW for short-term forecasting of daily stock market time series data
NASA Astrophysics Data System (ADS)
Awajan, Ahmad Mohd; Ismail, Mohd Tahir
2017-08-01
Recently, forecasting time series has attracted considerable attention in the field of analyzing financial time series data, specifically within the stock market index. Moreover, stock market forecasting is a challenging area of financial time-series forecasting. In this study, a hybrid methodology between Empirical Mode Decomposition with the Holt-Winter method (EMD-HW) is used to improve forecasting performances in financial time series. The strength of this EMD-HW lies in its ability to forecast non-stationary and non-linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy and offers a new forecasting method in time series. The daily stock market time series data of 11 countries is applied to show the forecasting performance of the proposed EMD-HW. Based on the three forecast accuracy measures, the results indicate that EMD-HW forecasting performance is superior to traditional Holt-Winter forecasting method.
NASA Astrophysics Data System (ADS)
Fabianová, Jana; Kačmáry, Peter; Molnár, Vieroslav; Michalik, Peter
2016-10-01
Forecasting is one of the logistics activities and a sales forecast is the starting point for the elaboration of business plans. Forecast accuracy affects the business outcomes and ultimately may significantly affect the economic stability of the company. The accuracy of the prediction depends on the suitability of the use of forecasting methods, experience, quality of input data, time period and other factors. The input data are usually not deterministic but they are often of random nature. They are affected by uncertainties of the market environment, and many other factors. Taking into account the input data uncertainty, the forecast error can by reduced. This article deals with the use of the software tool for incorporating data uncertainty into forecasting. Proposals are presented of a forecasting approach and simulation of the impact of uncertain input parameters to the target forecasted value by this case study model. The statistical analysis and risk analysis of the forecast results is carried out including sensitivity analysis and variables impact analysis.
Addictive Online Games: Examining the Relationship Between Game Genres and Internet Gaming Disorder.
Lemmens, Jeroen S; Hendriks, Stefan J F
2016-04-01
Internet gaming disorder (IGD) is the most recent term used to describe problematic or pathological involvement with computer or video games. This study examined whether this disorder is more likely to involve pathological involvement with online (i.e., Internet) games as opposed to offline games. We also explored the addictive potential of nine video game genres by examining the relationship between IGD and 2,720 games played by a sample of 13- to 40-year olds (N = 2,442). Although time spent playing both online and offline games was related to IGD, online games showed much stronger correlations. This tendency is also reflected within various genres. Disordered gamers spent more than four times as much time playing online role-playing games than nondisordered gamers and more than thrice as much time playing online shooters, whereas no significant differences for offline games from these genres were found. Results are discussed within the frame of social interaction and competition provided by online games.
Wang, Chong-Wen; Chan, Cecilia L W; Mak, Kwok-Kei; Ho, Sai-Yin; Wong, Paul W C; Ho, Rainbow T H
2014-01-01
This pilot study investigated the patterns of video and internet gaming habits and the prevalence and correlates of gaming addiction in Hong Kong adolescents. A total of 503 students were recruited from two secondary schools. Addictive behaviors of video and internet gaming were assessed using the Game Addiction Scale. Risk factors for gaming addiction were examined using logistical regression. An overwhelming majority of the subjects (94%) reported using video or internet games, with one in six (15.6%) identified as having a gaming addiction. The risk for gaming addiction was significantly higher among boys, those with poor academic performance, and those who preferred multiplayer online games. Gaming addiction was significantly associated with the average time spent gaming per week, frequency of spending money on gaming, period of spending money on gaming, perceived family disharmony, and having more close friends. These results suggest that effective educational and preventative programs or strategies are needed.
Wang, Chong-Wen; Chan, Cecilia L. W.; Mak, Kwok-Kei; Ho, Sai-Yin; Wong, Paul W. C.; Ho, Rainbow T. H.
2014-01-01
This pilot study investigated the patterns of video and internet gaming habits and the prevalence and correlates of gaming addiction in Hong Kong adolescents. A total of 503 students were recruited from two secondary schools. Addictive behaviors of video and internet gaming were assessed using the Game Addiction Scale. Risk factors for gaming addiction were examined using logistical regression. An overwhelming majority of the subjects (94%) reported using video or internet games, with one in six (15.6%) identified as having a gaming addiction. The risk for gaming addiction was significantly higher among boys, those with poor academic performance, and those who preferred multiplayer online games. Gaming addiction was significantly associated with the average time spent gaming per week, frequency of spending money on gaming, period of spending money on gaming, perceived family disharmony, and having more close friends. These results suggest that effective educational and preventative programs or strategies are needed. PMID:25032242
Videogame Mechanics in Games for Health.
Baranowski, Moderator Tom; Lieberman, Participants Debra; Buday, Richard; Peng, Wei; Zimmerli, Lukas; Wiederhold, Brenda; Kato, Pamela M
2013-08-01
Game mechanics have been identified as "methods invoked by agents for interacting with the game world."(1) They are elements of game play that provide a primary source of interactivity and structure how videogames proceed. Many kinds of game mechanics have been generated. Some provide fun or enjoyment, others may provide excitement or even suspense (i.e., emotional aspects of game play), whereas some are guided by principles of behavior change theory. Game mechanics that succeed in off-the-shelf entertainment videogames, however, may not work in serious games, such as games for health (G4H). Although game mechanics are key to a serious videogame's success, maintaining a balance between the serious intent of the game while keeping it fun, there has been little attention to game mechanics in the academic G4H literature. Several eminent games for health developers (academics and nonacademics) were asked to share their experiences in regard to game mechanics in the serious videogames they have developed.
NASA Astrophysics Data System (ADS)
Shaman, J.; Stieglitz, M.; Zebiak, S.; Cane, M.; Day, J. F.
2002-12-01
We present an ensemble local hydrologic forecast derived from the seasonal forecasts of the International Research Institute (IRI) for Climate Prediction. Three- month seasonal forecasts were used to resample historical meteorological conditions and generate ensemble forcing datasets for a TOPMODEL-based hydrology model. Eleven retrospective forecasts were run at a Florida and New York site. Forecast skill was assessed for mean area modeled water table depth (WTD), i.e. near surface soil wetness conditions, and compared with WTD simulated with observed data. Hydrology model forecast skill was evident at the Florida site but not at the New York site. At the Florida site, persistence of hydrologic conditions and local skill of the IRI seasonal forecast contributed to the local hydrologic forecast skill. This forecast will permit probabilistic prediction of future hydrologic conditions. At the Florida site, we have also quantified the link between modeled WTD (i.e. drought) and the amplification and transmission of St. Louis Encephalitis virus (SLEV). We derive an empirical relationship between modeled land surface wetness and levels of SLEV transmission associated with human clinical cases. We then combine the seasonal forecasts of local, modeled WTD with this empirical relationship and produce retrospective probabilistic seasonal forecasts of epidemic SLEV transmission in Florida. Epidemic SLEV transmission forecast skill is demonstrated. These findings will permit real-time forecast of drought and resultant SLEV transmission in Florida.
NASA Astrophysics Data System (ADS)
Shafiee-Jood, M.; Cai, X.
2017-12-01
Advances in streamflow forecasts at different time scales offer a promise for proactive flood management and improved risk management. Despite the huge potential, previous studies have found that water resources managers are often not willing to incorporate streamflow forecasts information in decisions making, particularly in risky situations. While low accuracy of forecasts information is often cited as the main reason, some studies have found that implementation of streamflow forecasts sometimes is impeded by institutional obstacles and behavioral factors (e.g., risk perception). In fact, a seminal study by O'Connor et al. (2005) found that risk perception is the strongest determinant of forecast use while managers' perception about forecast reliability is not significant. In this study, we aim to address this issue again. However, instead of using survey data and regression analysis, we develop a theoretical framework to assess the user-perceived value of streamflow forecasts. The framework includes a novel behavioral component which incorporates both risk perception and perceived forecast reliability. The framework is then used in a hypothetical problem where reservoir operator should react to probabilistic flood forecasts with different reliabilities. The framework will allow us to explore the interactions among risk perception and perceived forecast reliability, and among the behavioral components and information accuracy. The findings will provide insights to improve the usability of flood forecasts information through better communication and education.
Parametric decadal climate forecast recalibration (DeFoReSt 1.0)
NASA Astrophysics Data System (ADS)
Pasternack, Alexander; Bhend, Jonas; Liniger, Mark A.; Rust, Henning W.; Müller, Wolfgang A.; Ulbrich, Uwe
2018-01-01
Near-term climate predictions such as decadal climate forecasts are increasingly being used to guide adaptation measures. For near-term probabilistic predictions to be useful, systematic errors of the forecasting systems have to be corrected. While methods for the calibration of probabilistic forecasts are readily available, these have to be adapted to the specifics of decadal climate forecasts including the long time horizon of decadal climate forecasts, lead-time-dependent systematic errors (drift) and the errors in the representation of long-term changes and variability. These features are compounded by small ensemble sizes to describe forecast uncertainty and a relatively short period for which typically pairs of reforecasts and observations are available to estimate calibration parameters. We introduce the Decadal Climate Forecast Recalibration Strategy (DeFoReSt), a parametric approach to recalibrate decadal ensemble forecasts that takes the above specifics into account. DeFoReSt optimizes forecast quality as measured by the continuous ranked probability score (CRPS). Using a toy model to generate synthetic forecast observation pairs, we demonstrate the positive effect on forecast quality in situations with pronounced and limited predictability. Finally, we apply DeFoReSt to decadal surface temperature forecasts from the MiKlip prototype system and find consistent, and sometimes considerable, improvements in forecast quality compared with a simple calibration of the lead-time-dependent systematic errors.
The Effects of Pathological Gaming on Aggressive Behavior
Valkenburg, Patti M.; Peter, Jochen
2010-01-01
Studies have shown that pathological involvement with computer or video games is related to excessive gaming binges and aggressive behavior. Our aims for this study were to longitudinally examine if pathological gaming leads to increasingly excessive gaming habits, and how pathological gaming may cause an increase in physical aggression. For this purpose, we conducted a two-wave panel study among 851 Dutch adolescents (49% female) of which 540 played games (30% female). Our analyses indicated that higher levels of pathological gaming predicted an increase in time spent playing games 6 months later. Time spent playing violent games specifically, and not just games per se, increased physical aggression. Furthermore, higher levels of pathological gaming, regardless of violent content, predicted an increase in physical aggression among boys. That this effect only applies to boys does not diminish its importance, because adolescent boys are generally the heaviest players of violent games and most susceptible to pathological involvement. PMID:20549320
The effects of pathological gaming on aggressive behavior.
Lemmens, Jeroen S; Valkenburg, Patti M; Peter, Jochen
2011-01-01
Studies have shown that pathological involvement with computer or video games is related to excessive gaming binges and aggressive behavior. Our aims for this study were to longitudinally examine if pathological gaming leads to increasingly excessive gaming habits, and how pathological gaming may cause an increase in physical aggression. For this purpose, we conducted a two-wave panel study among 851 Dutch adolescents (49% female) of which 540 played games (30% female). Our analyses indicated that higher levels of pathological gaming predicted an increase in time spent playing games 6 months later. Time spent playing violent games specifically, and not just games per se, increased physical aggression. Furthermore, higher levels of pathological gaming, regardless of violent content, predicted an increase in physical aggression among boys. That this effect only applies to boys does not diminish its importance, because adolescent boys are generally the heaviest players of violent games and most susceptible to pathological involvement.
Iyer, Swami; Reyes, Joshua; Killingback, Timothy
2014-01-01
The Traveler's Dilemma game and the Minimum Effort Coordination game are two social dilemmas that have attracted considerable attention due to the fact that the predictions of classical game theory are at odds with the results found when the games are studied experimentally. Moreover, a direct application of deterministic evolutionary game theory, as embodied in the replicator dynamics, to these games does not explain the observed behavior. In this work, we formulate natural variants of these two games as smoothed continuous-strategy games. We study the evolutionary dynamics of these continuous-strategy games, both analytically and through agent-based simulations, and show that the behavior predicted theoretically is in accord with that observed experimentally. Thus, these variants of the Traveler's Dilemma and the Minimum Effort Coordination games provide a simple resolution of the paradoxical behavior associated with the original games. PMID:24709851
Iyer, Swami; Reyes, Joshua; Killingback, Timothy
2014-01-01
The Traveler's Dilemma game and the Minimum Effort Coordination game are two social dilemmas that have attracted considerable attention due to the fact that the predictions of classical game theory are at odds with the results found when the games are studied experimentally. Moreover, a direct application of deterministic evolutionary game theory, as embodied in the replicator dynamics, to these games does not explain the observed behavior. In this work, we formulate natural variants of these two games as smoothed continuous-strategy games. We study the evolutionary dynamics of these continuous-strategy games, both analytically and through agent-based simulations, and show that the behavior predicted theoretically is in accord with that observed experimentally. Thus, these variants of the Traveler's Dilemma and the Minimum Effort Coordination games provide a simple resolution of the paradoxical behavior associated with the original games.
40 CFR Appendix A to Part 57 - Primary Nonferrous Smelter Order (NSO) Application
Code of Federal Regulations, 2011 CFR
2011-07-01
... Profit and Loss Summary A.4 Historical Capital Investment Summary B.1 Pre-Control Revenue Forecast B.2 Pre-Control Cost Forecast B.3 Pre-Control Forecast Profit and Loss Summary B.4 Constant Controls Revenue Forecast B.5 Constant Controls Cost Forecast B.6 Constant Controls Forecast Profit and Loss...
ERIC Educational Resources Information Center
Sonnenberg, William; And Others
The Second Annual Federal Forecasters Conference, "Forecasting and Public Policy", provided a forum where forecasters from various Federal agencies could meet and discuss aspects of forecasting in the U.S. Government. A total of 140 forecasters from 42 Federal agencies and other organizations attended the conference. Opening remarks by…
ERIC Educational Resources Information Center
Gerald, Debra E., Ed.
The Ninth Federal Forecasters Conference provided a forum in which forecasters from different federal agencies and other organizations could meet to discuss various aspects of forecasting in the United States. The theme was "Forecasting in an Era of Diminishing Resources." The conference was attended by 150 forecasters. A keynote address by…
ERIC Educational Resources Information Center
Silvern, Steven B.
1986-01-01
Video games may allow educators to reach children who have never before been incorporated into an educational experience within the confines of school. A distinction is made between types of arcade style games and educational games in Piagetian terms: practice games, symbolic games, games with rules, and games of construction. (LMO)
GloFAS-Seasonal: Operational Seasonal Ensemble River Flow Forecasts at the Global Scale
NASA Astrophysics Data System (ADS)
Emerton, Rebecca; Zsoter, Ervin; Smith, Paul; Salamon, Peter
2017-04-01
Seasonal hydrological forecasting has potential benefits for many sectors, including agriculture, water resources management and humanitarian aid. At present, no global scale seasonal hydrological forecasting system exists operationally; although smaller scale systems have begun to emerge around the globe over the past decade, a system providing consistent global scale seasonal forecasts would be of great benefit in regions where no other forecasting system exists, and to organisations operating at the global scale, such as disaster relief. We present here a new operational global ensemble seasonal hydrological forecast, currently under development at ECMWF as part of the Global Flood Awareness System (GloFAS). The proposed system, which builds upon the current version of GloFAS, takes the long-range forecasts from the ECMWF System4 ensemble seasonal forecast system (which incorporates the HTESSEL land surface scheme) and uses this runoff as input to the Lisflood routing model, producing a seasonal river flow forecast out to 4 months lead time, for the global river network. The seasonal forecasts will be evaluated using the global river discharge reanalysis, and observations where available, to determine the potential value of the forecasts across the globe. The seasonal forecasts will be presented as a new layer in the GloFAS interface, which will provide a global map of river catchments, indicating whether the catchment-averaged discharge forecast is showing abnormally high or low flows during the 4-month lead time. Each catchment will display the corresponding forecast as an ensemble hydrograph of the weekly-averaged discharge forecast out to 4 months, with percentile thresholds shown for comparison with the discharge climatology. The forecast visualisation is based on a combination of the current medium-range GloFAS forecasts and the operational EFAS (European Flood Awareness System) seasonal outlook, and aims to effectively communicate the nature of a seasonal outlook while providing useful information to users and partners. We demonstrate the first version of an operational GloFAS seasonal outlook, outlining the model set-up and presenting a first look at the seasonal forecasts that will be displayed in the GloFAS interface, and discuss the initial results of the forecast evaluation.
NASA Astrophysics Data System (ADS)
Murray, S.; Guerra, J. A.
2017-12-01
One essential component of operational space weather forecasting is the prediction of solar flares. Early flare forecasting work focused on statistical methods based on historical flaring rates, but more complex machine learning methods have been developed in recent years. A multitude of flare forecasting methods are now available, however it is still unclear which of these methods performs best, and none are substantially better than climatological forecasts. Current operational space weather centres cannot rely on automated methods, and generally use statistical forecasts with a little human intervention. Space weather researchers are increasingly looking towards methods used in terrestrial weather to improve current forecasting techniques. Ensemble forecasting has been used in numerical weather prediction for many years as a way to combine different predictions in order to obtain a more accurate result. It has proved useful in areas such as magnetospheric modelling and coronal mass ejection arrival analysis, however has not yet been implemented in operational flare forecasting. Here we construct ensemble forecasts for major solar flares by linearly combining the full-disk probabilistic forecasts from a group of operational forecasting methods (ASSA, ASAP, MAG4, MOSWOC, NOAA, and Solar Monitor). Forecasts from each method are weighted by a factor that accounts for the method's ability to predict previous events, and several performance metrics (both probabilistic and categorical) are considered. The results provide space weather forecasters with a set of parameters (combination weights, thresholds) that allow them to select the most appropriate values for constructing the 'best' ensemble forecast probability value, according to the performance metric of their choice. In this way different forecasts can be made to fit different end-user needs.
Fuzzy logic-based analogue forecasting and hybrid modelling of horizontal visibility
NASA Astrophysics Data System (ADS)
Tuba, Zoltán; Bottyán, Zsolt
2018-04-01
Forecasting visibility is one of the greatest challenges in aviation meteorology. At the same time, high accuracy visibility forecasts can significantly reduce or make avoidable weather-related risk in aviation as well. To improve forecasting visibility, this research links fuzzy logic-based analogue forecasting and post-processed numerical weather prediction model outputs in hybrid forecast. Performance of analogue forecasting model was improved by the application of Analytic Hierarchy Process. Then, linear combination of the mentioned outputs was applied to create ultra-short term hybrid visibility prediction which gradually shifts the focus from statistical to numerical products taking their advantages during the forecast period. It gives the opportunity to bring closer the numerical visibility forecast to the observations even it is wrong initially. Complete verification of categorical forecasts was carried out; results are available for persistence and terminal aerodrome forecasts (TAF) as well in order to compare. The average value of Heidke Skill Score (HSS) of examined airports of analogue and hybrid forecasts shows very similar results even at the end of forecast period where the rate of analogue prediction in the final hybrid output is 0.1-0.2 only. However, in case of poor visibility (1000-2500 m), hybrid (0.65) and analogue forecasts (0.64) have similar average of HSS in the first 6 h of forecast period, and have better performance than persistence (0.60) or TAF (0.56). Important achievement that hybrid model takes into consideration physics and dynamics of the atmosphere due to the increasing part of the numerical weather prediction. In spite of this, its performance is similar to the most effective visibility forecasting methods and does not follow the poor verification results of clearly numerical outputs.
76 FR 39185 - 2011-2012 Refuge-Specific Hunting and Sport Fishing Regulations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-05
... that pertain to migratory game bird hunting, upland game hunting, big game hunting, and sport fishing... wildlife refuges to migratory game bird hunting, upland game hunting, big game hunting, or sport fishing.... Table 1--Changes for 2011-2012 Hunting/Fishing Season Migratory bird Upland game National wildlife...
Meta-Games in Information Work
ERIC Educational Resources Information Center
Huvila, Isto
2013-01-01
Introduction: Meta-games and meta-gaming refer to various second-order conceptions of games and gaming. The present article discusses the applicability of the notions of meta-game and meta-gaming in understanding the patterns of how people use, misuse, work and work-around information and information infrastructures. Method: Twenty-two qualitative…
WPC Maximum Heat Index Forecasts
Forecasts for Western US CLICK ON MAPS FOR MAXIMUM HEAT INDEX AND PROBABILITY FORECASTS FROM SUN MAY 27 2018 02 CLICK to view SAT JUN 02 forecast SUN JUN 03 CLICK to view SUN JUN 03 forecast SUN JUN 03 CLICK to view SUN JUN 03 forecast SUN JUN 03 CLICK to view SUN JUN 03 forecast SUN JUN 03 CLICK to view SUN JUN
Active and non-active video gaming among Dutch adolescents: who plays and how much?
Simons, Monique; de Vet, Emely; Brug, Johannes; Seidell, Jaap; Chinapaw, Mai J M
2014-11-01
The aim of study was to determine prevalence and identify demographic correlates of active and non-active gaming among adolescents. Cross-sectional. A survey, assessing game behavior and correlates, was conducted among adolescents (12-16 years, n = 373), recruited via schools. Multivariable logistic regression analyses were conducted to examine demographic correlates of active gaming (≥ 1 h per week) and non-active gaming (>7h per week). Of all participants (n=373), 3% reported to play exclusively active games, 40% active games and non-active games, 40% exclusively non-active games, and 17% not playing video games at all. Active gaming adolescents played active games on average on 1.5 (sd = 1.2) days per school week for 36 (sd = 32.9)min and 1 (sd = 0.54) day per weekend for 42 (sd = 36.5)min. Non-active gaming adolescents played on average on 3.3 (sd = 1.6) days per school week for 65 (sd = 46.0)min and 1.4 (sd = 0.65) days per weekend for 80 (sd = 50.8)min. Adolescents attending lower levels of education were more likely to play active games ≥ 1 h per week than adolescents attending higher educational levels. Boys and older adolescents were more likely to play non-active games >7h per week, than girls or younger adolescents. Many adolescents play active games, especially those following a lower educational level, but time spent in this activity is relatively low compared to non-active gaming. To be feasible as a public health strategy, active gaming interventions should achieve more time is spent on active gaming at the expense of non-active gaming. Copyright © 2013 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Serious Games and Gamification for Mental Health: Current Status and Promising Directions.
Fleming, Theresa M; Bavin, Lynda; Stasiak, Karolina; Hermansson-Webb, Eve; Merry, Sally N; Cheek, Colleen; Lucassen, Mathijs; Lau, Ho Ming; Pollmuller, Britta; Hetrick, Sarah
2016-01-01
Computer games are ubiquitous and can be utilized for serious purposes such as health and education. "Applied games" including serious games (in brief, computerized games for serious purposes) and gamification (gaming elements used outside of games) have the potential to increase the impact of mental health internet interventions via three processes. First, by extending the reach of online programs to those who might not otherwise use them. Second, by improving engagement through both game-based and "serious" motivational dynamics. Third, by utilizing varied mechanisms for change, including therapeutic processes and gaming features. In this scoping review, we aim to advance the field by exploring the potential and opportunities available in this area. We review engagement factors which may be exploited and demonstrate that there is promising evidence of effectiveness for serious games for depression from contemporary systematic reviews. We illustrate six major categories of tested applied games for mental health (exergames, virtual reality, cognitive behavior therapy-based games, entertainment games, biofeedback, and cognitive training games) and demonstrate that it is feasible to translate traditional evidence-based interventions into computer gaming formats and to exploit features of computer games for therapeutic change. Applied games have considerable potential for increasing the impact of online interventions for mental health. However, there are few independent trials, and direct comparisons of game-based and non-game-based interventions are lacking. Further research, faster iterations, rapid testing, non-traditional collaborations, and user-centered approaches are needed to respond to diverse user needs and preferences in rapidly changing environments.
Desai, Rani A.; Krishnan-Sarin, Suchitra; Cavallo, Dana; Potenza, Marc N.
2013-01-01
There is concern about the potential for negative impact of video games on youth. However the existing literature on gaming is inconsistent and has often focused on aggression. Health correlates of gaming and the prevalence and correlates of problematic gaming have not been systematically studied. We anonymously surveyed 4,028 adolescents about gaming, reported problems with gaming, and other health behaviors. 51.2% of the sample reported gaming (76.3% of boys and 29.2% of girls). There were no negative health correlates of gaming in boys, and lower odds of smoking regularly; however, girls who reported gaming were less likely to report depression, and more likely to report getting into serious fights and carrying a weapon to school. Among gamers, 4.9% reported problematic gaming, defined as reporting trying to cut back, experiencing an irresistible urge to play, and experiencing a growing tension that could only be relieved by playing. Boys were more likely to report these problems (5.8%) than girls (3.0%). Correlates of problematic gaming included regular cigarette smoking, drug use, depression, and serious fights. Results suggest that gaming is largely normative in boys and not associated with many health factors. In girls, however, gaming appears associated with more externalizing behaviors and fewer internalizing symptoms. The prevalence of problematic gaming is low but not insignificant, and problematic gaming may be contained within a larger spectrum of externalizing behaviors. More research is needed to define safe levels of gaming, refine the definition of problematic gaming, and evaluate effective prevention and intervention strategies. PMID:21078729
Desai, Rani A; Krishnan-Sarin, Suchitra; Cavallo, Dana; Potenza, Marc N
2010-12-01
Video game playing may negatively impact youth. However, the existing literature on gaming is inconsistent and often has focused on aggression rather than the health correlates of gaming and the prevalence and correlates of problematic gaming. We anonymously surveyed 4028 adolescents about gaming and reported problems with gaming and other health behaviors. A total of 51.2% of the sample reported gaming (76.3% of boys and 29.2% of girls). There were no negative health correlates of gaming in boys and lower odds of smoking regularly; however, girls who reported gaming were less likely to report depression and more likely to report getting into serious fights and carrying a weapon to school. Among gamers, 4.9% reported problematic gaming, defined as reporting trying to cut back, experiencing an irresistible urge to play, and experiencing a growing tension that could only be relieved by playing. Boys were more likely to report these problems (5.8%) than girls (3.0%). Correlates of problematic gaming included regular cigarette smoking, drug use, depression, and serious fights. Results suggest that gaming is largely normative in boys and not associated with many health factors. In girls, however, gaming seems to be associated with more externalizing behaviors and fewer internalizing symptoms. The prevalence of problematic gaming is low but not insignificant, and problematic gaming may be contained within a larger spectrum of externalizing behaviors. More research is needed to define safe levels of gaming, refine the definition of problematic gaming, and evaluate effective prevention and intervention strategies.
Yoo, Seung-Chul; Peña, Jorge
2011-01-01
The present study examined whether a violent video game impairs the effectiveness of in-game advertisements compared to a nonviolent video game. Participants recalled and evaluated in-game ads after navigating identical violent or nonviolent game scenarios. Participants' brand recall, recognition, and attitudes were comparatively lower after navigating the violent video game. Also, females in the violent game condition reported lower brand attitudes in comparison to males in the violent game condition, thus suggesting that the effects of gaming environment interacts with participants' gender. The findings supported the predictions of the limited capacity model of attention and cognitive priming effects. The results also extend previous studies on how violent media impair advertising effectiveness and provide practical implications for researchers and practitioners.
Structure coefficients and strategy selection in multiplayer games.
McAvoy, Alex; Hauert, Christoph
2016-01-01
Evolutionary processes based on two-player games such as the Prisoner's Dilemma or Snowdrift Game are abundant in evolutionary game theory. These processes, including those based on games with more than two strategies, have been studied extensively under the assumption that selection is weak. However, games involving more than two players have not received the same level of attention. To address this issue, and to relate two-player games to multiplayer games, we introduce a notion of reducibility for multiplayer games that captures what it means to break down a multiplayer game into a sequence of interactions with fewer players. We discuss the role of reducibility in structured populations, and we give examples of games that are irreducible in any population structure. Since the known conditions for strategy selection, otherwise known as [Formula: see text]-rules, have been established only for two-player games with multiple strategies and for multiplayer games with two strategies, we extend these rules to multiplayer games with many strategies to account for irreducible games that cannot be reduced to those simpler types of games. In particular, we show that the number of structure coefficients required for a symmetric game with [Formula: see text]-player interactions and [Formula: see text] strategies grows in [Formula: see text] like [Formula: see text]. Our results also cover a type of ecologically asymmetric game based on payoff values that are derived not only from the strategies of the players, but also from their spatial positions within the population.
Allsop, Susan; Dodd-Reynolds, Caroline J; Green, Benjamin P; Debuse, Dorothée; Rumbold, Penny L S
2015-12-28
The present study examined the acute effects of active gaming on energy intake (EI) and appetite responses in 8-11-year-old boys in a school-based setting. Using a randomised cross-over design, twenty-one boys completed four individual 90-min gaming bouts, each separated by 1 week. The gaming bouts were (1) seated gaming, no food or drink; (2) active gaming, no food or drink; (3) seated gaming with food and drink offered ad libitum; and (4) active gaming with food and drink offered ad libitum. In the two gaming bouts during which foods and drinks were offered, EI was measured. Appetite sensations - hunger, prospective food consumption and fullness - were recorded using visual analogue scales during all gaming bouts at 30-min intervals and at two 15-min intervals post gaming. In the two bouts with food and drink, no significant differences were found in acute EI (MJ) (P=0·238). Significant differences were detected in appetite sensations for hunger, prospective food consumption and fullness between the four gaming bouts at various time points. The relative EI calculated for the two gaming bouts with food and drink (active gaming 1·42 (sem 0·28) MJ; seated gaming 2·12 (sem 0·25) MJ) was not statistically different. Acute EI in response to active gaming was no different from seated gaming, and appetite sensations were influenced by whether food was made available during the 90-min gaming bouts.
A systematic review of serious video games used for vaccination.
Ohannessian, Robin; Yaghobian, Sarina; Verger, Pierre; Vanhems, Philippe
2016-08-31
Vaccination is an effective and proven method of preventing infectious diseases. However, uptake has not been optimal with available vaccines partly due to vaccination hesitancy. Various public health approaches have adressed vaccination hesitancy. Serious video games involving vaccination may represent an innovative public health approach. The aim of this study was to identify, describe, and review existing serious video games on vaccination. A systematic review was performed. Various databases were used to find data on vaccination-related serious video games published from January 1st 2000 to May 15th 2015. Data including featured medical and vaccination content, publication characteristics and games classification were collected for each identified serious game. Sixteen serious video games involved in vaccination were identified. All games were developed in high-income countries between 2003 and 2014. The majority of games were available online and were sponsored by educational/health institutions. All games were free of charge to users. Edugame was the most prevalent serious game subcategory. Twelve games were infectious disease-specific and the majority concerned influenza. The main objective of the games was disease control with a collective perspective. Utilization data was available for two games. Two games were formally evaluated. The use of serious video games for vaccination is an innovative tool for public health. Evaluation of vaccination related serious video games should be encouraged to demonstrate their efficacy and utility. Copyright © 2016 Elsevier Ltd. All rights reserved.
Characteristics of Internet use in relation to game genre in Korean adolescents.
Lee, Moon-Soo; Ko, Young-Hoon; Song, Hyoung-Seok; Kwon, Ku-Hyung; Lee, Hyeon-Soo; Nam, Min; Jung, In-Kwa
2007-04-01
As the number of internet users increases, a new game genre using the internet as a networking tool is emerging. Some game genres are regarded as having greater addiction potentials than others. Games and the internet are closely related. We investigated games frequently used by adolescents and classified each of them with the help of game professionals. We also examined internet use patterns to identify relationships between game genre and internet use patterns. 627 middle school and high school students (male 488, female 139) completed questionnaires concerning computer and game use patterns and Korean internet addiction scales. Game genres were divided into eight criteria (simulation, role playing game, web board, community, action, adventure, shooting, and sports). Using Korean internet addiction scales, 627 participants were divided into a normal group (474), a potential risk group (128), and a high-risk group (25). Each group showed significant differences in total internet addiction scores. We classified players into specific game users based upon the game types they most prefer. Role playing game users showed significantly higher internet addiction scores than web board and sports game users. Game and internet addictions are also connected with interpersonal relationship patterns. We suggest that users of some game genre have unique psychological addiction potentials that are different from others and that this influences both game selection and internet use.
ERIC Educational Resources Information Center
Smith, Curtis A.
"EnrollForecast for Excel" will generate a 5-year forecast of K-12 student enrollment. It will also work for any combination of grades between kindergarten and twelth. The forecasts can be printed as either a table or a graph. The user must provide birth history (only if forecasting kindergarten) and enrollment history information. The user also…
On the reliability of seasonal climate forecasts.
Weisheimer, A; Palmer, T N
2014-07-06
Seasonal climate forecasts are being used increasingly across a range of application sectors. A recent UK governmental report asked: how good are seasonal forecasts on a scale of 1-5 (where 5 is very good), and how good can we expect them to be in 30 years time? Seasonal forecasts are made from ensembles of integrations of numerical models of climate. We argue that 'goodness' should be assessed first and foremost in terms of the probabilistic reliability of these ensemble-based forecasts; reliable inputs are essential for any forecast-based decision-making. We propose that a '5' should be reserved for systems that are not only reliable overall, but where, in particular, small ensemble spread is a reliable indicator of low ensemble forecast error. We study the reliability of regional temperature and precipitation forecasts of the current operational seasonal forecast system of the European Centre for Medium-Range Weather Forecasts, universally regarded as one of the world-leading operational institutes producing seasonal climate forecasts. A wide range of 'goodness' rankings, depending on region and variable (with summer forecasts of rainfall over Northern Europe performing exceptionally poorly) is found. Finally, we discuss the prospects of reaching '5' across all regions and variables in 30 years time.
Software forecasting as it is really done: A study of JPL software engineers
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.
1993-01-01
This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.
Utilizing Climate Forecasts for Improving Water and Power Systems Coordination
NASA Astrophysics Data System (ADS)
Arumugam, S.; Queiroz, A.; Patskoski, J.; Mahinthakumar, K.; DeCarolis, J.
2016-12-01
Climate forecasts, typically monthly-to-seasonal precipitation forecasts, are commonly used to develop streamflow forecasts for improving reservoir management. Irrespective of their high skill in forecasting, temperature forecasts in developing power demand forecasts are not often considered along with streamflow forecasts for improving water and power systems coordination. In this study, we consider a prototype system to analyze the utility of climate forecasts, both precipitation and temperature, for improving water and power systems coordination. The prototype system, a unit-commitment model that schedules power generation from various sources, is considered and its performance is compared with an energy system model having an equivalent reservoir representation. Different skill sets of streamflow forecasts and power demand forecasts are forced on both water and power systems representations for understanding the level of model complexity required for utilizing monthly-to-seasonal climate forecasts to improve coordination between these two systems. The analyses also identify various decision-making strategies - forward purchasing of fuel stocks, scheduled maintenance of various power systems and tradeoff on water appropriation between hydropower and other uses - in the context of various water and power systems configurations. Potential application of such analyses for integrating large power systems with multiple river basins is also discussed.