Sample records for multi-dimensional bounded forecasting

  1. GENERAL: Scattering Phase Correction for Semiclassical Quantization Rules in Multi-Dimensional Quantum Systems

    NASA Astrophysics Data System (ADS)

    Huang, Wen-Min; Mou, Chung-Yu; Chang, Cheng-Hung

    2010-02-01

    While the scattering phase for several one-dimensional potentials can be exactly derived, less is known in multi-dimensional quantum systems. This work provides a method to extend the one-dimensional phase knowledge to multi-dimensional quantization rules. The extension is illustrated in the example of Bogomolny's transfer operator method applied in two quantum wells bounded by step potentials of different heights. This generalized semiclassical method accurately determines the energy spectrum of the systems, which indicates the substantial role of the proposed phase correction. Theoretically, the result can be extended to other semiclassical methods, such as Gutzwiller trace formula, dynamical zeta functions, and semiclassical Landauer-Büttiker formula. In practice, this recipe enhances the applicability of semiclassical methods to multi-dimensional quantum systems bounded by general soft potentials.

  2. Low-dimensional Representation of Error Covariance

    NASA Technical Reports Server (NTRS)

    Tippett, Michael K.; Cohn, Stephen E.; Todling, Ricardo; Marchesin, Dan

    2000-01-01

    Ensemble and reduced-rank approaches to prediction and assimilation rely on low-dimensional approximations of the estimation error covariances. Here stability properties of the forecast/analysis cycle for linear, time-independent systems are used to identify factors that cause the steady-state analysis error covariance to admit a low-dimensional representation. A useful measure of forecast/analysis cycle stability is the bound matrix, a function of the dynamics, observation operator and assimilation method. Upper and lower estimates for the steady-state analysis error covariance matrix eigenvalues are derived from the bound matrix. The estimates generalize to time-dependent systems. If much of the steady-state analysis error variance is due to a few dominant modes, the leading eigenvectors of the bound matrix approximate those of the steady-state analysis error covariance matrix. The analytical results are illustrated in two numerical examples where the Kalman filter is carried to steady state. The first example uses the dynamics of a generalized advection equation exhibiting nonmodal transient growth. Failure to observe growing modes leads to increased steady-state analysis error variances. Leading eigenvectors of the steady-state analysis error covariance matrix are well approximated by leading eigenvectors of the bound matrix. The second example uses the dynamics of a damped baroclinic wave model. The leading eigenvectors of a lowest-order approximation of the bound matrix are shown to approximate well the leading eigenvectors of the steady-state analysis error covariance matrix.

  3. Urban air quality forecasting based on multi-dimensional collaborative Support Vector Regression (SVR): A case study of Beijing-Tianjin-Shijiazhuang

    PubMed Central

    Liu, Bing-Chun; Binaykia, Arihant; Chang, Pei-Chann; Tiwari, Manoj Kumar; Tsao, Cheng-Chin

    2017-01-01

    Today, China is facing a very serious issue of Air Pollution due to its dreadful impact on the human health as well as the environment. The urban cities in China are the most affected due to their rapid industrial and economic growth. Therefore, it is of extreme importance to come up with new, better and more reliable forecasting models to accurately predict the air quality. This paper selected Beijing, Tianjin and Shijiazhuang as three cities from the Jingjinji Region for the study to come up with a new model of collaborative forecasting using Support Vector Regression (SVR) for Urban Air Quality Index (AQI) prediction in China. The present study is aimed to improve the forecasting results by minimizing the prediction error of present machine learning algorithms by taking into account multiple city multi-dimensional air quality information and weather conditions as input. The results show that there is a decrease in MAPE in case of multiple city multi-dimensional regression when there is a strong interaction and correlation of the air quality characteristic attributes with AQI. Also, the geographical location is found to play a significant role in Beijing, Tianjin and Shijiazhuang AQI prediction. PMID:28708836

  4. Model Error Estimation for the CPTEC Eta Model

    NASA Technical Reports Server (NTRS)

    Tippett, Michael K.; daSilva, Arlindo

    1999-01-01

    Statistical data assimilation systems require the specification of forecast and observation error statistics. Forecast error is due to model imperfections and differences between the initial condition and the actual state of the atmosphere. Practical four-dimensional variational (4D-Var) methods try to fit the forecast state to the observations and assume that the model error is negligible. Here with a number of simplifying assumption, a framework is developed for isolating the model error given the forecast error at two lead-times. Two definitions are proposed for the Talagrand ratio tau, the fraction of the forecast error due to model error rather than initial condition error. Data from the CPTEC Eta Model running operationally over South America are used to calculate forecast error statistics and lower bounds for tau.

  5. Fuzzy Regression Prediction and Application Based on Multi-Dimensional Factors of Freight Volume

    NASA Astrophysics Data System (ADS)

    Xiao, Mengting; Li, Cheng

    2018-01-01

    Based on the reality of the development of air cargo, the multi-dimensional fuzzy regression method is used to determine the influencing factors, and the three most important influencing factors of GDP, total fixed assets investment and regular flight route mileage are determined. The system’s viewpoints and analogy methods, the use of fuzzy numbers and multiple regression methods to predict the civil aviation cargo volume. In comparison with the 13th Five-Year Plan for China’s Civil Aviation Development (2016-2020), it is proved that this method can effectively improve the accuracy of forecasting and reduce the risk of forecasting. It is proved that this model predicts civil aviation freight volume of the feasibility, has a high practical significance and practical operation.

  6. An ensemble-ANFIS based uncertainty assessment model for forecasting multi-scalar standardized precipitation index

    NASA Astrophysics Data System (ADS)

    Ali, Mumtaz; Deo, Ravinesh C.; Downs, Nathan J.; Maraseni, Tek

    2018-07-01

    Forecasting drought by means of the World Meteorological Organization-approved Standardized Precipitation Index (SPI) is considered to be a fundamental task to support socio-economic initiatives and effectively mitigating the climate-risk. This study aims to develop a robust drought modelling strategy to forecast multi-scalar SPI in drought-rich regions of Pakistan where statistically significant lagged combinations of antecedent SPI are used to forecast future SPI. With ensemble-Adaptive Neuro Fuzzy Inference System ('ensemble-ANFIS') executed via a 10-fold cross-validation procedure, a model is constructed by randomly partitioned input-target data. Resulting in 10-member ensemble-ANFIS outputs, judged by mean square error and correlation coefficient in the training period, the optimal forecasts are attained by the averaged simulations, and the model is benchmarked with M5 Model Tree and Minimax Probability Machine Regression (MPMR). The results show the proposed ensemble-ANFIS model's preciseness was notably better (in terms of the root mean square and mean absolute error including the Willmott's, Nash-Sutcliffe and Legates McCabe's index) for the 6- and 12- month compared to the 3-month forecasts as verified by the largest error proportions that registered in smallest error band. Applying 10-member simulations, ensemble-ANFIS model was validated for its ability to forecast severity (S), duration (D) and intensity (I) of drought (including the error bound). This enabled uncertainty between multi-models to be rationalized more efficiently, leading to a reduction in forecast error caused by stochasticity in drought behaviours. Through cross-validations at diverse sites, a geographic signature in modelled uncertainties was also calculated. Considering the superiority of ensemble-ANFIS approach and its ability to generate uncertainty-based information, the study advocates the versatility of a multi-model approach for drought-risk forecasting and its prime importance for estimating drought properties over confidence intervals to generate better information for strategic decision-making.

  7. Shock/vortex interaction and vortex-breakdown modes

    NASA Technical Reports Server (NTRS)

    Kandil, Osama A.; Kandil, H. A.; Liu, C. H.

    1992-01-01

    Computational simulation and study of shock/vortex interaction and vortex-breakdown modes are considered for bound (internal) and unbound (external) flow domains. The problem is formulated using the unsteady, compressible, full Navier-Stokes (NS) equations which are solved using an implicit, flux-difference splitting, finite-volume scheme. For the bound flow domain, a supersonic swirling flow is considered in a configured circular duct and the problem is solved for quasi-axisymmetric and three-dimensional flows. For the unbound domain, a supersonic swirling flow issued from a nozzle into a uniform supersonic flow of lower Mach number is considered for quasi-axisymmetric and three-dimensional flows. The results show several modes of breakdown; e.g., no-breakdown, transient single-bubble breakdown, transient multi-bubble breakdown, periodic multi-bubble multi-frequency breakdown and helical breakdown.

  8. Evaluating the performance of real-time streamflow forecasting using multi-satellite precipitation products in the Upper Zambezi, Africa

    NASA Astrophysics Data System (ADS)

    Demaria, E. M.; Valdes, J. B.; Wi, S.; Serrat-Capdevila, A.; Valdés-Pineda, R.; Durcik, M.

    2016-12-01

    In under-instrumented basins around the world, accurate and timely forecasts of river streamflows have the potential of assisting water and natural resource managers in their management decisions. The Upper Zambezi river basin is the largest basin in southern Africa and its water resources are critical to sustainable economic growth and poverty reduction in eight riparian countries. We present a real-time streamflow forecast for the basin using a multi-model-multi-satellite approach that allows accounting for model and input uncertainties. Three distributed hydrologic models with different levels of complexity: VIC, HYMOD_DS, and HBV_DS are setup at a daily time step and a 0.25 degree spatial resolution for the basin. The hydrologic models are calibrated against daily observed streamflows at the Katima-Mulilo station using a Genetic Algorithm. Three real-time satellite products: Climate Prediction Center's morphing technique (CMORPH), Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN), and Tropical Rainfall Measuring Mission (TRMM-3B42RT) are bias-corrected with daily CHIRPS estimates. Uncertainty bounds for predicted flows are estimated with the Inverse Variance Weighting method. Because concentration times in the basin range from a few days to more than a week, we include the use of precipitation forecasts from the Global Forecasting System (GFS) to predict daily streamflows in the basin with a 10-days lead time. The skill of GFS-predicted streamflows is evaluated and the usefulness of the forecasts for short term water allocations is presented.

  9. Development of a WRF-RTFDDA-based high-resolution hybrid data-assimilation and forecasting system toward to operation in the Middle East

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Wu, W.; Zhang, Y.; Kucera, P. A.; Liu, Y.; Pan, L.

    2012-12-01

    Weather forecasting in the Middle East is challenging because of its complicated geographical nature including massive coastal area and heterogeneous land, and regional spare observational network. Strong air-land-sea interactions form multi-scale weather regimes in the area, which require a numerical weather prediction model capable of properly representing multi-scale atmospheric flow with appropriate initial conditions. The WRF-based Real-Time Four Dimensional Data Assimilation (RTFDDA) system is one of advanced multi-scale weather analysis and forecasting facilities developed at the Research Applications Laboratory (RAL) of NCAR. The forecasting system is applied for the Middle East with careful configuration. To overcome the limitation of the very sparsely available conventional observations in the region, we develop a hybrid data assimilation algorithm combining RTFDDA and WRF-3DVAR, which ingests remote sensing data from satellites and radar. This hybrid data assimilation blends Newtonian nudging FDDA and 3DVAR technology to effectively assimilate both conventional observations and remote sensing measurements and provide improved initial conditions for the forecasting system. For brevity, the forecasting system is called RTF3H (RTFDDA-3DVAR Hybrid). In this presentation, we will discuss the hybrid data assimilation algorithm, and its implementation, and the applications for high-impact weather events in the area. Sensitivity studies are conducted to understand the strength and limitations of this hybrid data assimilation algorithm.

  10. Assimilating the Future for Better Forecasts and Earlier Warnings

    NASA Astrophysics Data System (ADS)

    Du, H.; Wheatcroft, E.; Smith, L. A.

    2016-12-01

    Multi-model ensembles have become popular tools to account for some of the uncertainty due to model inadequacy in weather and climate simulation-based predictions. The current multi-model forecasts focus on combining single model ensemble forecasts by means of statistical post-processing. Assuming each model is developed independently or with different primary target variables, each is likely to contain different dynamical strengths and weaknesses. Using statistical post-processing, such information is only carried by the simulations under a single model ensemble: no advantage is taken to influence simulations under the other models. A novel methodology, named Multi-model Cross Pollination in Time, is proposed for multi-model ensemble scheme with the aim of integrating the dynamical information regarding the future from each individual model operationally. The proposed approach generates model states in time via applying data assimilation scheme(s) to yield truly "multi-model trajectories". It is demonstrated to outperform traditional statistical post-processing in the 40-dimensional Lorenz96 flow. Data assimilation approaches are originally designed to improve state estimation from the past to the current time. The aim of this talk is to introduce a framework that uses data assimilation to improve model forecasts at future time (not to argue for any one particular data assimilation scheme). Illustration of applying data assimilation "in the future" to provide early warning of future high-impact events is also presented.

  11. Reconstructing latent dynamical noise for better forecasting observables

    NASA Astrophysics Data System (ADS)

    Hirata, Yoshito

    2018-03-01

    I propose a method for reconstructing multi-dimensional dynamical noise inspired by the embedding theorem of Muldoon et al. [Dyn. Stab. Syst. 13, 175 (1998)] by regarding multiple predictions as different observables. Then, applying the embedding theorem by Stark et al. [J. Nonlinear Sci. 13, 519 (2003)] for a forced system, I produce time series forecast by supplying the reconstructed past dynamical noise as auxiliary information. I demonstrate the proposed method on toy models driven by auto-regressive models or independent Gaussian noise.

  12. Multi-Dimensional Asymptotically Stable 4th Order Accurate Schemes for the Diffusion Equation

    NASA Technical Reports Server (NTRS)

    Abarbanel, Saul; Ditkowski, Adi

    1996-01-01

    An algorithm is presented which solves the multi-dimensional diffusion equation on co mplex shapes to 4th-order accuracy and is asymptotically stable in time. This bounded-error result is achieved by constructing, on a rectangular grid, a differentiation matrix whose symmetric part is negative definite. The differentiation matrix accounts for the Dirichlet boundary condition by imposing penalty like terms. Numerical examples in 2-D show that the method is effective even where standard schemes, stable by traditional definitions fail.

  13. The NWRA Classification Infrastructure: description and extension to the Discriminant Analysis Flare Forecasting System (DAFFS)

    NASA Astrophysics Data System (ADS)

    Leka, K. D.; Barnes, Graham; Wagner, Eric

    2018-04-01

    A classification infrastructure built upon Discriminant Analysis (DA) has been developed at NorthWest Research Associates for examining the statistical differences between samples of two known populations. Originating to examine the physical differences between flare-quiet and flare-imminent solar active regions, we describe herein some details of the infrastructure including: parametrization of large datasets, schemes for handling "null" and "bad" data in multi-parameter analysis, application of non-parametric multi-dimensional DA, an extension through Bayes' theorem to probabilistic classification, and methods invoked for evaluating classifier success. The classifier infrastructure is applicable to a wide range of scientific questions in solar physics. We demonstrate its application to the question of distinguishing flare-imminent from flare-quiet solar active regions, updating results from the original publications that were based on different data and much smaller sample sizes. Finally, as a demonstration of "Research to Operations" efforts in the space-weather forecasting context, we present the Discriminant Analysis Flare Forecasting System (DAFFS), a near-real-time operationally-running solar flare forecasting tool that was developed from the research-directed infrastructure.

  14. Multi-Dimensional Assessment of Professional Competence during Initial Pilot Training

    ERIC Educational Resources Information Center

    Larson, Douglas Andrew

    2017-01-01

    A twenty-year forecast predicting significant increases in global air transportation portends a need to increase the capacity and effectiveness of initial pilot training. In addition to quantitative concerns related to the supply of new pilots, industry leaders have expressed dissatisfaction with the qualitative output of current aviation training…

  15. Climatological attribution of wind power ramp events in East Japan and their probabilistic forecast based on multi-model ensembles downscaled by analog ensemble using self-organizing maps

    NASA Astrophysics Data System (ADS)

    Ohba, Masamichi; Nohara, Daisuke; Kadokura, Shinji

    2016-04-01

    Severe storms or other extreme weather events can interrupt the spin of wind turbines in large scale that cause unexpected "wind ramp events". In this study, we present an application of self-organizing maps (SOMs) for climatological attribution of the wind ramp events and their probabilistic prediction. The SOM is an automatic data-mining clustering technique, which allows us to summarize a high-dimensional data space in terms of a set of reference vectors. The SOM is applied to analyze and connect the relationship between atmospheric patterns over Japan and wind power generation. SOM is employed on sea level pressure derived from the JRA55 reanalysis over the target area (Tohoku region in Japan), whereby a two-dimensional lattice of weather patterns (WPs) classified during the 1977-2013 period is obtained. To compare with the atmospheric data, the long-term wind power generation is reconstructed by using a high-resolution surface observation network AMeDAS (Automated Meteorological Data Acquisition System) in Japan. Our analysis extracts seven typical WPs, which are linked to frequent occurrences of wind ramp events. Probabilistic forecasts to wind power generation and ramps are conducted by using the obtained SOM. The probability are derived from the multiple SOM lattices based on the matching of output from TIGGE multi-model global forecast to the WPs on the lattices. Since this method effectively takes care of the empirical uncertainties from the historical data, wind power generation and ramp is probabilistically forecasted from the forecasts of global models. The predictability skill of the forecasts for the wind power generation and ramp events show the relatively good skill score under the downscaling technique. It is expected that the results of this study provides better guidance to the user community and contribute to future development of system operation model for the transmission grid operator.

  16. Universality of the Volume Bound in Slow-Roll Eternal Inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubovsky, Sergei; Senatore, Leonardo; Villadoro, Giovanni

    2012-03-28

    It has recently been shown that in single field slow-roll inflation the total volume cannot grow by a factor larger than e{sup S{sub dS}/2} without becoming infinite. The bound is saturated exactly at the phase transition to eternal inflation where the probability to produce infinite volume becomes non zero. We show that the bound holds sharply also in any space-time dimensions, when arbitrary higher-dimensional operators are included and in the multi-field inflationary case. The relation with the entropy of de Sitter and the universality of the bound strengthen the case for a deeper holographic interpretation. As a spin-off we providemore » the formalism to compute the probability distribution of the volume after inflation for generic multi-field models, which might help to address questions about the population of vacua of the landscape during slow-roll inflation.« less

  17. Forecasting and analyzing high O3 time series in educational area through an improved chaotic approach

    NASA Astrophysics Data System (ADS)

    Hamid, Nor Zila Abd; Adenan, Nur Hamiza; Noorani, Mohd Salmi Md

    2017-08-01

    Forecasting and analyzing the ozone (O3) concentration time series is important because the pollutant is harmful to health. This study is a pilot study for forecasting and analyzing the O3 time series in one of Malaysian educational area namely Shah Alam using chaotic approach. Through this approach, the observed hourly scalar time series is reconstructed into a multi-dimensional phase space, which is then used to forecast the future time series through the local linear approximation method. The main purpose is to forecast the high O3 concentrations. The original method performed poorly but the improved method addressed the weakness thereby enabling the high concentrations to be successfully forecast. The correlation coefficient between the observed and forecasted time series through the improved method is 0.9159 and both the mean absolute error and root mean squared error are low. Thus, the improved method is advantageous. The time series analysis by means of the phase space plot and Cao method identified the presence of low-dimensional chaotic dynamics in the observed O3 time series. Results showed that at least seven factors affect the studied O3 time series, which is consistent with the listed factors from the diurnal variations investigation and the sensitivity analysis from past studies. In conclusion, chaotic approach has been successfully forecast and analyzes the O3 time series in educational area of Shah Alam. These findings are expected to help stakeholders such as Ministry of Education and Department of Environment in having a better air pollution management.

  18. Statistical Downscaling in Multi-dimensional Wave Climate Forecast

    NASA Astrophysics Data System (ADS)

    Camus, P.; Méndez, F. J.; Medina, R.; Losada, I. J.; Cofiño, A. S.; Gutiérrez, J. M.

    2009-04-01

    Wave climate at a particular site is defined by the statistical distribution of sea state parameters, such as significant wave height, mean wave period, mean wave direction, wind velocity, wind direction and storm surge. Nowadays, long-term time series of these parameters are available from reanalysis databases obtained by numerical models. The Self-Organizing Map (SOM) technique is applied to characterize multi-dimensional wave climate, obtaining the relevant "wave types" spanning the historical variability. This technique summarizes multi-dimension of wave climate in terms of a set of clusters projected in low-dimensional lattice with a spatial organization, providing Probability Density Functions (PDFs) on the lattice. On the other hand, wind and storm surge depend on instantaneous local large-scale sea level pressure (SLP) fields while waves depend on the recent history of these fields (say, 1 to 5 days). Thus, these variables are associated with large-scale atmospheric circulation patterns. In this work, a nearest-neighbors analog method is used to predict monthly multi-dimensional wave climate. This method establishes relationships between the large-scale atmospheric circulation patterns from numerical models (SLP fields as predictors) with local wave databases of observations (monthly wave climate SOM PDFs as predictand) to set up statistical models. A wave reanalysis database, developed by Puertos del Estado (Ministerio de Fomento), is considered as historical time series of local variables. The simultaneous SLP fields calculated by NCEP atmospheric reanalysis are used as predictors. Several applications with different size of sea level pressure grid and with different temporal domain resolution are compared to obtain the optimal statistical model that better represents the monthly wave climate at a particular site. In this work we examine the potential skill of this downscaling approach considering perfect-model conditions, but we will also analyze the suitability of this methodology to be used for seasonal forecast and for long-term climate change scenario projection of wave climate.

  19. Computational Fluid Dynamics: Algorithms and Supercomputers

    DTIC Science & Technology

    1988-03-01

    1985. 1.2. Pulliam, T., and Steger, J. , Implicit Finite Difference Simulations of Three Dimensional Compressible Flow, AIAA Journal , Vol. 18, No. 2...approaches infinity, assuming N is bounded. The question as to actual performance when M is finite and N varies, is a different matter. (Note: the CYBER...PARTICLE-IN-CELL 9i% 3.b7 j.48 WEATHER FORECAST 98% 3.77 3.55 SEISMIC MIGRATION 98% 3.85 3.45 MONTE CARLO 99% 3.85 3.75 LATTICE GAUGE 100% 4.00 3.77

  20. Sensitivity of the model error parameter specification in weak-constraint four-dimensional variational data assimilation

    NASA Astrophysics Data System (ADS)

    Shaw, Jeremy A.; Daescu, Dacian N.

    2017-08-01

    This article presents the mathematical framework to evaluate the sensitivity of a forecast error aspect to the input parameters of a weak-constraint four-dimensional variational data assimilation system (w4D-Var DAS), extending the established theory from strong-constraint 4D-Var. Emphasis is placed on the derivation of the equations for evaluating the forecast sensitivity to parameters in the DAS representation of the model error statistics, including bias, standard deviation, and correlation structure. A novel adjoint-based procedure for adaptive tuning of the specified model error covariance matrix is introduced. Results from numerical convergence tests establish the validity of the model error sensitivity equations. Preliminary experiments providing a proof-of-concept are performed using the Lorenz multi-scale model to illustrate the theoretical concepts and potential benefits for practical applications.

  1. On operational flood forecasting system involving 1D/2D coupled hydraulic model and data assimilation

    NASA Astrophysics Data System (ADS)

    Barthélémy, S.; Ricci, S.; Morel, T.; Goutal, N.; Le Pape, E.; Zaoui, F.

    2018-07-01

    In the context of hydrodynamic modeling, the use of 2D models is adapted in areas where the flow is not mono-dimensional (confluence zones, flood plains). Nonetheless the lack of field data and the computational cost constraints limit the extensive use of 2D models for operational flood forecasting. Multi-dimensional coupling offers a solution with 1D models where the flow is mono-dimensional and with local 2D models where needed. This solution allows for the representation of complex processes in 2D models, while the simulated hydraulic state is significantly better than that of the full 1D model. In this study, coupling is implemented between three 1D sub-models and a local 2D model for a confluence on the Adour river (France). A Schwarz algorithm is implemented to guarantee the continuity of the variables at the 1D/2D interfaces while in situ observations are assimilated in the 1D sub-models to improve results and forecasts in operational mode as carried out by the French flood forecasting services. An implementation of the coupling and data assimilation (DA) solution with domain decomposition and task/data parallelism is proposed so that it is compatible with operational constraints. The coupling with the 2D model improves the simulated hydraulic state compared to a global 1D model, and DA improves results in 1D and 2D areas.

  2. The dynamics of aloof baby Skyrmions

    DOE PAGES

    Salmi, Petja; Sutcliffe, Paul

    2016-01-25

    The aloof baby Skyrme model is a (2+1)-dimensional theory with solitons that are lightly bound. It is a low-dimensional analogue of a similar Skyrme model in (3+1)- dimensions, where the lightly bound solitons have binding energies comparable to nuclei. A previous study of static solitons in the aloof baby Skyrme model revealed that multi-soliton bound states have a cluster structure, with constituents that preserve their individual identities due to the short-range repulsion and long-range attraction between solitons. Furthermore, there are many different local energy minima that are all well-described by a simple binary species particle model. In this paper wemore » present the first results on soliton dynamics in the aloof baby Skyrme model. Numerical field theory simulations reveal that the lightly bound cluster structure results in a variety of exotic soliton scattering events that are novel in comparison to standard Skyrmion scattering. A dynamical version of the binary species point particle model is shown to provide a good qualitative description of the dynamics.« less

  3. The dynamics of aloof baby Skyrmions

    NASA Astrophysics Data System (ADS)

    Salmi, Petja; Sutcliffe, Paul

    2016-01-01

    The aloof baby Skyrme model is a (2+1)-dimensional theory with solitons that are lightly bound. It is a low-dimensional analogue of a similar Skyrme model in (3+1)-dimensions, where the lightly bound solitons have binding energies comparable to nuclei. A previous study of static solitons in the aloof baby Skyrme model revealed that multi-soliton bound states have a cluster structure, with constituents that preserve their individual identities due to the short-range repulsion and long-range attraction between solitons. Furthermore, there are many different local energy minima that are all well-described by a simple binary species particle model. In this paper we present the first results on soliton dynamics in the aloof baby Skyrme model. Numerical field theory simulations reveal that the lightly bound cluster structure results in a variety of exotic soliton scattering events that are novel in comparison to standard Skyrmion scattering. A dynamical version of the binary species point particle model is shown to provide a good qualitative description of the dynamics.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salmi, Petja; Sutcliffe, Paul

    The aloof baby Skyrme model is a (2+1)-dimensional theory with solitons that are lightly bound. It is a low-dimensional analogue of a similar Skyrme model in (3+1)- dimensions, where the lightly bound solitons have binding energies comparable to nuclei. A previous study of static solitons in the aloof baby Skyrme model revealed that multi-soliton bound states have a cluster structure, with constituents that preserve their individual identities due to the short-range repulsion and long-range attraction between solitons. Furthermore, there are many different local energy minima that are all well-described by a simple binary species particle model. In this paper wemore » present the first results on soliton dynamics in the aloof baby Skyrme model. Numerical field theory simulations reveal that the lightly bound cluster structure results in a variety of exotic soliton scattering events that are novel in comparison to standard Skyrmion scattering. A dynamical version of the binary species point particle model is shown to provide a good qualitative description of the dynamics.« less

  5. A Quick Response Forecasting Model of Pathogen Transport and Inactivation in Near-shore Regions

    NASA Astrophysics Data System (ADS)

    Liu, L.; Fu, X.

    2011-12-01

    Modeling methods supporting water quality assessments play a critical role by facilitating people to understand and promptly predict the potential threat of waterborne bacterial pathogens pose to human health. A mathematical model to describe and predict bacterial levels can provide foundation for water managers in making decisions on whether a water system is safe to open to the public. The inactivation (decay or die-off) rate of bacteria is critical in a bacterial model by controlling bacterial concentration in waters and depends on numerous factors of hydrodynamics, meteorology, geology, chemistry and biology. Transport and fate of waterborne pathogens in fresh water systems is an essentially three-dimensional problem, which requires a coupling of hydrodynamic equations and transport equations that describe the pathogen and suspended sediment dynamics. However, such an approach could be very demanding and time consuming from a practical point of view due to excess computational efforts. Long computation time may lead people unintentionally drinking or swimming in the contaminated water during the period before the predictive results of water quality come out. Therefore, it is very necessary to find a quick-response model to forecast bacterial concentration instantly to protect human health without any delay. Nearshore regions are the most commonly and directly used area for people in a huge water system. The prior multi-dimensional investigations of E. Coli and Enterococci inactivation in literature indicate that along-shore current predominated the nearshore region. Consequently, the complex dynamic conditions may be potentially simplified to one-dimensional scenario. In this research, a one-dimensional model system coupling both hydrodynamic and bacterial transport modules is constructed considering different complex processes to simulate the transport and fate of pathogens in nearshore regions. The quick-response model mainly focuses on promptly forecasting purpose and will be verified and calibrated with the available data collected from southern Lake Michigan. The modeling results will be compared with those from prior multi-dimensional models. This model is specifically effective for the outfall-controlled waters, where pathogens are primarily predominated by loadings from nearby tributaries and tend to show wide variations in concentrations.

  6. A Decision Support System for effective use of probability forecasts

    NASA Astrophysics Data System (ADS)

    De Kleermaeker, Simone; Verkade, Jan

    2013-04-01

    Often, water management decisions are based on hydrological forecasts. These forecasts, however, are affected by inherent uncertainties. It is increasingly common for forecasting agencies to make explicit estimates of these uncertainties and thus produce probabilistic forecasts. Associated benefits include the decision makers' increased awareness of forecasting uncertainties and the potential for risk-based decision-making. Also, a stricter separation of responsibilities between forecasters and decision maker can be made. However, simply having probabilistic forecasts available is not sufficient to realise the associated benefits. Additional effort is required in areas such as forecast visualisation and communication, decision making in uncertainty and forecast verification. Also, revised separation of responsibilities requires a shift in institutional arrangements and responsibilities. A recent study identified a number of additional issues related to the effective use of probability forecasts. When moving from deterministic to probability forecasting, a dimension is added to an already multi-dimensional problem; this makes it increasingly difficult for forecast users to extract relevant information from a forecast. A second issue is that while probability forecasts provide a necessary ingredient for risk-based decision making, other ingredients may not be present. For example, in many cases no estimates of flood damage, of costs of management measures and of damage reduction are available. This paper presents the results of the study, including some suggestions for resolving these issues and the integration of those solutions in a prototype decision support system (DSS). A pathway for further development of the DSS is outlined.

  7. Optimality of Thermal Expansion Bounds in Three Dimensions

    DOE PAGES

    Watts, Seth E.; Tortorelli, Daniel A.

    2015-02-20

    In this short note, we use topology optimization to design multi-phase isotropic three-dimensional composite materials with extremal combinations of isotropic thermal expansion and bulk modulus. In so doing, we provide evidence that the theoretical bounds for this combination of material properties are optimal. This has been shown in two dimensions, but not heretofore in three dimensions. Finally, we also show that restricting the design space by enforcing material symmetry by construction does not prevent one from obtaining extremal designs.

  8. Application Of Multi-grid Method On China Seas' Temperature Forecast

    NASA Astrophysics Data System (ADS)

    Li, W.; Xie, Y.; He, Z.; Liu, K.; Han, G.; Ma, J.; Li, D.

    2006-12-01

    Correlation scales have been used in traditional scheme of 3-dimensional variational (3D-Var) data assimilation to estimate the background error covariance for the numerical forecast and reanalysis of atmosphere and ocean for decades. However there are still some drawbacks of this scheme. First, the correlation scales are difficult to be determined accurately. Second, the positive definition of the first-guess error covariance matrix cannot be guaranteed unless the correlation scales are sufficiently small. Xie et al. (2005) indicated that a traditional 3D-Var only corrects some certain wavelength errors and its accuracy depends on the accuracy of the first-guess covariance. And in general, short wavelength error can not be well corrected until long one is corrected and then inaccurate first-guess covariance may mistakenly take long wave error as short wave ones and result in erroneous analysis. For the purpose of quickly minimizing the errors of long and short waves successively, a new 3D-Var data assimilation scheme, called multi-grid data assimilation scheme, is proposed in this paper. By assimilating the shipboard SST and temperature profiles data into a numerical model of China Seas, we applied this scheme in two-month data assimilation and forecast experiment which ended in a favorable result. Comparing with the traditional scheme of 3D-Var, the new scheme has higher forecast accuracy and a lower forecast Root-Mean-Square (RMS) error. Furthermore, this scheme was applied to assimilate the SST of shipboard, AVHRR Pathfinder Version 5.0 SST and temperature profiles at the same time, and a ten-month forecast experiment on sea temperature of China Seas was carried out, in which a successful forecast result was obtained. Particularly, the new scheme is demonstrated a great numerical efficiency in these analyses.

  9. Impact of multi-resolution analysis of artificial intelligence models inputs on multi-step ahead river flow forecasting

    NASA Astrophysics Data System (ADS)

    Badrzadeh, Honey; Sarukkalige, Ranjan; Jayawardena, A. W.

    2013-12-01

    Discrete wavelet transform was applied to decomposed ANN and ANFIS inputs.Novel approach of WNF with subtractive clustering applied for flow forecasting.Forecasting was performed in 1-5 step ahead, using multi-variate inputs.Forecasting accuracy of peak values and longer lead-time significantly improved.

  10. Series expansion solutions for the multi-term time and space fractional partial differential equations in two- and three-dimensions

    NASA Astrophysics Data System (ADS)

    Ye, H.; Liu, F.; Turner, I.; Anh, V.; Burrage, K.

    2013-09-01

    Fractional partial differential equations with more than one fractional derivative in time describe some important physical phenomena, such as the telegraph equation, the power law wave equation, or the Szabo wave equation. In this paper, we consider two- and three-dimensional multi-term time and space fractional partial differential equations. The multi-term time-fractional derivative is defined in the Caputo sense, whose order belongs to the interval (1,2],(2,3],(3,4] or (0, m], and the space-fractional derivative is referred to as the fractional Laplacian form. We derive series expansion solutions based on a spectral representation of the Laplacian operator on a bounded region. Some applications are given for the two- and three-dimensional telegraph equation, power law wave equation and Szabo wave equation.

  11. Teacher-Student Relationship at University: An Important yet Under-Researched Field

    ERIC Educational Resources Information Center

    Hagenauer, Gerda; Volet, Simone E.

    2014-01-01

    This article reviews the extant research on the relationship between students and teachers in higher education across three main areas: the quality of this relationship, its consequences and its antecedents. The weaknesses and gaps in prior research are highlighted and the importance of addressing the multi-dimensional and context-bound nature of…

  12. Qualification of a multi-diagnostic detonator-output characterization procedure utilizing PMMA witness blocks

    NASA Astrophysics Data System (ADS)

    Biss, Matthew; Murphy, Michael; Lieber, Mark

    2017-06-01

    Experiments were conducted in an effort to qualify a multi-diagnostic characterization procedure for the performance output of a detonator when fired into a poly(methyl methacrylate) (PMMA) witness block. A suite of optical diagnostics were utilized in combination to both bound the shock wave interaction state at the detonator/PMMA interface and characterize the nature of the shock wave decay in PMMA. The diagnostics included the Shock Wave Image Framing Technique (SWIFT), a photocathode tube streak camera, and photonic Doppler velocimetry (PDV). High-precision, optically clear witness blocks permitted dynamic flow visualization of the shock wave in PMMA via focused shadowgraphy. SWIFT- and streak-imaging diagnostics captured the spatiotemporally evolving shock wave, providing a two-dimensional temporally discrete image set and a one-dimensional temporally continuous image, respectively. PDV provided the temporal velocity history of the detonator output along the detonator axis. Through combination of the results obtained, a bound was able to be placed on the interface condition and a more-physical profile representing the shock wave decay in PMMA for an exploding-bridgewire detonator was achieved.

  13. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence

    PubMed Central

    Kelly, David; Majda, Andrew J.; Tong, Xin T.

    2015-01-01

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature. PMID:26261335

  14. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.

    PubMed

    Kelly, David; Majda, Andrew J; Tong, Xin T

    2015-08-25

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.

  15. Multi-Year Revenue and Expenditure Forecasting for Small Municipal Governments.

    DTIC Science & Technology

    1981-03-01

    Management Audit Econometric Revenue Forecast Gap and Impact Analysis Deterministic Expenditure Forecast Municipal Forecasting Municipal Budget Formlto...together with a multi-year revenue and expenditure forecasting model for the City of Monterey, California. The Monterey model includes an econometric ...65 5 D. FORECAST BASED ON THE ECONOMETRIC MODEL ------- 67 E. FORECAST BASED ON EXPERT JUDGMENT AND TREND ANALYSIS

  16. Water and Power Systems Co-optimization under a High Performance Computing Framework

    NASA Astrophysics Data System (ADS)

    Xuan, Y.; Arumugam, S.; DeCarolis, J.; Mahinthakumar, K.

    2016-12-01

    Water and energy systems optimizations are traditionally being treated as two separate processes, despite their intrinsic interconnections (e.g., water is used for hydropower generation, and thermoelectric cooling requires a large amount of water withdrawal). Given the challenges of urbanization, technology uncertainty and resource constraints, and the imminent threat of climate change, a cyberinfrastructure is needed to facilitate and expedite research into the complex management of these two systems. To address these issues, we developed a High Performance Computing (HPC) framework for stochastic co-optimization of water and energy resources to inform water allocation and electricity demand. The project aims to improve conjunctive management of water and power systems under climate change by incorporating improved ensemble forecast models of streamflow and power demand. First, by downscaling and spatio-temporally disaggregating multimodel climate forecasts from General Circulation Models (GCMs), temperature and precipitation forecasts are obtained and input into multi-reservoir and power systems models. Extended from Optimus (Optimization Methods for Universal Simulators), the framework drives the multi-reservoir model and power system model, Temoa (Tools for Energy Model Optimization and Analysis), and uses Particle Swarm Optimization (PSO) algorithm to solve high dimensional stochastic problems. The utility of climate forecasts on the cost of water and power systems operations is assessed and quantified based on different forecast scenarios (i.e., no-forecast, multimodel forecast and perfect forecast). Analysis of risk management actions and renewable energy deployments will be investigated for the Catawba River basin, an area with adequate hydroclimate predicting skill and a critical basin with 11 reservoirs that supplies water and generates power for both North and South Carolina. Further research using this scalable decision supporting framework will provide understanding and elucidate the intricate and interdependent relationship between water and energy systems and enhance the security of these two critical public infrastructures.

  17. Real-time forecasting of an epidemic using a discrete time stochastic model: a case study of pandemic influenza (H1N1-2009).

    PubMed

    Nishiura, Hiroshi

    2011-02-16

    Real-time forecasting of epidemics, especially those based on a likelihood-based approach, is understudied. This study aimed to develop a simple method that can be used for the real-time epidemic forecasting. A discrete time stochastic model, accounting for demographic stochasticity and conditional measurement, was developed and applied as a case study to the weekly incidence of pandemic influenza (H1N1-2009) in Japan. By imposing a branching process approximation and by assuming the linear growth of cases within each reporting interval, the epidemic curve is predicted using only two parameters. The uncertainty bounds of the forecasts are computed using chains of conditional offspring distributions. The quality of the forecasts made before the epidemic peak appears largely to depend on obtaining valid parameter estimates. The forecasts of both weekly incidence and final epidemic size greatly improved at and after the epidemic peak with all the observed data points falling within the uncertainty bounds. Real-time forecasting using the discrete time stochastic model with its simple computation of the uncertainty bounds was successful. Because of the simplistic model structure, the proposed model has the potential to additionally account for various types of heterogeneity, time-dependent transmission dynamics and epidemiological details. The impact of such complexities on forecasting should be explored when the data become available as part of the disease surveillance.

  18. Several reverse-time integrable nonlocal nonlinear equations: Rogue-wave solutions

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Chen, Yong

    2018-05-01

    A study of rogue-wave solutions in the reverse-time nonlocal nonlinear Schrödinger (NLS) and nonlocal Davey-Stewartson (DS) equations is presented. By using Darboux transformation (DT) method, several types of rogue-wave solutions are constructed. Dynamics of these rogue-wave solutions are further explored. It is shown that the (1 + 1)-dimensional fundamental rogue-wave solutions in the reverse-time NLS equation can be globally bounded or have finite-time blowing-ups. It is also shown that the (2 + 1)-dimensional line rogue waves in the reverse-time nonlocal DS equations can be bounded for all space and time or develop singularities in critical time. In addition, the multi- and higher-order rogue waves exhibit richer structures, most of which have no counterparts in the corresponding local nonlinear equations.

  19. Statistical post-processing of seasonal multi-model forecasts: Why is it so hard to beat the multi-model mean?

    NASA Astrophysics Data System (ADS)

    Siegert, Stefan

    2017-04-01

    Initialised climate forecasts on seasonal time scales, run several months or even years ahead, are now an integral part of the battery of products offered by climate services world-wide. The availability of seasonal climate forecasts from various modeling centres gives rise to multi-model ensemble forecasts. Post-processing such seasonal-to-decadal multi-model forecasts is challenging 1) because the cross-correlation structure between multiple models and observations can be complicated, 2) because the amount of training data to fit the post-processing parameters is very limited, and 3) because the forecast skill of numerical models tends to be low on seasonal time scales. In this talk I will review new statistical post-processing frameworks for multi-model ensembles. I will focus particularly on Bayesian hierarchical modelling approaches, which are flexible enough to capture commonly made assumptions about collective and model-specific biases of multi-model ensembles. Despite the advances in statistical methodology, it turns out to be very difficult to out-perform the simplest post-processing method, which just recalibrates the multi-model ensemble mean by linear regression. I will discuss reasons for this, which are closely linked to the specific characteristics of seasonal multi-model forecasts. I explore possible directions for improvements, for example using informative priors on the post-processing parameters, and jointly modelling forecasts and observations.

  20. Bounding the Failure Probability Range of Polynomial Systems Subject to P-box Uncertainties

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2012-01-01

    This paper proposes a reliability analysis framework for systems subject to multiple design requirements that depend polynomially on the uncertainty. Uncertainty is prescribed by probability boxes, also known as p-boxes, whose distribution functions have free or fixed functional forms. An approach based on the Bernstein expansion of polynomials and optimization is proposed. In particular, we search for the elements of a multi-dimensional p-box that minimize (i.e., the best-case) and maximize (i.e., the worst-case) the probability of inner and outer bounding sets of the failure domain. This technique yields intervals that bound the range of failure probabilities. The offset between this bounding interval and the actual failure probability range can be made arbitrarily tight with additional computational effort.

  1. Three-Dimensional Surface Parameters and Multi-Fractal Spectrum of Corroded Steel

    PubMed Central

    Shanhua, Xu; Songbo, Ren; Youde, Wang

    2015-01-01

    To study multi-fractal behavior of corroded steel surface, a range of fractal surfaces of corroded surfaces of Q235 steel were constructed by using the Weierstrass-Mandelbrot method under a high total accuracy. The multi-fractal spectrum of fractal surface of corroded steel was calculated to study the multi-fractal characteristics of the W-M corroded surface. Based on the shape feature of the multi-fractal spectrum of corroded steel surface, the least squares method was applied to the quadratic fitting of the multi-fractal spectrum of corroded surface. The fitting function was quantitatively analyzed to simplify the calculation of multi-fractal characteristics of corroded surface. The results showed that the multi-fractal spectrum of corroded surface was fitted well with the method using quadratic curve fitting, and the evolution rules and trends were forecasted accurately. The findings can be applied to research on the mechanisms of corroded surface formation of steel and provide a new approach for the establishment of corrosion damage constitutive models of steel. PMID:26121468

  2. Three-Dimensional Surface Parameters and Multi-Fractal Spectrum of Corroded Steel.

    PubMed

    Shanhua, Xu; Songbo, Ren; Youde, Wang

    2015-01-01

    To study multi-fractal behavior of corroded steel surface, a range of fractal surfaces of corroded surfaces of Q235 steel were constructed by using the Weierstrass-Mandelbrot method under a high total accuracy. The multi-fractal spectrum of fractal surface of corroded steel was calculated to study the multi-fractal characteristics of the W-M corroded surface. Based on the shape feature of the multi-fractal spectrum of corroded steel surface, the least squares method was applied to the quadratic fitting of the multi-fractal spectrum of corroded surface. The fitting function was quantitatively analyzed to simplify the calculation of multi-fractal characteristics of corroded surface. The results showed that the multi-fractal spectrum of corroded surface was fitted well with the method using quadratic curve fitting, and the evolution rules and trends were forecasted accurately. The findings can be applied to research on the mechanisms of corroded surface formation of steel and provide a new approach for the establishment of corrosion damage constitutive models of steel.

  3. Real-time forecasting of an epidemic using a discrete time stochastic model: a case study of pandemic influenza (H1N1-2009)

    PubMed Central

    2011-01-01

    Background Real-time forecasting of epidemics, especially those based on a likelihood-based approach, is understudied. This study aimed to develop a simple method that can be used for the real-time epidemic forecasting. Methods A discrete time stochastic model, accounting for demographic stochasticity and conditional measurement, was developed and applied as a case study to the weekly incidence of pandemic influenza (H1N1-2009) in Japan. By imposing a branching process approximation and by assuming the linear growth of cases within each reporting interval, the epidemic curve is predicted using only two parameters. The uncertainty bounds of the forecasts are computed using chains of conditional offspring distributions. Results The quality of the forecasts made before the epidemic peak appears largely to depend on obtaining valid parameter estimates. The forecasts of both weekly incidence and final epidemic size greatly improved at and after the epidemic peak with all the observed data points falling within the uncertainty bounds. Conclusions Real-time forecasting using the discrete time stochastic model with its simple computation of the uncertainty bounds was successful. Because of the simplistic model structure, the proposed model has the potential to additionally account for various types of heterogeneity, time-dependent transmission dynamics and epidemiological details. The impact of such complexities on forecasting should be explored when the data become available as part of the disease surveillance. PMID:21324153

  4. Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times

    NASA Astrophysics Data System (ADS)

    Engeland, Kolbjorn; Steinsland, Ingelin

    2014-05-01

    This study introduces a methodology for the construction of probabilistic inflow forecasts for multiple catchments and lead times, and investigates criterions for evaluation of multi-variate forecasts. A post-processing approach is used, and a Gaussian model is applied for transformed variables. The post processing model has two main components, the mean model and the dependency model. The mean model is used to estimate the marginal distributions for forecasted inflow for each catchment and lead time, whereas the dependency models was used to estimate the full multivariate distribution of forecasts, i.e. co-variances between catchments and lead times. In operational situations, it is a straightforward task to use the models to sample inflow ensembles which inherit the dependencies between catchments and lead times. The methodology was tested and demonstrated in the river systems linked to the Ulla-Førre hydropower complex in southern Norway, where simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology exhibits sufficient flexibility to utilize deterministic flow forecasts from a numerical hydrological model as well as statistical forecasts such as persistent forecasts and sliding window climatology forecasts. It also deals with variation in the relative weights of these forecasts with both catchment and lead time. When evaluating predictive performance in original space using cross validation, the case study found that it is important to include the persistent forecast for the initial lead times and the hydrological forecast for medium-term lead times. Sliding window climatology forecasts become more important for the latest lead times. Furthermore, operationally important features in this case study such as heteroscedasticity, lead time varying between lead time dependency and lead time varying between catchment dependency are captured. Two criterions were used for evaluating the added value of the dependency model. The first one was the Energy score (ES) that is a multi-dimensional generalization of continuous rank probability score (CRPS). ES was calculated for all lead-times and catchments together, for each catchment across all lead times and for each lead time across all catchments. The second criterion was to use CRPS for forecasted inflows accumulated over several lead times and catchments. The results showed that ES was not very sensitive to correct covariance structure, whereas CRPS for accumulated flows where more suitable for evaluating the dependency model. This indicates that it is more appropriate to evaluate relevant univariate variables that depends on the dependency structure then to evaluate the multivariate forecast directly.

  5. Stimuli Reduce the Dimensionality of Cortical Activity

    PubMed Central

    Mazzucato, Luca; Fontanini, Alfredo; La Camera, Giancarlo

    2016-01-01

    The activity of ensembles of simultaneously recorded neurons can be represented as a set of points in the space of firing rates. Even though the dimension of this space is equal to the ensemble size, neural activity can be effectively localized on smaller subspaces. The dimensionality of the neural space is an important determinant of the computational tasks supported by the neural activity. Here, we investigate the dimensionality of neural ensembles from the sensory cortex of alert rats during periods of ongoing (inter-trial) and stimulus-evoked activity. We find that dimensionality grows linearly with ensemble size, and grows significantly faster during ongoing activity compared to evoked activity. We explain these results using a spiking network model based on a clustered architecture. The model captures the difference in growth rate between ongoing and evoked activity and predicts a characteristic scaling with ensemble size that could be tested in high-density multi-electrode recordings. Moreover, we present a simple theory that predicts the existence of an upper bound on dimensionality. This upper bound is inversely proportional to the amount of pair-wise correlations and, compared to a homogeneous network without clusters, it is larger by a factor equal to the number of clusters. The empirical estimation of such bounds depends on the number and duration of trials and is well predicted by the theory. Together, these results provide a framework to analyze neural dimensionality in alert animals, its behavior under stimulus presentation, and its theoretical dependence on ensemble size, number of clusters, and correlations in spiking network models. PMID:26924968

  6. Stimuli Reduce the Dimensionality of Cortical Activity.

    PubMed

    Mazzucato, Luca; Fontanini, Alfredo; La Camera, Giancarlo

    2016-01-01

    The activity of ensembles of simultaneously recorded neurons can be represented as a set of points in the space of firing rates. Even though the dimension of this space is equal to the ensemble size, neural activity can be effectively localized on smaller subspaces. The dimensionality of the neural space is an important determinant of the computational tasks supported by the neural activity. Here, we investigate the dimensionality of neural ensembles from the sensory cortex of alert rats during periods of ongoing (inter-trial) and stimulus-evoked activity. We find that dimensionality grows linearly with ensemble size, and grows significantly faster during ongoing activity compared to evoked activity. We explain these results using a spiking network model based on a clustered architecture. The model captures the difference in growth rate between ongoing and evoked activity and predicts a characteristic scaling with ensemble size that could be tested in high-density multi-electrode recordings. Moreover, we present a simple theory that predicts the existence of an upper bound on dimensionality. This upper bound is inversely proportional to the amount of pair-wise correlations and, compared to a homogeneous network without clusters, it is larger by a factor equal to the number of clusters. The empirical estimation of such bounds depends on the number and duration of trials and is well predicted by the theory. Together, these results provide a framework to analyze neural dimensionality in alert animals, its behavior under stimulus presentation, and its theoretical dependence on ensemble size, number of clusters, and correlations in spiking network models.

  7. Weighting of NMME temperature and precipitation forecasts across Europe

    NASA Astrophysics Data System (ADS)

    Slater, Louise J.; Villarini, Gabriele; Bradley, A. Allen

    2017-09-01

    Multi-model ensemble forecasts are obtained by weighting multiple General Circulation Model (GCM) outputs to heighten forecast skill and reduce uncertainties. The North American Multi-Model Ensemble (NMME) project facilitates the development of such multi-model forecasting schemes by providing publicly-available hindcasts and forecasts online. Here, temperature and precipitation forecasts are enhanced by leveraging the strengths of eight NMME GCMs (CCSM3, CCSM4, CanCM3, CanCM4, CFSv2, GEOS5, GFDL2.1, and FLORb01) across all forecast months and lead times, for four broad climatic European regions: Temperate, Mediterranean, Humid-Continental and Subarctic-Polar. We compare five different approaches to multi-model weighting based on the equally weighted eight single-model ensembles (EW-8), Bayesian updating (BU) of the eight single-model ensembles (BU-8), BU of the 94 model members (BU-94), BU of the principal components of the eight single-model ensembles (BU-PCA-8) and BU of the principal components of the 94 model members (BU-PCA-94). We assess the forecasting skill of these five multi-models and evaluate their ability to predict some of the costliest historical droughts and floods in recent decades. Results indicate that the simplest approach based on EW-8 preserves model skill, but has considerable biases. The BU and BU-PCA approaches reduce the unconditional biases and negative skill in the forecasts considerably, but they can also sometimes diminish the positive skill in the original forecasts. The BU-PCA models tend to produce lower conditional biases than the BU models and have more homogeneous skill than the other multi-models, but with some loss of skill. The use of 94 NMME model members does not present significant benefits over the use of the 8 single model ensembles. These findings may provide valuable insights for the development of skillful, operational multi-model forecasting systems.

  8. Parcel-scale urban coastal flood mapping: Leveraging the multi-scale CoSMoS model for coastal flood forecasting

    NASA Astrophysics Data System (ADS)

    Gallien, T.; Barnard, P. L.; Sanders, B. F.

    2011-12-01

    California coastal sea levels are projected to rise 1-1.4 meters in the next century and evidence suggests mean tidal range, and consequently, mean high water (MHW) is increasing along portions of Southern California Bight. Furthermore, emerging research indicates wind stress patterns associated with the Pacific Decadal Oscillation (PDO) have suppressed sea level rise rates along the West Coast since 1980, and a reversal in this pattern would result in the resumption of regional sea level rise rates equivalent to or exceeding global mean sea level rise rates, thereby enhancing coastal flooding. Newport Beach is a highly developed, densely populated lowland along the Southern California coast currently subject to episodic flooding from coincident high tides and waves, and the frequency and intensity of flooding is expected to increase with projected future sea levels. Adaptation to elevated sea levels will require flood mapping and forecasting tools that are sensitive to the dominant factors affecting flooding including extreme high tides, waves and flood control infrastructure. Considerable effort has been focused on the development of nowcast and forecast systems including Scripps Institute of Oceanography's Coastal Data Information Program (CDIP) and the USGS Multi-hazard model, the Southern California Coastal Storm Modeling System (CoSMoS). However, fine scale local embayment dynamics and overtopping flows are needed to map unsteady flooding effects in coastal lowlands protected by dunes, levees and seawalls. Here, a recently developed two dimensional Godunov non-linear shallow water solver is coupled to water level and wave forecasts from the CoSMoS model to investigate the roles of tides, waves, sea level changes and flood control infrastructure in accurate flood mapping and forecasting. The results of this study highlight the important roles of topographic data, embayment hydrodynamics, water level uncertainties and critical flood processes required for meaningful prediction of sea level rise impacts and coastal flood forecasting.

  9. A data-driven multi-model methodology with deep feature selection for short-term wind forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias

    With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by firstmore » layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.« less

  10. Sufficient Forecasting Using Factor Models

    PubMed Central

    Fan, Jianqing; Xue, Lingzhou; Yao, Jiawei

    2017-01-01

    We consider forecasting a single time series when there is a large number of predictors and a possible nonlinear effect. The dimensionality was first reduced via a high-dimensional (approximate) factor model implemented by the principal component analysis. Using the extracted factors, we develop a novel forecasting method called the sufficient forecasting, which provides a set of sufficient predictive indices, inferred from high-dimensional predictors, to deliver additional predictive power. The projected principal component analysis will be employed to enhance the accuracy of inferred factors when a semi-parametric (approximate) factor model is assumed. Our method is also applicable to cross-sectional sufficient regression using extracted factors. The connection between the sufficient forecasting and the deep learning architecture is explicitly stated. The sufficient forecasting correctly estimates projection indices of the underlying factors even in the presence of a nonparametric forecasting function. The proposed method extends the sufficient dimension reduction to high-dimensional regimes by condensing the cross-sectional information through factor models. We derive asymptotic properties for the estimate of the central subspace spanned by these projection directions as well as the estimates of the sufficient predictive indices. We further show that the natural method of running multiple regression of target on estimated factors yields a linear estimate that actually falls into this central subspace. Our method and theory allow the number of predictors to be larger than the number of observations. We finally demonstrate that the sufficient forecasting improves upon the linear forecasting in both simulation studies and an empirical study of forecasting macroeconomic variables. PMID:29731537

  11. Production data from five major geothermal fields in Nevada analysed using a physiostatistical algorithm developed for oil and gas: temperature decline forecasts and type curves

    NASA Astrophysics Data System (ADS)

    Kuzma, H. A.; Golubkova, A.; Eklund, C.

    2015-12-01

    Nevada has the second largest output of geothermal energy in the United States (after California) with 14 major power plants producing over 425 megawatts of electricity meeting 7% of the state's total energy needs. A number of wells, particularly older ones, have shown significant temperature and pressure declines over their lifetimes, adversely affecting economic returns. Production declines are almost universal in the oil and gas (O&G) industry. BetaZi (BZ) is a proprietary algorithm which uses a physiostatistical model to forecast production from the past history of O&G wells and to generate "type curves" which are used to estimate the production of undrilled wells. Although BZ was designed and calibrated for O&G, it is a general purpose diffusion equation solver, capable of modeling complex fluid dynamics in multi-phase systems. In this pilot study, it is applied directly to the temperature data from five Nevada geothermal fields. With the data appropriately normalized, BZ is shown to accurately predict temperature declines. The figure shows several examples of BZ forecasts using historic data from Steamboat Hills field near Reno. BZ forecasts were made using temperature on a normalized scale (blue) with two years of data held out for blind testing (yellow). The forecast is returned in terms of percentiles of probability (red) with the median forecast marked (solid green). Actual production is expected to fall within the majority of the red bounds 80% of the time. Blind tests such as these are used to verify that the probabilistic forecast can be trusted. BZ is also used to compute and accurate type temperature profile for wells that have yet to be drilled. These forecasts can be combined with estimated costs to evaluate the economics and risks of a project or potential capital investment. It is remarkable that an algorithm developed for oil and gas can accurately predict temperature in geothermal wells without significant recasting.

  12. Teacher–student relationship at university: an important yet under-researched field

    PubMed Central

    Hagenauer, Gerda; Volet, Simone E.

    2014-01-01

    This article reviews the extant research on the relationship between students and teachers in higher education across three main areas: the quality of this relationship, its consequences and its antecedents. The weaknesses and gaps in prior research are highlighted and the importance of addressing the multi-dimensional and context-bound nature of teacher–student relationships is proposed. A possible agenda for future research is outlined. PMID:27226693

  13. Comparisons of Three-Dimensional Variational Data Assimilation and Model Output Statistics in Improving Atmospheric Chemistry Forecasts

    NASA Astrophysics Data System (ADS)

    Ma, Chaoqun; Wang, Tijian; Zang, Zengliang; Li, Zhijin

    2018-07-01

    Atmospheric chemistry models usually perform badly in forecasting wintertime air pollution because of their uncertainties. Generally, such uncertainties can be decreased effectively by techniques such as data assimilation (DA) and model output statistics (MOS). However, the relative importance and combined effects of the two techniques have not been clarified. Here, a one-month air quality forecast with the Weather Research and Forecasting-Chemistry (WRF-Chem) model was carried out in a virtually operational setup focusing on Hebei Province, China. Meanwhile, three-dimensional variational (3DVar) DA and MOS based on one-dimensional Kalman filtering were implemented separately and simultaneously to investigate their performance in improving the model forecast. Comparison with observations shows that the chemistry forecast with MOS outperforms that with 3DVar DA, which could be seen in all the species tested over the whole 72 forecast hours. Combined use of both techniques does not guarantee a better forecast than MOS only, with the improvements and degradations being small and appearing rather randomly. Results indicate that the implementation of MOS is more suitable than 3DVar DA in improving the operational forecasting ability of WRF-Chem.

  14. Multi-Level Reduced Order Modeling Equipped with Probabilistic Error Bounds

    NASA Astrophysics Data System (ADS)

    Abdo, Mohammad Gamal Mohammad Mostafa

    This thesis develops robust reduced order modeling (ROM) techniques to achieve the needed efficiency to render feasible the use of high fidelity tools for routine engineering analyses. Markedly different from the state-of-the-art ROM techniques, our work focuses only on techniques which can quantify the credibility of the reduction which can be measured with the reduction errors upper-bounded for the envisaged range of ROM model application. Our objective is two-fold. First, further developments of ROM techniques are proposed when conventional ROM techniques are too taxing to be computationally practical. This is achieved via a multi-level ROM methodology designed to take advantage of the multi-scale modeling strategy typically employed for computationally taxing models such as those associated with the modeling of nuclear reactor behavior. Second, the discrepancies between the original model and ROM model predictions over the full range of model application conditions are upper-bounded in a probabilistic sense with high probability. ROM techniques may be classified into two broad categories: surrogate construction techniques and dimensionality reduction techniques, with the latter being the primary focus of this work. We focus on dimensionality reduction, because it offers a rigorous approach by which reduction errors can be quantified via upper-bounds that are met in a probabilistic sense. Surrogate techniques typically rely on fitting a parametric model form to the original model at a number of training points, with the residual of the fit taken as a measure of the prediction accuracy of the surrogate. This approach, however, does not generally guarantee that the surrogate model predictions at points not included in the training process will be bound by the error estimated from the fitting residual. Dimensionality reduction techniques however employ a different philosophy to render the reduction, wherein randomized snapshots of the model variables, such as the model parameters, responses, or state variables, are projected onto lower dimensional subspaces, referred to as the "active subspaces", which are selected to capture a user-defined portion of the snapshots variations. Once determined, the ROM model application involves constraining the variables to the active subspaces. In doing so, the contribution from the variables discarded components can be estimated using a fundamental theorem from random matrix theory which has its roots in Dixon's theory, developed in 1983. This theory was initially presented for linear matrix operators. The thesis extends this theorem's results to allow reduction of general smooth nonlinear operators. The result is an approach by which the adequacy of a given active subspace determined using a given set of snapshots, generated either using the full high fidelity model, or other models with lower fidelity, can be assessed, which provides insight to the analyst on the type of snapshots required to reach a reduction that can satisfy user-defined preset tolerance limits on the reduction errors. Reactor physics calculations are employed as a test bed for the proposed developments. The focus will be on reducing the effective dimensionality of the various data streams such as the cross-section data and the neutron flux. The developed methods will be applied to representative assembly level calculations, where the size of the cross-section and flux spaces are typically large, as required by downstream core calculations, in order to capture the broad range of conditions expected during reactor operation. (Abstract shortened by ProQuest.).

  15. An Examination of a Multi-Scale Three-Dimensional Variational Data Assimilation Scheme in the Kuroshio Extension Using the Naval Coastal Ocean Model

    DTIC Science & Technology

    2014-01-01

    forecasts. Oceanic applications of the MS3DVAR have been Contents lists available at ScienceDirect journal homepage: www.elsevier.com/locate/ csr ...intervals of 12 h. The model grid location and depth can be seen in Fig. 2. In this region the Kuroshio appears as a narrow large magni- tude current in...1997. Introduction to high-frequency radar: reality and myth. Oceanography 10, 36–39. Qu, T., Mitsudera, H., Qui, B., 2001. A climatological view

  16. Determining effective forecast horizons for multi-purpose reservoirs with short- and long-term operating objectives

    NASA Astrophysics Data System (ADS)

    Luchner, Jakob; Anghileri, Daniela; Castelletti, Andrea

    2017-04-01

    Real-time control of multi-purpose reservoirs can benefit significantly from hydro-meteorological forecast products. Because of their reliability, the most used forecasts range on time scales from hours to few days and are suitable for short-term operation targets such as flood control. In recent years, hydro-meteorological forecasts have become more accurate and reliable on longer time scales, which are more relevant to long-term reservoir operation targets such as water supply. While the forecast quality of such products has been studied extensively, the forecast value, i.e. the operational effectiveness of using forecasts to support water management, has been only relatively explored. It is comparatively easy to identify the most effective forecasting information needed to design reservoir operation rules for flood control but it is not straightforward to identify which forecast variable and lead time is needed to define effective hedging rules for operational targets with slow dynamics such as water supply. The task is even more complex when multiple targets, with diverse slow and fast dynamics, are considered at the same time. In these cases, the relative importance of different pieces of information, e.g. magnitude and timing of peak flow rate and accumulated inflow on different time lags, may vary depending on the season or the hydrological conditions. In this work, we analyze the relationship between operational forecast value and streamflow forecast horizon for different multi-purpose reservoir trade-offs. We use the Information Selection and Assessment (ISA) framework to identify the most effective forecast variables and horizons for informing multi-objective reservoir operation over short- and long-term temporal scales. The ISA framework is an automatic iterative procedure to discriminate the information with the highest potential to improve multi-objective reservoir operating performance. Forecast variables and horizons are selected using a feature selection technique. The technique determines the most informative combination in a multi-variate regression model to the optimal reservoir releases based on perfect information at a fixed objective trade-off. The improved reservoir operation is evaluated against optimal reservoir operation conditioned upon perfect information on future disturbances and basic reservoir operation using only the day of the year and the reservoir level. Different objective trade-offs are selected for analyzing resulting differences in improved reservoir operation and selected forecast variables and horizons. For comparison, the effective streamflow forecast horizon determined by the ISA framework is benchmarked against the performances obtained with a deterministic model predictive control (MPC) optimization scheme. Both the ISA framework and the MPC optimization scheme are applied to the real-world case study of Lake Como, Italy, using perfect streamflow forecast information. The principal operation targets for Lake Como are flood control and downstream water supply which makes its operation a suitable case study. Results provide critical feedback to reservoir operators on the use of long-term streamflow forecasts and to the hydro-meteorological forecasting community with respect to the forecast horizon needed from reliable streamflow forecasts.

  17. Scientific Visualization and Simulation for Multi-dimensional Marine Environment Data

    NASA Astrophysics Data System (ADS)

    Su, T.; Liu, H.; Wang, W.; Song, Z.; Jia, Z.

    2017-12-01

    As higher attention on the ocean and rapid development of marine detection, there are increasingly demands for realistic simulation and interactive visualization of marine environment in real time. Based on advanced technology such as GPU rendering, CUDA parallel computing and rapid grid oriented strategy, a series of efficient and high-quality visualization methods, which can deal with large-scale and multi-dimensional marine data in different environmental circumstances, has been proposed in this paper. Firstly, a high-quality seawater simulation is realized by FFT algorithm, bump mapping and texture animation technology. Secondly, large-scale multi-dimensional marine hydrological environmental data is virtualized by 3d interactive technologies and volume rendering techniques. Thirdly, seabed terrain data is simulated with improved Delaunay algorithm, surface reconstruction algorithm, dynamic LOD algorithm and GPU programming techniques. Fourthly, seamless modelling in real time for both ocean and land based on digital globe is achieved by the WebGL technique to meet the requirement of web-based application. The experiments suggest that these methods can not only have a satisfying marine environment simulation effect, but also meet the rendering requirements of global multi-dimension marine data. Additionally, a simulation system for underwater oil spill is established by OSG 3D-rendering engine. It is integrated with the marine visualization method mentioned above, which shows movement processes, physical parameters, current velocity and direction for different types of deep water oil spill particle (oil spill particles, hydrates particles, gas particles, etc.) dynamically and simultaneously in multi-dimension. With such application, valuable reference and decision-making information can be provided for understanding the progress of oil spill in deep water, which is helpful for ocean disaster forecasting, warning and emergency response.

  18. Statistical model for forecasting monthly large wildfire events in western United States

    Treesearch

    Haiganoush K. Preisler; Anthony L. Westerling

    2006-01-01

    The ability to forecast the number and location of large wildfire events (with specified confidence bounds) is important to fire managers attempting to allocate and distribute suppression efforts during severe fire seasons. This paper describes the development of a statistical model for assessing the forecasting skills of fire-danger predictors and producing 1-month-...

  19. Medium-range reference evapotranspiration forecasts for the contiguous United States based on multi-model numerical weather predictions

    NASA Astrophysics Data System (ADS)

    Medina, Hanoi; Tian, Di; Srivastava, Puneet; Pelosi, Anna; Chirico, Giovanni B.

    2018-07-01

    Reference evapotranspiration (ET0) plays a fundamental role in agronomic, forestry, and water resources management. Estimating and forecasting ET0 have long been recognized as a major challenge for researchers and practitioners in these communities. This work explored the potential of multiple leading numerical weather predictions (NWPs) for estimating and forecasting summer ET0 at 101 U.S. Regional Climate Reference Network stations over nine climate regions across the contiguous United States (CONUS). Three leading global NWP model forecasts from THORPEX Interactive Grand Global Ensemble (TIGGE) dataset were used in this study, including the single model ensemble forecasts from the European Centre for Medium-Range Weather Forecasts (EC), the National Centers for Environmental Prediction Global Forecast System (NCEP), and the United Kingdom Meteorological Office forecasts (MO), as well as multi-model ensemble forecasts from the combinations of these NWP models. A regression calibration was employed to bias correct the ET0 forecasts. Impact of individual forecast variables on ET0 forecasts were also evaluated. The results showed that the EC forecasts provided the least error and highest skill and reliability, followed by the MO and NCEP forecasts. The multi-model ensembles constructed from the combination of EC and MO forecasts provided slightly better performance than the single model EC forecasts. The regression process greatly improved ET0 forecast performances, particularly for the regions involving stations near the coast, or with a complex orography. The performance of EC forecasts was only slightly influenced by the size of the ensemble members, particularly at short lead times. Even with less ensemble members, EC still performed better than the other two NWPs. Errors in the radiation forecasts, followed by those in the wind, had the most detrimental effects on the ET0 forecast performances.

  20. Use of forecasting signatures to help distinguish periodicity, randomness, and chaos in ripples and other spatial patterns

    USGS Publications Warehouse

    Rubin, D.M.

    1992-01-01

    Forecasting of one-dimensional time series previously has been used to help distinguish periodicity, chaos, and noise. This paper presents two-dimensional generalizations for making such distinctions for spatial patterns. The techniques are evaluated using synthetic spatial patterns and then are applied to a natural example: ripples formed in sand by blowing wind. Tests with the synthetic patterns demonstrate that the forecasting techniques can be applied to two-dimensional spatial patterns, with the same utility and limitations as when applied to one-dimensional time series. One limitation is that some combinations of periodicity and randomness exhibit forecasting signatures that mimic those of chaos. For example, sine waves distorted with correlated phase noise have forecasting errors that increase with forecasting distance, errors that, are minimized using nonlinear models at moderate embedding dimensions, and forecasting properties that differ significantly between the original and surrogates. Ripples formed in sand by flowing air or water typically vary in geometry from one to another, even when formed in a flow that is uniform on a large scale; each ripple modifies the local flow or sand-transport field, thereby influencing the geometry of the next ripple downcurrent. Spatial forecasting was used to evaluate the hypothesis that such a deterministic process - rather than randomness or quasiperiodicity - is responsible for the variation between successive ripples. This hypothesis is supported by a forecasting error that increases with forecasting distance, a greater accuracy of nonlinear relative to linear models, and significant differences between forecasts made with the original ripples and those made with surrogate patterns. Forecasting signatures cannot be used to distinguish ripple geometry from sine waves with correlated phase noise, but this kind of structure can be ruled out by two geometric properties of the ripples: Successive ripples are highly correlated in wavelength, and ripple crests display dislocations such as branchings and mergers. ?? 1992 American Institute of Physics.

  1. Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks.

    PubMed

    Vlachas, Pantelis R; Byeon, Wonmin; Wan, Zhong Y; Sapsis, Themistoklis P; Koumoutsakos, Petros

    2018-05-01

    We introduce a data-driven forecasting method for high-dimensional chaotic systems using long short-term memory (LSTM) recurrent neural networks. The proposed LSTM neural networks perform inference of high-dimensional dynamical systems in their reduced order space and are shown to be an effective set of nonlinear approximators of their attractor. We demonstrate the forecasting performance of the LSTM and compare it with Gaussian processes (GPs) in time series obtained from the Lorenz 96 system, the Kuramoto-Sivashinsky equation and a prototype climate model. The LSTM networks outperform the GPs in short-term forecasting accuracy in all applications considered. A hybrid architecture, extending the LSTM with a mean stochastic model (MSM-LSTM), is proposed to ensure convergence to the invariant measure. This novel hybrid method is fully data-driven and extends the forecasting capabilities of LSTM networks.

  2. Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS)

    NASA Astrophysics Data System (ADS)

    OConnor, A.; Kirtman, B. P.; Harrison, S.; Gorman, J.

    2016-02-01

    Current US Navy forecasting systems cannot easily incorporate extended-range forecasts that can improve mission readiness and effectiveness; ensure safety; and reduce cost, labor, and resource requirements. If Navy operational planners had systems that incorporated these forecasts, they could plan missions using more reliable and longer-term weather and climate predictions. Further, using multi-model forecast ensembles instead of single forecasts would produce higher predictive performance. Extended-range multi-model forecast ensembles, such as those available in the North American Multi-Model Ensemble (NMME), are ideal for system integration because of their high skill predictions; however, even higher skill predictions can be produced if forecast model ensembles are combined correctly. While many methods for weighting models exist, the best method in a given environment requires expert knowledge of the models and combination methods.We present an innovative approach that uses machine learning to combine extended-range predictions from multi-model forecast ensembles and generate a probabilistic forecast for any region of the globe up to 12 months in advance. Our machine-learning approach uses 30 years of hindcast predictions to learn patterns of forecast model successes and failures. Each model is assigned a weight for each environmental condition, 100 km2 region, and day given any expected environmental information. These weights are then applied to the respective predictions for the region and time of interest to effectively stitch together a single, coherent probabilistic forecast. Our experimental results demonstrate the benefits of our approach to produce extended-range probabilistic forecasts for regions and time periods of interest that are superior, in terms of skill, to individual NMME forecast models and commonly weighted models. The probabilistic forecast leverages the strengths of three NMME forecast models to predict environmental conditions for an area spanning from San Diego, CA to Honolulu, HI, seven months in-advance. Key findings include: weighted combinations of models are strictly better than individual models; machine-learned combinations are especially better; and forecasts produced using our approach have the highest rank probability skill score most often.

  3. A Multi-scale, Multi-Model, Machine-Learning Solar Forecasting Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamann, Hendrik F.

    The goal of the project was the development and demonstration of a significantly improved solar forecasting technology (short: Watt-sun), which leverages new big data processing technologies and machine-learnt blending between different models and forecast systems. The technology aimed demonstrating major advances in accuracy as measured by existing and new metrics which themselves were developed as part of this project. Finally, the team worked with Independent System Operators (ISOs) and utilities to integrate the forecasts into their operations.

  4. The multi temporal/multi-model approach to predictive uncertainty assessment in real-time flood forecasting

    NASA Astrophysics Data System (ADS)

    Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Brocca, Luca; Todini, Ezio

    2017-08-01

    This work extends the multi-temporal approach of the Model Conditional Processor (MCP-MT) to the multi-model case and to the four Truncated Normal Distributions (TNDs) approach, demonstrating the improvement on the single-temporal one. The study is framed in the context of probabilistic Bayesian decision-making that is appropriate to take rational decisions on uncertain future outcomes. As opposed to the direct use of deterministic forecasts, the probabilistic forecast identifies a predictive probability density function that represents a fundamental knowledge on future occurrences. The added value of MCP-MT is the identification of the probability that a critical situation will happen within the forecast lead-time and when, more likely, it will occur. MCP-MT is thoroughly tested for both single-model and multi-model configurations at a gauged site on the Tiber River, central Italy. The stages forecasted by two operative deterministic models, STAFOM-RCM and MISDc, are considered for the study. The dataset used for the analysis consists of hourly data from 34 flood events selected on a time series of six years. MCP-MT improves over the original models' forecasts: the peak overestimation and the rising limb delayed forecast, characterizing MISDc and STAFOM-RCM respectively, are significantly mitigated, with a reduced mean error on peak stage from 45 to 5 cm and an increased coefficient of persistence from 0.53 up to 0.75. The results show that MCP-MT outperforms the single-temporal approach and is potentially useful for supporting decision-making because the exceedance probability of hydrometric thresholds within a forecast horizon and the most probable flooding time can be estimated.

  5. Weather and seasonal climate prediction for South America using a multi-model superensemble

    NASA Astrophysics Data System (ADS)

    Chaves, Rosane R.; Ross, Robert S.; Krishnamurti, T. N.

    2005-11-01

    This work examines the feasibility of weather and seasonal climate predictions for South America using the multi-model synthetic superensemble approach for climate, and the multi-model conventional superensemble approach for numerical weather prediction, both developed at Florida State University (FSU). The effect on seasonal climate forecasts of the number of models used in the synthetic superensemble is investigated. It is shown that the synthetic superensemble approach for climate and the conventional superensemble approach for numerical weather prediction can reduce the errors over South America in seasonal climate prediction and numerical weather prediction.For climate prediction, a suite of 13 models is used. The forecast lead-time is 1 month for the climate forecasts, which consist of precipitation and surface temperature forecasts. The multi-model ensemble is comprised of four versions of the FSU-Coupled Ocean-Atmosphere Model, seven models from the Development of a European Multi-model Ensemble System for Seasonal to Interannual Prediction (DEMETER), a version of the Community Climate Model (CCM3), and a version of the predictive Ocean Atmosphere Model for Australia (POAMA). The results show that conditions over South America are appropriately simulated by the Florida State University Synthetic Superensemble (FSUSSE) in comparison to observations and that the skill of this approach increases with the use of additional models in the ensemble. When compared to observations, the forecasts are generally better than those from both a single climate model and the multi-model ensemble mean, for the variables tested in this study.For numerical weather prediction, the conventional Florida State University Superensemble (FSUSE) is used to predict the mass and motion fields over South America. Predictions of mean sea level pressure, 500 hPa geopotential height, and 850 hPa wind are made with a multi-model superensemble comprised of six global models for the period January, February, and December of 2000. The six global models are from the following forecast centers: FSU, Bureau of Meteorology Research Center (BMRC), Japan Meteorological Agency (JMA), National Centers for Environmental Prediction (NCEP), Naval Research Laboratory (NRL), and Recherche en Prevision Numerique (RPN). Predictions of precipitation are made for the period January, February, and December of 2001 with a multi-analysis-multi-model superensemble where, in addition to the six forecast models just mentioned, five additional versions of the FSU model are used in the ensemble, each with a different initialization (analysis) based on different physical initialization procedures. On the basis of observations, the results show that the FSUSE provides the best forecasts of the mass and motion field variables to forecast day 5, when compared to both the models comprising the ensemble and the multi-model ensemble mean during the wet season of December-February over South America. Individual case studies show that the FSUSE provides excellent predictions of rainfall for particular synoptic events to forecast day 3. Copyright

  6. Polynomial Chaos Based Acoustic Uncertainty Predictions from Ocean Forecast Ensembles

    NASA Astrophysics Data System (ADS)

    Dennis, S.

    2016-02-01

    Most significant ocean acoustic propagation occurs at tens of kilometers, at scales small compared basin and to most fine scale ocean modeling. To address the increased emphasis on uncertainty quantification, for example transmission loss (TL) probability density functions (PDF) within some radius, a polynomial chaos (PC) based method is utilized. In order to capture uncertainty in ocean modeling, Navy Coastal Ocean Model (NCOM) now includes ensembles distributed to reflect the ocean analysis statistics. Since the ensembles are included in the data assimilation for the new forecast ensembles, the acoustic modeling uses the ensemble predictions in a similar fashion for creating sound speed distribution over an acoustically relevant domain. Within an acoustic domain, singular value decomposition over the combined time-space structure of the sound speeds can be used to create Karhunen-Loève expansions of sound speed, subject to multivariate normality testing. These sound speed expansions serve as a basis for Hermite polynomial chaos expansions of derived quantities, in particular TL. The PC expansion coefficients result from so-called non-intrusive methods, involving evaluation of TL at multi-dimensional Gauss-Hermite quadrature collocation points. Traditional TL calculation from standard acoustic propagation modeling could be prohibitively time consuming at all multi-dimensional collocation points. This method employs Smolyak order and gridding methods to allow adaptive sub-sampling of the collocation points to determine only the most significant PC expansion coefficients to within a preset tolerance. Practically, the Smolyak order and grid sizes grow only polynomially in the number of Karhunen-Loève terms, alleviating the curse of dimensionality. The resulting TL PC coefficients allow the determination of TL PDF normality and its mean and standard deviation. In the non-normal case, PC Monte Carlo methods are used to rapidly establish the PDF. This work was sponsored by the Office of Naval Research

  7. Bound-preserving Legendre-WENO finite volume schemes using nonlinear mapping

    NASA Astrophysics Data System (ADS)

    Smith, Timothy; Pantano, Carlos

    2017-11-01

    We present a new method to enforce field bounds in high-order Legendre-WENO finite volume schemes. The strategy consists of reconstructing each field through an intermediate mapping, which by design satisfies realizability constraints. Determination of the coefficients of the polynomial reconstruction involves nonlinear equations that are solved using Newton's method. The selection between the original or mapped reconstruction is implemented dynamically to minimize computational cost. The method has also been generalized to fields that exhibit interdependencies, requiring multi-dimensional mappings. Further, the method does not depend on the existence of a numerical flux function. We will discuss details of the proposed scheme and show results for systems in conservation and non-conservation form. This work was funded by the NSF under Grant DMS 1318161.

  8. A Real-Time California Coastal Ocean Nowcast/Forecast System: Skill Assessment, User Products, and Transition from Research to Operations

    NASA Astrophysics Data System (ADS)

    Farrara, J. D.; Chao, Y.; Chai, F.; Zhang, H.

    2016-02-01

    The real-time California coastal ocean nowcast/forecast system is described. The model is based on the Regional Ocean Modeling System (ROMS) and covers the entire California coastal ocean with a horizontal resolution of 3 km and 40 vertical layers. The atmospheric forcing is derived from the operational regional atmospheric model forecasts. The lateral boundary conditions are provided by the operational ocean model forecasts. A multi-scale 3-dimensional variational (3DVAR) data assimilation scheme is used to assimilate both in situ (e.g., vertical profiles of temperature and salinity) and remotely sensed data from both satellite (e.g., sea surface temperature and sea surface height) and land-based platforms (e.g., surface current). The performance of our nowcast/forecast system is evaluated in real-time by a number of metrics that are published as soon as they become available. User tools and products have been developed for both general users and super-users (e.g., NOAA Office of Response and Restoration and USCG). Recent results comparing the 3DVAR with the ensemble Kalman Filter (EnKF) using Data Assimilation Research Testbed (DART) will be presented. Preliminary results coupling the ROMS circulation model with a biogeochemistry/ecosystem model (i.e., CoSiNE) will also discussed. Cloud computing services (e.g., Microsoft, Google) are now being tested to increase the reliability and timeliness in order to be accepted as a truly operational system in the near future.

  9. The Research of Regression Method for Forecasting Monthly Electricity Sales Considering Coupled Multi-factor

    NASA Astrophysics Data System (ADS)

    Wang, Jiangbo; Liu, Junhui; Li, Tiantian; Yin, Shuo; He, Xinhui

    2018-01-01

    The monthly electricity sales forecasting is a basic work to ensure the safety of the power system. This paper presented a monthly electricity sales forecasting method which comprehensively considers the coupled multi-factors of temperature, economic growth, electric power replacement and business expansion. The mathematical model is constructed by using regression method. The simulation results show that the proposed method is accurate and effective.

  10. Daily Reservoir Inflow Forecasting using Deep Learning with Downscaled Multi-General Circulation Models (GCMs) Platform

    NASA Astrophysics Data System (ADS)

    Li, D.; Fang, N. Z.

    2017-12-01

    Dallas-Fort Worth Metroplex (DFW) has a population of over 7 million depending on many water supply reservoirs. The reservoir inflow plays a vital role in water supply decision making process and long-term strategic planning for the region. This paper demonstrates a method of utilizing deep learning algorithms and multi-general circulation model (GCM) platform to forecast reservoir inflow for three reservoirs within the DFW: Eagle Mountain Lake, Lake Benbrook and Lake Arlington. Ensemble empirical mode decomposition was firstly employed to extract the features, which were then represented by the deep belief networks (DBNs). The first 75 years of the historical data (1940 -2015) were used to train the model, while the last 2 years of the data (2016-2017) were used for the model validation. The weights of each DBN gained from the training process were then applied to establish a neural network (NN) that was able to forecast reservoir inflow. Feature predictors used for the forecasting model were generated from weather forecast results of the downscaled multi-GCM platform for the North Texas region. By comparing root mean square error (RMSE) and mean bias error (MBE) with the observed data, the authors found that the deep learning with downscaled multi-GCM platform is an effective approach in the reservoir inflow forecasting.

  11. A multi-source data assimilation framework for flood forecasting: Accounting for runoff routing lags

    NASA Astrophysics Data System (ADS)

    Meng, S.; Xie, X.

    2015-12-01

    In the flood forecasting practice, model performance is usually degraded due to various sources of uncertainties, including the uncertainties from input data, model parameters, model structures and output observations. Data assimilation is a useful methodology to reduce uncertainties in flood forecasting. For the short-term flood forecasting, an accurate estimation of initial soil moisture condition will improve the forecasting performance. Considering the time delay of runoff routing is another important effect for the forecasting performance. Moreover, the observation data of hydrological variables (including ground observations and satellite observations) are becoming easily available. The reliability of the short-term flood forecasting could be improved by assimilating multi-source data. The objective of this study is to develop a multi-source data assimilation framework for real-time flood forecasting. In this data assimilation framework, the first step is assimilating the up-layer soil moisture observations to update model state and generated runoff based on the ensemble Kalman filter (EnKF) method, and the second step is assimilating discharge observations to update model state and runoff within a fixed time window based on the ensemble Kalman smoother (EnKS) method. This smoothing technique is adopted to account for the runoff routing lag. Using such assimilation framework of the soil moisture and discharge observations is expected to improve the flood forecasting. In order to distinguish the effectiveness of this dual-step assimilation framework, we designed a dual-EnKF algorithm in which the observed soil moisture and discharge are assimilated separately without accounting for the runoff routing lag. The results show that the multi-source data assimilation framework can effectively improve flood forecasting, especially when the runoff routing has a distinct time lag. Thus, this new data assimilation framework holds a great potential in operational flood forecasting by merging observations from ground measurement and remote sensing retrivals.

  12. NREL and IBM Improve Solar Forecasting with Big Data | Energy Systems

    Science.gov Websites

    forecasting model using deep-machine-learning technology. The multi-scale, multi-model tool, named Watt-sun the first standard suite of metrics for this purpose. Validating Watt-sun at multiple sites across the

  13. Evaluating NMME Seasonal Forecast Skill for use in NASA SERVIR Hub Regions

    NASA Technical Reports Server (NTRS)

    Roberts, J. Brent; Roberts, Franklin R.

    2013-01-01

    The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The coupled forecasts have numerous potential applications, both national and international in scope. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in driving applications models in hub regions including East Africa, the Hindu Kush- Himalayan (HKH) region and Mesoamerica. A prerequisite for seasonal forecast use in application modeling (e.g. hydrology, agriculture) is bias correction and skill assessment. Efforts to address systematic biases and multi-model combination in support of NASA SERVIR impact modeling requirements will be highlighted. Specifically, quantilequantile mapping for bias correction has been implemented for all archived NMME hindcasts. Both deterministic and probabilistic skill estimates for raw, bias-corrected, and multi-model ensemble forecasts as a function of forecast lead will be presented for temperature and precipitation. Complementing this statistical assessment will be case studies of significant events, for example, the ability of the NMME forecasts suite to anticipate the 2010/2011 drought in the Horn of Africa and its relationship to evolving SST patterns.

  14. Experiments with a three-dimensional statistical objective analysis scheme using FGGE data

    NASA Technical Reports Server (NTRS)

    Baker, Wayman E.; Bloom, Stephen C.; Woollen, John S.; Nestler, Mark S.; Brin, Eugenia

    1987-01-01

    A three-dimensional (3D), multivariate, statistical objective analysis scheme (referred to as optimum interpolation or OI) has been developed for use in numerical weather prediction studies with the FGGE data. Some novel aspects of the present scheme include: (1) a multivariate surface analysis over the oceans, which employs an Ekman balance instead of the usual geostrophic relationship, to model the pressure-wind error cross correlations, and (2) the capability to use an error correlation function which is geographically dependent. A series of 4-day data assimilation experiments are conducted to examine the importance of some of the key features of the OI in terms of their effects on forecast skill, as well as to compare the forecast skill using the OI with that utilizing a successive correction method (SCM) of analysis developed earlier. For the three cases examined, the forecast skill is found to be rather insensitive to varying the error correlation function geographically. However, significant differences are noted between forecasts from a two-dimensional (2D) version of the OI and those from the 3D OI, with the 3D OI forecasts exhibiting better forecast skill. The 3D OI forecasts are also more accurate than those from the SCM initial conditions. The 3D OI with the multivariate oceanic surface analysis was found to produce forecasts which were slightly more accurate, on the average, than a univariate version.

  15. Numerical simulation of advection fog formation on multi-disperse aerosols due to combustion-related pollutants

    NASA Technical Reports Server (NTRS)

    Hung, R. J.; Liaw, G. S.

    1980-01-01

    The effects of multi-disperse distribution of the aerosol population are presented. Single component and multi-component aerosol species on the condensation/nucleation processes which affect the reduction in visibility are described. The aerosol population with a high particle concentration provided more favorable conditions for the formation of a denser fog than the aerosol population with a greater particle size distribution when the value of the mass concentration of the aerosols was kept constant. The results were used as numerical predictions of fog formation. Two dimensional observations in horizontal and vertical coordinates, together with time-dependent measurements were needed as initial values for the following physical parameters: (1)wind profiles; (2) temperature profiles; (3) humidity profiles; (4) mass concentration of aerosol particles; (5) particle size distribution of aerosols; and (6) chemical composition of aerosols. Formation and dissipation of advection fog, thus, can be forecasted numerically by introducing initial values obtained from the observations.

  16. Penalized spline estimation for functional coefficient regression models.

    PubMed

    Cao, Yanrong; Lin, Haiqun; Wu, Tracy Z; Yu, Yan

    2010-04-01

    The functional coefficient regression models assume that the regression coefficients vary with some "threshold" variable, providing appreciable flexibility in capturing the underlying dynamics in data and avoiding the so-called "curse of dimensionality" in multivariate nonparametric estimation. We first investigate the estimation, inference, and forecasting for the functional coefficient regression models with dependent observations via penalized splines. The P-spline approach, as a direct ridge regression shrinkage type global smoothing method, is computationally efficient and stable. With established fixed-knot asymptotics, inference is readily available. Exact inference can be obtained for fixed smoothing parameter λ, which is most appealing for finite samples. Our penalized spline approach gives an explicit model expression, which also enables multi-step-ahead forecasting via simulations. Furthermore, we examine different methods of choosing the important smoothing parameter λ: modified multi-fold cross-validation (MCV), generalized cross-validation (GCV), and an extension of empirical bias bandwidth selection (EBBS) to P-splines. In addition, we implement smoothing parameter selection using mixed model framework through restricted maximum likelihood (REML) for P-spline functional coefficient regression models with independent observations. The P-spline approach also easily allows different smoothness for different functional coefficients, which is enabled by assigning different penalty λ accordingly. We demonstrate the proposed approach by both simulation examples and a real data application.

  17. An information model for managing multi-dimensional gridded data in a GIS

    NASA Astrophysics Data System (ADS)

    Xu, H.; Abdul-Kadar, F.; Gao, P.

    2016-04-01

    Earth observation agencies like NASA and NOAA produce huge volumes of historical, near real-time, and forecasting data representing terrestrial, atmospheric, and oceanic phenomena. The data drives climatological and meteorological studies, and underpins operations ranging from weather pattern prediction and forest fire monitoring to global vegetation analysis. These gridded data sets are distributed mostly as files in HDF, GRIB, or netCDF format and quantify variables like precipitation, soil moisture, or sea surface temperature, along one or more dimensions like time and depth. Although the data cube is a well-studied model for storing and analyzing multi-dimensional data, the GIS community remains in need of a solution that simplifies interactions with the data, and elegantly fits with existing database schemas and dissemination protocols. This paper presents an information model that enables Geographic Information Systems (GIS) to efficiently catalog very large heterogeneous collections of geospatially-referenced multi-dimensional rasters—towards providing unified access to the resulting multivariate hypercubes. We show how the implementation of the model encapsulates format-specific variations and provides unified access to data along any dimension. We discuss how this framework lends itself to familiar GIS concepts like image mosaics, vector field visualization, layer animation, distributed data access via web services, and scientific computing. Global data sources like MODIS from USGS and HYCOM from NOAA illustrate how one would employ this framework for cataloging, querying, and intuitively visualizing such hypercubes. ArcGIS—an established platform for processing, analyzing, and visualizing geospatial data—serves to demonstrate how this integration brings the full power of GIS to the scientific community.

  18. Contextual Multi-armed Bandits under Feature Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yun, Seyoung; Nam, Jun Hyun; Mo, Sangwoo

    We study contextual multi-armed bandit problems under linear realizability on rewards and uncertainty (or noise) on features. For the case of identical noise on features across actions, we propose an algorithm, coined NLinRel, having O(T⁷/₈(log(dT)+K√d)) regret bound for T rounds, K actions, and d-dimensional feature vectors. Next, for the case of non-identical noise, we observe that popular linear hypotheses including NLinRel are impossible to achieve such sub-linear regret. Instead, under assumption of Gaussian feature vectors, we prove that a greedy algorithm has O(T²/₃√log d)regret bound with respect to the optimal linear hypothesis. Utilizing our theoretical understanding on the Gaussian case,more » we also design a practical variant of NLinRel, coined Universal-NLinRel, for arbitrary feature distributions. It first runs NLinRel for finding the ‘true’ coefficient vector using feature uncertainties and then adjust it to minimize its regret using the statistical feature information. We justify the performance of Universal-NLinRel on both synthetic and real-world datasets.« less

  19. New ab initio adiabatic potential energy surfaces and bound state calculations for the singlet ground X˜ 1A1 and excited C˜ 1B2(21A') states of SO2

    NASA Astrophysics Data System (ADS)

    Kłos, Jacek; Alexander, Millard H.; Kumar, Praveen; Poirier, Bill; Jiang, Bin; Guo, Hua

    2016-05-01

    We report new and more accurate adiabatic potential energy surfaces (PESs) for the ground X˜ 1A1 and electronically excited C˜ 1B2(21A') states of the SO2 molecule. Ab initio points are calculated using the explicitly correlated internally contracted multi-reference configuration interaction (icMRCI-F12) method. A second less accurate PES for the ground X ˜ state is also calculated using an explicitly correlated single-reference coupled-cluster method with single, double, and non-iterative triple excitations [CCSD(T)-F12]. With these new three-dimensional PESs, we determine energies of the vibrational bound states and compare these values to existing literature data and experiment.

  20. Diagnosis of North American Multi-Model Ensemble (NMME) skill for predicting floods and droughts over the continental USA

    NASA Astrophysics Data System (ADS)

    Slater, L. J.; Villarini, G.; Bradley, A.

    2015-12-01

    Model predictions of precipitation and temperature are crucial to mitigate the impacts of major flood and drought events through informed planning and response. However, the potential value and applicability of these predictions is inescapably linked to their forecast quality. The North-American Multi-Model Ensemble (NMME) is a multi-agency supported forecasting system for intraseasonal to interannual (ISI) climate predictions. Retrospective forecasts and real-time information are provided by each agency free of charge to facilitate collaborative research efforts for predicting future climate conditions as well as extreme weather events such as floods and droughts. Using the PRISM climate mapping system as the reference data, we examine the skill of five General Circulation Models (GCMs) from the NMME project to forecast monthly and seasonal precipitation and temperature over seven sub-regions of the continental United States. For each model, we quantify the seasonal accuracy of the forecast relative to observed precipitation using the mean square error skill score. This score is decomposed to assess the accuracy of the forecast in the absence of biases (potential skill), and in the presence of conditional (slope reliability) and unconditional (standardized mean error) biases. The quantification of these biases allows us to diagnose each model's skill over a full range temporal and spatial scales. Finally, we test each model's forecasting skill by evaluating its ability to predict extended periods of extreme temperature and precipitation that were conducive to 'billion-dollar' historical flood and drought events in different regions of the continental USA. The forecasting skill of the individual climate models is summarized and presented along with a discussion of different multi-model averaging techniques for predicting such events.

  1. Hydrologic and hydraulic flood forecasting constrained by remote sensing data

    NASA Astrophysics Data System (ADS)

    Li, Y.; Grimaldi, S.; Pauwels, V. R. N.; Walker, J. P.; Wright, A. J.

    2017-12-01

    Flooding is one of the most destructive natural disasters, resulting in many deaths and billions of dollars of damages each year. An indispensable tool to mitigate the effect of floods is to provide accurate and timely forecasts. An operational flood forecasting system typically consists of a hydrologic model, converting rainfall data into flood volumes entering the river system, and a hydraulic model, converting these flood volumes into water levels and flood extents. Such a system is prone to various sources of uncertainties from the initial conditions, meteorological forcing, topographic data, model parameters and model structure. To reduce those uncertainties, current forecasting systems are typically calibrated and/or updated using ground-based streamflow measurements, and such applications are limited to well-gauged areas. The recent increasing availability of spatially distributed remote sensing (RS) data offers new opportunities to improve flood forecasting skill. Based on an Australian case study, this presentation will discuss the use of 1) RS soil moisture to constrain a hydrologic model, and 2) RS flood extent and level to constrain a hydraulic model.The GRKAL hydrological model is calibrated through a joint calibration scheme using both ground-based streamflow and RS soil moisture observations. A lag-aware data assimilation approach is tested through a set of synthetic experiments to integrate RS soil moisture to constrain the streamflow forecasting in real-time.The hydraulic model is LISFLOOD-FP which solves the 2-dimensional inertial approximation of the Shallow Water Equations. Gauged water level time series and RS-derived flood extent and levels are used to apply a multi-objective calibration protocol. The effectiveness with which each data source or combination of data sources constrained the parameter space will be discussed.

  2. Multi-year predictability of climate, drought, and wildfire in southwestern North America.

    PubMed

    Chikamoto, Yoshimitsu; Timmermann, Axel; Widlansky, Matthew J; Balmaseda, Magdalena A; Stott, Lowell

    2017-07-26

    Past severe droughts over North America have led to massive water shortages and increases in wildfire frequency. Triggering sources for multi-year droughts in this region include randomly occurring atmospheric blocking patterns, ocean impacts on atmospheric circulation, and climate's response to anthropogenic radiative forcings. A combination of these sources translates into a difficulty to predict the onset and length of such droughts on multi-year timescales. Here we present results from a new multi-year dynamical prediction system that exhibits a high degree of skill in forecasting wildfire probabilities and drought for 10-23 and 10-45 months lead time, which extends far beyond the current seasonal prediction activities for southwestern North America. Using a state-of-the-art earth system model along with 3-dimensional ocean data assimilation and by prescribing the external radiative forcings, this system simulates the observed low-frequency variability of precipitation, soil water, and wildfire probabilities in close agreement with observational records and reanalysis data. The underlying source of multi-year predictability can be traced back to variations of the Atlantic/Pacific sea surface temperature gradient, external radiative forcings, and the low-pass filtering characteristics of soils.

  3. Adaptive multi-resolution 3D Hartree-Fock-Bogoliubov solver for nuclear structure

    NASA Astrophysics Data System (ADS)

    Pei, J. C.; Fann, G. I.; Harrison, R. J.; Nazarewicz, W.; Shi, Yue; Thornton, S.

    2014-08-01

    Background: Complex many-body systems, such as triaxial and reflection-asymmetric nuclei, weakly bound halo states, cluster configurations, nuclear fragments produced in heavy-ion fusion reactions, cold Fermi gases, and pasta phases in neutron star crust, are all characterized by large sizes and complex topologies in which many geometrical symmetries characteristic of ground-state configurations are broken. A tool of choice to study such complex forms of matter is an adaptive multi-resolution wavelet analysis. This method has generated much excitement since it provides a common framework linking many diversified methodologies across different fields, including signal processing, data compression, harmonic analysis and operator theory, fractals, and quantum field theory. Purpose: To describe complex superfluid many-fermion systems, we introduce an adaptive pseudospectral method for solving self-consistent equations of nuclear density functional theory in three dimensions, without symmetry restrictions. Methods: The numerical method is based on the multi-resolution and computational harmonic analysis techniques with a multi-wavelet basis. The application of state-of-the-art parallel programming techniques include sophisticated object-oriented templates which parse the high-level code into distributed parallel tasks with a multi-thread task queue scheduler for each multi-core node. The internode communications are asynchronous. The algorithm is variational and is capable of solving coupled complex-geometric systems of equations adaptively, with functional and boundary constraints, in a finite spatial domain of very large size, limited by existing parallel computer memory. For smooth functions, user-defined finite precision is guaranteed. Results: The new adaptive multi-resolution Hartree-Fock-Bogoliubov (HFB) solver madness-hfb is benchmarked against a two-dimensional coordinate-space solver hfb-ax that is based on the B-spline technique and a three-dimensional solver hfodd that is based on the harmonic-oscillator basis expansion. Several examples are considered, including the self-consistent HFB problem for spin-polarized trapped cold fermions and the Skyrme-Hartree-Fock (+BCS) problem for triaxial deformed nuclei. Conclusions: The new madness-hfb framework has many attractive features when applied to nuclear and atomic problems involving many-particle superfluid systems. Of particular interest are weakly bound nuclear configurations close to particle drip lines, strongly elongated and dinuclear configurations such as those present in fission and heavy-ion fusion, and exotic pasta phases that appear in neutron star crust.

  4. Research on electromechanical resonance of two-axis tracking system

    NASA Astrophysics Data System (ADS)

    Zhao, Zhi-ming; Xue, Ying-jie; Zeng, Shu-qin; Li, Zhi-guo

    2017-02-01

    The multi-axes synchronous system about the spatial two-axis turntable is the key equipment for semi-physical simulation and test in aerospace. In this paper, the whole structure design of the turntable is created by using Solidworks, then putting the three-dimensional solid model into ANSYS to build the finite element model. The software ANSYS is used to do the simulation about the static and dynamic analysis of two-axis turntable. Based on the modal analysis, we can forecast the inherent frequencies and the mode of vibration during the launch conditions which is very important to the design and safety of the structure.

  5. Probabilistic Solar Wind Forecasting Using Large Ensembles of Near-Sun Conditions With a Simple One-Dimensional "Upwind" Scheme

    NASA Astrophysics Data System (ADS)

    Owens, Mathew J.; Riley, Pete

    2017-11-01

    Long lead-time space-weather forecasting requires accurate prediction of the near-Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near-Sun solar wind and magnetic field conditions provide the inner boundary condition to three-dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics-based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near-Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near-Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near-Sun solar wind speed at a range of latitudes about the sub-Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun-Earth line. Propagating these conditions to Earth by a three-dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one-dimensional "upwind" scheme is used. The variance in the resulting near-Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996-2016, the upwind ensemble is found to provide a more "actionable" forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large).

  6. Probabilistic Solar Wind Forecasting Using Large Ensembles of Near-Sun Conditions With a Simple One-Dimensional "Upwind" Scheme.

    PubMed

    Owens, Mathew J; Riley, Pete

    2017-11-01

    Long lead-time space-weather forecasting requires accurate prediction of the near-Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near-Sun solar wind and magnetic field conditions provide the inner boundary condition to three-dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics-based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near-Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near-Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near-Sun solar wind speed at a range of latitudes about the sub-Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun-Earth line. Propagating these conditions to Earth by a three-dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one-dimensional "upwind" scheme is used. The variance in the resulting near-Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996-2016, the upwind ensemble is found to provide a more "actionable" forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large).

  7. Probabilistic Solar Wind Forecasting Using Large Ensembles of Near‐Sun Conditions With a Simple One‐Dimensional “Upwind” Scheme

    PubMed Central

    Riley, Pete

    2017-01-01

    Abstract Long lead‐time space‐weather forecasting requires accurate prediction of the near‐Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near‐Sun solar wind and magnetic field conditions provide the inner boundary condition to three‐dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics‐based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near‐Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near‐Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near‐Sun solar wind speed at a range of latitudes about the sub‐Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun‐Earth line. Propagating these conditions to Earth by a three‐dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one‐dimensional “upwind” scheme is used. The variance in the resulting near‐Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996–2016, the upwind ensemble is found to provide a more “actionable” forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large). PMID:29398982

  8. Filtering techniques for efficient inversion of two-dimensional Nuclear Magnetic Resonance data

    NASA Astrophysics Data System (ADS)

    Bortolotti, V.; Brizi, L.; Fantazzini, P.; Landi, G.; Zama, F.

    2017-10-01

    The inversion of two-dimensional Nuclear Magnetic Resonance (NMR) data requires the solution of a first kind Fredholm integral equation with a two-dimensional tensor product kernel and lower bound constraints. For the solution of this ill-posed inverse problem, the recently presented 2DUPEN algorithm [V. Bortolotti et al., Inverse Problems, 33(1), 2016] uses multiparameter Tikhonov regularization with automatic choice of the regularization parameters. In this work, I2DUPEN, an improved version of 2DUPEN that implements Mean Windowing and Singular Value Decomposition filters, is deeply tested. The reconstruction problem with filtered data is formulated as a compressed weighted least squares problem with multi-parameter Tikhonov regularization. Results on synthetic and real 2D NMR data are presented with the main purpose to deeper analyze the separate and combined effects of these filtering techniques on the reconstructed 2D distribution.

  9. Evaluating the extreme precipitation events using a mesoscale atmopshere model

    NASA Astrophysics Data System (ADS)

    Yucel, I.; Onen, A.

    2012-04-01

    Evidence is showing that global warming or climate change has a direct influence on changes in precipitation and the hydrological cycle. Extreme weather events such as heavy rainfall and flooding are projected to become much more frequent as climate warms. Mesoscale atmospheric models coupled with land surface models provide efficient forecasts for meteorological events in high lead time and therefore they should be used for flood forecasting and warning issues as they provide more continuous monitoring of precipitation over large areas. This study examines the performance of the Weather Research and Forecasting (WRF) model in producing the temporal and spatial characteristics of the number of extreme precipitation events observed in West Black Sea Region of Turkey. Extreme precipitation events usually resulted in flood conditions as an associated hydrologic response of the basin. The performance of the WRF system is further investigated by using the three dimensional variational (3D-VAR) data assimilation scheme within WRF. WRF performance with and without data assimilation at high spatial resolution (4 km) is evaluated by making comparison with gauge precipitation and satellite-estimated rainfall data from Multi Precipitation Estimates (MPE). WRF-derived precipitation showed capabilities in capturing the timing of the precipitation extremes and in some extent spatial distribution and magnitude of the heavy rainfall events. These precipitation characteristics are enhanced with the use of 3D-VAR scheme in WRF system. Data assimilation improved area-averaged precipitation forecasts by 9 percent and at some points there exists quantitative match in precipitation events, which are critical for hydrologic forecast application.

  10. Discrete post-processing of total cloud cover ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Haiden, Thomas; Pappenberger, Florian

    2017-04-01

    This contribution presents an approach to post-process ensemble forecasts for the discrete and bounded weather variable of total cloud cover. Two methods for discrete statistical post-processing of ensemble predictions are tested. The first approach is based on multinomial logistic regression, the second involves a proportional odds logistic regression model. Applying them to total cloud cover raw ensemble forecasts from the European Centre for Medium-Range Weather Forecasts improves forecast skill significantly. Based on station-wise post-processing of raw ensemble total cloud cover forecasts for a global set of 3330 stations over the period from 2007 to early 2014, the more parsimonious proportional odds logistic regression model proved to slightly outperform the multinomial logistic regression model. Reference Hemri, S., Haiden, T., & Pappenberger, F. (2016). Discrete post-processing of total cloud cover ensemble forecasts. Monthly Weather Review 144, 2565-2577.

  11. Operational planning using Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS)

    NASA Astrophysics Data System (ADS)

    O'Connor, Alison; Kirtman, Benjamin; Harrison, Scott; Gorman, Joe

    2016-05-01

    The US Navy faces several limitations when planning operations in regard to forecasting environmental conditions. Currently, mission analysis and planning tools rely heavily on short-term (less than a week) forecasts or long-term statistical climate products. However, newly available data in the form of weather forecast ensembles provides dynamical and statistical extended-range predictions that can produce more accurate predictions if ensemble members can be combined correctly. Charles River Analytics is designing the Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS), which performs data fusion over extended-range multi-model ensembles, such as the North American Multi-Model Ensemble (NMME), to produce a unified forecast for several weeks to several seasons in the future. We evaluated thirty years of forecasts using machine learning to select predictions for an all-encompassing and superior forecast that can be used to inform the Navy's decision planning process.

  12. Three-Dimensional Planetary Surface Tracking Based on a Simple Ultra-Wideband Impulse-Radio Infrastructure

    NASA Technical Reports Server (NTRS)

    Barton, Richard J.; Ni, David; Ngo, Phong

    2010-01-01

    Several prototype ultra-wideband (UWB) impulse-radio (IR) tracking systems are currently under development at NASA Johnson Space Center (JSC). These systems are being studied for use in tracking of Lunar/Mars rovers and astronauts during early exploration missions when satellite navigation systems (such as GPS) are not available. To date, the systems that have been designed and tested are intended only for two-dimensional location and tracking, but these designs can all be extended to three-dimensional tracking with only minor modifications and increases in complexity. In this presentation, we will briefly review the design and performance of two of the current 2-D systems: one designed specifically for short-range, extremely high-precision tracking (approximately 1-2 cm resolution) and the other designed specifically for much longer range tracking with less stringent precision requirements (1-2 m resolution). We will then discuss a new multi-purpose system design based on a simple UWB-IR architecture that can be deployed easily on a planetary surface to support arbitrary three-dimensional localization and tracking applications. We will discuss utilization of this system as an infrastructure to provide both short-range and long-range tracking and analyze the localization performance of the system in several different configurations. We will give theoretical performance bounds for some canonical system configurations and compare these performance bounds with both numerical simulations of the system as well as actual experimental system performance evaluations.

  13. Development, Implementation, and Skill Assessment of the NOAA/NOS Great Lakes Operational Forecast System

    DTIC Science & Technology

    2011-01-01

    USA) 2011 Abstract The NOAA Great Lakes Operational Forecast System ( GLOFS ) uses near-real-time atmospheric observa- tions and numerical weather...Operational Oceanographic Products and Services (CO-OPS) in Silver Spring, MD. GLOFS has been making operational nowcasts and forecasts at CO-OPS... GLOFS ) uses near-real-time atmospheric observations and numerical weather prediction forecast guidance to produce three-dimensional forecasts of water

  14. The development and evaluation of a hydrological seasonal forecast system prototype for predicting spring flood volumes in Swedish rivers

    NASA Astrophysics Data System (ADS)

    Foster, Kean; Bertacchi Uvo, Cintia; Olsson, Jonas

    2018-05-01

    Hydropower makes up nearly half of Sweden's electrical energy production. However, the distribution of the water resources is not aligned with demand, as most of the inflows to the reservoirs occur during the spring flood period. This means that carefully planned reservoir management is required to help redistribute water resources to ensure optimal production and accurate forecasts of the spring flood volume (SFV) is essential for this. The current operational SFV forecasts use a historical ensemble approach where the HBV model is forced with historical observations of precipitation and temperature. In this work we develop and test a multi-model prototype, building on previous work, and evaluate its ability to forecast the SFV in 84 sub-basins in northern Sweden. The hypothesis explored in this work is that a multi-model seasonal forecast system incorporating different modelling approaches is generally more skilful at forecasting the SFV in snow dominated regions than a forecast system that utilises only one approach. The testing is done using cross-validated hindcasts for the period 1981-2015 and the results are evaluated against both climatology and the current system to determine skill. Both the multi-model methods considered showed skill over the reference forecasts. The version that combined the historical modelling chain, dynamical modelling chain, and statistical modelling chain performed better than the other and was chosen for the prototype. The prototype was able to outperform the current operational system 57 % of the time on average and reduce the error in the SFV by ˜ 6 % across all sub-basins and forecast dates.

  15. The THOR Project-Reducing the Impact of Thunderstorms on Aviation and the General Public Through a Multi-Agency Effect

    NASA Technical Reports Server (NTRS)

    Smith, Stephan B.; Pace, David; Goodman, Steven J.; Burgess, Donald W.; Smarsh, David; Roberts, Rita D.; Wolfson, Marilyn M.; Goodman, H. Michael (Technical Monitor)

    2001-01-01

    Thunderstorms are high impact weather phenomena. They also pose an extremely challenging forecast problem. The National Oceanic and Atmospheric Administration (NOAA), the Federal Aviation Administration (FAA), the National Aeronautic and Space Administration (NASA), and the Air Force Weather Agency (AFWA), have decided to pool technology and scientific expertise into an unprecedented effort to better observe, diagnose, and forecast thunderstorms. This paper describes plans for an operational field test called the THunderstorm Operational Research (THOR) Project beginning in 2002, the primary goals of which are to: 1) Reduce the number of Thunderstorm-related Air Traffic Delays with in the National Airspace System (NAS) and, 2) Improve severe thunderstorm, tornado and airport thunderstorm warning accuracy and lead time. Aviation field operations will be focused on the prime air traffic bottleneck in the NAS, the airspace bounded roughly by Chicago, New York City and Washington D.C., sometimes called the Northeast Corridor. A variety of new automated thunderstorm forecasting applications will be tested here that, when implemented into FAA-NWS operations, will allow for better tactical decision making and NAS management during thunderstorm days. Severe thunderstorm operations will be centered on Northern Alabama. NWS meteorologists from the forecast office in Birmingham will test the utility of experimental lightning, radar, and profiler data from a mesoscale observing network being established by NASA's Marshall Space Flight Center. In addition, new tornado detection and thunderstorm nowcasting algorithms will be examined for their potential for improving warning accuracy. The Alabama THOR site will also serve as a test bed for new gridded, digital thunderstorm and flash flood warning products.

  16. Evaluation of NU-WRF Rainfall Forecasts for IFloodS

    NASA Technical Reports Server (NTRS)

    Wu, Di; Peters-Lidard, Christa; Tao, Wei-Kuo; Petersen, Walter

    2016-01-01

    The Iowa Flood Studies (IFloodS) campaign was conducted in eastern Iowa as a pre- GPM-launch campaign from 1 May to 15 June 2013. During the campaign period, real time forecasts are conducted utilizing NASA-Unified Weather Research and Forecasting (NU-WRF) model to support the everyday weather briefing. In this study, two sets of the NU-WRF rainfall forecasts are evaluated with Stage IV and Multi-Radar Multi-Sensor (MRMS) Quantitative Precipitation Estimation (QPE), with the objective to understand the impact of Land Surface initialization on the predicted precipitation. NU-WRF is also compared with North American Mesoscale Forecast System (NAM) 12 kilometer forecast. In general, NU-WRF did a good job at capturing individual precipitation events. NU-WRF is also able to replicate a better rainfall spatial distribution compare with NAM. Further sensitivity tests show that the high-resolution makes a positive impact on rainfall forecast. The two sets of NU-WRF simulations produce very close rainfall characteristics. The Land surface initialization do not show significant impact on short term rainfall forecast, and it is largely due to the soil conditions during the field campaign period.

  17. Improved track forecasting of a typhoon reaching landfall from four-dimensional variational data assimilation of AMSU-A retrieved data

    NASA Astrophysics Data System (ADS)

    Zhao, Ying; Wang, Bin; Ji, Zhongzhen; Liang, Xudong; Deng, Guo; Zhang, Xin

    2005-07-01

    In this study, an attempt to improve typhoon forecasts is made by incorporating three-dimensional Advanced Microwave Sounding Unit-A (AMSU-A) retrieved wind and temperature and the central sea level pressure of cyclones from typhoon reports or bogus surface low data into initial conditions, on the basis of the Fifth-Generation National Center for Atmospheric Research/Pennsylvania State University Mesoscale Model (MM5) four-dimensional variational data assimilation (4DVar) system with a full-physics adjoint model. All the above-mentioned data are found to be useful for improvement of typhoon forecasts in this mesoscale data assimilation experiment. The comparison tests showed the following results: (1) The assimilation of the satellite-retrieved data was found to have a positive impact on the typhoon track forecast, but the landing position error is ˜150 km. (2) The assimilation of both the satellite-retrieved data and moving information of the typhoon center dramatically improved the track forecast and captured the recurvature and landfall. The mean track error during the 72-hour forecast is 69 km. The predicted typhoon intensity, however, is much weaker than that from observations. (3) The assimilation of both the satellite-retrieved data and the bogus surface low data improved the intensity and track forecasts more significantly than the assimilation of only bogus surface low data (bogus data assimilation) did. The mean errors during the 72-hour forecast are 2.6 hPa for the minimum sea level pressure and 87 km for track position. However, the forecasted landing time is ˜6 hours earlier than the observed one.

  18. Multi-Scale 4DVAR Assimilation of Glider Teams on the North Carolina Shelf

    NASA Astrophysics Data System (ADS)

    Osborne, J. J. V.; Carrier, M.; Book, J. W.; Barron, C. N.; Rice, A. E.; Rowley, C. D.; Smedstad, L.; Souopgui, I.; Teague, W. J.

    2017-12-01

    We demonstrate a method to assimilate glider profile data from multiple gliders in close proximity ( 10 km or less). Gliders were deployed in a field experiment from 17 May until 4 June 2017, north of Cape Hatteras and inshore of the Gulf Stream. Gliders were divided into two teams, generally two or three gliders per team. One team was tasked with station keeping and the other with moving and sampling regions of high variability in temperature and salinity. Glider data are assimilated into the Relocatable Navy Coastal Ocean Model (RELO NCOM) with four dimensional variational assimilation (NCOM-4DVAR). RELO NCOM is used by the US Navy to predict the ocean. RELO NCOM is a baroclinic, Boussinesq, free-surface, and hydrostatic ocean model with a flexible sigma-z vertical coordinate. Two domains are used, one focused north and one focused south of Cape Hatteras. The domains overlap near the gliders, thus providing two forecasts near the gliders. Both domains have 1 km horizontal resolution. Data are assimilated in a newly-developed multi-scale data-processing and assimilating approach using NCOM-4DVAR. This enables NCOM-4DVAR to use many more observations than standard NCOM-4DVAR, improving the analysis and forecast. Assimilation experiments use station-keeping glider data, moving glider data, or all glider data. Sea surface temperature (SST) data and satellite altimeter (SSH) data are also assimilated. An additional experiment omits glider data but still assimilates SST and SSH data. Conductivity, temperature, and depth (CTD) profiles from the R/V Savannah are used for validation, including data from an underway CTD (UCTD). Data from glider teams have the potential to significantly improve model forecasts. Missions using teams of gliders can be planned to maximize data assimilation for optimal impact on model predictions.

  19. How do I know if I’ve improved my continental scale flood early warning system?

    NASA Astrophysics Data System (ADS)

    Cloke, Hannah L.; Pappenberger, Florian; Smith, Paul J.; Wetterhall, Fredrik

    2017-04-01

    Flood early warning systems mitigate damages and loss of life and are an economically efficient way of enhancing disaster resilience. The use of continental scale flood early warning systems is rapidly growing. The European Flood Awareness System (EFAS) is a pan-European flood early warning system forced by a multi-model ensemble of numerical weather predictions. Responses to scientific and technical changes can be complex in these computationally expensive continental scale systems, and improvements need to be tested by evaluating runs of the whole system. It is demonstrated here that forecast skill is not correlated with the value of warnings. In order to tell if the system has been improved an evaluation strategy is required that considers both forecast skill and warning value. The combination of a multi-forcing ensemble of EFAS flood forecasts is evaluated with a new skill-value strategy. The full multi-forcing ensemble is recommended for operational forecasting, but, there are spatial variations in the optimal forecast combination. Results indicate that optimizing forecasts based on value rather than skill alters the optimal forcing combination and the forecast performance. Also indicated is that model diversity and ensemble size are both important in achieving best overall performance. The use of several evaluation measures that consider both skill and value is strongly recommended when considering improvements to early warning systems.

  20. Forecast horizon of multi-item dynamic lot size model with perishable inventory.

    PubMed

    Jing, Fuying; Lan, Zirui

    2017-01-01

    This paper studies a multi-item dynamic lot size problem for perishable products where stock deterioration rates and inventory costs are age-dependent. We explore structural properties in an optimal solution under two cost structures and develop a dynamic programming algorithm to solve the problem in polynomial time when the number of products is fixed. We establish forecast horizon results that can help the operation manager to decide the precise forecast horizon in a rolling decision-making process. Finally, based on a detailed test bed of instance, we obtain useful managerial insights on the impact of deterioration rate and lifetime of products on the length of forecast horizon.

  1. Forecast horizon of multi-item dynamic lot size model with perishable inventory

    PubMed Central

    Jing, Fuying

    2017-01-01

    This paper studies a multi-item dynamic lot size problem for perishable products where stock deterioration rates and inventory costs are age-dependent. We explore structural properties in an optimal solution under two cost structures and develop a dynamic programming algorithm to solve the problem in polynomial time when the number of products is fixed. We establish forecast horizon results that can help the operation manager to decide the precise forecast horizon in a rolling decision-making process. Finally, based on a detailed test bed of instance, we obtain useful managerial insights on the impact of deterioration rate and lifetime of products on the length of forecast horizon. PMID:29125856

  2. CMB constraints on running non-Gaussianity

    NASA Astrophysics Data System (ADS)

    Oppizzi, F.; Liguori, M.; Renzi, A.; Arroja, F.; Bartolo, N.

    2018-05-01

    We develop a complete set of tools for CMB forecasting, simulation and estimation of primordial running bispectra, arising from a variety of curvaton and single-field (DBI) models of Inflation. We validate our pipeline using mock CMB running non-Gaussianity realizations and test it on real data by obtaining experimental constraints on the fNL running spectral index, nNG, using WMAP 9-year data. Our final bounds (68% C.L.) read ‑0.6< nNG<1.4}, ‑0.3< nNG<1.2, ‑1.1

  3. A Kalman filter for a two-dimensional shallow-water model

    NASA Technical Reports Server (NTRS)

    Parrish, D. F.; Cohn, S. E.

    1985-01-01

    A two-dimensional Kalman filter is described for data assimilation for making weather forecasts. The filter is regarded as superior to the optimal interpolation method because the filter determines the forecast error covariance matrix exactly instead of using an approximation. A generalized time step is defined which includes expressions for one time step of the forecast model, the error covariance matrix, the gain matrix, and the evolution of the covariance matrix. Subsequent time steps are achieved by quantifying the forecast variables or employing a linear extrapolation from a current variable set, assuming the forecast dynamics are linear. Calculations for the evolution of the error covariance matrix are banded, i.e., are performed only with the elements significantly different from zero. Experimental results are provided from an application of the filter to a shallow-water simulation covering a 6000 x 6000 km grid.

  4. The Canadian seasonal forecast and the APCC exchange.

    NASA Astrophysics Data System (ADS)

    Archambault, B.; Fontecilla, J.; Kharin, V.; Bourgouin, P.; Ashok, K.; Lee, D.

    2009-05-01

    In this talk, we will first describe the Canadian seasonal forecast system. This system uses a 4 model ensemble approach with each of these models generating a 10 members ensemble. Multi-model issues related to this system will be describes. Secondly, we will describe an international multi-system initiative. The Asia-Pacific Economic Cooperation (APEC) is a forum for 21 Pacific Rim countries or regions including Canada. The APEC Climate Center (APCC) provides seasonal forecasts to their regional climate centers with a Multi Model Ensemble (MME) approach. The APCC MME is based on 13 ensemble prediction systems from different institutions including MSC(Canada), NCEP(USA), COLA(USA), KMA(Korea), JMA(Japan), BOM(Australia) and others. In this presentation, we will describe the basics of this international cooperation.

  5. Predictive Skill of Meteorological Drought Based on Multi-Model Ensemble Forecasts: A Real-Time Assessment

    NASA Astrophysics Data System (ADS)

    Chen, L. C.; Mo, K. C.; Zhang, Q.; Huang, J.

    2014-12-01

    Drought prediction from monthly to seasonal time scales is of critical importance to disaster mitigation, agricultural planning, and multi-purpose reservoir management. Starting in December 2012, NOAA Climate Prediction Center (CPC) has been providing operational Standardized Precipitation Index (SPI) Outlooks using the North American Multi-Model Ensemble (NMME) forecasts, to support CPC's monthly drought outlooks and briefing activities. The current NMME system consists of six model forecasts from U.S. and Canada modeling centers, including the CFSv2, CM2.1, GEOS-5, CCSM3.0, CanCM3, and CanCM4 models. In this study, we conduct an assessment of the predictive skill of meteorological drought using real-time NMME forecasts for the period from May 2012 to May 2014. The ensemble SPI forecasts are the equally weighted mean of the six model forecasts. Two performance measures, the anomaly correlation coefficient and root-mean-square errors against the observations, are used to evaluate forecast skill.Similar to the assessment based on NMME retrospective forecasts, predictive skill of monthly-mean precipitation (P) forecasts is generally low after the second month and errors vary among models. Although P forecast skill is not large, SPI predictive skill is high and the differences among models are small. The skill mainly comes from the P observations appended to the model forecasts. This factor also contributes to the similarity of SPI prediction among the six models. Still, NMME SPI ensemble forecasts have higher skill than those based on individual models or persistence, and the 6-month SPI forecasts are skillful out to four months. The three major drought events occurred during the 2012-2014 period, the 2012 Central Great Plains drought, the 2013 Upper Midwest flash drought, and 2013-2014 California drought, are used as examples to illustrate the system's strength and limitations. For precipitation-driven drought events, such as the 2012 Central Great Plains drought, NMME SPI forecasts perform well in predicting drought severity and spatial patterns. For fast-developing drought events, such as the 2013 Upper Midwest flash drought, the system failed to capture the onset of the drought.

  6. THE EMERGENCE OF NUMERICAL AIR QUALITY FORECASTING MODELS AND THEIR APPLICATION

    EPA Science Inventory

    In recent years the U.S. and other nations have begun programs for short-term local through regional air quality forecasting based upon numerical three-dimensional air quality grid models. These numerical air quality forecast (NAQF) models and systems have been developed and test...

  7. Scattering and cloaking of binary hyper-particles in metamaterials.

    PubMed

    Alexopoulos, A; Yau, K S B

    2010-09-13

    We derive the d-dimensional scattering cross section for homogeneous and composite hyper-particles inside a metamaterial. The polarizability of the hyper-particles is expressed in multi-dimensional form and is used in order to examine various scattering characteristics. We introduce scattering bounds that display interesting results when d --> ∞ and in particular consider the special limit of hyper-particle cloaking in some detail. We demonstrate cloaking via resonance for homogeneous particles and show that composite hyper-particles can be used in order to obtain electromagnetic cloaking with either negative or all positive constitutive parameters respectively. Our approach not only considers cloaking of particles of integer dimension but also particles with non-integer dimension such as fractals. Theoretical results are compared to full-wave numerical simulations for two interacting hyper-particles in a medium.

  8. Can we use Earth Observations to improve monthly water level forecasts?

    NASA Astrophysics Data System (ADS)

    Slater, L. J.; Villarini, G.

    2017-12-01

    Dynamical-statistical hydrologic forecasting approaches benefit from different strengths in comparison with traditional hydrologic forecasting systems: they are computationally efficient, can integrate and `learn' from a broad selection of input data (e.g., General Circulation Model (GCM) forecasts, Earth Observation time series, teleconnection patterns), and can take advantage of recent progress in machine learning (e.g. multi-model blending, post-processing and ensembling techniques). Recent efforts to develop a dynamical-statistical ensemble approach for forecasting seasonal streamflow using both GCM forecasts and changing land cover have shown promising results over the U.S. Midwest. Here, we use climate forecasts from several GCMs of the North American Multi Model Ensemble (NMME) alongside 15-minute stage time series from the National River Flow Archive (NRFA) and land cover classes extracted from the European Space Agency's Climate Change Initiative 300 m annual Global Land Cover time series. With these data, we conduct systematic long-range probabilistic forecasting of monthly water levels in UK catchments over timescales ranging from one to twelve months ahead. We evaluate the improvement in model fit and model forecasting skill that comes from using land cover classes as predictors in the models. This work opens up new possibilities for combining Earth Observation time series with GCM forecasts to predict a variety of hazards from space using data science techniques.

  9. Multi-Dimensional Calibration of Impact Dynamic Models

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Annett, Martin S.; Jackson, Karen E.

    2011-01-01

    NASA Langley, under the Subsonic Rotary Wing Program, recently completed two helicopter tests in support of an in-house effort to study crashworthiness. As part of this effort, work is on-going to investigate model calibration approaches and calibration metrics for impact dynamics models. Model calibration of impact dynamics problems has traditionally assessed model adequacy by comparing time histories from analytical predictions to test at only a few critical locations. Although this approach provides for a direct measure of the model predictive capability, overall system behavior is only qualitatively assessed using full vehicle animations. In order to understand the spatial and temporal relationships of impact loads as they migrate throughout the structure, a more quantitative approach is needed. In this work impact shapes derived from simulated time history data are used to recommend sensor placement and to assess model adequacy using time based metrics and orthogonality multi-dimensional metrics. An approach for model calibration is presented that includes metric definitions, uncertainty bounds, parameter sensitivity, and numerical optimization to estimate parameters to reconcile test with analysis. The process is illustrated using simulated experiment data.

  10. Constraining primordial non-Gaussianity with bispectrum and power spectrum from upcoming optical and radio surveys

    NASA Astrophysics Data System (ADS)

    Karagiannis, Dionysios; Lazanu, Andrei; Liguori, Michele; Raccanelli, Alvise; Bartolo, Nicola; Verde, Licia

    2018-07-01

    We forecast constraints on primordial non-Gaussianity (PNG) and bias parameters from measurements of galaxy power spectrum and bispectrum in future radio continuum and optical surveys. In the galaxy bispectrum, we consider a comprehensive list of effects, including the bias expansion for non-Gaussian initial conditions up to second order, redshift space distortions, redshift uncertainties and theoretical errors. These effects are all combined in a single PNG forecast for the first time. Moreover, we improve the bispectrum modelling over previous forecasts, by accounting for trispectrum contributions. All effects have an impact on final predicted bounds, which varies with the type of survey. We find that the bispectrum can lead to improvements up to a factor ˜5 over bounds based on the power spectrum alone, leading to significantly better constraints for local-type PNG, with respect to current limits from Planck. Future radio and photometric surveys could obtain a measurement error of σ (f_{NL}^{loc}) ≈ 0.2. In the case of equilateral PNG, galaxy bispectrum can improve upon present bounds only if significant improvements in the redshift determinations of future, large volume, photometric or radio surveys could be achieved. For orthogonal non-Gaussianity, expected constraints are generally comparable to current ones.

  11. Constraining Primordial non-Gaussianity with Bispectrum and Power Spectum from Upcoming Optical and Radio Surveys

    NASA Astrophysics Data System (ADS)

    Karagiannis, Dionysios; Lazanu, Andrei; Liguori, Michele; Raccanelli, Alvise; Bartolo, Nicola; Verde, Licia

    2018-04-01

    We forecast constraints on primordial non-Gaussianity (PNG) and bias parameters from measurements of galaxy power spectrum and bispectrum in future radio continuum and optical surveys. In the galaxy bispectrum, we consider a comprehensive list of effects, including the bias expansion for non-Gaussian initial conditions up to second order, redshift space distortions, redshift uncertainties and theoretical errors. These effects are all combined in a single PNG forecast for the first time. Moreover, we improve the bispectrum modelling over previous forecasts, by accounting for trispectrum contributions. All effects have an impact on final predicted bounds, which varies with the type of survey. We find that the bispectrum can lead to improvements up to a factor ˜5 over bounds based on the power spectrum alone, leading to significantly better constraints for local-type PNG, with respect to current limits from Planck. Future radio and photometric surveys could obtain a measurement error of σ (f_{NL}^{loc}) ≈ 0.2. In the case of equilateral PNG, galaxy bispectrum can improve upon present bounds only if significant improvements in the redshift determinations of future, large volume, photometric or radio surveys could be achieved. For orthogonal non-Gaussianity, expected constraints are generally comparable to current ones.

  12. Multi-parametric variational data assimilation for hydrological forecasting

    NASA Astrophysics Data System (ADS)

    Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.

    2017-12-01

    Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.

  13. Beyond the continuum: a multi-dimensional phase space for neutral-niche community assembly.

    PubMed

    Latombe, Guillaume; Hui, Cang; McGeoch, Melodie A

    2015-12-22

    Neutral and niche processes are generally considered to interact in natural communities along a continuum, exhibiting community patterns bounded by pure neutral and pure niche processes. The continuum concept uses niche separation, an attribute of the community, to test the hypothesis that communities are bounded by pure niche or pure neutral conditions. It does not accommodate interactions via feedback between processes and the environment. By contrast, we introduce the Community Assembly Phase Space (CAPS), a multi-dimensional space that uses community processes (such as dispersal and niche selection) to define the limiting neutral and niche conditions and to test the continuum hypothesis. We compare the outputs of modelled communities in a heterogeneous landscape, assembled by pure neutral, pure niche and composite processes. Differences in patterns under different combinations of processes in CAPS reveal hidden complexity in neutral-niche community dynamics. The neutral-niche continuum only holds for strong dispersal limitation and niche separation. For weaker dispersal limitation and niche separation, neutral and niche processes amplify each other via feedback with the environment. This generates patterns that lie well beyond those predicted by a continuum. Inferences drawn from patterns about community assembly processes can therefore be misguided when based on the continuum perspective. CAPS also demonstrates the complementary information value of different patterns for inferring community processes and captures the complexity of community assembly. It provides a general tool for studying the processes structuring communities and can be applied to address a range of questions in community and metacommunity ecology. © 2015 The Author(s).

  14. Beyond the continuum: a multi-dimensional phase space for neutral–niche community assembly

    PubMed Central

    Latombe, Guillaume; McGeoch, Melodie A.

    2015-01-01

    Neutral and niche processes are generally considered to interact in natural communities along a continuum, exhibiting community patterns bounded by pure neutral and pure niche processes. The continuum concept uses niche separation, an attribute of the community, to test the hypothesis that communities are bounded by pure niche or pure neutral conditions. It does not accommodate interactions via feedback between processes and the environment. By contrast, we introduce the Community Assembly Phase Space (CAPS), a multi-dimensional space that uses community processes (such as dispersal and niche selection) to define the limiting neutral and niche conditions and to test the continuum hypothesis. We compare the outputs of modelled communities in a heterogeneous landscape, assembled by pure neutral, pure niche and composite processes. Differences in patterns under different combinations of processes in CAPS reveal hidden complexity in neutral–niche community dynamics. The neutral–niche continuum only holds for strong dispersal limitation and niche separation. For weaker dispersal limitation and niche separation, neutral and niche processes amplify each other via feedback with the environment. This generates patterns that lie well beyond those predicted by a continuum. Inferences drawn from patterns about community assembly processes can therefore be misguided when based on the continuum perspective. CAPS also demonstrates the complementary information value of different patterns for inferring community processes and captures the complexity of community assembly. It provides a general tool for studying the processes structuring communities and can be applied to address a range of questions in community and metacommunity ecology. PMID:26702047

  15. Universal bounds on charged states in 2d CFT and 3d gravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benjamin, Nathan; Dyer, Ethan; Fitzpatrick, A. Liam

    2016-08-04

    We derive an explicit bound on the dimension of the lightest charged state in two dimensional conformal field theories with a global abelian symmetry. We find that the bound scales with c and provide examples that parametrically saturate this bound. We also prove that any such theory must contain a state with charge-to-mass ratio above a minimal lower bound. As a result, we comment on the implications for charged states in three dimensional theories of gravity.

  16. A fast numerical method for the valuation of American lookback put options

    NASA Astrophysics Data System (ADS)

    Song, Haiming; Zhang, Qi; Zhang, Ran

    2015-10-01

    A fast and efficient numerical method is proposed and analyzed for the valuation of American lookback options. American lookback option pricing problem is essentially a two-dimensional unbounded nonlinear parabolic problem. We reformulate it into a two-dimensional parabolic linear complementary problem (LCP) on an unbounded domain. The numeraire transformation and domain truncation technique are employed to convert the two-dimensional unbounded LCP into a one-dimensional bounded one. Furthermore, the variational inequality (VI) form corresponding to the one-dimensional bounded LCP is obtained skillfully by some discussions. The resulting bounded VI is discretized by a finite element method. Meanwhile, the stability of the semi-discrete solution and the symmetric positive definiteness of the full-discrete matrix are established for the bounded VI. The discretized VI related to options is solved by a projection and contraction method. Numerical experiments are conducted to test the performance of the proposed method.

  17. An operational mesoscale ensemble data assimilation and prediction system: E-RTFDDA

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Hopson, T.; Roux, G.; Hacker, J.; Xu, M.; Warner, T.; Swerdlin, S.

    2009-04-01

    Mesoscale (2-2000 km) meteorological processes differ from synoptic circulations in that mesoscale weather changes rapidly in space and time, and physics processes that are parameterized in NWP models play a great role. Complex interactions of synoptic circulations, regional and local terrain, land-surface heterogeneity, and associated physical properties, and the physical processes of radiative transfer, cloud and precipitation and boundary layer mixing, are crucial in shaping regional weather and climate. Mesoscale ensemble analysis and prediction should sample the uncertainties of mesoscale modeling systems in representing these factors. An innovative mesoscale Ensemble Real-Time Four Dimensional Data Assimilation (E-RTFDDA) and forecasting system has been developed at NCAR. E-RTFDDA contains diverse ensemble perturbation approaches that consider uncertainties in all major system components to produce multi-scale continuously-cycling probabilistic data assimilation and forecasting. A 30-member E-RTFDDA system with three nested domains with grid sizes of 30, 10 and 3.33 km has been running on a Department of Defense high-performance computing platform since September 2007. It has been applied at two very different US geographical locations; one in the western inter-mountain area and the other in the northeastern states, producing 6 hour analyses and 48 hour forecasts, with 4 forecast cycles a day. The operational model outputs are analyzed to a) assess overall ensemble performance and properties, b) study terrain effect on mesoscale predictability, c) quantify the contribution of different ensemble perturbation approaches to the overall forecast skill, and d) assess the additional contributed skill from an ensemble calibration process based on a quantile-regression algorithm. The system and the results will be reported at the meeting.

  18. Calibration and combination of dynamical seasonal forecasts to enhance the value of predicted probabilities for managing risk

    NASA Astrophysics Data System (ADS)

    Dutton, John A.; James, Richard P.; Ross, Jeremy D.

    2013-06-01

    Seasonal probability forecasts produced with numerical dynamics on supercomputers offer great potential value in managing risk and opportunity created by seasonal variability. The skill and reliability of contemporary forecast systems can be increased by calibration methods that use the historical performance of the forecast system to improve the ongoing real-time forecasts. Two calibration methods are applied to seasonal surface temperature forecasts of the US National Weather Service, the European Centre for Medium Range Weather Forecasts, and to a World Climate Service multi-model ensemble created by combining those two forecasts with Bayesian methods. As expected, the multi-model is somewhat more skillful and more reliable than the original models taken alone. The potential value of the multimodel in decision making is illustrated with the profits achieved in simulated trading of a weather derivative. In addition to examining the seasonal models, the article demonstrates that calibrated probability forecasts of weekly average temperatures for leads of 2-4 weeks are also skillful and reliable. The conversion of ensemble forecasts into probability distributions of impact variables is illustrated with degree days derived from the temperature forecasts. Some issues related to loss of stationarity owing to long-term warming are considered. The main conclusion of the article is that properly calibrated probabilistic forecasts possess sufficient skill and reliability to contribute to effective decisions in government and business activities that are sensitive to intraseasonal and seasonal climate variability.

  19. Beyond Bounded Solutions

    ERIC Educational Resources Information Center

    Enzer, Selwyn

    1977-01-01

    Futures research offers new tools for forecasting and for designing alternative intervention strategies. Interactive cross-impact modeling is presented as a useful method for identifying future events. (Author/MV)

  20. An operational global ocean forecast system and its applications

    NASA Astrophysics Data System (ADS)

    Mehra, A.; Tolman, H. L.; Rivin, I.; Rajan, B.; Spindler, T.; Garraffo, Z. D.; Kim, H.

    2012-12-01

    A global Real-Time Ocean Forecast System (RTOFS) was implemented in operations at NCEP/NWS/NOAA on 10/25/2011. This system is based on an eddy resolving 1/12 degree global HYCOM (HYbrid Coordinates Ocean Model) and is part of a larger national backbone capability of ocean modeling at NWS in strong partnership with US Navy. The forecast system is run once a day and produces a 6 day long forecast using the daily initialization fields produced at NAVOCEANO using NCODA (Navy Coupled Ocean Data Assimilation), a 3D multi-variate data assimilation methodology. As configured within RTOFS, HYCOM has a horizontal equatorial resolution of 0.08 degrees or ~9 km. The HYCOM grid is on a Mercator projection from 78.64 S to 47 N and north of this it employs an Arctic dipole patch where the poles are shifted over land to avoid a singularity at the North Pole. This gives a mid-latitude (polar) horizontal resolution of approximately 7 km (3.5 km). The coastline is fixed at 10 m isobath with open Bering Straits. This version employs 32 hybrid vertical coordinate surfaces with potential density referenced to 2000 m. Vertical coordinates can be isopycnals, often best for resolving deep water masses, levels of equal pressure (fixed depths), best for the well mixed unstratified upper ocean and sigma-levels (terrain-following), often the best choice in shallow water. The dynamic ocean model is coupled to a thermodynamic energy loan ice model and uses a non-slab mixed layer formulation. The forecast system is forced with 3-hourly momentum, radiation and precipitation fluxes from the operational Global Forecast System (GFS) fields. Results include global sea surface height and three dimensional fields of temperature, salinity, density and velocity fields used for validation and evaluation against available observations. Several downstream applications of this forecast system will also be discussed which include search and rescue operations at US Coast Guard, navigation safety information provided by OPC using real time ocean model guidance from Global RTOFS surface ocean currents, operational guidance on radionuclide dispersion near Fukushima using 3D tracers, boundary conditions for various operational coastal ocean forecast systems (COFS) run by NOS etc.

  1. SWIFT2: Software for continuous ensemble short-term streamflow forecasting for use in research and operations

    NASA Astrophysics Data System (ADS)

    Perraud, Jean-Michel; Bennett, James C.; Bridgart, Robert; Robertson, David E.

    2016-04-01

    Research undertaken through the Water Information Research and Development Alliance (WIRADA) has laid the foundations for continuous deterministic and ensemble short-term forecasting services. One output of this research is the software Short-term Water Information Forecasting Tools version 2 (SWIFT2). SWIFT2 is developed for use in research on short term streamflow forecasting techniques as well as operational forecasting services at the Australian Bureau of Meteorology. The variety of uses in research and operations requires a modular software system whose components can be arranged in applications that are fit for each particular purpose, without unnecessary software duplication. SWIFT2 modelling structures consist of sub-areas of hydrologic models, nodes and links with in-stream routing and reservoirs. While this modelling structure is customary, SWIFT2 is built from the ground up for computational and data intensive applications such as ensemble forecasts necessary for the estimation of the uncertainty in forecasts. Support for parallel computation on multiple processors or on a compute cluster is a primary use case. A convention is defined to store large multi-dimensional forecasting data and its metadata using the netCDF library. SWIFT2 is written in modern C++ with state of the art software engineering techniques and practices. A salient technical feature is a well-defined application programming interface (API) to facilitate access from different applications and technologies. SWIFT2 is already seamlessly accessible on Windows and Linux via packages in R, Python, Matlab and .NET languages such as C# and F#. Command line or graphical front-end applications are also feasible. This poster gives an overview of the technology stack, and illustrates the resulting features of SWIFT2 for users. Research and operational uses share the same common core C++ modelling shell for consistency, but augmented by different software modules suitable for each context. The accessibility via interactive modelling languages is particularly amenable to using SWIFT2 in exploratory research, with a dynamic and versatile experimental modelling workflow. This does not come at the expense of the stability and reliability required for use in operations, where only mature and stable components are used.

  2. Minimizers with Bounded Action for the High-Dimensional Frenkel-Kontorova Model

    NASA Astrophysics Data System (ADS)

    Miao, Xue-Qing; Wang, Ya-Nan; Qin, Wen-Xin

    In Aubry-Mather theory for monotone twist maps or for one-dimensional Frenkel-Kontorova (FK) model with nearest neighbor interactions, each global minimizer (minimal energy configuration) is naturally Birkhoff. However, this is not true for the one-dimensional FK model with non-nearest neighbor interactions or for the high-dimensional FK model. In this paper, we study the Birkhoff property of minimizers with bounded action for the high-dimensional FK model.

  3. Operational hydrological forecasting in Bavaria. Part II: Ensemble forecasting

    NASA Astrophysics Data System (ADS)

    Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.

    2009-04-01

    In part I of this study, the operational flood forecasting system in Bavaria and an approach to identify and quantify forecast uncertainty was introduced. The approach is split into the calculation of an empirical 'overall error' from archived forecasts and the calculation of an empirical 'model error' based on hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The 'model error' can especially in upstream catchments where forecast uncertainty is strongly dependent on the current predictability of the atrmosphere be superimposed on the spread of a hydrometeorological ensemble forecast. In Bavaria, two meteorological ensemble prediction systems are currently tested for operational use: the 16-member COSMO-LEPS forecast and a poor man's ensemble composed of DWD GME, DWD Cosmo-EU, NCEP GFS, Aladin-Austria, MeteoSwiss Cosmo-7. The determination of the overall forecast uncertainty is dependent on the catchment characteristics: 1. Upstream catchment with high influence of weather forecast a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. b) Corresponding to the characteristics of the meteorological ensemble forecast, each resulting forecast hydrograph can be regarded as equally likely. c) The 'model error' distribution, with parameters dependent on hydrological case and lead time, is added to each forecast timestep of each ensemble member d) For each forecast timestep, the overall (i.e. over all 'model error' distribution of each ensemble member) error distribution is calculated e) From this distribution, the uncertainty range on a desired level (here: the 10% and 90% percentile) is extracted and drawn as forecast envelope. f) As the mean or median of an ensemble forecast does not necessarily exhibit meteorologically sound temporal evolution, a single hydrological forecast termed 'lead forecast' is chosen and shown in addition to the uncertainty bounds. This can be either an intermediate forecast between the extremes of the ensemble spread or a manually selected forecast based on a meteorologists advice. 2. Downstream catchments with low influence of weather forecast In downstream catchments with strong human impact on discharge (e.g. by reservoir operation) and large influence of upstream gauge observation quality on forecast quality, the 'overall error' may in most cases be larger than the combination of the 'model error' and an ensemble spread. Therefore, the overall forecast uncertainty bounds are calculated differently: a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. Here, additionally the corresponding inflow hydrograph from all upstream catchments must be used. b) As for an upstream catchment, the uncertainty range is determined by combination of 'model error' and the ensemble member forecasts c) In addition, the 'overall error' is superimposed on the 'lead forecast'. For reasons of consistency, the lead forecast must be based on the same meteorological forecast in the downstream and all upstream catchments. d) From the resulting two uncertainty ranges (one from the ensemble forecast and 'model error', one from the 'lead forecast' and 'overall error'), the envelope is taken as the most prudent uncertainty range. In sum, the uncertainty associated with each forecast run is calculated and communicated to the public in the form of 10% and 90% percentiles. As in part I of this study, the methodology as well as the useful- or uselessness of the resulting uncertainty ranges will be presented and discussed by typical examples.

  4. Multi-Scale Three-Dimensional Variational Data Assimilation System for Coastal Ocean Prediction

    NASA Technical Reports Server (NTRS)

    Li, Zhijin; Chao, Yi; Li, P. Peggy

    2012-01-01

    A multi-scale three-dimensional variational data assimilation system (MS-3DVAR) has been formulated and the associated software system has been developed for improving high-resolution coastal ocean prediction. This system helps improve coastal ocean prediction skill, and has been used in support of operational coastal ocean forecasting systems and field experiments. The system has been developed to improve the capability of data assimilation for assimilating, simultaneously and effectively, sparse vertical profiles and high-resolution remote sensing surface measurements into coastal ocean models, as well as constraining model biases. In this system, the cost function is decomposed into two separate units for the large- and small-scale components, respectively. As such, data assimilation is implemented sequentially from large to small scales, the background error covariance is constructed to be scale-dependent, and a scale-dependent dynamic balance is incorporated. This scheme then allows effective constraining large scales and model bias through assimilating sparse vertical profiles, and small scales through assimilating high-resolution surface measurements. This MS-3DVAR enhances the capability of the traditional 3DVAR for assimilating highly heterogeneously distributed observations, such as along-track satellite altimetry data, and particularly maximizing the extraction of information from limited numbers of vertical profile observations.

  5. Gambling scores for earthquake predictions and forecasts

    NASA Astrophysics Data System (ADS)

    Zhuang, Jiancang

    2010-04-01

    This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

  6. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors.

    PubMed

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Dynamical-statistical seasonal prediction for western North Pacific typhoons based on APCC multi-models

    NASA Astrophysics Data System (ADS)

    Kim, Ok-Yeon; Kim, Hye-Mi; Lee, Myong-In; Min, Young-Mi

    2017-01-01

    This study aims at predicting the seasonal number of typhoons (TY) over the western North Pacific with an Asia-Pacific Climate Center (APCC) multi-model ensemble (MME)-based dynamical-statistical hybrid model. The hybrid model uses the statistical relationship between the number of TY during the typhoon season (July-October) and the large-scale key predictors forecasted by APCC MME for the same season. The cross validation result from the MME hybrid model demonstrates high prediction skill, with a correlation of 0.67 between the hindcasts and observation for 1982-2008. The cross validation from the hybrid model with individual models participating in MME indicates that there is no single model which consistently outperforms the other models in predicting typhoon number. Although the forecast skill of MME is not always the highest compared to that of each individual model, the skill of MME presents rather higher averaged correlations and small variance of correlations. Given large set of ensemble members from multi-models, a relative operating characteristic score reveals an 82 % (above-) and 78 % (below-normal) improvement for the probabilistic prediction of the number of TY. It implies that there is 82 % (78 %) probability that the forecasts can successfully discriminate between above normal (below-normal) from other years. The forecast skill of the hybrid model for the past 7 years (2002-2008) is more skillful than the forecast from the Tropical Storm Risk consortium. Using large set of ensemble members from multi-models, the APCC MME could provide useful deterministic and probabilistic seasonal typhoon forecasts to the end-users in particular, the residents of tropical cyclone-prone areas in the Asia-Pacific region.

  8. Strong polygamy of quantum correlations in multi-party quantum systems

    NASA Astrophysics Data System (ADS)

    Kim, Jeong San

    2014-10-01

    We propose a new type of polygamy inequality for multi-party quantum entanglement. We first consider the possible amount of bipartite entanglement distributed between a fixed party and any subset of the rest parties in a multi-party quantum system. By using the summation of these distributed entanglements, we provide an upper bound of the distributed entanglement between a party and the rest in multi-party quantum systems. We then show that this upper bound also plays as a lower bound of the usual polygamy inequality, therefore the strong polygamy of multi-party quantum entanglement. For the case of multi-party pure states, we further show that the strong polygamy of entanglement implies the strong polygamy of quantum discord.

  9. Multi-RCM ensemble downscaling of global seasonal forecasts (MRED)

    NASA Astrophysics Data System (ADS)

    Arritt, R. W.

    2008-12-01

    The Multi-RCM Ensemble Downscaling (MRED) project was recently initiated to address the question, Can regional climate models provide additional useful information from global seasonal forecasts? MRED will use a suite of regional climate models to downscale seasonal forecasts produced by the new National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) seasonal forecast system and the NASA GEOS5 system. The initial focus will be on wintertime forecasts in order to evaluate topographic forcing, snowmelt, and the potential usefulness of higher resolution, especially for near-surface fields influenced by high resolution orography. Each regional model will cover the conterminous US (CONUS) at approximately 32 km resolution, and will perform an ensemble of 15 runs for each year 1982-2003 for the forecast period 1 December - 30 April. MRED will compare individual regional and global forecasts as well as ensemble mean precipitation and temperature forecasts, which are currently being used to drive macroscale land surface models (LSMs), as well as wind, humidity, radiation, turbulent heat fluxes, which are important for more advanced coupled macro-scale hydrologic models. Metrics of ensemble spread will also be evaluated. Extensive analysis will be performed to link improvements in downscaled forecast skill to regional forcings and physical mechanisms. Our overarching goal is to determine what additional skill can be provided by a community ensemble of high resolution regional models, which we believe will eventually define a strategy for more skillful and useful regional seasonal climate forecasts.

  10. Gas demand forecasting by a new artificial intelligent algorithm

    NASA Astrophysics Data System (ADS)

    Khatibi. B, Vahid; Khatibi, Elham

    2012-01-01

    Energy demand forecasting is a key issue for consumers and generators in all energy markets in the world. This paper presents a new forecasting algorithm for daily gas demand prediction. This algorithm combines a wavelet transform and forecasting models such as multi-layer perceptron (MLP), linear regression or GARCH. The proposed method is applied to real data from the UK gas markets to evaluate their performance. The results show that the forecasting accuracy is improved significantly by using the proposed method.

  11. The Bound to Bound State Contribution to the Electric Polarizability of a Relativbistic Particle

    NASA Astrophysics Data System (ADS)

    Vidnovic, Theodore, III; Anis Maize, Mohamed

    1998-04-01

    We calculate, in our study, the contribution of the transition between bound energy states to the electric polarizability of a relativistic particle. The particle is moving under the influence of a one-dimensional delta potential. Our work is done in the case of the scalar potential. The solution of Dirac's equation and the calculation of the particles total electric polarizability has been done in references (1-3). The transitions contributing to the electric polarizability are: Continuum to continuum, bound to bound, negative energy bound states to continuum, and positive energy bound states to continuum. Our task is to study the bound to bound state contribution to the electric polarizability. We will also investigate the effect of the strength of the potential on the contribution. 1. T.H. Solomon and S. Fallieros, "Relativistic One Dimensional Binding and Two Dimensional Motion." J. Franklin Inst. 320, 323-344 (1985) 2. M.A. Maize and C.A. Burkholder, "Electric Polarizability and the Solution of an Inhomogenous Differential Equation." Am.J.Phys. 63, 244-247 (1995) 3. M.A. Maize, S. Paulson, and A. D'Avanti, "Electric Polarizability of a Relativistic Particle." Am.J.Phys. 65, 888-892 (1997)

  12. Multi-step-ahead crude oil price forecasting using a hybrid grey wave model

    NASA Astrophysics Data System (ADS)

    Chen, Yanhui; Zhang, Chuan; He, Kaijian; Zheng, Aibing

    2018-07-01

    Crude oil is crucial to the operation and economic well-being of the modern society. Huge changes of crude oil price always cause panics to the global economy. There are many factors influencing crude oil price. Crude oil price prediction is still a difficult research problem widely discussed among researchers. Based on the researches on Heterogeneous Market Hypothesis and the relationship between crude oil price and macroeconomic factors, exchange market, stock market, this paper proposes a hybrid grey wave forecasting model, which combines Random Walk (RW)/ARMA to forecast multi-step-ahead crude oil price. More specifically, we use grey wave forecasting model to model the periodical characteristics of crude oil price and ARMA/RW to simulate the daily random movements. The innovation also comes from using the information of the time series graph to forecast crude oil price, since grey wave forecasting is a graphical prediction method. The empirical results demonstrate that based on the daily data of crude oil price, the hybrid grey wave forecasting model performs well in 15- to 20-step-ahead prediction and it always dominates ARMA and Random Walk in correct direction prediction.

  13. Regional crop yield forecasting: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    de Wit, A.; van Diepen, K.; Boogaard, H.

    2009-04-01

    Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.

  14. Skill of Ensemble Seasonal Probability Forecasts

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.; Binter, Roman; Du, Hailiang; Niehoerster, Falk

    2010-05-01

    In operational forecasting, the computational complexity of large simulation models is, ideally, justified by enhanced performance over simpler models. We will consider probability forecasts and contrast the skill of ENSEMBLES-based seasonal probability forecasts of interest to the finance sector (specifically temperature forecasts for Nino 3.4 and the Atlantic Main Development Region (MDR)). The ENSEMBLES model simulations will be contrasted against forecasts from statistical models based on the observations (climatological distributions) and empirical dynamics based on the observations but conditioned on the current state (dynamical climatology). For some start dates, individual ENSEMBLES models yield significant skill even at a lead-time of 14 months. The nature of this skill is discussed, and chances of application are noted. Questions surrounding the interpretation of probability forecasts based on these multi-model ensemble simulations are then considered; the distributions considered are formed by kernel dressing the ensemble and blending with the climatology. The sources of apparent (RMS) skill in distributions based on multi-model simulations is discussed, and it is demonstrated that the inclusion of "zero-skill" models in the long range can improve Root-Mean-Square-Error scores, casting some doubt on the common justification for the claim that all models should be included in forming an operational probability forecast. It is argued that the rational response varies with lead time.

  15. An operational ensemble prediction system for catchment rainfall over eastern Africa spanning multiple temporal and spatial scales

    NASA Astrophysics Data System (ADS)

    Riddle, E. E.; Hopson, T. M.; Gebremichael, M.; Boehnert, J.; Broman, D.; Sampson, K. M.; Rostkier-Edelstein, D.; Collins, D. C.; Harshadeep, N. R.; Burke, E.; Havens, K.

    2017-12-01

    While it is not yet certain how precipitation patterns will change over Africa in the future, it is clear that effectively managing the available water resources is going to be crucial in order to mitigate the effects of water shortages and floods that are likely to occur in a changing climate. One component of effective water management is the availability of state-of-the-art and easy to use rainfall forecasts across multiple spatial and temporal scales. We present a web-based system for displaying and disseminating ensemble forecast and observed precipitation data over central and eastern Africa. The system provides multi-model rainfall forecasts integrated to relevant hydrological catchments for timescales ranging from one day to three months. A zoom-in features is available to access high resolution forecasts for small-scale catchments. Time series plots and data downloads with forecasts, recent rainfall observations and climatological data are available by clicking on individual catchments. The forecasts are calibrated using a quantile regression technique and an optimal multi-model forecast is provided at each timescale. The forecast skill at the various spatial and temporal scales will discussed, as will current applications of this tool for managing water resources in Sudan and optimizing hydropower operations in Ethiopia and Tanzania.

  16. Development of a multi-ensemble Prediction Model for China

    NASA Astrophysics Data System (ADS)

    Brasseur, G. P.; Bouarar, I.; Petersen, A. K.

    2016-12-01

    As part of the EU-sponsored Panda and MarcoPolo Projects, a multi-model prediction system including 7 models has been developed. Most regional models use global air quality predictions provided by the Copernicus Atmospheric Monitoring Service and downscale the forecast at relatively high spatial resolution in eastern China. The paper will describe the forecast system and show examples of forecasts produced for several Chinese urban areas and displayed on a web site developed by the Dutch Meteorological service. A discussion on the accuracy of the predictions based on a detailed validation process using surface measurements from the Chinese monitoring network will be presented.

  17. The North American Multi-Model Ensemble (NMME): Phase-1 Seasonal to Interannual Prediction, Phase-2 Toward Developing Intra-Seasonal Prediction

    NASA Technical Reports Server (NTRS)

    Kirtman, Ben P.; Min, Dughong; Infanti, Johnna M.; Kinter, James L., III; Paolino, Daniel A.; Zhang, Qin; vandenDool, Huug; Saha, Suranjana; Mendez, Malaquias Pena; Becker, Emily; hide

    2013-01-01

    The recent US National Academies report "Assessment of Intraseasonal to Interannual Climate Prediction and Predictability" was unequivocal in recommending the need for the development of a North American Multi-Model Ensemble (NMME) operational predictive capability. Indeed, this effort is required to meet the specific tailored regional prediction and decision support needs of a large community of climate information users. The multi-model ensemble approach has proven extremely effective at quantifying prediction uncertainty due to uncertainty in model formulation, and has proven to produce better prediction quality (on average) then any single model ensemble. This multi-model approach is the basis for several international collaborative prediction research efforts, an operational European system and there are numerous examples of how this multi-model ensemble approach yields superior forecasts compared to any single model. Based on two NOAA Climate Test Bed (CTB) NMME workshops (February 18, and April 8, 2011) a collaborative and coordinated implementation strategy for a NMME prediction system has been developed and is currently delivering real-time seasonal-to-interannual predictions on the NOAA Climate Prediction Center (CPC) operational schedule. The hindcast and real-time prediction data is readily available (e.g., http://iridl.ldeo.columbia.edu/SOURCES/.Models/.NMME/) and in graphical format from CPC (http://origin.cpc.ncep.noaa.gov/products/people/wd51yf/NMME/index.html). Moreover, the NMME forecast are already currently being used as guidance for operational forecasters. This paper describes the new NMME effort, presents an overview of the multi-model forecast quality, and the complementary skill associated with individual models.

  18. An Overview of the National Weather Service National Water Model

    NASA Astrophysics Data System (ADS)

    Cosgrove, B.; Gochis, D.; Clark, E. P.; Cui, Z.; Dugger, A. L.; Feng, X.; Karsten, L. R.; Khan, S.; Kitzmiller, D.; Lee, H. S.; Liu, Y.; McCreight, J. L.; Newman, A. J.; Oubeidillah, A.; Pan, L.; Pham, C.; Salas, F.; Sampson, K. M.; Sood, G.; Wood, A.; Yates, D. N.; Yu, W.

    2016-12-01

    The National Weather Service (NWS) Office of Water Prediction (OWP), in conjunction with the National Center for Atmospheric Research (NCAR) and the NWS National Centers for Environmental Prediction (NCEP) recently implemented version 1.0 of the National Water Model (NWM) into operations. This model is an hourly cycling uncoupled analysis and forecast system that provides streamflow for 2.7 million river reaches and other hydrologic information on 1km and 250m grids. It will provide complementary hydrologic guidance at current NWS river forecast locations and significantly expand guidance coverage and type in underserved locations. The core of this system is the NCAR-supported community Weather Research and Forecasting (WRF)-Hydro hydrologic model. It ingests forcing from a variety of sources including Multi-Sensor Multi-Radar (MRMS) radar-gauge observed precipitation data and High Resolution Rapid Refresh (HRRR), Rapid Refresh (RAP), Global Forecast System (GFS) and Climate Forecast System (CFS) forecast data. WRF-Hydro is configured to use the Noah-Multi Parameterization (Noah-MP) Land Surface Model (LSM) to simulate land surface processes. Separate water routing modules perform diffusive wave surface routing and saturated subsurface flow routing on a 250m grid, and Muskingum-Cunge channel routing down National Hydrogaphy Dataset Plus V2 (NHDPlusV2) stream reaches. River analyses and forecasts are provided across a domain encompassing the Continental United States (CONUS) and hydrologically contributing areas, while land surface output is available on a larger domain that extends beyond the CONUS into Canada and Mexico (roughly from latitude 19N to 58N). The system includes an analysis and assimilation configuration along with three forecast configurations. These include a short-range 15 hour deterministic forecast, a medium-Range 10 day deterministic forecast and a long-range 30 day 16-member ensemble forecast. United Sates Geologic Survey (USGS) streamflow observations are assimilated into the analysis and assimilation configuration, and all four configurations benefit from the inclusion of 1,260 reservoirs. An overview of the National Water Model will be given, along with information on ongoing evaluation activities and plans for future NWM enhancements.

  19. Building Reliable Forecasts of Solar Activity

    NASA Technical Reports Server (NTRS)

    Kitiashvili, Irina; Wray, Alan; Mansour, Nagi

    2017-01-01

    Solar ionizing radiation critically depends on the level of the Sun’s magnetic activity. For robust physics-based forecasts, we employ the procedure of data assimilation, which combines theoretical modeling and observational data such that uncertainties in both the model and the observations are taken into account. Currently we are working in two major directions: 1) development of a new long-term forecast procedure on time-scales of the 11-year solar cycle, using a 2-dimensional mean-field dynamo model and synoptic magnetograms; 2) development of 3-dimensional radiative MHD (Magnetohydrodynamic) simulations to investigate the origin and precursors of local manifestations of magnetic activity, such as the formation of magnetic structures and eruptive dynamics.

  20. Superensemble forecasts of dengue outbreaks

    PubMed Central

    Kandula, Sasikiran; Shaman, Jeffrey

    2016-01-01

    In recent years, a number of systems capable of predicting future infectious disease incidence have been developed. As more of these systems are operationalized, it is important that the forecasts generated by these different approaches be formally reconciled so that individual forecast error and bias are reduced. Here we present a first example of such multi-system, or superensemble, forecast. We develop three distinct systems for predicting dengue, which are applied retrospectively to forecast outbreak characteristics in San Juan, Puerto Rico. We then use Bayesian averaging methods to combine the predictions from these systems and create superensemble forecasts. We demonstrate that on average, the superensemble approach produces more accurate forecasts than those made from any of the individual forecasting systems. PMID:27733698

  1. Satellite Imagery Analysis for Automated Global Food Security Forecasting

    NASA Astrophysics Data System (ADS)

    Moody, D.; Brumby, S. P.; Chartrand, R.; Keisler, R.; Mathis, M.; Beneke, C. M.; Nicholaeff, D.; Skillman, S.; Warren, M. S.; Poehnelt, J.

    2017-12-01

    The recent computing performance revolution has driven improvements in sensor, communication, and storage technology. Multi-decadal remote sensing datasets at the petabyte scale are now available in commercial clouds, with new satellite constellations generating petabytes/year of daily high-resolution global coverage imagery. Cloud computing and storage, combined with recent advances in machine learning, are enabling understanding of the world at a scale and at a level of detail never before feasible. We present results from an ongoing effort to develop satellite imagery analysis tools that aggregate temporal, spatial, and spectral information and that can scale with the high-rate and dimensionality of imagery being collected. We focus on the problem of monitoring food crop productivity across the Middle East and North Africa, and show how an analysis-ready, multi-sensor data platform enables quick prototyping of satellite imagery analysis algorithms, from land use/land cover classification and natural resource mapping, to yearly and monthly vegetative health change trends at the structural field level.

  2. Soil Moisture Initialization Error and Subgrid Variability of Precipitation in Seasonal Streamflow Forecasting

    NASA Technical Reports Server (NTRS)

    Koster, Randal D.; Walker, Gregory K.; Mahanama, Sarith P.; Reichle, Rolf H.

    2013-01-01

    Offline simulations over the conterminous United States (CONUS) with a land surface model are used to address two issues relevant to the forecasting of large-scale seasonal streamflow: (i) the extent to which errors in soil moisture initialization degrade streamflow forecasts, and (ii) the extent to which a realistic increase in the spatial resolution of forecasted precipitation would improve streamflow forecasts. The addition of error to a soil moisture initialization field is found to lead to a nearly proportional reduction in streamflow forecast skill. The linearity of the response allows the determination of a lower bound for the increase in streamflow forecast skill achievable through improved soil moisture estimation, e.g., through satellite-based soil moisture measurements. An increase in the resolution of precipitation is found to have an impact on large-scale streamflow forecasts only when evaporation variance is significant relative to the precipitation variance. This condition is met only in the western half of the CONUS domain. Taken together, the two studies demonstrate the utility of a continental-scale land surface modeling system as a tool for addressing the science of hydrological prediction.

  3. Seasonal Drought Prediction in East Africa: Can National Multi-Model Ensemble Forecasts Help?

    NASA Technical Reports Server (NTRS)

    Shukla, Shraddhanand; Roberts, J. B.; Funk, Christopher; Robertson, F. R.; Hoell, Andrew

    2015-01-01

    The increasing food and water demands of East Africa's growing population are stressing the region's inconsistent water resources and rain-fed agriculture. As recently as in 2011 part of this region underwent one of the worst famine events in its history. Timely and skillful drought forecasts at seasonal scale for this region can inform better water and agro-pastoral management decisions, support optimal allocation of the region's water resources, and mitigate socio-economic losses incurred by droughts. However seasonal drought prediction in this region faces several challenges. Lack of skillful seasonal rainfall forecasts; the focus of this presentation, is one of those major challenges. In the past few decades, major strides have been taken towards improvement of seasonal scale dynamical climate forecasts. The National Centers for Environmental Prediction's (NCEP) National Multi-model Ensemble (NMME) is one such state-of-the-art dynamical climate forecast system. The NMME incorporates climate forecasts from 6+ fully coupled dynamical models resulting in 100+ ensemble member forecasts. Recent studies have indicated that in general NMME offers improvement over forecasts from any single model. However thus far the skill of NMME for forecasting rainfall in a vulnerable region like the East Africa has been unexplored. In this presentation we report findings of a comprehensive analysis that examines the strength and weakness of NMME in forecasting rainfall at seasonal scale in East Africa for all three of the prominent seasons for the region. (i.e. March-April-May, July-August-September and October-November- December). Simultaneously we also describe hybrid approaches; that combine statistical approaches with NMME forecasts; to improve rainfall forecast skill in the region when raw NMME forecasts lack in skill.

  4. Seasonal Drought Prediction in East Africa: Can National Multi-Model Ensemble Forecasts Help?

    NASA Technical Reports Server (NTRS)

    Shukla, Shraddhanand; Roberts, J. B.; Funk, Christopher; Robertson, F. R.; Hoell, Andrew

    2014-01-01

    The increasing food and water demands of East Africa's growing population are stressing the region's inconsistent water resources and rain-fed agriculture. As recently as in 2011 part of this region underwent one of the worst famine events in its history. Timely and skillful drought forecasts at seasonal scale for this region can inform better water and agro-pastoral management decisions, support optimal allocation of the region's water resources, and mitigate socio-economic losses incurred by droughts. However seasonal drought prediction in this region faces several challenges. Lack of skillful seasonal rainfall forecasts; the focus of this presentation, is one of those major challenges. In the past few decades, major strides have been taken towards improvement of seasonal scale dynamical climate forecasts. The National Centers for Environmental Prediction's (NCEP) National Multi-model Ensemble (NMME) is one such state-of-the-art dynamical climate forecast system. The NMME incorporates climate forecasts from 6+ fully coupled dynamical models resulting in 100+ ensemble member forecasts. Recent studies have indicated that in general NMME offers improvement over forecasts from any single model. However thus far the skill of NMME for forecasting rainfall in a vulnerable region like the East Africa has been unexplored. In this presentation we report findings of a comprehensive analysis that examines the strength and weakness of NMME in forecasting rainfall at seasonal scale in East Africa for all three of the prominent seasons for the region. (i.e. March-April-May, July-August-September and October-November- December). Simultaneously we also describe hybrid approaches; that combine statistical approaches with NMME forecasts; to improve rainfall forecast skill in the region when raw NMME forecasts lack in skill.

  5. A high-order multi-zone cut-stencil method for numerical simulations of high-speed flows over complex geometries

    NASA Astrophysics Data System (ADS)

    Greene, Patrick T.; Eldredge, Jeff D.; Zhong, Xiaolin; Kim, John

    2016-07-01

    In this paper, we present a method for performing uniformly high-order direct numerical simulations of high-speed flows over arbitrary geometries. The method was developed with the goal of simulating and studying the effects of complex isolated roughness elements on the stability of hypersonic boundary layers. The simulations are carried out on Cartesian grids with the geometries imposed by a third-order cut-stencil method. A fifth-order hybrid weighted essentially non-oscillatory scheme was implemented to capture any steep gradients in the flow created by the geometries and a third-order Runge-Kutta method is used for time advancement. A multi-zone refinement method was also utilized to provide extra resolution at locations with expected complex physics. The combination results in a globally fourth-order scheme in space and third order in time. Results confirming the method's high order of convergence are shown. Two-dimensional and three-dimensional test cases are presented and show good agreement with previous results. A simulation of Mach 3 flow over the logo of the Ubuntu Linux distribution is shown to demonstrate the method's capabilities for handling complex geometries. Results for Mach 6 wall-bounded flow over a three-dimensional cylindrical roughness element are also presented. The results demonstrate that the method is a promising tool for the study of hypersonic roughness-induced transition.

  6. A new accuracy measure based on bounded relative error for time series forecasting

    PubMed Central

    Twycross, Jamie; Garibaldi, Jonathan M.

    2017-01-01

    Many accuracy measures have been proposed in the past for time series forecasting comparisons. However, many of these measures suffer from one or more issues such as poor resistance to outliers and scale dependence. In this paper, while summarising commonly used accuracy measures, a special review is made on the symmetric mean absolute percentage error. Moreover, a new accuracy measure called the Unscaled Mean Bounded Relative Absolute Error (UMBRAE), which combines the best features of various alternative measures, is proposed to address the common issues of existing measures. A comparative evaluation on the proposed and related measures has been made with both synthetic and real-world data. The results indicate that the proposed measure, with user selectable benchmark, performs as well as or better than other measures on selected criteria. Though it has been commonly accepted that there is no single best accuracy measure, we suggest that UMBRAE could be a good choice to evaluate forecasting methods, especially for cases where measures based on geometric mean of relative errors, such as the geometric mean relative absolute error, are preferred. PMID:28339480

  7. A new accuracy measure based on bounded relative error for time series forecasting.

    PubMed

    Chen, Chao; Twycross, Jamie; Garibaldi, Jonathan M

    2017-01-01

    Many accuracy measures have been proposed in the past for time series forecasting comparisons. However, many of these measures suffer from one or more issues such as poor resistance to outliers and scale dependence. In this paper, while summarising commonly used accuracy measures, a special review is made on the symmetric mean absolute percentage error. Moreover, a new accuracy measure called the Unscaled Mean Bounded Relative Absolute Error (UMBRAE), which combines the best features of various alternative measures, is proposed to address the common issues of existing measures. A comparative evaluation on the proposed and related measures has been made with both synthetic and real-world data. The results indicate that the proposed measure, with user selectable benchmark, performs as well as or better than other measures on selected criteria. Though it has been commonly accepted that there is no single best accuracy measure, we suggest that UMBRAE could be a good choice to evaluate forecasting methods, especially for cases where measures based on geometric mean of relative errors, such as the geometric mean relative absolute error, are preferred.

  8. Optimization of autoregressive, exogenous inputs-based typhoon inundation forecasting models using a multi-objective genetic algorithm

    NASA Astrophysics Data System (ADS)

    Ouyang, Huei-Tau

    2017-07-01

    Three types of model for forecasting inundation levels during typhoons were optimized: the linear autoregressive model with exogenous inputs (LARX), the nonlinear autoregressive model with exogenous inputs with wavelet function (NLARX-W) and the nonlinear autoregressive model with exogenous inputs with sigmoid function (NLARX-S). The forecast performance was evaluated by three indices: coefficient of efficiency, error in peak water level and relative time shift. Historical typhoon data were used to establish water-level forecasting models that satisfy all three objectives. A multi-objective genetic algorithm was employed to search for the Pareto-optimal model set that satisfies all three objectives and select the ideal models for the three indices. Findings showed that the optimized nonlinear models (NLARX-W and NLARX-S) outperformed the linear model (LARX). Among the nonlinear models, the optimized NLARX-W model achieved a more balanced performance on the three indices than the NLARX-S models and is recommended for inundation forecasting during typhoons.

  9. Skill of real-time operational forecasts with the APCC multi-model ensemble prediction system during the period 2008-2015

    NASA Astrophysics Data System (ADS)

    Min, Young-Mi; Kryjov, Vladimir N.; Oh, Sang Myeong; Lee, Hyun-Ju

    2017-12-01

    This paper assesses the real-time 1-month lead forecasts of 3-month (seasonal) mean temperature and precipitation on a monthly basis issued by the Asia-Pacific Economic Cooperation Climate Center (APCC) for 2008-2015 (8 years, 96 forecasts). It shows the current level of the APCC operational multi-model prediction system performance. The skill of the APCC forecasts strongly depends on seasons and regions that it is higher for the tropics and boreal winter than for the extratropics and boreal summer due to direct effects and remote teleconnections from boundary forcings. There is a negative relationship between the forecast skill and its interseasonal variability for both variables and the forecast skill for precipitation is more seasonally and regionally dependent than that for temperature. The APCC operational probabilistic forecasts during this period show a cold bias (underforecasting of above-normal temperature and overforecasting of below-normal temperature) underestimating a long-term warming trend. A wet bias is evident for precipitation, particularly in the extratropical regions. The skill of both temperature and precipitation forecasts strongly depends upon the ENSO strength. Particularly, the highest forecast skill noted in 2015/2016 boreal winter is associated with the strong forcing of an extreme El Nino event. Meanwhile, the relatively low skill is associated with the transition and/or continuous ENSO-neutral phases of 2012-2014. As a result the skill of real-time forecast for boreal winter season is higher than that of hindcast. However, on average, the level of forecast skill during the period 2008-2015 is similar to that of hindcast.

  10. A systematic construction of microstate geometries with low angular momentum

    NASA Astrophysics Data System (ADS)

    Bena, Iosif; Heidmann, Pierre; Ramírez, Pedro F.

    2017-10-01

    We outline a systematic procedure to obtain horizonless microstate geometries that have the same charges as three-charge five-dimensional black holes with a macroscopically-large horizon area and an arbitrarily-small angular momentum. There are two routes through which such solutions can be constructed: using multi-center Gibbons-Hawking (GH) spaces or using superstratum technology. So far the only solutions corre-sponding to microstate geometries for black holes with no angular momentum have been obtained via superstrata [1], and multi-center Gibbons-Hawking spaces have been believed to give rise only to microstate geometries of BMPV black holes with a large angular mo-mentum [2]. We perform a thorough search throughout the parameter space of smooth horizonless solutions with four GH centers and find that these have an angular momentum that is generally larger than 80% of the cosmic censorship bound. However, we find that solutions with three GH centers and one supertube (which are smooth in six-dimensional supergravity) can have an arbitrarily-low angular momentum. Our construction thus gives a recipe to build large classes of microstate geometries for zero-angular-momentum black holes without resorting to superstratum technology.

  11. Short-term leprosy forecasting from an expert opinion survey.

    PubMed

    Deiner, Michael S; Worden, Lee; Rittel, Alex; Ackley, Sarah F; Liu, Fengchen; Blum, Laura; Scott, James C; Lietman, Thomas M; Porco, Travis C

    2017-01-01

    We conducted an expert survey of leprosy (Hansen's Disease) and neglected tropical disease experts in February 2016. Experts were asked to forecast the next year of reported cases for the world, for the top three countries, and for selected states and territories of India. A total of 103 respondents answered at least one forecasting question. We elicited lower and upper confidence bounds. Comparing these results to regression and exponential smoothing, we found no evidence that any forecasting method outperformed the others. We found evidence that experts who believed it was more likely to achieve global interruption of transmission goals and disability reduction goals had higher error scores for India and Indonesia, but lower for Brazil. Even for a disease whose epidemiology changes on a slow time scale, forecasting exercises such as we conducted are simple and practical. We believe they can be used on a routine basis in public health.

  12. Short-term leprosy forecasting from an expert opinion survey

    PubMed Central

    Deiner, Michael S.; Worden, Lee; Rittel, Alex; Ackley, Sarah F.; Liu, Fengchen; Blum, Laura; Scott, James C.; Lietman, Thomas M.

    2017-01-01

    We conducted an expert survey of leprosy (Hansen’s Disease) and neglected tropical disease experts in February 2016. Experts were asked to forecast the next year of reported cases for the world, for the top three countries, and for selected states and territories of India. A total of 103 respondents answered at least one forecasting question. We elicited lower and upper confidence bounds. Comparing these results to regression and exponential smoothing, we found no evidence that any forecasting method outperformed the others. We found evidence that experts who believed it was more likely to achieve global interruption of transmission goals and disability reduction goals had higher error scores for India and Indonesia, but lower for Brazil. Even for a disease whose epidemiology changes on a slow time scale, forecasting exercises such as we conducted are simple and practical. We believe they can be used on a routine basis in public health. PMID:28813531

  13. Palm oil price forecasting model: An autoregressive distributed lag (ARDL) approach

    NASA Astrophysics Data System (ADS)

    Hamid, Mohd Fahmi Abdul; Shabri, Ani

    2017-05-01

    Palm oil price fluctuated without any clear trend or cyclical pattern in the last few decades. The instability of food commodities price causes it to change rapidly over time. This paper attempts to develop Autoregressive Distributed Lag (ARDL) model in modeling and forecasting the price of palm oil. In order to use ARDL as a forecasting model, this paper modifies the data structure where we only consider lagged explanatory variables to explain the variation in palm oil price. We then compare the performance of this ARDL model with a benchmark model namely ARIMA in term of their comparative forecasting accuracy. This paper also utilize ARDL bound testing approach to co-integration in examining the short run and long run relationship between palm oil price and its determinant; production, stock, and price of soybean as the substitute of palm oil and price of crude oil. The comparative forecasting accuracy suggests that ARDL model has a better forecasting accuracy compared to ARIMA.

  14. Exact semiclassical expansions for one-dimensional quantum oscillators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delabaere, E.; Dillinger, H.; Pham, F.

    1997-12-01

    A set of rules is given for dealing with WKB expansions in the one-dimensional analytic case, whereby such expansions are not considered as approximations but as exact encodings of wave functions, thus allowing for analytic continuation with respect to whichever parameters the potential function depends on, with an exact control of small exponential effects. These rules, which include also the case when there are double turning points, are illustrated on various examples, and applied to the study of bound state or resonance spectra. In the case of simple oscillators, it is thus shown that the Rayleigh{endash}Schr{umlt o}dinger series is Borelmore » resummable, yielding the exact energy levels. In the case of the symmetrical anharmonic oscillator, one gets a simple and rigorous justification of the Zinn-Justin quantization condition, and of its solution in terms of {open_quotes}multi-instanton expansions.{close_quotes} {copyright} {ital 1997 American Institute of Physics.}« less

  15. Short-term wind speed prediction based on the wavelet transformation and Adaboost neural network

    NASA Astrophysics Data System (ADS)

    Hai, Zhou; Xiang, Zhu; Haijian, Shao; Ji, Wu

    2018-03-01

    The operation of the power grid will be affected inevitably with the increasing scale of wind farm due to the inherent randomness and uncertainty, so the accurate wind speed forecasting is critical for the stability of the grid operation. Typically, the traditional forecasting method does not take into account the frequency characteristics of wind speed, which cannot reflect the nature of the wind speed signal changes result from the low generality ability of the model structure. AdaBoost neural network in combination with the multi-resolution and multi-scale decomposition of wind speed is proposed to design the model structure in order to improve the forecasting accuracy and generality ability. The experimental evaluation using the data from a real wind farm in Jiangsu province is given to demonstrate the proposed strategy can improve the robust and accuracy of the forecasted variable.

  16. Using constructed analogs to improve the skill of National Multi-Model Ensemble March–April–May precipitation forecasts in equatorial East Africa

    USGS Publications Warehouse

    Shukla, Shraddhanand; Funk, Christopher C.; Hoell, Andrew

    2014-01-01

    In this study we implement and evaluate a simple 'hybrid' forecast approach that uses constructed analogs (CA) to improve the National Multi-Model Ensemble's (NMME) March–April–May (MAM) precipitation forecasts over equatorial eastern Africa (hereafter referred to as EA, 2°S to 8°N and 36°E to 46°E). Due to recent declines in MAM rainfall, increases in population, land degradation, and limited technological advances, this region has become a recent epicenter of food insecurity. Timely and skillful precipitation forecasts for EA could help decision makers better manage their limited resources, mitigate socio-economic losses, and potentially save human lives. The 'hybrid approach' described in this study uses the CA method to translate dynamical precipitation and sea surface temperature (SST) forecasts over the Indian and Pacific Oceans (specifically 30°S to 30°N and 30°E to 270°E) into terrestrial MAM precipitation forecasts over the EA region. In doing so, this approach benefits from the post-1999 teleconnection that exists between precipitation and SSTs over the Indian and tropical Pacific Oceans (Indo-Pacific) and EA MAM rainfall. The coupled atmosphere-ocean dynamical forecasts used in this study were drawn from the NMME. We demonstrate that while the MAM precipitation forecasts (initialized in February) skill of the NMME models over the EA region itself is negligible, the ranked probability skill score of hybrid CA forecasts based on Indo-Pacific NMME precipitation and SST forecasts reach up to 0.45.

  17. Balancing Flood Risk and Water Supply in California: Policy Search Combining Short-Term Forecast Ensembles and Groundwater Recharge

    NASA Astrophysics Data System (ADS)

    Herman, J. D.; Steinschneider, S.; Nayak, M. A.

    2017-12-01

    Short-term weather forecasts are not codified into the operating policies of federal, multi-purpose reservoirs, despite their potential to improve service provision. This is particularly true for facilities that provide flood protection and water supply, since the potential flood damages are often too severe to accept the risk of inaccurate forecasts. Instead, operators must maintain empty storage capacity to mitigate flood risk, even if the system is currently in drought, as occurred in California from 2012-2016. This study investigates the potential for forecast-informed operating rules to improve water supply efficiency while maintaining flood protection, combining state-of-the-art weather hindcasts with a novel tree-based policy optimization framework. We hypothesize that forecasts need only accurately predict the occurrence of a storm, rather than its intensity, to be effective in regions like California where wintertime, synoptic-scale storms dominate the flood regime. We also investigate the potential for downstream groundwater injection to improve the utility of forecasts. These hypotheses are tested in a case study of Folsom Reservoir on the American River. Because available weather hindcasts are relatively short (10-20 years), we propose a new statistical framework to develop synthetic forecasts to assess the risk associated with inaccurate forecasts. The efficiency of operating policies is tested across a range of scenarios that include varying forecast skill and additional groundwater pumping capacity. Results suggest that the combined use of groundwater storage and short-term weather forecasts can substantially improve the tradeoff between water supply and flood control objectives in large, multi-purpose reservoirs in California.

  18. Data center thermal management

    DOEpatents

    Hamann, Hendrik F.; Li, Hongfei

    2016-02-09

    Historical high-spatial-resolution temperature data and dynamic temperature sensor measurement data may be used to predict temperature. A first formulation may be derived based on the historical high-spatial-resolution temperature data for determining a temperature at any point in 3-dimensional space. The dynamic temperature sensor measurement data may be calibrated based on the historical high-spatial-resolution temperature data at a corresponding historical time. Sensor temperature data at a plurality of sensor locations may be predicted for a future time based on the calibrated dynamic temperature sensor measurement data. A three-dimensional temperature spatial distribution associated with the future time may be generated based on the forecasted sensor temperature data and the first formulation. The three-dimensional temperature spatial distribution associated with the future time may be projected to a two-dimensional temperature distribution, and temperature in the future time for a selected space location may be forecasted dynamically based on said two-dimensional temperature distribution.

  19. Verification of operational solar flare forecast: Case of Regional Warning Center Japan

    NASA Astrophysics Data System (ADS)

    Kubo, Yûki; Den, Mitsue; Ishii, Mamoru

    2017-08-01

    In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.

  20. Probabilistic calibration of the distributed hydrological model RIBS applied to real-time flood forecasting: the Harod river basin case study (Israel)

    NASA Astrophysics Data System (ADS)

    Nesti, Alice; Mediero, Luis; Garrote, Luis; Caporali, Enrica

    2010-05-01

    An automatic probabilistic calibration method for distributed rainfall-runoff models is presented. The high number of parameters in hydrologic distributed models makes special demands on the optimization procedure to estimate model parameters. With the proposed technique it is possible to reduce the complexity of calibration while maintaining adequate model predictions. The first step of the calibration procedure of the main model parameters is done manually with the aim to identify their variation range. Afterwards a Monte-Carlo technique is applied, which consists on repetitive model simulations with randomly generated parameters. The Monte Carlo Analysis Toolbox (MCAT) includes a number of analysis methods to evaluate the results of these Monte Carlo parameter sampling experiments. The study investigates the use of a global sensitivity analysis as a screening tool to reduce the parametric dimensionality of multi-objective hydrological model calibration problems, while maximizing the information extracted from hydrological response data. The method is applied to the calibration of the RIBS flood forecasting model in the Harod river basin, placed on Israel. The Harod basin has an extension of 180 km2. The catchment has a Mediterranean climate and it is mainly characterized by a desert landscape, with a soil that is able to absorb large quantities of rainfall and at the same time is capable to generate high peaks of discharge. Radar rainfall data with 6 minute temporal resolution are available as input to the model. The aim of the study is the validation of the model for real-time flood forecasting, in order to evaluate the benefits of improved precipitation forecasting within the FLASH European project.

  1. Machine Learning Based Multi-Physical-Model Blending for Enhancing Renewable Energy Forecast -- Improvement via Situation Dependent Error Correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Siyuan; Hwang, Youngdeok; Khabibrakhmanov, Ildar

    With increasing penetration of solar and wind energy to the total energy supply mix, the pressing need for accurate energy forecasting has become well-recognized. Here we report the development of a machine-learning based model blending approach for statistically combining multiple meteorological models for improving the accuracy of solar/wind power forecast. Importantly, we demonstrate that in addition to parameters to be predicted (such as solar irradiance and power), including additional atmospheric state parameters which collectively define weather situations as machine learning input provides further enhanced accuracy for the blended result. Functional analysis of variance shows that the error of individual modelmore » has substantial dependence on the weather situation. The machine-learning approach effectively reduces such situation dependent error thus produces more accurate results compared to conventional multi-model ensemble approaches based on simplistic equally or unequally weighted model averaging. Validation over an extended period of time results show over 30% improvement in solar irradiance/power forecast accuracy compared to forecasts based on the best individual model.« less

  2. Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model

    NASA Astrophysics Data System (ADS)

    Pathak, Jaideep; Wikner, Alexander; Fussell, Rebeckah; Chandra, Sarthak; Hunt, Brian R.; Girvan, Michelle; Ott, Edward

    2018-04-01

    A model-based approach to forecasting chaotic dynamical systems utilizes knowledge of the mechanistic processes governing the dynamics to build an approximate mathematical model of the system. In contrast, machine learning techniques have demonstrated promising results for forecasting chaotic systems purely from past time series measurements of system state variables (training data), without prior knowledge of the system dynamics. The motivation for this paper is the potential of machine learning for filling in the gaps in our underlying mechanistic knowledge that cause widely-used knowledge-based models to be inaccurate. Thus, we here propose a general method that leverages the advantages of these two approaches by combining a knowledge-based model and a machine learning technique to build a hybrid forecasting scheme. Potential applications for such an approach are numerous (e.g., improving weather forecasting). We demonstrate and test the utility of this approach using a particular illustrative version of a machine learning known as reservoir computing, and we apply the resulting hybrid forecaster to a low-dimensional chaotic system, as well as to a high-dimensional spatiotemporal chaotic system. These tests yield extremely promising results in that our hybrid technique is able to accurately predict for a much longer period of time than either its machine-learning component or its model-based component alone.

  3. COSMO-PAFOG: Three-dimensional fog forecasting with the high-resolution COSMO-model

    NASA Astrophysics Data System (ADS)

    Hacker, Maike; Bott, Andreas

    2017-04-01

    The presence of fog can have critical impact on shipping, aviation and road traffic increasing the risk of serious accidents. Besides these negative impacts of fog, in arid regions fog is explored as a supplementary source of water for human settlements. Thus the improvement of fog forecasts holds immense operational value. The aim of this study is the development of an efficient three-dimensional numerical fog forecast model based on a mesoscale weather prediction model for the application in the Namib region. The microphysical parametrization of the one-dimensional fog forecast model PAFOG (PArameterized FOG) is implemented in the three-dimensional nonhydrostatic mesoscale weather prediction model COSMO (COnsortium for Small-scale MOdeling) developed and maintained by the German Meteorological Service. Cloud water droplets are introduced in COSMO as prognostic variables, thus allowing a detailed description of droplet sedimentation. Furthermore, a visibility parametrization depending on the liquid water content and the droplet number concentration is implemented. The resulting fog forecast model COSMO-PAFOG is run with kilometer-scale horizontal resolution. In vertical direction, we use logarithmically equidistant layers with 45 of 80 layers in total located below 2000 m. Model results are compared to satellite observations and synoptic observations of the German Meteorological Service for a domain in the west of Germany, before the model is adapted to the geographical and climatological conditions in the Namib desert. COSMO-PAFOG is able to represent the horizontal structure of fog patches reasonably well. Especially small fog patches typical of radiation fog can be simulated in agreement with observations. Ground observations of temperature are also reproduced. Simulations without the PAFOG microphysics yield unrealistically high liquid water contents. This in turn reduces the radiative cooling of the ground, thus inhibiting nocturnal temperature decrease. The simulated visibility agrees with observations. However, fog tends to be dissolved earlier than in the observation. As a result of the investigated fog events, it is concluded that the three-dimensional fog forecast model COSMO-PAFOG is able to simulate these fog events in accordance with observations. After the successful application of COSMO-PAFOG for fog events in the west of Germany, model simulations will be performed for coastal desert fog in the Namib region.

  4. Strategies to reduce the complexity of hydrologic data assimilation for high-dimensional models

    NASA Astrophysics Data System (ADS)

    Hernandez, F.; Liang, X.

    2017-12-01

    Probabilistic forecasts in the geosciences offer invaluable information by allowing to estimate the uncertainty of predicted conditions (including threats like floods and droughts). However, while forecast systems based on modern data assimilation algorithms are capable of producing multi-variate probability distributions of future conditions, the computational resources required to fully characterize the dependencies between the model's state variables render their applicability impractical for high-resolution cases. This occurs because of the quadratic space complexity of storing the covariance matrices that encode these dependencies and the cubic time complexity of performing inference operations with them. In this work we introduce two complementary strategies to reduce the size of the covariance matrices that are at the heart of Bayesian assimilation methods—like some variants of (ensemble) Kalman filters and of particle filters—and variational methods. The first strategy involves the optimized grouping of state variables by clustering individual cells of the model into "super-cells." A dynamic fuzzy clustering approach is used to take into account the states (e.g., soil moisture) and forcings (e.g., precipitation) of each cell at each time step. The second strategy consists in finding a compressed representation of the covariance matrix that still encodes the most relevant information but that can be more efficiently stored and processed. A learning and a belief-propagation inference algorithm are developed to take advantage of this modified low-rank representation. The two proposed strategies are incorporated into OPTIMISTS, a state-of-the-art hybrid Bayesian/variational data assimilation algorithm, and comparative streamflow forecasting tests are performed using two watersheds modeled with the Distributed Hydrology Soil Vegetation Model (DHSVM). Contrasts are made between the efficiency gains and forecast accuracy losses of each strategy used in isolation, and of those achieved through their coupling. We expect these developments to help catalyze improvements in the predictive accuracy of large-scale forecasting operations by lowering the costs of deploying advanced data assimilation techniques.

  5. An aerobic scope-based habitat suitability index for predicting the effects of multi-dimensional climate change stressors on marine teleosts

    NASA Astrophysics Data System (ADS)

    Del Raye, Gen; Weng, Kevin C.

    2015-03-01

    Climate change will expose many marine ecosystems to temperature, oxygen and CO2 conditions that have not been experienced for millennia. Predicting the impact of these changes on marine fishes is difficult due to the complexity of these disparate stressors and the inherent non-linearity of physiological systems. Aerobic scope (the difference between maximum and minimum aerobic metabolic rates) is a coherent, unifying physiological framework that can be used to examine all of the major environmental changes expected to occur in the oceans during this century. Using this framework, we develop a physiology-based habitat suitability model to forecast the response of marine fishes to simultaneous ocean acidification, warming and deoxygenation, including interactions between all three stressors. We present an example of the model parameterized for Thunnus albacares (yellowfin tuna), an important fisheries species that is likely to be affected by climate change. We anticipate that if embedded into multispecies ecosystem models, our model could help to more precisely forecast climate change impacts on the distribution and abundance of other high value species. Finally, we show how our model may indicate the potential for, and limits of, adaptation to chronic stressors.

  6. Sub-seasonal-to-seasonal Reservoir Inflow Forecast using Bayesian Hierarchical Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, S.; Arumugam, S.

    2017-12-01

    Sub-seasonal-to-seasonal (S2S) (15-90 days) streamflow forecasting is an emerging area of research that provides seamless information for reservoir operation from weather time scales to seasonal time scales. From an operational perspective, sub-seasonal inflow forecasts are highly valuable as these enable water managers to decide short-term releases (15-30 days), while holding water for seasonal needs (e.g., irrigation and municipal supply) and to meet end-of-the-season target storage at a desired level. We propose a Bayesian Hierarchical Hidden Markov Model (BHHMM) to develop S2S inflow forecasts for the Tennessee Valley Area (TVA) reservoir system. Here, the hidden states are predicted by relevant indices that influence the inflows at S2S time scale. The hidden Markov model also captures the both spatial and temporal hierarchy in predictors that operate at S2S time scale with model parameters being estimated as a posterior distribution using a Bayesian framework. We present our work in two steps, namely single site model and multi-site model. For proof of concept, we consider inflows to Douglas Dam, Tennessee, in the single site model. For multisite model we consider reservoirs in the upper Tennessee valley. Streamflow forecasts are issued and updated continuously every day at S2S time scale. We considered precipitation forecasts obtained from NOAA Climate Forecast System (CFSv2) GCM as predictors for developing S2S streamflow forecasts along with relevant indices for predicting hidden states. Spatial dependence of the inflow series of reservoirs are also preserved in the multi-site model. To circumvent the non-normality of the data, we consider the HMM in a Generalized Linear Model setting. Skill of the proposed approach is tested using split sample validation against a traditional multi-site canonical correlation model developed using the same set of predictors. From the posterior distribution of the inflow forecasts, we also highlight different system behavior under varied global and local scale climatic influences from the developed BHMM.

  7. Applications of Satellite Observations to Aerosol Analyses and Forecasting using the NAAPS Model and the DataFed Distributed Data System

    NASA Astrophysics Data System (ADS)

    Husar, R. B.; Hoijarvi, K.; Westphal, D. L.; Scheffe, R.; Keating, T.; Frank, N.; Poirot, R.; DuBois, D. W.; Bleiweiss, M. P.; Eberhard, W. L.; Menon, R.; Sethi, V.; Deshpande, A.

    2012-12-01

    Near-real-time (NRT) aerosol characterization, forecasting and decision support is now possible through the availability of (1) surface-based monitoring of regional PM concentrations, (2) global-scale columnar aerosol observations through satellites; (3) an aerosol model (NAAPS) that is capable of assimilating NRT satellite observations; and (4) an emerging cyber infrastructure for processing and distribution of data and model results (DataFed) for a wide range of users. This report describes the evolving NRT aerosol analysis and forecasting system and its applications at Federal and State and other AQ Agencies and groups. Through use cases and persistent real-world applications in the US and abroad, the report will show how satellite observations along with surface data and models are combined to aid decision support for AQ management, science and informing the public. NAAPS is the U.S. Navy's global aerosol and visibility forecast model that generates operational six-day global-scale forecasts for sulfate, dust, sea salt, and smoke aerosol. Through NAVDAS-AOD, NAAPS operationally assimilates filtered and corrected MODIS MOD04 aerosol optical depths and uses satellite-derived FLAMBÉ smoke emissions. Washington University's federated data system, DataFed, consist of a (1) data server which mediates the access to AQ datasets from distributed providers (NASA, NOAA, EPA, etc.,); (2) an AQ Data Catalog for finding and accessing data; and (3) a set of application programs/tools for browsing, exploring, comparing, aggregating, fusing data, evaluating models and delivering outputs through interactive visualization. NAAPS and DataFed are components of the Global Earth Observation System of Systems (GEOSS). Satellite data support the detection of long-range transported wind-blown dust and biomass smoke aerosols on hemispheric scales. The AQ management and analyst communities use the satellite/model data through DataFed and other channels as evidence for Exceptional Events (EE) as defined by EPA; i.e., Sahara dust impact on Texas and Florida, local dusts events in the Southwestern U.S. and Canadian smoke events over the Northeastern U.S. Recent applications include the impact analysis of a major Saudi Arabian dust event on Mumbai, India air quality. The NAAPS model and the DataFed tools can visualize the dynamic AQ events as they are manifested through the different sensors. Satellite-derived aerosol observations assimilated into NAAPS provide estimates of daily emission rates for dust and biomass fire sources. Tuning and reconciliation of the observations, emissions and models constitutes a key and novel contribution yielding a convergence toward the true five-dimensional (X, Y, Z, T, Composition) characterization of the atmospheric aerosol data space. This observation-emission-model reconciliation effort is aided by model evaluation tools and supports the international HTAP program. The report will also discuss some of the challenges facing multi-disciplinary, multi-agency, multi-national applications of integrated observation-modeling system of systems that impede the incorporation of satellite observations into AQ management decision support systems.

  8. Bounds on strong field magneto-transport in three-dimensional composites

    NASA Astrophysics Data System (ADS)

    Briane, Marc; Milton, Graeme W.

    2011-10-01

    This paper deals with bounds satisfied by the effective non-symmetric conductivity of three-dimensional composites in the presence of a strong magnetic field. On the one hand, it is shown that for general composites the antisymmetric part of the effective conductivity cannot be bounded solely in terms of the antisymmetric part of the local conductivity, contrary to the columnar case studied by Briane and Milton [SIAM J. Appl. Math. 70(8), 3272-3286 (2010), 10.1137/100798090]. Thus a suitable rank-two laminate, the conductivity of which has a bounded antisymmetric part together with a high-contrast symmetric part, may generate an arbitrarily large antisymmetric part of the effective conductivity. On the other hand, bounds are provided which show that the antisymmetric part of the effective conductivity must go to zero if the upper bound on the antisymmetric part of the local conductivity goes to zero, and the symmetric part of the local conductivity remains bounded below and above. Elementary bounds on the effective moduli are derived assuming the local conductivity and the effective conductivity have transverse isotropy in the plane orthogonal to the magnetic field. New Hashin-Shtrikman type bounds for two-phase three-dimensional composites with a non-symmetric conductivity are provided under geometric isotropy of the microstructure. The derivation of the bounds is based on a particular variational principle symmetrizing the problem, and the use of Y-tensors involving the averages of the fields in each phase.

  9. Nonlinear forecasting analysis of inflation-deflation patterns of an active caldera (Campi Flegrei, Italy)

    USGS Publications Warehouse

    Cortini, M.; Barton, C.C.

    1993-01-01

    The ground level in Pozzuoli, Italy, at the center of the Campi Flegrei caldera, has been monitored by tide gauges. Previous work suggests that the dynamics of the Campi Flegrei system, as reconstructed from the tide gauge record, is chaotic and low dimensional. According to this suggestion, in spite of the complexity of the system, at a time scale of days the ground motion is driven by a deterministic mechanism with few degrees of freedom; however, the interactions of the system may never be describable in full detail. New analysis of the tide gauge record using Nonlinear Forecasting, confirms low-dimensional chaos in the ground elevation record at Campi Flegrei and suggests that Nonlinear Forecasting could be a useful tool in volcanic surveillance. -from Authors

  10. Encounters in an online brand community: development and validation of a metric for value co-creation by customers.

    PubMed

    Hsieh, Pei-Ling

    2015-05-01

    Recent developments in service marketing have demonstrated the potential value co-creation by customers who participate in online brand communities (OBCs). Therefore, this study forecasts the co-created value by understanding the participation/behavior of customers in a multi-stakeholder OBC. This six-phase qualitative and quantitative investigation conceptualizes, constructs, refines, and tests a 12-item three-dimensional scale for measuring key factors that are related to the experience, interpersonal interactions, and social relationships that affect the value co-creation by customers in an OBC. The scale captures stable psychometric properties, measured using various reliability and validity tests, and can be applied across various industries. Finally, the utility implications and limitations of the proposed scale are discussed, and potential future research directions considered.

  11. ENSURF: multi-model sea level forecast - implementation and validation results for the IBIROOS and Western Mediterranean regions

    NASA Astrophysics Data System (ADS)

    Pérez, B.; Brouwer, R.; Beckers, J.; Paradis, D.; Balseiro, C.; Lyons, K.; Cure, M.; Sotillo, M. G.; Hackett, B.; Verlaan, M.; Fanjul, E. A.

    2012-03-01

    ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast that makes use of several storm surge or circulation models and near-real time tide gauge data in the region, with the following main goals: 1. providing easy access to existing forecasts, as well as to its performance and model validation, by means of an adequate visualization tool; 2. generation of better forecasts of sea level, including confidence intervals, by means of the Bayesian Model Average technique (BMA). The Bayesian Model Average technique generates an overall forecast probability density function (PDF) by making a weighted average of the individual forecasts PDF's; the weights represent the Bayesian likelihood that a model will give the correct forecast and are continuously updated based on the performance of the models during a recent training period. This implies the technique needs the availability of sea level data from tide gauges in near-real time. The system was implemented for the European Atlantic facade (IBIROOS region) and Western Mediterranean coast based on the MATROOS visualization tool developed by Deltares. Results of validation of the different models and BMA implementation for the main harbours are presented for these regions where this kind of activity is performed for the first time. The system is currently operational at Puertos del Estado and has proved to be useful in the detection of calibration problems in some of the circulation models, in the identification of the systematic differences between baroclinic and barotropic models for sea level forecasts and to demonstrate the feasibility of providing an overall probabilistic forecast, based on the BMA method.

  12. An application of ensemble/multi model approach for wind power production forecasting

    NASA Astrophysics Data System (ADS)

    Alessandrini, S.; Pinson, P.; Hagedorn, R.; Decimi, G.; Sperati, S.

    2011-02-01

    The wind power forecasts of the 3 days ahead period are becoming always more useful and important in reducing the problem of grid integration and energy price trading due to the increasing wind power penetration. Therefore it's clear that the accuracy of this forecast is one of the most important requirements for a successful application. The wind power forecast applied in this study is based on meteorological models that provide the 3 days ahead wind data. A Model Output Statistic correction is then performed to reduce systematic error caused, for instance, by a wrong representation of surface roughness or topography in the meteorological models. For this purpose a training of a Neural Network (NN) to link directly the forecasted meteorological data and the power data has been performed. One wind farm has been examined located in a mountain area in the south of Italy (Sicily). First we compare the performances of a prediction based on meteorological data coming from a single model with those obtained by the combination of models (RAMS, ECMWF deterministic, LAMI). It is shown that the multi models approach reduces the day-ahead normalized RMSE forecast error (normalized by nominal power) of at least 1% compared to the singles models approach. Finally we have focused on the possibility of using the ensemble model system (EPS by ECMWF) to estimate the hourly, three days ahead, power forecast accuracy. Contingency diagram between RMSE of the deterministic power forecast and the ensemble members spread of wind forecast have been produced. From this first analysis it seems that ensemble spread could be used as an indicator of the forecast's accuracy at least for the first three days ahead period.

  13. Integration of Local Observations into the One Dimensional Fog Model PAFOG

    NASA Astrophysics Data System (ADS)

    Thoma, Christina; Schneider, Werner; Masbou, Matthieu; Bott, Andreas

    2012-05-01

    The numerical prediction of fog requires a very high vertical resolution of the atmosphere. Owing to a prohibitive computational effort of high resolution three dimensional models, operational fog forecast is usually done by means of one dimensional fog models. An important condition for a successful fog forecast with one dimensional models consists of the proper integration of observational data into the numerical simulations. The goal of the present study is to introduce new methods for the consideration of these data in the one dimensional radiation fog model PAFOG. First, it will be shown how PAFOG may be initialized with observed visibilities. Second, a nudging scheme will be presented for the inclusion of measured temperature and humidity profiles in the PAFOG simulations. The new features of PAFOG have been tested by comparing the model results with observations of the German Meteorological Service. A case study will be presented that reveals the importance of including local observations in the model calculations. Numerical results obtained with the modified PAFOG model show a distinct improvement of fog forecasts regarding the times of fog formation, dissipation as well as the vertical extent of the investigated fog events. However, model results also reveal that a further improvement of PAFOG might be possible if several empirical model parameters are optimized. This tuning can only be realized by comprehensive comparisons of model simulations with corresponding fog observations.

  14. THE EMERGENCE OF NUMERICAL AIR QUALITY FORCASTING MODELS AND THEIR APPLICATIONS

    EPA Science Inventory

    In recent years the U.S. and other nations have begun programs for short-term local through regional air quality forecasting based upon numerical three-dimensional air quality grid models. These numerical air quality forecast (NAQF) models and systems have been developed and test...

  15. Software reliability: Additional investigations into modeling with replicated experiments

    NASA Technical Reports Server (NTRS)

    Nagel, P. M.; Schotz, F. M.; Skirvan, J. A.

    1984-01-01

    The effects of programmer experience level, different program usage distributions, and programming languages are explored. All these factors affect performance, and some tentative relational hypotheses are presented. An analytic framework for replicated and non-replicated (traditional) software experiments is presented. A method of obtaining an upper bound on the error rate of the next error is proposed. The method was validated empirically by comparing forecasts with actual data. In all 14 cases the bound exceeded the observed parameter, albeit somewhat conservatively. Two other forecasting methods are proposed and compared to observed results. Although demonstrated relative to this framework that stages are neither independent nor exponentially distributed, empirical estimates show that the exponential assumption is nearly valid for all but the extreme tails of the distribution. Except for the dependence in the stage probabilities, Cox's model approximates to a degree what is being observed.

  16. Forecasting the ocean optical environment in support of Navy mine warfare operations

    NASA Astrophysics Data System (ADS)

    Ladner, S. D.; Arnone, R.; Jolliff, J.; Casey, B.; Matulewski, K.

    2012-06-01

    A 3D ocean optical forecast system called TODS (Tactical Ocean Data System) has been developed to determine the performance of underwater LIDAR detection/identification systems. TODS fuses optical measurements from gliders, surface satellite optical properties, and 3D ocean forecast circulation models to extend the 2-dimensional surface satellite optics into a 3-dimensional optical volume including subsurface optical layers of beam attenuation coefficient (c) and diver visibility. Optical 3D nowcast and forecasts are combined with electro-optical identification (EOID) models to determine the underwater LIDAR imaging performance field used to identify subsurface mine threats in rapidly changing coastal regions. TODS was validated during a recent mine warfare exercise with Helicopter Mine Countermeasures Squadron (HM-14). Results include the uncertainties in the optical forecast and lidar performance and sensor tow height predictions that are based on visual detection and identification metrics using actual mine target images from the EOID system. TODS is a new capability of coupling the 3D optical environment and EOID system performance and is proving important for the MIW community as both a tactical decision aid and for use in operational planning, improving timeliness and efficiency in clearance operations.

  17. Efficient ensemble forecasting of marine ecology with clustered 1D models and statistical lateral exchange: application to the Red Sea

    NASA Astrophysics Data System (ADS)

    Dreano, Denis; Tsiaras, Kostas; Triantafyllou, George; Hoteit, Ibrahim

    2017-07-01

    Forecasting the state of large marine ecosystems is important for many economic and public health applications. However, advanced three-dimensional (3D) ecosystem models, such as the European Regional Seas Ecosystem Model (ERSEM), are computationally expensive, especially when implemented within an ensemble data assimilation system requiring several parallel integrations. As an alternative to 3D ecological forecasting systems, we propose to implement a set of regional one-dimensional (1D) water-column ecological models that run at a fraction of the computational cost. The 1D model domains are determined using a Gaussian mixture model (GMM)-based clustering method and satellite chlorophyll-a (Chl-a) data. Regionally averaged Chl-a data is assimilated into the 1D models using the singular evolutive interpolated Kalman (SEIK) filter. To laterally exchange information between subregions and improve the forecasting skills, we introduce a new correction step to the assimilation scheme, in which we assimilate a statistical forecast of future Chl-a observations based on information from neighbouring regions. We apply this approach to the Red Sea and show that the assimilative 1D ecological models can forecast surface Chl-a concentration with high accuracy. The statistical assimilation step further improves the forecasting skill by as much as 50%. This general approach of clustering large marine areas and running several interacting 1D ecological models is very flexible. It allows many combinations of clustering, filtering and regression technics to be used and can be applied to build efficient forecasting systems in other large marine ecosystems.

  18. Intrinsic two-dimensional states on the pristine surface of tellurium

    NASA Astrophysics Data System (ADS)

    Li, Pengke; Appelbaum, Ian

    2018-05-01

    Atomic chains configured in a helical geometry have fascinating properties, including phases hosting localized bound states in their electronic structure. We show how the zero-dimensional state—bound to the edge of a single one-dimensional helical chain of tellurium atoms—evolves into two-dimensional bands on the c -axis surface of the three-dimensional trigonal bulk. We give an effective Hamiltonian description of its dispersion in k space by exploiting confinement to a virtual bilayer, and elaborate on the diminished role of spin-orbit coupling. These intrinsic gap-penetrating surface bands were neglected in the interpretation of seminal experiments, where two-dimensional transport was otherwise attributed to extrinsic accumulation layers.

  19. iFLOOD: A Real Time Flood Forecast System for Total Water Modeling in the National Capital Region

    NASA Astrophysics Data System (ADS)

    Sumi, S. J.; Ferreira, C.

    2017-12-01

    Extreme flood events are the costliest natural hazards impacting the US and frequently cause extensive damages to infrastructure, disruption to economy and loss of lives. In 2016, Hurricane Matthew brought severe damage to South Carolina and demonstrated the importance of accurate flood hazard predictions that requires the integration of riverine and coastal model forecasts for total water prediction in coastal and tidal areas. The National Weather Service (NWS) and the National Ocean Service (NOS) provide flood forecasts for almost the entire US, still there are service-gap areas in tidal regions where no official flood forecast is available. The National capital region is vulnerable to multi-flood hazards including high flows from annual inland precipitation events and surge driven coastal inundation along the tidal Potomac River. Predicting flood levels on such tidal areas in river-estuarine zone is extremely challenging. The main objective of this study is to develop the next generation of flood forecast systems capable of providing accurate and timely information to support emergency management and response in areas impacted by multi-flood hazards. This forecast system is capable of simulating flood levels in the Potomac and Anacostia River incorporating the effects of riverine flooding from the upstream basins, urban storm water and tidal oscillations from the Chesapeake Bay. Flood forecast models developed so far have been using riverine data to simulate water levels for Potomac River. Therefore, the idea is to use forecasted storm surge data from a coastal model as boundary condition of this system. Final output of this validated model will capture the water behavior in river-estuary transition zone far better than the one with riverine data only. The challenge for this iFLOOD forecast system is to understand the complex dynamics of multi-flood hazards caused by storm surges, riverine flow, tidal oscillation and urban storm water. Automated system simulations will help to develop a seamless integration with the boundary systems in the service-gap area with new insights into our scientific understanding of such complex systems. A visualization system is being developed to allow stake holders and the community to have access to the flood forecasting for their region with sufficient lead time.

  20. Developing Multi-model Ensemble for Precipitation and Temperature Seasonal Forecasts: Implications for Karkheh River Basin in Iran

    NASA Astrophysics Data System (ADS)

    Najafi, Husain; Massah Bavani, Ali Reza; Wanders, Niko; Wood, Eric; Irannejad, Parviz; Robertson, Andrew

    2017-04-01

    Water resource managers can utilize reliable seasonal forecasts for allocating water between different users within a water year. In the west of Iran where a decline of renewable water resources has been observed, basin-wide water management has been the subject of many inter-provincial conflicts in recent years. The problem is exacerbated when the environmental water requirements is not provided leaving the Hoor-al-Azim marshland in the downstream dry. It has been argued that information on total seasonal rainfall can support the Iranian Ministry of Energy within the water year. This study explores the skill of the North America Multi Model Ensemble for Karkheh River Basin in the of west Iran. NMME seasonal precipitation and temperature forecasts from eight models are evaluated against PERSIANN-CDR and Climate Research Unit (CRU) datasets. Analysis suggests that anomaly correlation for both precipitation and temperature is greater than 0.4 for all individual models. Lead time-dependent seasonal forecasts are improved when a multi-model ensemble is developed for the river basin using stepwise linear regression model. MME R-squared exceeds 0.6 for temperature for almost all initializations suggesting high skill of NMME in Karkheh river basin. The skill of MME for rainfall forecasts is high for 1-month lead time for October, February, March and October initializations. However, for months when the amount of rainfall accounts for a significant proportion of total annual rainfall, the skill of NMME is limited a month in advance. It is proposed that operational regional water companies incorporate NMME seasonal forecasts into water resource planning and management, especially during growing seasons that are essential for agricultural risk management.

  1. Homogenization-based interval analysis for structural-acoustic problem involving periodical composites and multi-scale uncertain-but-bounded parameters.

    PubMed

    Chen, Ning; Yu, Dejie; Xia, Baizhan; Liu, Jian; Ma, Zhengdong

    2017-04-01

    This paper presents a homogenization-based interval analysis method for the prediction of coupled structural-acoustic systems involving periodical composites and multi-scale uncertain-but-bounded parameters. In the structural-acoustic system, the macro plate structure is assumed to be composed of a periodically uniform microstructure. The equivalent macro material properties of the microstructure are computed using the homogenization method. By integrating the first-order Taylor expansion interval analysis method with the homogenization-based finite element method, a homogenization-based interval finite element method (HIFEM) is developed to solve a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters. The corresponding formulations of the HIFEM are deduced. A subinterval technique is also introduced into the HIFEM for higher accuracy. Numerical examples of a hexahedral box and an automobile passenger compartment are given to demonstrate the efficiency of the presented method for a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters.

  2. Real-Time Analysis of a Sensor's Data for Automated Decision Making in an IoT-Based Smart Home.

    PubMed

    Khan, Nida Saddaf; Ghani, Sayeed; Haider, Sajjad

    2018-05-25

    IoT devices frequently generate large volumes of streaming data and in order to take advantage of this data, their temporal patterns must be learned and identified. Streaming data analysis has become popular after being successfully used in many applications including forecasting electricity load, stock market prices, weather conditions, etc. Artificial Neural Networks (ANNs) have been successfully utilized in understanding the embedded interesting patterns/behaviors in the data and forecasting the future values based on it. One such pattern is modelled and learned in the present study to identify the occurrence of a specific pattern in a Water Management System (WMS). This prediction aids in making an automatic decision support system, to switch OFF a hydraulic suction pump at the appropriate time. Three types of ANN, namely Multi-Input Multi-Output (MIMO), Multi-Input Single-Output (MISO), and Recurrent Neural Network (RNN) have been compared, for multi-step-ahead forecasting, on a sensor's streaming data. Experiments have shown that RNN has the best performance among three models and based on its prediction, a system can be implemented to make the best decision with 86% accuracy.

  3. Accessing Multi-Dimensional Images and Data Cubes in the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Tody, Douglas; Plante, R. L.; Berriman, G. B.; Cresitello-Dittmar, M.; Good, J.; Graham, M.; Greene, G.; Hanisch, R. J.; Jenness, T.; Lazio, J.; Norris, P.; Pevunova, O.; Rots, A. H.

    2014-01-01

    Telescopes across the spectrum are routinely producing multi-dimensional images and datasets, such as Doppler velocity cubes, polarization datasets, and time-resolved “movies.” Examples of current telescopes producing such multi-dimensional images include the JVLA, ALMA, and the IFU instruments on large optical and near-infrared wavelength telescopes. In the near future, both the LSST and JWST will also produce such multi-dimensional images routinely. High-energy instruments such as Chandra produce event datasets that are also a form of multi-dimensional data, in effect being a very sparse multi-dimensional image. Ensuring that the data sets produced by these telescopes can be both discovered and accessed by the community is essential and is part of the mission of the Virtual Observatory (VO). The Virtual Astronomical Observatory (VAO, http://www.usvao.org/), in conjunction with its international partners in the International Virtual Observatory Alliance (IVOA), has developed a protocol and an initial demonstration service designed for the publication, discovery, and access of arbitrarily large multi-dimensional images. The protocol describing multi-dimensional images is the Simple Image Access Protocol, version 2, which provides the minimal set of metadata required to characterize a multi-dimensional image for its discovery and access. A companion Image Data Model formally defines the semantics and structure of multi-dimensional images independently of how they are serialized, while providing capabilities such as support for sparse data that are essential to deal effectively with large cubes. A prototype data access service has been deployed and tested, using a suite of multi-dimensional images from a variety of telescopes. The prototype has demonstrated the capability to discover and remotely access multi-dimensional data via standard VO protocols. The prototype informs the specification of a protocol that will be submitted to the IVOA for approval, with an operational data cube service to be delivered in mid-2014. An associated user-installable VO data service framework will provide the capabilities required to publish VO-compatible multi-dimensional images or data cubes.

  4. Vector tomography for reconstructing electric fields with non-zero divergence in bounded domains

    NASA Astrophysics Data System (ADS)

    Koulouri, Alexandra; Brookes, Mike; Rimpiläinen, Ville

    2017-01-01

    In vector tomography (VT), the aim is to reconstruct an unknown multi-dimensional vector field using line integral data. In the case of a 2-dimensional VT, two types of line integral data are usually required. These data correspond to integration of the parallel and perpendicular projection of the vector field along the integration lines and are called the longitudinal and transverse measurements, respectively. In most cases, however, the transverse measurements cannot be physically acquired. Therefore, the VT methods are typically used to reconstruct divergence-free (or source-free) velocity and flow fields that can be reconstructed solely from the longitudinal measurements. In this paper, we show how vector fields with non-zero divergence in a bounded domain can also be reconstructed from the longitudinal measurements without the need of explicitly evaluating the transverse measurements. To the best of our knowledge, VT has not previously been used for this purpose. In particular, we study low-frequency, time-harmonic electric fields generated by dipole sources in convex bounded domains which arise, for example, in electroencephalography (EEG) source imaging. We explain in detail the theoretical background, the derivation of the electric field inverse problem and the numerical approximation of the line integrals. We show that fields with non-zero divergence can be reconstructed from the longitudinal measurements with the help of two sparsity constraints that are constructed from the transverse measurements and the vector Laplace operator. As a comparison to EEG source imaging, we note that VT does not require mathematical modeling of the sources. By numerical simulations, we show that the pattern of the electric field can be correctly estimated using VT and the location of the source activity can be determined accurately from the reconstructed magnitudes of the field.

  5. A three-dimensional in vitro model to demonstrate the haptotactic effect of monocyte chemoattractant protein-1 on atherosclerosis-associated monocyte migration

    PubMed Central

    Ghousifam, Neda; Mortazavian, Seyyed Hamid; Bhowmick, Rudra; Vasquez, Yolanda; Blum, Frank D.; Gappa-Fahlenkamp, Heather

    2017-01-01

    Monocyte transendothelial migration is a multi-step process critical for the initiation and development of atherosclerosis. The chemokine monocyte chemoattractant protein-1 (MCP-1) is overexpressed during atheroma and its concentration gradients in the extracellular matrix (ECM) is critical for the transendothelial recruitment of monocytes. Based on prior observations, we hypothesize that both free and bound gradients of MCP-1 within the ECM are involved in directing monocyte migration. The interaction between a three-dimensional (3D), cell-free, collagen matrix and MCP-1; and its effect on monocyte migration was measured in this study. Our results showed such an interaction existed between MCP-1 and collagen, as 26% of the total MCP-1 added to the collagen matrix was bound to the matrix after extensive washes. We also characterized the collagen-MCP-1 interaction using biophysical techniques. The treatment of the collagen matrix with MCP-1 lead to increased monocyte migration, and this phenotype was abrogated by treating the matrix with an anti-MCP-1 antibody. Thus, our results indicate a binding interaction between MCP-1 and the collagen matrix, which could elicit a haptotactic effect on monocyte migration. A better understanding of such mechanisms controlling monocyte migration will help identify target cytokines and lead to the development of better anti-inflammatory therapeutic strategies. PMID:28041913

  6. Some applications of the multi-dimensional fractional order for the Riemann-Liouville derivative

    NASA Astrophysics Data System (ADS)

    Ahmood, Wasan Ajeel; Kiliçman, Adem

    2017-01-01

    In this paper, the aim of this work is to study theorem for the one-dimensional space-time fractional deriative, generalize some function for the one-dimensional fractional by table represents the fractional Laplace transforms of some elementary functions to be valid for the multi-dimensional fractional Laplace transform and give the definition of the multi-dimensional fractional Laplace transform. This study includes that, dedicate the one-dimensional fractional Laplace transform for functions of only one independent variable and develop of the one-dimensional fractional Laplace transform to multi-dimensional fractional Laplace transform based on the modified Riemann-Liouville derivative.

  7. Droplet states in quantum XXZ spin systems on general graphs

    NASA Astrophysics Data System (ADS)

    Fischbacher, C.; Stolz, G.

    2018-05-01

    We study XXZ spin systems on general graphs. In particular, we describe the formation of droplet states near the bottom of the spectrum in the Ising phase of the model, where the Z-term dominates the XX-term. As key tools, we use particle number conservation of XXZ systems and symmetric products of graphs with their associated adjacency matrices and Laplacians. Of particular interest to us are strips and multi-dimensional Euclidean lattices, for which we discuss the existence of spectral gaps above the droplet regime. We also prove a Combes-Thomas bound which shows that the eigenstates in the droplet regime are exponentially small perturbations of strict (classical) droplets.

  8. Variable field-of-view visible and near-infrared polarization compound-eye endoscope.

    PubMed

    Kagawa, K; Shogenji, R; Tanaka, E; Yamada, K; Kawahito, S; Tanida, J

    2012-01-01

    A multi-functional compound-eye endoscope enabling variable field-of-view and polarization imaging as well as extremely deep focus is presented, which is based on a compact compound-eye camera called TOMBO (thin observation module by bound optics). Fixed and movable mirrors are introduced to control the field of view. Metal-wire-grid polarizer thin film applicable to both of visible and near-infrared lights is attached to the lenses in TOMBO and light sources. Control of the field-of-view, polarization and wavelength of the illumination realizes several observation modes such as three-dimensional shape measurement, wide field-of-view, and close-up observation of the superficial tissues and structures beneath the skin.

  9. ENSURF: multi-model sea level forecast - implementation and validation results for the IBIROOS and Western Mediterranean regions

    NASA Astrophysics Data System (ADS)

    Pérez, B.; Brower, R.; Beckers, J.; Paradis, D.; Balseiro, C.; Lyons, K.; Cure, M.; Sotillo, M. G.; Hacket, B.; Verlaan, M.; Alvarez Fanjul, E.

    2011-04-01

    ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast that makes use of existing storm surge or circulation models today operational in Europe, as well as near-real time tide gauge data in the region, with the following main goals: - providing an easy access to existing forecasts, as well as to its performance and model validation, by means of an adequate visualization tool - generation of better forecasts of sea level, including confidence intervals, by means of the Bayesian Model Average Technique (BMA) The system was developed and implemented within ECOOP (C.No. 036355) European Project for the NOOS and the IBIROOS regions, based on MATROOS visualization tool developed by Deltares. Both systems are today operational at Deltares and Puertos del Estado respectively. The Bayesian Modelling Average technique generates an overall forecast probability density function (PDF) by making a weighted average of the individual forecasts PDF's; the weights represent the probability that a model will give the correct forecast PDF and are determined and updated operationally based on the performance of the models during a recent training period. This implies the technique needs the availability of sea level data from tide gauges in near-real time. Results of validation of the different models and BMA implementation for the main harbours will be presented for the IBIROOS and Western Mediterranean regions, where this kind of activity is performed for the first time. The work has proved to be useful to detect problems in some of the circulation models not previously well calibrated with sea level data, to identify the differences on baroclinic and barotropic models for sea level applications and to confirm the general improvement of the BMA forecasts.

  10. Forecasting European Droughts using the North American Multi-Model Ensemble (NMME)

    NASA Astrophysics Data System (ADS)

    Thober, Stephan; Kumar, Rohini; Samaniego, Luis; Sheffield, Justin; Schäfer, David; Mai, Juliane

    2015-04-01

    Soil moisture droughts have the potential to diminish crop yields causing economic damage or even threatening the livelihood of societies. State-of-the-art drought forecasting systems incorporate seasonal meteorological forecasts to estimate future drought conditions. Meteorological forecasting skill (in particular that of precipitation), however, is limited to a few weeks because of the chaotic behaviour of the atmosphere. One of the most important challenges in drought forecasting is to understand how the uncertainty in the atmospheric forcings (e.g., precipitation and temperature) is further propagated into hydrologic variables such as soil moisture. The North American Multi-Model Ensemble (NMME) provides the latest collection of a multi-institutional seasonal forecasting ensemble for precipitation and temperature. In this study, we analyse the skill of NMME forecasts for predicting European drought events. The monthly NMME forecasts are downscaled to daily values to force the mesoscale hydrological model (mHM). The mHM soil moisture forecasts obtained with the forcings of the dynamical models are then compared against those obtained with the Ensemble Streamflow Prediction (ESP) approach. ESP recombines historical meteorological forcings to create a new ensemble forecast. Both forecasts are compared against reference soil moisture conditions obtained using observation based meteorological forcings. The study is conducted for the period from 1982 to 2009 and covers a large part of the Pan-European domain (10°W to 40°E and 35°N to 55°N). Results indicate that NMME forecasts are better at predicting the reference soil moisture variability as compared to ESP. For example, NMME explains 50% of the variability in contrast to only 31% by ESP at a six-month lead time. The Equitable Threat Skill Score (ETS), which combines the hit and false alarm rates, is analysed for drought events using a 0.2 threshold of a soil moisture percentile index. On average, the NMME based ensemble forecasts have consistently higher skill than the ESP based ones (ETS of 13% as compared to 5% at a six-month lead time). Additionally, the ETS ensemble spread of NMME forecasts is considerably narrower than that of ESP; the lower boundary of the NMME ensemble spread coincides most of the time with the ensemble median of ESP. Among the NMME models, NCEP-CFSv2 outperforms the other models in terms of ETS most of the time. Removing the three worst performing models does not deteriorate the ensemble performance (neither in skill nor in spread), but would substantially reduce the computational resources required in an operational forecasting system. For major European drought events (e.g., 1990, 1992, 2003, and 2007), NMME forecasts tend to underestimate area under drought and drought magnitude during times of drought development. During drought recovery, this underestimation is weaker for area under drought or even reversed into an overestimation for drought magnitude. This indicates that the NMME models are too wet during drought development and too dry during drought recovery. In summary, soil moisture drought forecasts by NMME are more skillful than those of an ESP based approach. However, they still show systematic biases in reproducing the observed drought dynamics during drought development and recovery.

  11. Low-wave number analysis of observations and ensemble forecasts to develop metrics for the selection of most realistic members to study multi-scale interactions between the environment and the convective organization of hurricanes: Focus on Rapid Intensification

    NASA Astrophysics Data System (ADS)

    Hristova-Veleva, S. M.; Chen, H.; Gopalakrishnan, S.; Haddad, Z. S.

    2017-12-01

    Tropical cyclones (TCs) are the product of complex multi-scale processes and interactions. The role of the environment has long been recognized. However, recent research has shown that convective-scale processes in the hurricane core might also play a crucial role in determining TCs intensity and size. Several studies have linked Rapid Intensification to the characteristics of the convective clouds (shallow versus deep), their organization (isolated versus wide-spread) and their location with respect to dynamical controls (the vertical shear, the radius of maximum wind). Yet a third set of controls signifies the interaction between the storm-scale and large-scale processes. Our goal is to use observations and models to advance the still-lacking understanding of these processes. Recently, hurricane models have improved significantly. However, deterministic forecasts have limitations due to the uncertainty in the representation of the physical processes and initial conditions. A crucial step forward is the use of high-resolution ensembles. We adopt the following approach: i) generate a high resolution ensemble forecast using HWRF; ii) produce synthetic data (e.g. brightness temperature) from the model fields for direct comparison to satellite observations; iii) develop metrics to allow us to sub-select the realistic members of the ensemble, based on objective measures of the similarity between observed and forecasted structures; iv) for these most-realistic members, determine the skill in forecasting TCs to provide"guidance on guidance"; v) use the members with the best predictive skill to untangle the complex multi-scale interactions. We will report on the first three goals of our research, using forecasts and observations of hurricane Edouard (2014), focusing on RI. We will focus on describing the metrics for the selection of the most appropriate ensemble members, based on applying low-wave number analysis (WNA - Hristova-Veleva et al., 2016) to the observed and forecasted 2D fields to develop objective criteria for consistency. We investigate the WNA cartoons of environmental moisture, precipitation structure and surface convergence. We will present the preliminary selection of most skillful members and will outline our future goals - analyzing the multi-scale interactions using these members

  12. A FAST BAYESIAN METHOD FOR UPDATING AND FORECASTING HOURLY OZONE LEVELS

    EPA Science Inventory

    A Bayesian hierarchical space-time model is proposed by combining information from real-time ambient AIRNow air monitoring data, and output from a computer simulation model known as the Community Multi-scale Air Quality (Eta-CMAQ) forecast model. A model validation analysis shows...

  13. The Extraction of One-Dimensional Flow Properties from Multi-Dimensional Data Sets

    NASA Technical Reports Server (NTRS)

    Baurle, Robert A.; Gaffney, Richard L., Jr.

    2007-01-01

    The engineering design and analysis of air-breathing propulsion systems relies heavily on zero- or one-dimensional properties (e.g. thrust, total pressure recovery, mixing and combustion efficiency, etc.) for figures of merit. The extraction of these parameters from experimental data sets and/or multi-dimensional computational data sets is therefore an important aspect of the design process. A variety of methods exist for extracting performance measures from multi-dimensional data sets. Some of the information contained in the multi-dimensional flow is inevitably lost when any one-dimensionalization technique is applied. Hence, the unique assumptions associated with a given approach may result in one-dimensional properties that are significantly different than those extracted using alternative approaches. The purpose of this effort is to examine some of the more popular methods used for the extraction of performance measures from multi-dimensional data sets, reveal the strengths and weaknesses of each approach, and highlight various numerical issues that result when mapping data from a multi-dimensional space to a space of one dimension.

  14. The Art of Extracting One-Dimensional Flow Properties from Multi-Dimensional Data Sets

    NASA Technical Reports Server (NTRS)

    Baurle, R. A.; Gaffney, R. L.

    2007-01-01

    The engineering design and analysis of air-breathing propulsion systems relies heavily on zero- or one-dimensional properties (e:g: thrust, total pressure recovery, mixing and combustion efficiency, etc.) for figures of merit. The extraction of these parameters from experimental data sets and/or multi-dimensional computational data sets is therefore an important aspect of the design process. A variety of methods exist for extracting performance measures from multi-dimensional data sets. Some of the information contained in the multi-dimensional flow is inevitably lost when any one-dimensionalization technique is applied. Hence, the unique assumptions associated with a given approach may result in one-dimensional properties that are significantly different than those extracted using alternative approaches. The purpose of this effort is to examine some of the more popular methods used for the extraction of performance measures from multi-dimensional data sets, reveal the strengths and weaknesses of each approach, and highlight various numerical issues that result when mapping data from a multi-dimensional space to a space of one dimension.

  15. A non-parametric postprocessor for bias-correcting multi-model ensemble forecasts of hydrometeorological and hydrologic variables

    NASA Astrophysics Data System (ADS)

    Brown, James; Seo, Dong-Jun

    2010-05-01

    Operational forecasts of hydrometeorological and hydrologic variables often contain large uncertainties, for which ensemble techniques are increasingly used. However, the utility of ensemble forecasts depends on the unbiasedness of the forecast probabilities. We describe a technique for quantifying and removing biases from ensemble forecasts of hydrometeorological and hydrologic variables, intended for use in operational forecasting. The technique makes no a priori assumptions about the distributional form of the variables, which is often unknown or difficult to model parametrically. The aim is to estimate the conditional cumulative distribution function (ccdf) of the observed variable given a (possibly biased) real-time ensemble forecast from one or several forecasting systems (multi-model ensembles). The technique is based on Bayesian optimal linear estimation of indicator variables, and is analogous to indicator cokriging (ICK) in geostatistics. By developing linear estimators for the conditional expectation of the observed variable at many thresholds, ICK provides a discrete approximation of the full ccdf. Since ICK minimizes the conditional error variance of the indicator expectation at each threshold, it effectively minimizes the Continuous Ranked Probability Score (CRPS) when infinitely many thresholds are employed. However, the ensemble members used as predictors in ICK, and other bias-correction techniques, are often highly cross-correlated, both within and between models. Thus, we propose an orthogonal transform of the predictors used in ICK, which is analogous to using their principal components in the linear system of equations. This leads to a well-posed problem in which a minimum number of predictors are used to provide maximum information content in terms of the total variance explained. The technique is used to bias-correct precipitation ensemble forecasts from the NCEP Global Ensemble Forecast System (GEFS), for which independent validation results are presented. Extension to multimodel ensembles from the NCEP GFS and Short Range Ensemble Forecast (SREF) systems is also proposed.

  16. ENSO Prediction in the NASA GMAO GEOS-5 Seasonal Forecasting System

    NASA Astrophysics Data System (ADS)

    Kovach, R. M.; Borovikov, A.; Marshak, J.; Pawson, S.; Vernieres, G.

    2016-12-01

    Seasonal-to-Interannual coupled forecasts are conducted in near-real time with the Goddard Earth Observing System (GEOS) Atmosphere-Ocean General Circulation Model (AOGCM). A 30-year suite of 9-month hindcasts is available, initialized with the MERRA-Ocean, MERRA-Land, and MERRA atmospheric fields. These forecasts are used to predict the timing and magnitude of ENSO and other short-term climate variability. The 2015 El Niño peaked in November 2015 and was considered a "very strong" event with the Equatorial Pacific Ocean sea-surface-temperature (SST) anomalies higher than 2.0 °C. These very strong temperature anomalies began in Sep/Oct/Nov (SON) of 2015 and persisted through Dec/Jan/Feb (DJF) of 2016. The other two very strong El Niño events recently recorded occurred in 1981/82 and 1997/98. The GEOS-5 system began predicting a very strong El Niño for SON starting with the March 2015 forecast. At this time, the GMAO forecast was an outlier in both the NMME and IRI multi-model ensemble prediction plumes. The GMAO May 2015 forecast for the November 2015 peak in temperature anomaly in the Niño3.4 region was in excellent agreement with the real event, but in May this forecast was still one of the outliers in the multi-model forecasts. The GEOS-5 May 2015 forecast also correctly predicted the weakening of the Eastern Pacific (Niño1+2) anomalies for SON. We will present a summary of the NASA GMAO GEOS-5 Seasonal Forecast System skills based on historic hindcasts. Initial conditions, prediction of ocean surface and subsurface evolution for the 2015/16 El Niño will be compared to the 1998/97 event. GEOS-5 capability to predict the precipitation, i.e. to model the teleconnection patterns associated with El Niño will also be shown. To conclude, we will highlight some new developments in the GEOS forecasting system.

  17. A formulation of multidimensional growth models for the assessment and forecast of technology attributes

    NASA Astrophysics Data System (ADS)

    Danner, Travis W.

    Developing technology systems requires all manner of investment---engineering talent, prototypes, test facilities, and more. Even for simple design problems the investment can be substantial; for complex technology systems, the development costs can be staggering. The profitability of a corporation in a technology-driven industry is crucially dependent on maximizing the effectiveness of research and development investment. Decision-makers charged with allocation of this investment are forced to choose between the further evolution of existing technologies and the pursuit of revolutionary technologies. At risk on the one hand is excessive investment in an evolutionary technology which has only limited availability for further improvement. On the other hand, the pursuit of a revolutionary technology may mean abandoning momentum and the potential for substantial evolutionary improvement resulting from the years of accumulated knowledge. The informed answer to this question, evolutionary or revolutionary, requires knowledge of the expected rate of improvement and the potential a technology offers for further improvement. This research is dedicated to formulating the assessment and forecasting tools necessary to acquire this knowledge. The same physical laws and principles that enable the development and improvement of specific technologies also limit the ultimate capability of those technologies. Researchers have long used this concept as the foundation for modeling technological advancement through extrapolation by analogy to biological growth models. These models are employed to depict technology development as it asymptotically approaches limits established by the fundamental principles on which the technological approach is based. This has proven an effective and accurate approach to modeling and forecasting simple single-attribute technologies. With increased system complexity and the introduction of multiple system objectives, however, the usefulness of this modeling technique begins to diminish. With the introduction of multiple objectives, researchers often abandon technology growth models for scoring models and technology frontiers. While both approaches possess advantages over current growth models for the assessment of multi-objective technologies, each lacks a necessary dimension for comprehensive technology assessment. By collapsing multiple system metrics into a single, non-intuitive technology measure, scoring models provide a succinct framework for multi-objective technology assessment and forecasting. Yet, with no consideration of physical limits, scoring models provide no insight as to the feasibility of a particular combination of system capabilities. They only indicate that a given combination of system capabilities yields a particular score. Conversely, technology frontiers are constructed with the distinct objective of providing insight into the feasibility of system capability combinations. Yet again, upper limits to overall system performance are ignored. Furthermore, the data required to forecast subsequent technology frontiers is often inhibitive. In an attempt to reincorporate the fundamental nature of technology advancement as bound by physical principles, researchers have sought to normalize multi-objective systems whereby the variability of a single system objective is eliminated as a result of changes in the remaining objectives. This drastically limits the applicability of the resulting technology model because it is only applicable for a single setting of all other system attributes. Attempts to maintain the interaction between the growth curves of each technical objective of a complex system have thus far been limited to qualitative and subjective consideration. This research proposes the formulation of multidimensional growth models as an approach to simulating the advancement of multi-objective technologies towards their upper limits. Multidimensional growth models were formulated by noticing and exploiting the correlation between technology growth models and technology frontiers. Both are frontiers in actuality. The technology growth curve is a frontier between capability levels of a single attribute and time, while a technology frontier is a frontier between the capability levels of two or more attributes. Multidimensional growth models are formulated by exploiting the mathematical significance of this correlation. The result is a model that can capture both the interaction between multiple system attributes and their expected rates of improvement over time. The fundamental nature of technology development is maintained, and interdependent growth curves are generated for each system metric with minimal data requirements. Being founded on the basic nature of technology advancement, relative to physical limits, the availability for further improvement can be determined for a single metric relative to other system measures of merit. A by-product of this modeling approach is a single n-dimensional technology frontier linking all n system attributes with time. This provides an environment capable of forecasting future system capability in the form of advancing technology frontiers. The ability of a multidimensional growth model to capture the expected improvement of a specific technological approach is dependent on accurately identifying the physical limitations to each pertinent attribute. This research investigates two potential approaches to identifying those physical limits, a physics-based approach and a regression-based approach. The regression-based approach has found limited acceptance among forecasters, although it does show potential for estimating upper limits with a specified degree of uncertainty. Forecasters have long favored physics-based approaches for establishing the upper limit to unidimensional growth models. The task of accurately identifying upper limits has become increasingly difficult with the extension of growth models into multiple dimensions. A lone researcher may be able to identify the physical limitation to a single attribute of a simple system; however, as system complexity and the number of attributes increases, the attention of researchers from multiple fields of study is required. Thus, limit identification is itself an area of research and development requiring some level of investment. Whether estimated by physics or regression-based approaches, predicted limits will always have some degree of uncertainty. This research takes the approach of quantifying the impact of that uncertainty on model forecasts rather than heavily endorsing a single technique to limit identification. In addition to formulating the multidimensional growth model, this research provides a systematic procedure for applying that model to specific technology architectures. Researchers and decision-makers are able to investigate the potential for additional improvement within that technology architecture and to estimate the expected cost of each incremental improvement relative to the cost of past improvements. In this manner, multidimensional growth models provide the necessary information to set reasonable program goals for the further evolution of a particular technological approach or to establish the need for revolutionary approaches in light of the constraining limits of conventional approaches.

  18. Integrability and chemical potential in the (3 + 1)-dimensional Skyrme model

    NASA Astrophysics Data System (ADS)

    Alvarez, P. D.; Canfora, F.; Dimakis, N.; Paliathanasis, A.

    2017-10-01

    Using a remarkable mapping from the original (3 + 1)dimensional Skyrme model to the Sine-Gordon model, we construct the first analytic examples of Skyrmions as well as of Skyrmions-anti-Skyrmions bound states within a finite box in 3 + 1 dimensional flat space-time. An analytic upper bound on the number of these Skyrmions-anti-Skyrmions bound states is derived. We compute the critical isospin chemical potential beyond which these Skyrmions cease to exist. With these tools, we also construct topologically protected time-crystals: time-periodic configurations whose time-dependence is protected by their non-trivial winding number. These are striking realizations of the ideas of Shapere and Wilczek. The critical isospin chemical potential for these time-crystals is determined.

  19. Multi-RCM ensemble downscaling of global seasonal forecasts (MRED)

    NASA Astrophysics Data System (ADS)

    Arritt, R.

    2009-04-01

    Regional climate models (RCMs) have long been used to downscale global climate simulations. In contrast the ability of RCMs to downscale seasonal climate forecasts has received little attention. The Multi-RCM Ensemble Downscaling (MRED) project was recently initiated to address the question, Does dynamical downscaling using RCMs provide additional useful information for seasonal forecasts made by global models? MRED is using a suite of RCMs to downscale seasonal forecasts produced by the National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) seasonal forecast system and the NASA GEOS5 system. The initial focus is on wintertime forecasts in order to evaluate topographic forcing, snowmelt, and the usefulness of higher resolution for near-surface fields influenced by high resolution orography. Each RCM covers the conterminous U.S. at approximately 32 km resolution, comparable to the scale of the North American Regional Reanalysis (NARR) which will be used to evaluate the models. The forecast ensemble for each RCM is comprised of 15 members over a period of 22+ years (from 1982 to 2003+) for the forecast period 1 December - 30 April. Each RCM will create a 15-member lagged ensemble by starting on different dates in the preceding November. This results in a 120-member ensemble for each projection (8 RCMs by 15 members per RCM). The RCMs will be continually updated at their lateral boundaries using 6-hourly output from CFS or GEOS5. Hydrometeorological output will be produced in a standard netCDF-based format for a common analysis grid, which simplifies both model intercomparison and the generation of ensembles. MRED will compare individual RCM and global forecasts as well as ensemble mean precipitation and temperature forecasts, which are currently being used to drive macroscale land surface models (LSMs). Metrics of ensemble spread will also be evaluated. Extensive process-oriented analysis will be performed to link improvements in downscaled forecast skill to regional forcings and physical mechanisms. Our overarching goal is to determine what additional skill can be provided by a community ensemble of high resolution regional models, which we believe will define a strategy for more skillful and useful regional seasonal climate forecasts.

  20. Some theorems and properties of multi-dimensional fractional Laplace transforms

    NASA Astrophysics Data System (ADS)

    Ahmood, Wasan Ajeel; Kiliçman, Adem

    2016-06-01

    The aim of this work is to study theorems and properties for the one-dimensional fractional Laplace transform, generalize some properties for the one-dimensional fractional Lapalce transform to be valid for the multi-dimensional fractional Lapalce transform and is to give the definition of the multi-dimensional fractional Lapalce transform. This study includes: dedicate the one-dimensional fractional Laplace transform for functions of only one independent variable with some of important theorems and properties and develop of some properties for the one-dimensional fractional Laplace transform to multi-dimensional fractional Laplace transform. Also, we obtain a fractional Laplace inversion theorem after a short survey on fractional analysis based on the modified Riemann-Liouville derivative.

  1. Forecasting transitions in systems with high-dimensional stochastic complex dynamics: a linear stability analysis of the tangled nature model.

    PubMed

    Cairoli, Andrea; Piovani, Duccio; Jensen, Henrik Jeldtoft

    2014-12-31

    We propose a new procedure to monitor and forecast the onset of transitions in high-dimensional complex systems. We describe our procedure by an application to the tangled nature model of evolutionary ecology. The quasistable configurations of the full stochastic dynamics are taken as input for a stability analysis by means of the deterministic mean-field equations. Numerical analysis of the high-dimensional stability matrix allows us to identify unstable directions associated with eigenvalues with a positive real part. The overlap of the instantaneous configuration vector of the full stochastic system with the eigenvectors of the unstable directions of the deterministic mean-field approximation is found to be a good early warning of the transitions occurring intermittently.

  2. Microphysics in the Multi-Scale Modeling Systems with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2011-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (l) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, the microphysics developments of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the heavy precipitation processes will be presented.

  3. Upper bounds on the error probabilities and asymptotic error exponents in quantum multiple state discrimination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Audenaert, Koenraad M. R., E-mail: koenraad.audenaert@rhul.ac.uk; Department of Physics and Astronomy, University of Ghent, S9, Krijgslaan 281, B-9000 Ghent; Mosonyi, Milán, E-mail: milan.mosonyi@gmail.com

    2014-10-01

    We consider the multiple hypothesis testing problem for symmetric quantum state discrimination between r given states σ₁, …, σ{sub r}. By splitting up the overall test into multiple binary tests in various ways we obtain a number of upper bounds on the optimal error probability in terms of the binary error probabilities. These upper bounds allow us to deduce various bounds on the asymptotic error rate, for which it has been hypothesized that it is given by the multi-hypothesis quantum Chernoff bound (or Chernoff divergence) C(σ₁, …, σ{sub r}), as recently introduced by Nussbaum and Szkoła in analogy with Salikhov'smore » classical multi-hypothesis Chernoff bound. This quantity is defined as the minimum of the pairwise binary Chernoff divergences min{sub j« less

  4. Multi-Step Time Series Forecasting with an Ensemble of Varied Length Mixture Models.

    PubMed

    Ouyang, Yicun; Yin, Hujun

    2018-05-01

    Many real-world problems require modeling and forecasting of time series, such as weather temperature, electricity demand, stock prices and foreign exchange (FX) rates. Often, the tasks involve predicting over a long-term period, e.g. several weeks or months. Most existing time series models are inheritably for one-step prediction, that is, predicting one time point ahead. Multi-step or long-term prediction is difficult and challenging due to the lack of information and uncertainty or error accumulation. The main existing approaches, iterative and independent, either use one-step model recursively or treat the multi-step task as an independent model. They generally perform poorly in practical applications. In this paper, as an extension of the self-organizing mixture autoregressive (AR) model, the varied length mixture (VLM) models are proposed to model and forecast time series over multi-steps. The key idea is to preserve the dependencies between the time points within the prediction horizon. Training data are segmented to various lengths corresponding to various forecasting horizons, and the VLM models are trained in a self-organizing fashion on these segments to capture these dependencies in its component AR models of various predicting horizons. The VLM models form a probabilistic mixture of these varied length models. A combination of short and long VLM models and an ensemble of them are proposed to further enhance the prediction performance. The effectiveness of the proposed methods and their marked improvements over the existing methods are demonstrated through a number of experiments on synthetic data, real-world FX rates and weather temperatures.

  5. Using Temperature Forecasts to Improve Seasonal Streamflow Forecasts in the Colorado and Rio Grande Basins

    NASA Astrophysics Data System (ADS)

    Lehner, F.; Wood, A.; Llewellyn, D.; Blatchford, D. B.; Goodbody, A. G.; Pappenberger, F.

    2017-12-01

    Recent studies have documented the influence of increasing temperature on streamflow across the American West, including snow-melt driven rivers such as the Colorado or Rio Grande. At the same time, some basins are reporting decreasing skill in seasonal streamflow forecasts, termed water supply forecasts (WSFs), over the recent decade. While the skill in seasonal precipitation forecasts from dynamical models remains low, their skill in predicting seasonal temperature variations could potentially be harvested for WSFs to account for non-stationarity in regional temperatures. Here, we investigate whether WSF skill can be improved by incorporating seasonal temperature forecasts from dynamical forecasting models (from the North American Multi Model Ensemble and the European Centre for Medium-Range Weather Forecast System 4) into traditional statistical forecast models. We find improved streamflow forecast skill relative to traditional WSF approaches in a majority of headwater locations in the Colorado and Rio Grande basins. Incorporation of temperature into WSFs thus provides a promising avenue to increase the robustness of current forecasting techniques in the face of continued regional warming.

  6. Combined Uncertainty and A-Posteriori Error Bound Estimates for General CFD Calculations: Theory and Software Implementation

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2014-01-01

    This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.

  7. Short-term Wind Forecasting at Wind Farms using WRF-LES and Actuator Disk Model

    NASA Astrophysics Data System (ADS)

    Kirkil, Gokhan

    2017-04-01

    Short-term wind forecasts are obtained for a wind farm on a mountainous terrain using WRF-LES. Multi-scale simulations are also performed using different PBL parameterizations. Turbines are parameterized using Actuator Disc Model. LES models improved the forecasts. Statistical error analysis is performed and ramp events are analyzed. Complex topography of the study area affects model performance, especially the accuracy of wind forecasts were poor for cross valley-mountain flows. By means of LES, we gain new knowledge about the sources of spatial and temporal variability of wind fluctuations such as the configuration of wind turbines.

  8. Butyrophenone on O-TiO2(110): one-dimensional motion in a weakly confined potential well.

    PubMed

    Jensen, Stephen C; Shank, Alex; Madix, Robert J; Friend, Cynthia M

    2012-04-24

    We demonstrate the one-dimensional confinement of weakly bound butyrophenone molecules between strongly bound complexes formed via reaction with oxygen on TiO(2)(110). Butyrophenone weakly bound to Ti rows through the carbonyl oxygen diffuses freely in one dimension along the rows even at 55 K, persisting for many minutes before hopping out of the 1-D well. Quantitative analysis yields an estimate of the migration barrier of 0.11 eV and a frequency factor of 6.5 × 10(9) Hz. These studies demonstrate that weakly bound organic molecules can be confined on a surface by creating molecular barriers, potentially altering their assembly.

  9. Three Dimensional Flow and Pressure Patterns in a Single Pocket of a Hydrostatic Journal Bearing

    NASA Technical Reports Server (NTRS)

    Braun, M. Jack; Dzodzo, Milorad B.

    1996-01-01

    The flow in a hydrostatic pocket is described by a mathematical model that uses the three dimensional Navier-Stokes equations written in terms of the primary variables, u, v, w, and p. Using a conservative formulation, a finite volume multi-block method is applied through a collocated, body fitted grid. The flow is simulated in a shallow pocket with a depth/length ratio of 0.02. The flow structures obtained and described by the authors in their previous two dimensional models are made visible in their three dimensional aspect for the Couette flow. It has been found that the flow regimes formed central and secondary vortical cells with three dimensional corkscrew-like structures that lead the fluid on an outward bound path in the axial direction of the pocket. The position of the central vortical cell center is at the exit region of the capillary restrictor feedline. It has also been determined that a fluid turn around zone occupies all the upstream space between the floor of the pocket and the runner, thus preventing any flow exit through the upstream port. The corresponding pressure distribution under the shaft presented as well. It was clearly established that for the Couette dominated case the pressure varies significantly in the pocket in the circumferential direction, while its variation is less pronounced axially.

  10. Recent developments and assessment of a three-dimensional PBL parameterization for improved wind forecasting over complex terrain

    NASA Astrophysics Data System (ADS)

    Kosovic, B.; Jimenez, P. A.; Haupt, S. E.; Martilli, A.; Olson, J.; Bao, J. W.

    2017-12-01

    At present, the planetary boundary layer (PBL) parameterizations available in most numerical weather prediction (NWP) models are one-dimensional. One-dimensional parameterizations are based on the assumption of horizontal homogeneity. This homogeneity assumption is appropriate for grid cell sizes greater than 10 km. However, for mesoscale simulations of flows in complex terrain with grid cell sizes below 1 km, the assumption of horizontal homogeneity is violated. Applying a one-dimensional PBL parameterization to high-resolution mesoscale simulations in complex terrain could result in significant error. For high-resolution mesoscale simulations of flows in complex terrain, we have therefore developed and implemented a three-dimensional (3D) PBL parameterization in the Weather Research and Forecasting (WRF) model. The implementation of the 3D PBL scheme is based on the developments outlined by Mellor and Yamada (1974, 1982). Our implementation in the Weather Research and Forecasting (WRF) model uses a pure algebraic model (level 2) to diagnose the turbulent fluxes. To evaluate the performance of the 3D PBL model, we use observations from the Wind Forecast Improvement Project 2 (WFIP2). The WFIP2 field study took place in the Columbia River Gorge area from 2015-2017. We focus on selected cases when physical phenomena of significance for wind energy applications such as mountain waves, topographic wakes, and gap flows were observed. Our assessment of the 3D PBL parameterization also considers a large-eddy simulation (LES). We carried out a nested LES with grid cell sizes of 30 m and 10 m covering a large fraction of the WFIP2 study area. Both LES domains were discretized using 6000 x 3000 x 200 grid cells in zonal, meridional, and vertical direction, respectively. The LES results are used to assess the relative magnitude of horizontal gradients of turbulent stresses and fluxes in comparison to vertical gradients. The presentation will highlight the advantages of the 3D PBL scheme in regions of complex terrain.

  11. Ensemble Downscaling of Winter Seasonal Forecasts: The MRED Project

    NASA Astrophysics Data System (ADS)

    Arritt, R. W.; Mred Team

    2010-12-01

    The Multi-Regional climate model Ensemble Downscaling (MRED) project is a multi-institutional project that is producing large ensembles of downscaled winter seasonal forecasts from coupled atmosphere-ocean seasonal prediction models. Eight regional climate models each are downscaling 15-member ensembles from the National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) and the new NASA seasonal forecast system based on the GEOS5 atmospheric model coupled with the MOM4 ocean model. This produces 240-member ensembles, i.e., 8 regional models x 15 global ensemble members x 2 global models, for each winter season (December-April) of 1982-2003. Results to date show that combined global-regional downscaled forecasts have greatest skill for seasonal precipitation anomalies during strong El Niño events such as 1982-83 and 1997-98. Ensemble means of area-averaged seasonal precipitation for the regional models generally track the corresponding results for the global model, though there is considerable inter-model variability amongst the regional models. For seasons and regions where area mean precipitation is accurately simulated the regional models bring added value by extracting greater spatial detail from the global forecasts, mainly due to better resolution of terrain in the regional models. Our results also emphasize that an ensemble approach is essential to realizing the added value from the combined global-regional modeling system.

  12. Evaluation of Multi-Model Ensemble System for Seasonal and Monthly Prediction

    NASA Astrophysics Data System (ADS)

    Zhang, Q.; Van den Dool, H. M.

    2013-12-01

    Since August 2011, the realtime seasonal forecasts of U.S. National Multi-Model Ensemble (NMME) have been made on 8th of each month by NCEP Climate Prediction Center (CPC). During the first year, the participating models were NCEP/CFSv1&2, GFDL/CM2.2, NCAR/U.Miami/COLA/CCSM3, NASA/GEOS5, IRI/ ECHAM-a & ECHAM-f for the realtime NMME forecast. The Canadian Meteorological Center CanCM3 and CM4 replaced the CFSv1 and IRI's models in the second year. The NMME team at CPC collects three variables, including precipitation, 2-meter temperature and sea surface temperature from each modeling center on a 1x1 global grid, removes systematic errors, makes the grand ensemble mean with equal weight for each model and constructs a probability forecast with equal weight for each member. The team then provides the NMME forecast to the operational CPC forecaster responsible for the seasonal and monthly outlook each month. Verification of the seasonal and monthly prediction from NMME is conducted by calculating the anomaly correlation (AC) from the 30-year hindcasts (1982-2011) of individual model and NMME ensemble. The motivation of this study is to provide skill benchmarks for future improvements of the NMME seasonal and monthly prediction system. The experimental (Phase I) stage of the project already supplies routine guidance to users of the NMME forecasts.

  13. Evaluation of NMME temperature and precipitation bias and forecast skill for South Asia

    NASA Astrophysics Data System (ADS)

    Cash, Benjamin A.; Manganello, Julia V.; Kinter, James L.

    2017-08-01

    Systematic error and forecast skill for temperature and precipitation in two regions of Southern Asia are investigated using hindcasts initialized May 1 from the North American Multi-Model Ensemble. We focus on two contiguous but geographically and dynamically diverse regions: the Extended Indian Monsoon Rainfall (70-100E, 10-30 N) and the nearby mountainous area of Pakistan and Afghanistan (60-75E, 23-39 N). Forecast skill is assessed using the Sign test framework, a rigorous statistical method that can be applied to non-Gaussian variables such as precipitation and to different ensemble sizes without introducing bias. We find that models show significant systematic error in both precipitation and temperature for both regions. The multi-model ensemble mean (MMEM) consistently yields the lowest systematic error and the highest forecast skill for both regions and variables. However, we also find that the MMEM consistently provides a statistically significant increase in skill over climatology only in the first month of the forecast. While the MMEM tends to provide higher overall skill than climatology later in the forecast, the differences are not significant at the 95% level. We also find that MMEMs constructed with a relatively small number of ensemble members per model can equal or outperform MMEMs constructed with more members in skill. This suggests some ensemble members either provide no contribution to overall skill or even detract from it.

  14. The ecological forecast horizon, and examples of its uses and determinants

    PubMed Central

    Petchey, Owen L; Pontarp, Mikael; Massie, Thomas M; Kéfi, Sonia; Ozgul, Arpat; Weilenmann, Maja; Palamara, Gian Marco; Altermatt, Florian; Matthews, Blake; Levine, Jonathan M; Childs, Dylan Z; McGill, Brian J; Schaepman, Michael E; Schmid, Bernhard; Spaak, Piet; Beckerman, Andrew P; Pennekamp, Frank; Pearse, Ian S; Vasseur, David

    2015-01-01

    Forecasts of ecological dynamics in changing environments are increasingly important, and are available for a plethora of variables, such as species abundance and distribution, community structure and ecosystem processes. There is, however, a general absence of knowledge about how far into the future, or other dimensions (space, temperature, phylogenetic distance), useful ecological forecasts can be made, and about how features of ecological systems relate to these distances. The ecological forecast horizon is the dimensional distance for which useful forecasts can be made. Five case studies illustrate the influence of various sources of uncertainty (e.g. parameter uncertainty, environmental variation, demographic stochasticity and evolution), level of ecological organisation (e.g. population or community), and organismal properties (e.g. body size or number of trophic links) on temporal, spatial and phylogenetic forecast horizons. Insights from these case studies demonstrate that the ecological forecast horizon is a flexible and powerful tool for researching and communicating ecological predictability. It also has potential for motivating and guiding agenda setting for ecological forecasting research and development. PMID:25960188

  15. Data Assimilation of AIRS Water Vapor Profiles: Impact on Precipitation Forecasts for Atmospheric River Cases Affecting the Western of the United States

    NASA Technical Reports Server (NTRS)

    Blankenship, Clay; Zavodsky, Bradley; Jedlovec, Gary; Wick, Gary; Neiman, Paul

    2013-01-01

    Atmospheric rivers are transient, narrow regions in the atmosphere responsible for the transport of large amounts of water vapor. These phenomena can have a large impact on precipitation. In particular, they can be responsible for intense rain events on the western coast of North America during the winter season. This paper focuses on attempts to improve forecasts of heavy precipitation events in the Western US due to atmospheric rivers. Profiles of water vapor derived from from Atmospheric Infrared Sounder (AIRS) observations are combined with GFS forecasts by a three-dimensional variational data assimilation in the Gridpoint Statistical Interpolation (GSI). Weather Research and Forecasting (WRF) forecasts initialized from the combined field are compared to forecasts initialized from the GFS forecast only for 3 test cases in the winter of 2011. Results will be presented showing the impact of the AIRS profile data on water vapor and temperature fields, and on the resultant precipitation forecasts.

  16. The GISS sounding temperature impact test

    NASA Technical Reports Server (NTRS)

    Halem, M.; Ghil, M.; Atlas, R.; Susskind, J.; Quirk, W. J.

    1978-01-01

    The impact of DST 5 and DST 6 satellite sounding data on mid-range forecasting was studied. The GISS temperature sounding technique, the GISS time-continuous four-dimensional assimilation procedure based on optimal statistical analysis, the GISS forecast model, and the verification techniques developed, including impact on local precipitation forecasts are described. It is found that the impact of sounding data was substantial and beneficial for the winter test period, Jan. 29 - Feb. 21. 1976. Forecasts started from initial state obtained with the aid of satellite data showed a mean improvement of about 4 points in the 48 and 772 hours Sub 1 scores as verified over North America and Europe. This corresponds to an 8 to 12 hour forecast improvement in the forecast range at 48 hours. An automated local precipitation forecast model applied to 128 cities in the United States showed on an average 15% improvement when satellite data was used for numerical forecasts. The improvement was 75% in the midwest.

  17. Development and Evaluation of a Gridded CrIS/ATMS Visualization for Operational Forecasting

    NASA Astrophysics Data System (ADS)

    Zavodsky, B.; Smith, N.; Dostalek, J.; Stevens, E.; Nelson, K.; Weisz, E.; Berndt, E.; Line, W.; Barnet, C.; Gambacorta, A.; Reale, A.; Hoese, D.

    2016-12-01

    Upper-air observations from radiosondes are limited in spatial coverage and are primarily launched only at synoptic times, potentially missing evolving air masses. For forecast challenges which require diagnosis of the three-dimensional extent of the atmosphere, these observations may not be enough for forecasters. Currently, forecasters rely on model output alongside the sparse network of radiosondes for characterizing the three-dimensional atmosphere. However, satellite information can help fill in the spatial and temporal gaps in radiosonde observations. In particular, temperature and moisture retrievals from the NOAA-Unique Combined Atmospheric Processing System (NUCAPS), which combines infrared soundings from the Cross-track Infrared Sounder (CrIS) with the Advanced Technology Microwave Sounder (ATMS) to retrieve profiles of temperature and moisture. NUCAPS retrievals are available in a wide swath of observations with approximately 45-km spatial resolution at nadir and a local Equator crossing time of 1:30 A.M./P.M. enabling three-dimensional observations at asynoptic times. For forecasters to make the best use of these observations, these satellite-based soundings must be displayed in the National Weather Service's decision support system, the Advanced Weather Interactive Processing System (AWIPS). NUCAPS profiles are currently available in AWIPS as point observations that can be displayed on Skew-T diagrams. This presentation discusses the development of a new visualization capability for NUCAPS within AWIPS that will allow the data to be viewed in gridded horizontal maps or as vertical cross sections, giving forecasters additional tools for diagnosing atmospheric features. Forecaster feedback and examples of operational applications from two testbed activities will be highlighted. First is a product evaluation at the Hazardous Weather Testbed for severe weather—such as high winds, large hail, tornadoes—where the vertical distribution of temperature and moisture ahead of frontal boundaries was assessed. Second, is a product evaluation with the Alaska Center Weather Service Unit for cold air aloft—where the detection of the three-dimension extent of exterior aircraft temperatures lower than -65°C (temperatures at which jet fuel may begin to freeze)—was assessed.

  18. Progress in multi-dimensional upwind differencing

    NASA Technical Reports Server (NTRS)

    Vanleer, Bram

    1992-01-01

    Multi-dimensional upwind-differencing schemes for the Euler equations are reviewed. On the basis of the first-order upwind scheme for a one-dimensional convection equation, the two approaches to upwind differencing are discussed: the fluctuation approach and the finite-volume approach. The usual extension of the finite-volume method to the multi-dimensional Euler equations is not entirely satisfactory, because the direction of wave propagation is always assumed to be normal to the cell faces. This leads to smearing of shock and shear waves when these are not grid-aligned. Multi-directional methods, in which upwind-biased fluxes are computed in a frame aligned with a dominant wave, overcome this problem, but at the expense of robustness. The same is true for the schemes incorporating a multi-dimensional wave model not based on multi-dimensional data but on an 'educated guess' of what they could be. The fluctuation approach offers the best possibilities for the development of genuinely multi-dimensional upwind schemes. Three building blocks are needed for such schemes: a wave model, a way to achieve conservation, and a compact convection scheme. Recent advances in each of these components are discussed; putting them all together is the present focus of a worldwide research effort. Some numerical results are presented, illustrating the potential of the new multi-dimensional schemes.

  19. Quantum Bayesian networks with application to games displaying Parrondo's paradox

    NASA Astrophysics Data System (ADS)

    Pejic, Michael

    Bayesian networks and their accompanying graphical models are widely used for prediction and analysis across many disciplines. We will reformulate these in terms of linear maps. This reformulation will suggest a natural extension, which we will show is equivalent to standard textbook quantum mechanics. Therefore, this extension will be termed quantum. However, the term quantum should not be taken to imply this extension is necessarily only of utility in situations traditionally thought of as in the domain of quantum mechanics. In principle, it may be employed in any modelling situation, say forecasting the weather or the stock market---it is up to experiment to determine if this extension is useful in practice. Even restricting to the domain of quantum mechanics, with this new formulation the advantages of Bayesian networks can be maintained for models incorporating quantum and mixed classical-quantum behavior. The use of these will be illustrated by various basic examples. Parrondo's paradox refers to the situation where two, multi-round games with a fixed winning criteria, both with probability greater than one-half for one player to win, are combined. Using a possibly biased coin to determine the rule to employ for each round, paradoxically, the previously losing player now wins the combined game with probabilitygreater than one-half. Using the extended Bayesian networks, we will formulate and analyze classical observed, classical hidden, and quantum versions of a game that displays this paradox, finding bounds for the discrepancy from naive expectations for the occurrence of the paradox. A quantum paradox inspired by Parrondo's paradox will also be analyzed. We will prove a bound for the discrepancy from naive expectations for this paradox as well. Games involving quantum walks that achieve this bound will be presented.

  20. Statistical Post-Processing of Wind Speed Forecasts to Estimate Relative Economic Value

    NASA Astrophysics Data System (ADS)

    Courtney, Jennifer; Lynch, Peter; Sweeney, Conor

    2013-04-01

    The objective of this research is to get the best possible wind speed forecasts for the wind energy industry by using an optimal combination of well-established forecasting and post-processing methods. We start with the ECMWF 51 member ensemble prediction system (EPS) which is underdispersive and hence uncalibrated. We aim to produce wind speed forecasts that are more accurate and calibrated than the EPS. The 51 members of the EPS are clustered to 8 weighted representative members (RMs), chosen to minimize the within-cluster spread, while maximizing the inter-cluster spread. The forecasts are then downscaled using two limited area models, WRF and COSMO, at two resolutions, 14km and 3km. This process creates four distinguishable ensembles which are used as input to statistical post-processes requiring multi-model forecasts. Two such processes are presented here. The first, Bayesian Model Averaging, has been proven to provide more calibrated and accurate wind speed forecasts than the ECMWF EPS using this multi-model input data. The second, heteroscedastic censored regression is indicating positive results also. We compare the two post-processing methods, applied to a year of hindcast wind speed data around Ireland, using an array of deterministic and probabilistic verification techniques, such as MAE, CRPS, probability transform integrals and verification rank histograms, to show which method provides the most accurate and calibrated forecasts. However, the value of a forecast to an end-user cannot be fully quantified by just the accuracy and calibration measurements mentioned, as the relationship between skill and value is complex. Capturing the full potential of the forecast benefits also requires detailed knowledge of the end-users' weather sensitive decision-making processes and most importantly the economic impact it will have on their income. Finally, we present the continuous relative economic value of both post-processing methods to identify which is more beneficial to the wind energy industry of Ireland.

  1. A Reduced Basis Method with Exact-Solution Certificates for Symmetric Coercive Equations

    DTIC Science & Technology

    2013-11-06

    the energy associated with the infinite - dimensional weak solution of parametrized symmetric coercive partial differential equations with piecewise...builds bounds with respect to the infinite - dimensional weak solution, aims to entirely remove the issue of the “truth” within the certified reduced basis...framework. We in particular introduce a reduced basis method that provides rigorous upper and lower bounds

  2. Mathematic simulation of mining company’s power demand forecast (by example of “Neryungri” coal strip mine)

    NASA Astrophysics Data System (ADS)

    Antonenkov, D. V.; Solovev, D. B.

    2017-10-01

    The article covers the aspects of forecasting and consideration of the wholesale market environment in generating the power demand forecast. Major mining companies that operate in conditions of the present day power market have to provide a reliable energy demand request for a certain time period ahead, thus ensuring sufficient reduction of financial losses associated with deviations of the actual power demand from the expected figures. Normally, under the power supply agreement, the consumer is bound to provide a per-month and per-hour request annually. It means that the consumer has to generate one-month-ahead short-term and medium-term hourly forecasts. The authors discovered that empiric distributions of “Yakutugol”, Holding Joint Stock Company, power demand belong to the sustainable rank parameter H-distribution type used for generating forecasts based on extrapolation of such distribution parameters. For this reason they justify the need to apply the mathematic rank analysis in short-term forecasting of the contracted power demand of “Neryungri” coil strip mine being a component of the technocenosis-type system of the mining company “Yakutugol”, Holding JSC.

  3. Data-driven forecasting algorithms for building energy consumption

    NASA Astrophysics Data System (ADS)

    Noh, Hae Young; Rajagopal, Ram

    2013-04-01

    This paper introduces two forecasting methods for building energy consumption data that are recorded from smart meters in high resolution. For utility companies, it is important to reliably forecast the aggregate consumption profile to determine energy supply for the next day and prevent any crisis. The proposed methods involve forecasting individual load on the basis of their measurement history and weather data without using complicated models of building system. The first method is most efficient for a very short-term prediction, such as the prediction period of one hour, and uses a simple adaptive time-series model. For a longer-term prediction, a nonparametric Gaussian process has been applied to forecast the load profiles and their uncertainty bounds to predict a day-ahead. These methods are computationally simple and adaptive and thus suitable for analyzing a large set of data whose pattern changes over the time. These forecasting methods are applied to several sets of building energy consumption data for lighting and heating-ventilation-air-conditioning (HVAC) systems collected from a campus building at Stanford University. The measurements are collected every minute, and corresponding weather data are provided hourly. The results show that the proposed algorithms can predict those energy consumption data with high accuracy.

  4. Technical Note: Initial assessment of a multi-method approach to spring-flood forecasting in Sweden

    NASA Astrophysics Data System (ADS)

    Olsson, J.; Uvo, C. B.; Foster, K.; Yang, W.

    2016-02-01

    Hydropower is a major energy source in Sweden, and proper reservoir management prior to the spring-flood onset is crucial for optimal production. This requires accurate forecasts of the accumulated discharge in the spring-flood period (i.e. the spring-flood volume, SFV). Today's SFV forecasts are generated using a model-based climatological ensemble approach, where time series of precipitation and temperature from historical years are used to force a calibrated and initialized set-up of the HBV model. In this study, a number of new approaches to spring-flood forecasting that reflect the latest developments with respect to analysis and modelling on seasonal timescales are presented and evaluated. Three main approaches, represented by specific methods, are evaluated in SFV hindcasts for the Swedish river Vindelälven over a 10-year period with lead times between 0 and 4 months. In the first approach, historically analogue years with respect to the climate in the period preceding the spring flood are identified and used to compose a reduced ensemble. In the second, seasonal meteorological ensemble forecasts are used to drive the HBV model over the spring-flood period. In the third approach, statistical relationships between SFV and the large-sale atmospheric circulation are used to build forecast models. None of the new approaches consistently outperform the climatological ensemble approach, but for early forecasts improvements of up to 25 % are found. This potential is reasonably well realized in a multi-method system, which over all forecast dates reduced the error in SFV by ˜ 4 %. This improvement is limited but potentially significant for e.g. energy trading.

  5. NMME Monthly / Seasonal Forecasts for NASA SERVIR Applications Science

    NASA Astrophysics Data System (ADS)

    Robertson, F. R.; Roberts, J. B.

    2014-12-01

    This work details use of the North American Multi-Model Ensemble (NMME) experimental forecasts as drivers for Decision Support Systems (DSSs) in the NASA / USAID initiative, SERVIR (a Spanish acronym meaning "to serve"). SERVIR integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and to improve response to natural disasters. Through the use of DSSs whose "front ends" are physically based models, the SERVIR activity provides a natural testbed to determine the extent to which NMME monthly to seasonal projections enable scientists, educators, project managers and policy implementers in developing countries to better use probabilistic outlooks of seasonal hydrologic anomalies in assessing agricultural / food security impacts, water availability, and risk to societal infrastructure. The multi-model NMME framework provides a "best practices" approach to probabilistic forecasting. The NMME forecasts are generated at resolution more coarse than that required to support DSS models; downscaling in both space and time is necessary. The methodology adopted here applied model output statistics where we use NMME ensemble monthly projections of sea-surface temperature (SST) and precipitation from 30 years of hindcasts with observations of precipitation and temperature for target regions. Since raw model forecasts are well-known to have structural biases, a cross-validated multivariate regression methodology (CCA) is used to link the model projected states as predictors to the predictands of the target region. The target regions include a number of basins in East and South Africa as well as the Ganges / Baramaputra / Meghna basin complex. The MOS approach used address spatial downscaling. Temporal disaggregation of monthly seasonal forecasts is achieved through use of a tercile bootstrapping approach. We interpret the results of these studies, the levels of skill by several metrics, and key uncertainties.

  6. NMME Monthly / Seasonal Forecasts for NASA SERVIR Applications Science

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Roberts, Jason B.

    2014-01-01

    This work details use of the North American Multi-Model Ensemble (NMME) experimental forecasts as drivers for Decision Support Systems (DSSs) in the NASA / USAID initiative, SERVIR (a Spanish acronym meaning "to serve"). SERVIR integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and to improve response to natural disasters. Through the use of DSSs whose "front ends" are physically based models, the SERVIR activity provides a natural testbed to determine the extent to which NMME monthly to seasonal projections enable scientists, educators, project managers and policy implementers in developing countries to better use probabilistic outlooks of seasonal hydrologic anomalies in assessing agricultural / food security impacts, water availability, and risk to societal infrastructure. The multi-model NMME framework provides a "best practices" approach to probabilistic forecasting. The NMME forecasts are generated at resolution more coarse than that required to support DSS models; downscaling in both space and time is necessary. The methodology adopted here applied model output statistics where we use NMME ensemble monthly projections of sea-surface temperature (SST) and precipitation from 30 years of hindcasts with observations of precipitation and temperature for target regions. Since raw model forecasts are well-known to have structural biases, a cross-validated multivariate regression methodology (CCA) is used to link the model projected states as predictors to the predictands of the target region. The target regions include a number of basins in East and South Africa as well as the Ganges / Baramaputra / Meghna basin complex. The MOS approach used address spatial downscaling. Temporal disaggregation of monthly seasonal forecasts is achieved through use of a tercile bootstrapping approach. We interpret the results of these studies, the levels of skill by several metrics, and key uncertainties.

  7. Perpetual motion of a mobile impurity in a one-dimensional quantum gas

    NASA Astrophysics Data System (ADS)

    Lychkovskiy, O.

    2014-03-01

    Consider an impurity particle injected in a degenerate one-dimensional gas of noninteracting fermions (or, equivalently, Tonks-Girardeau bosons) with some initial momentum p0. We examine the infinite-time value of the momentum of the impurity, p∞, as a function of p0. A lower bound on |p∞(p0)| is derived under fairly general conditions. The derivation, based on the existence of the lower edge of the spectrum of the host gas, does not resort to any approximations. The existence of such bound implies the perpetual motion of the impurity in a one-dimensional gas of noninteracting fermions or Tonks-Girardeau bosons at zero temperature. The bound admits an especially simple and useful form when the interaction between the impurity and host particles is everywhere repulsive.

  8. Application of Bred Vectors To Data Assimilation

    NASA Astrophysics Data System (ADS)

    Corazza, M.; Kalnay, E.; Patil, Dj

    We introduced a statistic, the BV-dimension, to measure the effective local finite-time dimensionality of the atmosphere. We show that this dimension is often quite low, and suggest that this finding has important implications for data assimilation and the accuracy of weather forecasting (Patil et al, 2001). The original database for this study was the forecasts of the NCEP global ensemble forecasting system. The initial differences between the control forecast and the per- turbed forecasts are called bred vectors. The control and perturbed initial conditions valid at time t=n(t are evolved using the forecast model until time t=(n+1) (t. The differences between the perturbed and the control forecasts are scaled down to their initial amplitude, and constitute the bred vectors valid at (n+1) (t. Their growth rate is typically about 1.5/day. The bred vectors are similar by construction to leading Lya- punov vectors except that they have small but finite amplitude, and they are valid at finite times. The original NCEP ensemble data set has 5 independent bred vectors. We define a local bred vector at each grid point by choosing the 5 by 5 grid points centered at the grid point (a region of about 1100km by 1100km), and using the north-south and east- west velocity components at 500mb pressure level to form a 50 dimensional column vector. Since we have k=5 global bred vectors, we also have k local bred vectors at each grid point. We estimate the effective dimensionality of the subspace spanned by the local bred vectors by performing a singular value decomposition (EOF analysis). The k local bred vector columns form a 50xk matrix M. The singular values s(i) of M measure the extent to which the k column unit vectors making up the matrix M point in the direction of v(i). We define the bred vector dimension as BVDIM={Sum[s(i)]}^2/{Sum[s(i)]^2} For example, if 4 out of the 5 vectors lie along v, and one lies along v, the BV- dimension would be BVDIM[sqrt(4), 1, 0,0,0]=1.8, less than 2 because one direction is more dominant than the other in representing the original data. The results (Patil et al, 2001) show that there are large regions where the bred vectors span a subspace of substantially lower dimension than that of the full space. These low dimensionality regions are dominant in the baroclinic extratropics, typically have a lifetime of 3-7 days, have a well-defined horizontal and vertical structure that spans 1 most of the atmosphere, and tend to move eastward. New results with a large number of ensemble members confirm these results and indicate that the low dimensionality regions are quite robust, and depend only on the verification time (i.e., the underlying flow). Corazza et al (2001) have performed experiments with a data assimilation system based on a quasi-geostrophic model and simulated observations (Morss, 1999, Hamill et al, 2000). A 3D-variational data assimilation scheme for a quasi-geostrophic chan- nel model is used to study the structure of the background error and its relationship to the corresponding bred vectors. The "true" evolution of the model atmosphere is defined by an integration of the model and "rawinsonde observations" are simulated by randomly perturbing the true state at fixed locations. It is found that after 3-5 days the bred vectors develop well organized structures which are very similar for the two different norms considered in this paper (potential vorticity norm and streamfunction norm). The results show that the bred vectors do indeed represent well the characteristics of the data assimilation forecast errors, and that the subspace of bred vectors contains most of the forecast error, except in areas where the forecast errors are small. For example, the angle between the 6hr forecast error and the subspace spanned by 10 bred vectors is less than 10o over 90% of the domain, indicating a pattern correlation of more than 98.5% between the forecast error and its projection onto the bred vector subspace. The presence of low-dimensional regions in the perturbations of the basic flow has important implications for data assimilation. At any given time, there is a difference between the true atmospheric state and the model forecast. Assuming that model er- rors are not the dominant source of errors, in a region of low BV-dimensionality the difference between the true state and the forecast should lie substantially in the low dimensional unstable subspace of the few bred vectors that contribute most strongly to the low BV-dimension. This information should yield a substantial improvement in the forecast: the data assimilation algorithm should correct the model state by moving it closer to the observations along the unstable subspace, since this is where the true state most likely lies. Preliminary experiments have been conducted with the quasi-geostrophic data assim- ilation system testing whether it is possible to add "errors of the day" based on bred vectors to the standard (constant) 3D-Var background error covariance in order to capture these important errors. The results are extremely encouraging, indicating a significant reduction (about 40%) in the analysis errors at a very low computational cost. References: 2 Corazza, M., E. Kalnay, DJ Patil, R. Morss, M Cai, I. Szunyogh, BR Hunt, E Ott and JA Yorke, 2001: Use of the breeding technique to estimate the structure of the analysis "errors of the day". Submitted to Nonlinear Processes in Geophysics. Hamill, T.M., Snyder, C., and Morss, R.E., 2000: A Comparison of Probabilistic Fore- casts from Bred, Singular-Vector and Perturbed Observation Ensembles, Mon. Wea. Rev., 128, 1835­1851. Kalnay, E., and Z. Toth, 1994: Removing growing errors in the analysis cycle. Preprints of the Tenth Conference on Numerical Weather Prediction, Amer. Meteor. Soc., 1994, 212-215. Morss, R. E., 1999: Adaptive observations: Idealized sampling strategies for improv- ing numerical weather prediction. PHD thesis, Massachussetts Institute of technology, 225pp. Patil, D. J. S., B. R. Hunt, E. Kalnay, J. A. Yorke, and E. Ott., 2001: Local Low Dimensionality of Atmospheric Dynamics. Phys. Rev. Lett., 86, 5878. 3

  9. Forecasting trends in outdoor recreation activities on multi-state basis

    Treesearch

    Vincent A. Scardino; Josef Schwalbe; Marianne Beauregard

    1980-01-01

    Since a substantial amount of recreation planning takes place on a statewide basis, it is essential to have reliable information and forecasts of recreational needs on a state level. However, most of the recreation research over the last few years have used either national survey data, statewide data or site specific information.

  10. The CMEMS IBI-MFC Forecasting Service in 2017: Evolution and Novelties associated to the CMEMS service release

    NASA Astrophysics Data System (ADS)

    Lorente, Pablo; Sotillo, Marcos G.; Gutknecht, Elodie; Dabrowski, Tomasz; Aouf, Lotfi; Toledano, Cristina; Amo-Baladron, Arancha; Aznar, Roland; De Pascual, Alvaro; Levier, Bruno; Bowyer, Peter; Rainaud, Romain; Alvarez-Fanjul, Enrique

    2017-04-01

    The IBI-MFC (Iberia-Biscay-Ireland Monitoring & Forecasting Centre) has been providing daily ocean model estimates and forecasts of diverse physical parameters for the IBI regional seas since 2011, first in the frame of MyOcean projects and later as part of the Copernicus Marine Environment Monitoring Service (CMEMS). By April 2017, coincident with the V3 CMEMS Service Release, the IBI-MFC will extend their near real time (NRT) forecast capabilities. Two new operational IBI forecast systems will be operationally run to generate high resolution biochemical (BIO) and wave (WAV) products on the IBI area. The IBI-NRT-BIO forecast system, based on a 1/36° NEMO-PISCES model application, is run once a week coupled with the IBI physical forecast solution and nested to the CMEMS GLOBAL-BIO solution. On the other hand, the IBI-NRT-WAV system, based on a MeteoFrance-WAM 10km resolution model application, runs twice a day using ECMWF wind forcing. Among other novelties related to the evolution of the IBI physical (PHY) solution, it is worthwhile mentioning the provision, as part of the IBI-NRT-PHY product daily updated, of three-dimensional hourly data on specific areas within the IBI domain. The delivery of these new hourly data along the whole water column has been achieved after the request from IBI users, in order to foster downscaling approaches by providing coherent open boundary conditions to any potential high-resolution coastal model nested to IBI regional solution. An extensive skill assessment of IBI-NRT forecast products has been conducted through the NARVAL (North Atlantic Regional VALidation) web tool, by means of the automatic computation of statistical metrics and quality indicators. By now, this tool has been focused on the validation of the IBI-NRT-PHY system. Nowadays, NARVAL is facing a significant upgrade to validate the aforementioned new biogeochemical and wave IBI products. To this aim, satellite derived observations of chlorophyll and significant wave height will be used, together with in-situ wave parameters measured by mooring buoys. Within this validation framework, special emphasis has been placed on the intercomparison of different forecast model solutions in overlapping areas in order to evaluate models' performances and prognostic capabilities. This common uncertainty estimates of IBI and other model solution is currently performed by NARVAL using both CMEMS forecast model sources (i.e. GLOBAL-MFC, MED-MFC and NWS-MFC) and non-CMEMS operational forecast solutions (mostly downstream application nested to the IBI solution). With respect to the IBI multi-year (MY) products, it is worth mentioning that the actual biogeochemical and physical reanalysis products will be re-run along year 2017, extending its time coverage backwards until 1992. Based on these IBI-MY products, a variety of climatic indicators related to essential oceanographic processes (i.e. western coastal upwelling or the Mediterranean Outflow Water) are currently being computed.

  11. Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yannan; Hou, Zhangshuan; Meng, Da

    2016-07-17

    In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.

  12. Time series forecasting using ERNN and QR based on Bayesian model averaging

    NASA Astrophysics Data System (ADS)

    Pwasong, Augustine; Sathasivam, Saratha

    2017-08-01

    The Bayesian model averaging technique is a multi-model combination technique. The technique was employed to amalgamate the Elman recurrent neural network (ERNN) technique with the quadratic regression (QR) technique. The amalgamation produced a hybrid technique known as the hybrid ERNN-QR technique. The potentials of forecasting with the hybrid technique are compared with the forecasting capabilities of individual techniques of ERNN and QR. The outcome revealed that the hybrid technique is superior to the individual techniques in the mean square error sense.

  13. An assessment of a North American Multi-Model Ensemble (NMME) based global drought early warning forecast system

    NASA Astrophysics Data System (ADS)

    Wood, E. F.; Yuan, X.; Sheffield, J.; Pan, M.; Roundy, J.

    2013-12-01

    One of the key recommendations of the WCRP Global Drought Information System (GDIS) workshop is to develop an experimental real-time global monitoring and prediction system. While great advances has been made in global drought monitoring based on satellite observations and model reanalysis data, global drought forecasting has been stranded in part due to the limited skill both in climate forecast models and global hydrologic predictions. Having been working on drought monitoring and forecasting over USA for more than a decade, the Princeton land surface hydrology group is now developing an experimental global drought early warning system that is based on multiple climate forecast models and a calibrated global hydrologic model. In this presentation, we will test its capability in seasonal forecasting of meteorological, agricultural and hydrologic droughts over global major river basins, using precipitation, soil moisture and streamflow forecasts respectively. Based on the joint probability distribution between observations using Princeton's global drought monitoring system and model hindcasts and real-time forecasts from North American Multi-Model Ensemble (NMME) project, we (i) bias correct the monthly precipitation and temperature forecasts from multiple climate forecast models, (ii) downscale them to a daily time scale, and (iii) use them to drive the calibrated VIC model to produce global drought forecasts at a 1-degree resolution. A parallel run using the ESP forecast method, which is based on resampling historical forcings, is also carried out for comparison. Analysis is being conducted over global major river basins, with multiple drought indices that have different time scales and characteristics. The meteorological drought forecast does not have uncertainty from hydrologic models and can be validated directly against observations - making the validation an 'apples-to-apples' comparison. Preliminary results for the evaluation of meteorological drought onset hindcasts indicate that climate models increase drought detectability over ESP by 31%-81%. However, less than 30% of the global drought onsets can be detected by climate models. The missed drought events are associated with weak ENSO signals and lower potential predictability. Due to the high false alarms from climate models, the reliability is more important than sharpness for a skillful probabilistic drought onset forecast. Validations and skill assessments for agricultural and hydrologic drought forecasts are carried out using soil moisture and streamflow output from the VIC land surface model (LSM) forced by a global forcing data set. Given our previous drought forecasting experiences over USA and Africa, validating the hydrologic drought forecasting is a significant challenge for a global drought early warning system.

  14. Interval forecasting of cyber-attacks on industrial control systems

    NASA Astrophysics Data System (ADS)

    Ivanyo, Y. M.; Krakovsky, Y. M.; Luzgin, A. N.

    2018-03-01

    At present, cyber-security issues of industrial control systems occupy one of the key niches in a state system of planning and management Functional disruption of these systems via cyber-attacks may lead to emergencies related to loss of life, environmental disasters, major financial and economic damage, or disrupted activities of cities and settlements. There is then an urgent need to develop protection methods against cyber-attacks. This paper studied the results of cyber-attack interval forecasting with a pre-set intensity level of cyber-attacks. Interval forecasting is the forecasting of one interval from two predetermined ones in which a future value of the indicator will be obtained. For this, probability estimates of these events were used. For interval forecasting, a probabilistic neural network with a dynamic updating value of the smoothing parameter was used. A dividing bound of these intervals was determined by a calculation method based on statistical characteristics of the indicator. The number of cyber-attacks per hour that were received through a honeypot from March to September 2013 for the group ‘zeppo-norcal’ was selected as the indicator.

  15. An Evaluation of the NOAA Climate Forecast System Subseasonal Forecasts

    NASA Astrophysics Data System (ADS)

    Mass, C.; Weber, N.

    2016-12-01

    This talk will describe a multi-year evaluation of the 1-5 week forecasts of the NOAA Climate Forecasting System (CFS) over the globe, North America, and the western U.S. Forecasts are evaluated for both specific times and for a variety of time-averaging periods. Initial results show a loss of predictability at approximately three weeks, with sea surface temperature retaining predictability longer than atmospheric variables. It is shown that a major CFS problem is an inability to realistically simulate propagating convection in the tropics, with substantial implications for midlatitude teleconnections and subseasonal predictability. The inability of CFS to deal with tropical convection will be discussed in connection with the prediction of extreme climatic events over the midlatitudes.

  16. Similarity-based multi-model ensemble approach for 1-15-day advance prediction of monsoon rainfall over India

    NASA Astrophysics Data System (ADS)

    Jaiswal, Neeru; Kishtawal, C. M.; Bhomia, Swati

    2018-04-01

    The southwest (SW) monsoon season (June, July, August and September) is the major period of rainfall over the Indian region. The present study focuses on the development of a new multi-model ensemble approach based on the similarity criterion (SMME) for the prediction of SW monsoon rainfall in the extended range. This approach is based on the assumption that training with the similar type of conditions may provide the better forecasts in spite of the sequential training which is being used in the conventional MME approaches. In this approach, the training dataset has been selected by matching the present day condition to the archived dataset and days with the most similar conditions were identified and used for training the model. The coefficients thus generated were used for the rainfall prediction. The precipitation forecasts from four general circulation models (GCMs), viz. European Centre for Medium-Range Weather Forecasts (ECMWF), United Kingdom Meteorological Office (UKMO), National Centre for Environment Prediction (NCEP) and China Meteorological Administration (CMA) have been used for developing the SMME forecasts. The forecasts of 1-5, 6-10 and 11-15 days were generated using the newly developed approach for each pentad of June-September during the years 2008-2013 and the skill of the model was analysed using verification scores, viz. equitable skill score (ETS), mean absolute error (MAE), Pearson's correlation coefficient and Nash-Sutcliffe model efficiency index. Statistical analysis of SMME forecasts shows superior forecast skill compared to the conventional MME and the individual models for all the pentads, viz. 1-5, 6-10 and 11-15 days.

  17. Multi-spacecraft testing of time-dependent interplanetary MHD models for operational forecasting of geomagnetic storms

    NASA Technical Reports Server (NTRS)

    Dryer, M.; Smith, Z. K.

    1989-01-01

    An MHD 2-1/2D, time-dependent model is used, together with observations of six solar flares during February 3-7, 1986, to demonstrate global, large-scale, compound disturbances in the solar wind over a wide range of heliolongitudes. This scenario is one that is likely to occur many times during the cruise, possibly even encounter, phases of the Multi-Comet Mission. It is suggested that a model such as this one should be tested with multi-spacecraft data (such as the MCM and earth-based probes) with several goals in view: (1) utility of the model for operational real-time forecasting of geomagnetic storms, and (2) scientific interpretation of certain forms of cometary activities and their possible association with solar-generated activity.

  18. Demonstrating the Operational Value of Thermodynamic Hyperspectral Profiles in the Pre-Convective Environment

    NASA Technical Reports Server (NTRS)

    Kozlowski, Danielle; Zavodsky, Bradley T.; Jedlovec, Gary J.

    2011-01-01

    The Short-term Prediction Research and Transition Center (SPoRT) is a collaborative partnership between NASA and operational forecasting partners, including a number of National Weather Service (NWS) Weather Forecasting Offices (WFO). As a part of the transition to operations process, SPoRT attempts to identify possible limitations in satellite observations and provide operational forecasters a product that will result in the most impact on their forecasts. One operational forecast challenge that some NWS offices face, is forecasting convection in data-void regions such as large bodies of water. The Atmospheric Infrared Sounder (AIRS) is a sounding instrument aboard NASA's Aqua satellite that provides temperature and moisture profiles of the atmosphere. This paper will demonstrate an approach to assimilate AIRS profile data into a regional configuration of the WRF model using its three-dimensional variational (3DVAR) assimilation component to be used as a proxy for the individual profiles.

  19. Development of a multi-sensor based urban discharge forecasting system using remotely sensed data: A case study of extreme rainfall in South Korea

    NASA Astrophysics Data System (ADS)

    Yoon, Sunkwon; Jang, Sangmin; Park, Kyungwon

    2017-04-01

    Extreme weather due to changing climate is a main source of water-related disasters such as flooding and inundation and its damage will be accelerated somewhere in world wide. To prevent the water-related disasters and mitigate their damage in urban areas in future, we developed a multi-sensor based real-time discharge forecasting system using remotely sensed data such as radar and satellite. We used Communication, Ocean and Meteorological Satellite (COMS) and Korea Meteorological Agency (KMA) weather radar for quantitative precipitation estimation. The Automatic Weather System (AWS) and McGill Algorithm for Precipitation Nowcasting by Lagrangian Extrapolation (MAPLE) were used for verification of rainfall accuracy. The optimal Z-R relation was applied the Tropical Z-R relationship (Z=32R1.65), it has been confirmed that the accuracy is improved in the extreme rainfall events. In addition, the performance of blended multi-sensor combining rainfall was improved in 60mm/h rainfall and more strong heavy rainfall events. Moreover, we adjusted to forecast the urban discharge using Storm Water Management Model (SWMM). Several statistical methods have been used for assessment of model simulation between observed and simulated discharge. In terms of the correlation coefficient and r-squared discharge between observed and forecasted were highly correlated. Based on this study, we captured a possibility of real-time urban discharge forecasting system using remotely sensed data and its utilization for real-time flood warning. Acknowledgement This research was supported by a grant (13AWMP-B066744-01) from Advanced Water Management Research Program (AWMP) funded by Ministry of Land, Infrastructure and Transport (MOLIT) of Korean government.

  20. Application of multi-scale wavelet entropy and multi-resolution Volterra models for climatic downscaling

    NASA Astrophysics Data System (ADS)

    Sehgal, V.; Lakhanpal, A.; Maheswaran, R.; Khosa, R.; Sridhar, Venkataramana

    2018-01-01

    This study proposes a wavelet-based multi-resolution modeling approach for statistical downscaling of GCM variables to mean monthly precipitation for five locations at Krishna Basin, India. Climatic dataset from NCEP is used for training the proposed models (Jan.'69 to Dec.'94) and are applied to corresponding CanCM4 GCM variables to simulate precipitation for the validation (Jan.'95-Dec.'05) and forecast (Jan.'06-Dec.'35) periods. The observed precipitation data is obtained from the India Meteorological Department (IMD) gridded precipitation product at 0.25 degree spatial resolution. This paper proposes a novel Multi-Scale Wavelet Entropy (MWE) based approach for clustering climatic variables into suitable clusters using k-means methodology. Principal Component Analysis (PCA) is used to obtain the representative Principal Components (PC) explaining 90-95% variance for each cluster. A multi-resolution non-linear approach combining Discrete Wavelet Transform (DWT) and Second Order Volterra (SoV) is used to model the representative PCs to obtain the downscaled precipitation for each downscaling location (W-P-SoV model). The results establish that wavelet-based multi-resolution SoV models perform significantly better compared to the traditional Multiple Linear Regression (MLR) and Artificial Neural Networks (ANN) based frameworks. It is observed that the proposed MWE-based clustering and subsequent PCA, helps reduce the dimensionality of the input climatic variables, while capturing more variability compared to stand-alone k-means (no MWE). The proposed models perform better in estimating the number of precipitation events during the non-monsoon periods whereas the models with clustering without MWE over-estimate the rainfall during the dry season.

  1. A technique for determining cloud free versus cloud contaminated pixels in satellite imagery

    NASA Technical Reports Server (NTRS)

    Wohlman, Richard A.

    1994-01-01

    Weather forecasting has been called the second oldest profession. To do so accurately and with some consistency requires an ability to understand the processes which create the clouds, drive the winds, and produce the ever changing atmospheric conditions. Measurement of basic parameters such as temperature, water vapor content, pressure, windspeed and wind direction throughout the three dimensional atmosphere form the foundation upon which a modern forecast is created. Doppler radar, and space borne remote sensing have provided forecasters the new tools with which to ply their trade.

  2. Forecasting production in Liquid Rich Shale plays

    NASA Astrophysics Data System (ADS)

    Nikfarman, Hanieh

    Production from Liquid Rich Shale (LRS) reservoirs is taking center stage in the exploration and production of unconventional reservoirs. Production from the low and ultra-low permeability LRS plays is possible only through multi-fractured horizontal wells (MFHW's). There is no existing workflow that is applicable to forecasting multi-phase production from MFHW's in LRS plays. This project presents a practical and rigorous workflow for forecasting multiphase production from MFHW's in LRS reservoirs. There has been much effort in developing workflows and methodology for forecasting in tight/shale plays in recent years. The existing workflows, however, are applicable only to single phase flow, and are primarily used in shale gas plays. These methodologies do not apply to the multi-phase flow that is inevitable in LRS plays. To account for complexities of multiphase flow in MFHW's the only available technique is dynamic modeling in compositional numerical simulators. These are time consuming and not practical when it comes to forecasting production and estimating reserves for a large number of producers. A workflow was developed, and validated by compositional numerical simulation. The workflow honors physics of flow, and is sufficiently accurate while practical so that an analyst can readily apply it to forecast production and estimate reserves in a large number of producers in a short period of time. To simplify the complex multiphase flow in MFHW, the workflow divides production periods into an initial period where large production and pressure declines are expected, and the subsequent period where production decline may converge into a common trend for a number of producers across an area of interest in the field. Initial period assumes the production is dominated by single-phase flow of oil and uses the tri-linear flow model of Erdal Ozkan to estimate the production history. Commercial software readily available can simulate flow and forecast production in this period. In the subsequent Period, dimensionless rate and dimensionless time functions are introduced that help identify transition from initial period into subsequent period. The production trends in terms of the dimensionless parameters converge for a range of rock permeability and stimulation intensity. This helps forecast production beyond transition to the end of life of well. This workflow is applicable to single fluid system.

  3. Microscopic observation of magnon bound states and their dynamics.

    PubMed

    Fukuhara, Takeshi; Schauß, Peter; Endres, Manuel; Hild, Sebastian; Cheneau, Marc; Bloch, Immanuel; Gross, Christian

    2013-10-03

    The existence of bound states of elementary spin waves (magnons) in one-dimensional quantum magnets was predicted almost 80 years ago. Identifying signatures of magnon bound states has so far remained the subject of intense theoretical research, and their detection has proved challenging for experiments. Ultracold atoms offer an ideal setting in which to find such bound states by tracking the spin dynamics with single-spin and single-site resolution following a local excitation. Here we use in situ correlation measurements to observe two-magnon bound states directly in a one-dimensional Heisenberg spin chain comprising ultracold bosonic atoms in an optical lattice. We observe the quantum dynamics of free and bound magnon states through time-resolved measurements of two spin impurities. The increased effective mass of the compound magnon state results in slower spin dynamics as compared to single-magnon excitations. We also determine the decay time of bound magnons, which is probably limited by scattering on thermal fluctuations in the system. Our results provide a new way of studying fundamental properties of quantum magnets and, more generally, properties of interacting impurities in quantum many-body systems.

  4. Characterizing Time Series Data Diversity for Wind Forecasting: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Brian S; Chartan, Erol Kevin; Feng, Cong

    Wind forecasting plays an important role in integrating variable and uncertain wind power into the power grid. Various forecasting models have been developed to improve the forecasting accuracy. However, it is challenging to accurately compare the true forecasting performances from different methods and forecasters due to the lack of diversity in forecasting test datasets. This paper proposes a time series characteristic analysis approach to visualize and quantify wind time series diversity. The developed method first calculates six time series characteristic indices from various perspectives. Then the principal component analysis is performed to reduce the data dimension while preserving the importantmore » information. The diversity of the time series dataset is visualized by the geometric distribution of the newly constructed principal component space. The volume of the 3-dimensional (3D) convex polytope (or the length of 1D number axis, or the area of the 2D convex polygon) is used to quantify the time series data diversity. The method is tested with five datasets with various degrees of diversity.« less

  5. Multi-model seasonal forecast of Arctic sea-ice: forecast uncertainty at pan-Arctic and regional scales

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.

    2017-08-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  6. Evaluating sub-seasonal skill in probabilistic forecasts of Atmospheric Rivers and associated extreme events

    NASA Astrophysics Data System (ADS)

    Subramanian, A. C.; Lavers, D.; Matsueda, M.; Shukla, S.; Cayan, D. R.; Ralph, M.

    2017-12-01

    Atmospheric rivers (ARs) - elongated plumes of intense moisture transport - are a primary source of hydrological extremes, water resources and impactful weather along the West Coast of North America and Europe. There is strong demand in the water management, societal infrastructure and humanitarian sectors for reliable sub-seasonal forecasts, particularly of extreme events, such as floods and droughts so that actions to mitigate disastrous impacts can be taken with sufficient lead-time. Many recent studies have shown that ARs in the Pacific and the Atlantic are modulated by large-scale modes of climate variability. Leveraging the improved understanding of how these large-scale climate modes modulate the ARs in these two basins, we use the state-of-the-art multi-model forecast systems such as the North American Multi-Model Ensemble (NMME) and the Subseasonal-to-Seasonal (S2S) database to help inform and assess the probabilistic prediction of ARs and related extreme weather events over the North American and European West Coasts. We will present results from evaluating probabilistic forecasts of extreme precipitation and AR activity at the sub-seasonal scale. In particular, results from the comparison of two winters (2015-16 and 2016-17) will be shown, winters which defied canonical El Niño teleconnection patterns over North America and Europe. We further extend this study to analyze probabilistic forecast skill of AR events in these two basins and the variability in forecast skill during certain regimes of large-scale climate modes.

  7. New forecasting methodology indicates more disease and earlier mortality ahead for today's younger Americans.

    PubMed

    Reither, Eric N; Olshansky, S Jay; Yang, Yang

    2011-08-01

    Traditional methods of projecting population health statistics, such as estimating future death rates, can give inaccurate results and lead to inferior or even poor policy decisions. A new "three-dimensional" method of forecasting vital health statistics is more accurate because it takes into account the delayed effects of the health risks being accumulated by today's younger generations. Applying this forecasting technique to the US obesity epidemic suggests that future death rates and health care expenditures could be far worse than currently anticipated. We suggest that public policy makers adopt this more robust forecasting tool and redouble efforts to develop and implement effective obesity-related prevention programs and interventions.

  8. Ensemble and Bias-Correction Techniques for Air-Quality Model Forecasts of Surface O3 and PM2.5 during the TEXAQS-II Experiment of 2006

    EPA Science Inventory

    Several air quality forecasting ensembles were created from seven models, running in real-time during the 2006 Texas Air Quality (TEXAQS-II) experiment. These multi-model ensembles incorporated a diverse set of meteorological models, chemical mechanisms, and emission inventories...

  9. Bounding the space of holographic CFTs with chaos

    DOE PAGES

    Perlmutter, Eric

    2016-10-13

    In this study, thermal states of quantum systems with many degrees of freedom are subject to a bound on the rate of onset of chaos, including a bound on the Lyapunov exponent, λ L ≤ 2π/β. We harness this bound to constrain the space of putative holographic CFTs and their would-be dual theories of AdS gravity. First, by studying out-of-time-order four-point functions, we discuss how λ L = 2π/β in ordinary two-dimensional holographic CFTs is related to properties of the OPE at strong coupling. We then rule out the existence of unitary, sparse two-dimensional CFTs with large central charge andmore » a set of higher spin currents of bounded spin; this implies the inconsistency of weakly coupled AdS 3 higher spin gravities without infinite towers of gauge fields, such as the SL(N) theories. This fits naturally with the structure of higher-dimensional gravity, where finite towers of higher spin fields lead to acausality. On the other hand, unitary CFTs with classical W ∞[λ] symmetry, dual to 3D Vasiliev or hs[λ] higher spin gravities, do not violate the chaos bound, instead exhibiting no chaos: λ L = 0. Independently, we show that such theories violate unitarity for |λ| > 2. These results encourage a tensionless string theory interpretation of the 3D Vasiliev theory.« less

  10. A Gaussian Processes Technique for Short-term Load Forecasting with Considerations of Uncertainty

    NASA Astrophysics Data System (ADS)

    Ohmi, Masataro; Mori, Hiroyuki

    In this paper, an efficient method is proposed to deal with short-term load forecasting with the Gaussian Processes. Short-term load forecasting plays a key role to smooth power system operation such as economic load dispatching, unit commitment, etc. Recently, the deregulated and competitive power market increases the degree of uncertainty. As a result, it is more important to obtain better prediction results to save the cost. One of the most important aspects is that power system operator needs the upper and lower bounds of the predicted load to deal with the uncertainty while they require more accurate predicted values. The proposed method is based on the Bayes model in which output is expressed in a distribution rather than a point. To realize the model efficiently, this paper proposes the Gaussian Processes that consists of the Bayes linear model and kernel machine to obtain the distribution of the predicted value. The proposed method is successively applied to real data of daily maximum load forecasting.

  11. A new Bayesian recursive technique for parameter estimation

    NASA Astrophysics Data System (ADS)

    Kaheil, Yasir H.; Gill, M. Kashif; McKee, Mac; Bastidas, Luis

    2006-08-01

    The performance of any model depends on how well its associated parameters are estimated. In the current application, a localized Bayesian recursive estimation (LOBARE) approach is devised for parameter estimation. The LOBARE methodology is an extension of the Bayesian recursive estimation (BARE) method. It is applied in this paper on two different types of models: an artificial intelligence (AI) model in the form of a support vector machine (SVM) application for forecasting soil moisture and a conceptual rainfall-runoff (CRR) model represented by the Sacramento soil moisture accounting (SAC-SMA) model. Support vector machines, based on statistical learning theory (SLT), represent the modeling task as a quadratic optimization problem and have already been used in various applications in hydrology. They require estimation of three parameters. SAC-SMA is a very well known model that estimates runoff. It has a 13-dimensional parameter space. In the LOBARE approach presented here, Bayesian inference is used in an iterative fashion to estimate the parameter space that will most likely enclose a best parameter set. This is done by narrowing the sampling space through updating the "parent" bounds based on their fitness. These bounds are actually the parameter sets that were selected by BARE runs on subspaces of the initial parameter space. The new approach results in faster convergence toward the optimal parameter set using minimum training/calibration data and fewer sets of parameter values. The efficacy of the localized methodology is also compared with the previously used BARE algorithm.

  12. Fuzzy Multi-Objective Transportation Planning with Modified S-Curve Membership Function

    NASA Astrophysics Data System (ADS)

    Peidro, D.; Vasant, P.

    2009-08-01

    In this paper, the S-Curve membership function methodology is used in a transportation planning decision (TPD) problem. An interactive method for solving multi-objective TPD problems with fuzzy goals, available supply and forecast demand is developed. The proposed method attempts simultaneously to minimize the total production and transportation costs and the total delivery time with reference to budget constraints and available supply, machine capacities at each source, as well as forecast demand and warehouse space constraints at each destination. We compare in an industrial case the performance of S-curve membership functions, representing uncertainty goals and constraints in TPD problems, with linear membership functions.

  13. Assimilation of total lightning data using the three-dimensional variational method at convection-allowing resolution

    NASA Astrophysics Data System (ADS)

    Zhang, Rong; Zhang, Yijun; Xu, Liangtao; Zheng, Dong; Yao, Wen

    2017-08-01

    A large number of observational analyses have shown that lightning data can be used to indicate areas of deep convection. It is important to assimilate observed lightning data into numerical models, so that more small-scale information can be incorporated to improve the quality of the initial condition and the subsequent forecasts. In this study, the empirical relationship between flash rate, water vapor mixing ratio, and graupel mixing ratio was used to adjust the model relative humidity, which was then assimilated by using the three-dimensional variational data assimilation system of the Weather Research and Forecasting model in cycling mode at 10-min intervals. To find the appropriate assimilation time-window length that yielded significant improvement in both the initial conditions and subsequent forecasts, four experiments with different assimilation time-window lengths were conducted for a squall line case that occurred on 10 July 2007 in North China. It was found that 60 min was the appropriate assimilation time-window length for this case, and longer assimilation window length was unnecessary since no further improvement was present. Forecasts of 1-h accumulated precipitation during the assimilation period and the subsequent 3-h accumulated precipitation were significantly improved compared with the control experiment without lightning data assimilation. The simulated reflectivity was optimal after 30 min of the forecast, it remained optimal during the following 42 min, and the positive effect from lightning data assimilation began to diminish after 72 min of the forecast. Overall, the improvement from lightning data assimilation can be maintained for about 3 h.

  14. The Impact of the Assimilation of AIRS Radiance Measurements on Short-term Weather Forecasts

    NASA Technical Reports Server (NTRS)

    McCarty, Will; Jedlovec, Gary; Miller, Timothy L.

    2009-01-01

    Advanced spaceborne instruments have the ability to improve the horizontal and vertical characterization of temperature and water vapor in the atmosphere through the explicit use of hyperspectral thermal infrared radiance measurements. The incorporation of these measurements into a data assimilation system provides a means to continuously characterize a three-dimensional, instantaneous atmospheric state necessary for the time integration of numerical weather forecasts. Measurements from the National Aeronautics and Space Administration (NASA) Atmospheric Infrared Sounder (AIRS) are incorporated into the gridpoint statistical interpolation (GSI) three-dimensional variational (3D-Var) assimilation system to provide improved initial conditions for use in a mesoscale modeling framework mimicking that of the operational North American Mesoscale (NAM) model. The methodologies for the incorporation of the measurements into the system are presented. Though the measurements have been shown to have a positive impact in global modeling systems, the measurements are further constrained in this system as the model top is physically lower than the global systems and there is no ozone characterization in the background state. For a study period, the measurements are shown to have positive impact on both the analysis state as well as subsequently spawned short-term (0-48 hr) forecasts, particularly in forecasted geopotential height and precipitation fields. At 48 hr, height anomaly correlations showed an improvement in forecast skill of 2.3 hours relative to a system without the AIRS measurements. Similarly, the equitable threat and bias scores of precipitation forecasts of 25 mm (6 hr)-1 were shown to be improved by 8% and 7%, respectively.

  15. Turbulent scaling laws as solutions of the multi-point correlation equation using statistical symmetries

    NASA Astrophysics Data System (ADS)

    Oberlack, Martin; Rosteck, Andreas; Avsarkisov, Victor

    2013-11-01

    Text-book knowledge proclaims that Lie symmetries such as Galilean transformation lie at the heart of fluid dynamics. These important properties also carry over to the statistical description of turbulence, i.e. to the Reynolds stress transport equations and its generalization, the multi-point correlation equations (MPCE). Interesting enough, the MPCE admit a much larger set of symmetries, in fact infinite dimensional, subsequently named statistical symmetries. Most important, theses new symmetries have important consequences for our understanding of turbulent scaling laws. The symmetries form the essential foundation to construct exact solutions to the infinite set of MPCE, which in turn are identified as classical and new turbulent scaling laws. Examples on various classical and new shear flow scaling laws including higher order moments will be presented. Even new scaling have been forecasted from these symmetries and in turn validated by DNS. Turbulence modellers have implicitly recognized at least one of the statistical symmetries as this is the basis for the usual log-law which has been employed for calibrating essentially all engineering turbulence models. An obvious conclusion is to generally make turbulence models consistent with the new statistical symmetries.

  16. Smoothing two-dimensional Malaysian mortality data using P-splines indexed by age and year

    NASA Astrophysics Data System (ADS)

    Kamaruddin, Halim Shukri; Ismail, Noriszura

    2014-06-01

    Nonparametric regression implements data to derive the best coefficient of a model from a large class of flexible functions. Eilers and Marx (1996) introduced P-splines as a method of smoothing in generalized linear models, GLMs, in which the ordinary B-splines with a difference roughness penalty on coefficients is being used in a single dimensional mortality data. Modeling and forecasting mortality rate is a problem of fundamental importance in insurance company calculation in which accuracy of models and forecasts are the main concern of the industry. The original idea of P-splines is extended to two dimensional mortality data. The data indexed by age of death and year of death, in which the large set of data will be supplied by Department of Statistics Malaysia. The extension of this idea constructs the best fitted surface and provides sensible prediction of the underlying mortality rate in Malaysia mortality case.

  17. Investigating the Impact on Modeled Ozone Concentrations Using Meteorological Fields From WRF With and Updated Four-Dimensional Data Assimilation Approach”

    EPA Science Inventory

    The four-dimensional data assimilation (FDDA) technique in the Weather Research and Forecasting (WRF) meteorological model has recently undergone an important update from the original version. Previous evaluation results have demonstrated that the updated FDDA approach in WRF pr...

  18. Planning Responses to Demographic Change. AIR 1986 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Taylor, Bryan J. R.; Taylor, Elizabeth A.

    A method for forecasting the number of college graduates in the United Kingdom is described, and suggestions are offered about ways that society should react to influence declining enrollments and potential reductions in technologically skilled graduates. Consideration is given to the implications of recruiting noncollege-bound individuals to…

  19. An application of a multi model approach for solar energy prediction in Southern Italy

    NASA Astrophysics Data System (ADS)

    Avolio, Elenio; Lo Feudo, Teresa; Calidonna, Claudia Roberta; Contini, Daniele; Torcasio, Rosa Claudia; Tiriolo, Luca; Montesanti, Stefania; Transerici, Claudio; Federico, Stefano

    2015-04-01

    The accuracy of the short and medium range forecast of solar irradiance is very important for solar energy integration into the grid. This issue is particularly important for Southern Italy where a significant availability of solar energy is associated with a poor development of the grid. In this work we analyse the performance of two deterministic models for the prediction of surface temperature and short-wavelength radiance for two sites in southern Italy. Both parameters are needed to forecast the power production from solar power plants, so the performance of the forecast for these meteorological parameters is of paramount importance. The models considered in this work are the RAMS (Regional Atmospheric Modeling System) and the WRF (Weather Research and Forecasting Model) and they were run for the summer 2013 at 4 km horizontal resolution over Italy. The forecast lasts three days. Initial and dynamic boundary conditions are given by the 12 UTC deterministic forecast of the ECMWF-IFS (European Centre for Medium Weather Range Forecast - Integrated Forecasting System) model, and were available every 6 hours. Verification is given against two surface stations located in Southern Italy, Lamezia Terme and Lecce, and are based on hourly output of models forecast. Results for the whole period for temperature show a positive bias for the RAMS model and a negative bias for the WRF model. RMSE is between 1 and 2 °C for both models. Results for the whole period for the short-wavelength radiance show a positive bias for both models (about 30 W/m2 for both models) and a RMSE of 100 W/m2. To reduce the model errors, a statistical post-processing technique, i.e the multi-model, is adopted. In this approach the two model's outputs are weighted with an adequate set of weights computed for a training period. In general, the performance is improved by the application of the technique, and the RMSE is reduced by a sizeable fraction (i.e. larger than 10% of the initial RMSE) depending on the forecasting time and parameter. The performance of the multi model is discussed as a function of the length of the training period and is compared with the performance of the MOS (Model Output Statistics) approach. ACKNOWLEDGMENTS This work is partially supported by projects PON04a2E Sinergreen-ResNovae - "Smart Energy Master for the energetic government of the territory" and PONa3_00363 "High Technology Infrastructure for Climate and Environment Monitoring" (I-AMICA) founded by Italian Ministry of University and Research (MIUR) PON 2007-2013. The ECMWF and CNMCA (Centro Nazionale di Meteorologia e Climatologia Aeronautica) are acknowledged for the use of the MARS (Meteorological Archive and Retrieval System).

  20. Evaluation of the North American Multi-Model Ensemble System for Monthly and Seasonal Prediction

    NASA Astrophysics Data System (ADS)

    Zhang, Q.

    2014-12-01

    Since August 2011, the real time seasonal forecasts of the U.S. National Multi-Model Ensemble (NMME) have been made on 8th of each month by NCEP Climate Prediction Center (CPC). The participating models were NCEP/CFSv1&2, GFDL/CM2.2, NCAR/U.Miami/COLA/CCSM3, NASA/GEOS5, IRI/ ECHAM-a & ECHAM-f in the first year of the real time NMME forecast. Two Canadian coupled models CMC/CanCM3 and CM4 joined in and CFSv1 and IRI's models dropped out in the second year. The NMME team at CPC collects monthly means of three variables, precipitation, temperature at 2m and sea surface temperature from each modeling center on a 1x1 global grid, removes systematic errors, makes the grand ensemble mean in equal weight for each model mean and probability forecast with equal weight for each member of each model. This provides the NMME forecast locked in schedule for the CPC operational seasonal and monthly outlook. The basic verification metrics of seasonal and monthly prediction of NMME are calculated as an evaluation of skill, including both deterministic and probabilistic forecasts for the 3-year real time (August, 2011- July 2014) period and the 30-year retrospective forecast (1982-2011) of the individual models as well as the NMME ensemble. The motivation of this study is to provide skill benchmarks for future improvements of the NMME seasonal and monthly prediction system. We also want to establish whether the real time and hindcast periods (used for bias correction in real time) are consistent. The experimental phase I of the project already supplies routine guidance to users of the NMME forecasts.

  1. Shared investment projects and forecasting errors: setting framework conditions for coordination and sequencing data quality activities.

    PubMed

    Leitner, Stephan; Brauneis, Alexander; Rausch, Alexandra

    2015-01-01

    In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments' efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that-in some setups-a certain extent of misforecasting is desirable from the firm's point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that-in particular for relatively good forecasters-most of our results are robust to changes in setting the parameters of our multi-agent simulation model.

  2. Initial assessment of a multi-model approach to spring flood forecasting in Sweden

    NASA Astrophysics Data System (ADS)

    Olsson, J.; Uvo, C. B.; Foster, K.; Yang, W.

    2015-06-01

    Hydropower is a major energy source in Sweden and proper reservoir management prior to the spring flood onset is crucial for optimal production. This requires useful forecasts of the accumulated discharge in the spring flood period (i.e. the spring-flood volume, SFV). Today's SFV forecasts are generated using a model-based climatological ensemble approach, where time series of precipitation and temperature from historical years are used to force a calibrated and initialised set-up of the HBV model. In this study, a number of new approaches to spring flood forecasting, that reflect the latest developments with respect to analysis and modelling on seasonal time scales, are presented and evaluated. Three main approaches, represented by specific methods, are evaluated in SFV hindcasts for three main Swedish rivers over a 10-year period with lead times between 0 and 4 months. In the first approach, historically analogue years with respect to the climate in the period preceding the spring flood are identified and used to compose a reduced ensemble. In the second, seasonal meteorological ensemble forecasts are used to drive the HBV model over the spring flood period. In the third approach, statistical relationships between SFV and the large-sale atmospheric circulation are used to build forecast models. None of the new approaches consistently outperform the climatological ensemble approach, but for specific locations and lead times improvements of 20-30 % are found. When combining all forecasts in a weighted multi-model approach, a mean improvement over all locations and lead times of nearly 10 % was indicated. This demonstrates the potential of the approach and further development and optimisation into an operational system is ongoing.

  3. Quantitative precipitation forecasts in the Alps - an assessment from the Forecast Demonstration Project MAP D-PHASE

    NASA Astrophysics Data System (ADS)

    Ament, F.; Weusthoff, T.; Arpagaus, M.; Rotach, M.

    2009-04-01

    The main aim of the WWRP Forecast Demonstration Project MAP D-PHASE is to demonstrate the performance of today's models to forecast heavy precipitation and flood events in the Alpine region. Therefore an end-to-end, real-time forecasting system was installed and operated during the D PHASE Operations Period from June to November 2007. Part of this system are 30 numerical weather prediction models (deterministic as well as ensemble systems) operated by weather services and research institutes, which issue alerts if predicted precipitation accumulations exceed critical thresholds. Additionally to the real-time alerts, all relevant model fields of these simulations are stored in a central data archive. This comprehensive data set allows a detailed assessment of today's quantitative precipitation forecast (QPF) performance in the Alpine region. We will present results of QPF verifications against Swiss radar and rain gauge data both from a qualitative point of view, in terms of alerts, as well as from a quantitative perspective, in terms of precipitation rate. Various influencing factors like lead time, accumulation time, selection of warning thresholds, or bias corrections will be discussed. Additional to traditional verifications of area average precipitation amounts, the performance of the models to predict the correct precipitation statistics without requiring a point-to-point match will be described by using modern Fuzzy verification techniques. Both analyses reveal significant advantages of deep convection resolving models compared to coarser models with parameterized convection. An intercomparison of the model forecasts themselves reveals a remarkably high variability between different models, and makes it worthwhile to evaluate the potential of a multi-model ensemble. Various multi-model ensemble strategies will be tested by combining D-PHASE models to virtual ensemble systems.

  4. Multi-platform operational validation of the Western Mediterranean SOCIB forecasting system

    NASA Astrophysics Data System (ADS)

    Juza, Mélanie; Mourre, Baptiste; Renault, Lionel; Tintoré, Joaquin

    2014-05-01

    The development of science-based ocean forecasting systems at global, regional, and local scales can support a better management of the marine environment (maritime security, environmental and resources protection, maritime and commercial operations, tourism, ...). In this context, SOCIB (the Balearic Islands Coastal Observing and Forecasting System, www.socib.es) has developed an operational ocean forecasting system in the Western Mediterranean Sea (WMOP). WMOP uses a regional configuration of the Regional Ocean Modelling System (ROMS, Shchepetkin and McWilliams, 2005) nested in the larger scale Mediterranean Forecasting System (MFS) with a spatial resolution of 1.5-2km. WMOP aims at reproducing both the basin-scale ocean circulation and the mesoscale variability which is known to play a crucial role due to its strong interaction with the large scale circulation in this region. An operational validation system has been developed to systematically assess the model outputs at daily, monthly and seasonal time scales. Multi-platform observations are used for this validation, including satellite products (Sea Surface Temperature, Sea Level Anomaly), in situ measurements (from gliders, Argo floats, drifters and fixed moorings) and High-Frequency radar data. The validation procedures allow to monitor and certify the general realism of the daily production of the ocean forecasting system before its distribution to users. Additionally, different indicators (Sea Surface Temperature and Salinity, Eddy Kinetic Energy, Mixed Layer Depth, Heat Content, transports in key sections) are computed every day both at the basin-scale and in several sub-regions (Alboran Sea, Balearic Sea, Gulf of Lion). The daily forecasts, validation diagnostics and indicators from the operational model over the last months are available at www.socib.es.

  5. Problems Related to Parallelization of CFD Algorithms on GPU, Multi-GPU and Hybrid Architectures

    NASA Astrophysics Data System (ADS)

    Biazewicz, Marek; Kurowski, Krzysztof; Ludwiczak, Bogdan; Napieraia, Krystyna

    2010-09-01

    Computational Fluid Dynamics (CFD) is one of the branches of fluid mechanics, which uses numerical methods and algorithms to solve and analyze fluid flows. CFD is used in various domains, such as oil and gas reservoir uncertainty analysis, aerodynamic body shapes optimization (e.g. planes, cars, ships, sport helmets, skis), natural phenomena analysis, numerical simulation for weather forecasting or realistic visualizations. CFD problem is very complex and needs a lot of computational power to obtain the results in a reasonable time. We have implemented a parallel application for two-dimensional CFD simulation with a free surface approximation (MAC method) using new hardware architectures, in particular multi-GPU and hybrid computing environments. For this purpose we decided to use NVIDIA graphic cards with CUDA environment due to its simplicity of programming and good computations performance. We used finite difference discretization of Navier-Stokes equations, where fluid is propagated over an Eulerian Grid. In this model, the behavior of the fluid inside the cell depends only on the properties of local, surrounding cells, therefore it is well suited for the GPU-based architecture. In this paper we demonstrate how to use efficiently the computing power of GPUs for CFD. Additionally, we present some best practices to help users analyze and improve the performance of CFD applications executed on GPU. Finally, we discuss various challenges around the multi-GPU implementation on the example of matrix multiplication.

  6. Climatic Forecasting of Net Infiltration at Yucca Montain Using Analogue Meteororological Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    B. Faybishenko

    At Yucca Mountain, Nevada, future changes in climatic conditions will most likely alter net infiltration, or the drainage below the bottom of the evapotranspiration zone within the soil profile or flow across the interface between soil and the densely welded part of the Tiva Canyon Tuff. The objectives of this paper are to: (a) develop a semi-empirical model and forecast average net infiltration rates, using the limited meteorological data from analogue meteorological stations, for interglacial (present day), and future monsoon, glacial transition, and glacial climates over the Yucca Mountain region, and (b) corroborate the computed net-infiltration rates by comparing themmore » with the empirically and numerically determined groundwater recharge and percolation rates through the unsaturated zone from published data. In this paper, the author presents an approach for calculations of net infiltration, aridity, and precipitation-effectiveness indices, using a modified Budyko's water-balance model, with reference-surface potential evapotranspiration determined from the radiation-based Penman (1948) formula. Results of calculations show that net infiltration rates are expected to generally increase from the present-day climate to monsoon climate, to glacial transition climate, and then to the glacial climate. The forecasting results indicate the overlap between the ranges of net infiltration for different climates. For example, the mean glacial net-infiltration rate corresponds to the upper-bound glacial transition net infiltration, and the lower-bound glacial net infiltration corresponds to the glacial transition mean net infiltration. Forecasting of net infiltration for different climate states is subject to numerous uncertainties-associated with selecting climate analogue sites, using relatively short analogue meteorological records, neglecting the effects of vegetation and surface runoff and runon on a local scale, as well as possible anthropogenic climate changes.« less

  7. A Multi-scale Modeling System with Unified Physics to Study Precipitation Processes

    NASA Astrophysics Data System (ADS)

    Tao, W. K.

    2017-12-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), and (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF). The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the precipitation, processes and their sensitivity on model resolution and microphysics schemes will be presented. Also how to use of the multi-satellite simulator to improve precipitation processes will be discussed.

  8. Using Multi-Scale Modeling Systems and Satellite Data to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2011-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (l) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, the recent developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the precipitating systems and hurricanes/typhoons will be presented. The high-resolution spatial and temporal visualization will be utilized to show the evolution of precipitation processes. Also how to use of the multi-satellite simulator tqimproy precipitation processes will be discussed.

  9. Using Multi-Scale Modeling Systems and Satellite Data to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei--Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2010-01-01

    In recent years, exponentially increasing computer power extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 sq km in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale models can be run in grid size similar to cloud resolving models through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model). (2) a regional scale model (a NASA unified weather research and forecast, W8F). (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling systems to study the interactions between clouds, precipitation, and aerosols will be presented. Also how to use the multi-satellite simulator to improve precipitation processes will be discussed.

  10. Using Multi-Scale Modeling Systems to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2010-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the interactions between clouds, precipitation, and aerosols will be presented. Also how to use of the multi-satellite simulator to improve precipitation processes will be discussed.

  11. Ensembles vs. information theory: supporting science under uncertainty

    NASA Astrophysics Data System (ADS)

    Nearing, Grey S.; Gupta, Hoshin V.

    2018-05-01

    Multi-model ensembles are one of the most common ways to deal with epistemic uncertainty in hydrology. This is a problem because there is no known way to sample models such that the resulting ensemble admits a measure that has any systematic (i.e., asymptotic, bounded, or consistent) relationship with uncertainty. Multi-model ensembles are effectively sensitivity analyses and cannot - even partially - quantify uncertainty. One consequence of this is that multi-model approaches cannot support a consistent scientific method - in particular, multi-model approaches yield unbounded errors in inference. In contrast, information theory supports a coherent hypothesis test that is robust to (i.e., bounded under) arbitrary epistemic uncertainty. This paper may be understood as advocating a procedure for hypothesis testing that does not require quantifying uncertainty, but is coherent and reliable (i.e., bounded) in the presence of arbitrary (unknown and unknowable) uncertainty. We conclude by offering some suggestions about how this proposed philosophy of science suggests new ways to conceptualize and construct simulation models of complex, dynamical systems.

  12. The Advantages of Hybrid 4DEnVar in the Context of the Forecast Sensitivity to Initial Conditions

    NASA Astrophysics Data System (ADS)

    Song, Hyo-Jong; Shin, Seoleun; Ha, Ji-Hyun; Lim, Sujeong

    2017-11-01

    Hybrid four-dimensional ensemble variational data assimilation (hybrid 4DEnVar) is a prospective successor to three-dimensional variational data assimilation (3DVar) in operational weather prediction centers currently developing a new weather prediction model and those that do not operate adjoint models. In experiments using real observations, hybrid 4DEnVar improved Northern Hemisphere (NH; 20°N-90°N) 500 hPa geopotential height forecasts up to 5 days in a NH summer month compared to 3DVar, with statistical significance. This result is verified against ERA-Interim through a Monte Carlo test. By a regression analysis, the sensitivity of 5 day forecast is associated with the quality of the initial condition. The increased analysis skill for midtropospheric midlatitude temperature and subtropical moisture has the most apparent effect on forecast skill in the NH including a typhoon prediction case. Through attributing the analysis improvements by hybrid 4DEnVar separately to the ensemble background error covariance (BEC), its four-dimensional (4-D) extension, and climatological BEC, it is revealed that the ensemble BEC contributes to the subtropical moisture analysis, whereas the 4-D extension does to the midtropospheric midlatitude temperature. This result implies that hourly wind-mass correlation in 6 h analysis window is required to extract the potential of hybrid 4DEnVar for the midlatitude temperature analysis to the maximum. However, the temporal ensemble correlation, in hourly time scale, between moisture and another variable is invalid so that it could not work for improving the hybrid 4DEnVar analysis.

  13. Cloud Forecasting and 3-D Radiative Transfer Model Validation using Citizen-Sourced Imagery

    NASA Astrophysics Data System (ADS)

    Gasiewski, A. J.; Heymsfield, A.; Newman Frey, K.; Davis, R.; Rapp, J.; Bansemer, A.; Coon, T.; Folsom, R.; Pfeufer, N.; Kalloor, J.

    2017-12-01

    Cloud radiative feedback mechanisms are one of the largest sources of uncertainty in global climate models. Variations in local 3D cloud structure impact the interpretation of NASA CERES and MODIS data for top-of-atmosphere radiation studies over clouds. Much of this uncertainty results from lack of knowledge of cloud vertical and horizontal structure. Surface-based data on 3-D cloud structure from a multi-sensor array of low-latency ground-based cameras can be used to intercompare radiative transfer models based on MODIS and other satellite data with CERES data to improve the 3-D cloud parameterizations. Closely related, forecasting of solar insolation and associated cloud cover on time scales out to 1 hour and with spatial resolution of 100 meters is valuable for stabilizing power grids with high solar photovoltaic penetrations. Data for cloud-advection based solar insolation forecasting with requisite spatial resolution and latency needed to predict high ramp rate events obtained from a bottom-up perspective is strongly correlated with cloud-induced fluctuations. The development of grid management practices for improved integration of renewable solar energy thus also benefits from a multi-sensor camera array. The data needs for both 3D cloud radiation modelling and solar forecasting are being addressed using a network of low-cost upward-looking visible light CCD sky cameras positioned at 2 km spacing over an area of 30-60 km in size acquiring imagery on 30 second intervals. Such cameras can be manufactured in quantity and deployed by citizen volunteers at a marginal cost of 200-400 and operated unattended using existing communications infrastructure. A trial phase to understand the potential utility of up-looking multi-sensor visible imagery is underway within this NASA Citizen Science project. To develop the initial data sets necessary to optimally design a multi-sensor cloud camera array a team of 100 citizen scientists using self-owned PDA cameras is being organized to collect distributed cloud data sets suitable for MODIS-CERES cloud radiation science and solar forecasting algorithm development. A low-cost and robust sensor design suitable for large scale fabrication and long term deployment has been developed during the project prototyping phase.

  14. Post-processing of multi-model ensemble river discharge forecasts using censored EMOS

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian

    2014-05-01

    When forecasting water levels and river discharge, ensemble weather forecasts are used as meteorological input to hydrologic process models. As hydrologic models are imperfect and the input ensembles tend to be biased and underdispersed, the output ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, statistical post-processing is required in order to achieve calibrated and sharp predictions. Standard post-processing methods such as Ensemble Model Output Statistics (EMOS) that have their origins in meteorological forecasting are now increasingly being used in hydrologic applications. Here we consider two sub-catchments of River Rhine, for which the forecasting system of the Federal Institute of Hydrology (BfG) uses runoff data that are censored below predefined thresholds. To address this methodological challenge, we develop a censored EMOS method that is tailored to such data. The censored EMOS forecast distribution can be understood as a mixture of a point mass at the censoring threshold and a continuous part based on a truncated normal distribution. Parameter estimates of the censored EMOS model are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over the training dataset. Model fitting on Box-Cox transformed data allows us to take account of the positive skewness of river discharge distributions. In order to achieve realistic forecast scenarios over an entire range of lead-times, there is a need for multivariate extensions. To this end, we smooth the marginal parameter estimates over lead-times. In order to obtain realistic scenarios of discharge evolution over time, the marginal distributions have to be linked with each other. To this end, the multivariate dependence structure can either be adopted from the raw ensemble like in Ensemble Copula Coupling (ECC), or be estimated from observations in a training period. The censored EMOS model has been applied to multi-model ensemble forecasts issued on a daily basis over a period of three years. For the two catchments considered, this resulted in well calibrated and sharp forecast distributions over all lead-times from 1 to 114 h. Training observations tended to be better indicators for the dependence structure than the raw ensemble.

  15. Performance assessment of deterministic and probabilistic weather predictions for the short-term optimization of a tropical hydropower reservoir

    NASA Astrophysics Data System (ADS)

    Mainardi Fan, Fernando; Schwanenberg, Dirk; Alvarado, Rodolfo; Assis dos Reis, Alberto; Naumann, Steffi; Collischonn, Walter

    2016-04-01

    Hydropower is the most important electricity source in Brazil. During recent years, it accounted for 60% to 70% of the total electric power supply. Marginal costs of hydropower are lower than for thermal power plants, therefore, there is a strong economic motivation to maximize its share. On the other hand, hydropower depends on the availability of water, which has a natural variability. Its extremes lead to the risks of power production deficits during droughts and safety issues in the reservoir and downstream river reaches during flood events. One building block of the proper management of hydropower assets is the short-term forecast of reservoir inflows as input for an online, event-based optimization of its release strategy. While deterministic forecasts and optimization schemes are the established techniques for the short-term reservoir management, the use of probabilistic ensemble forecasts and stochastic optimization techniques receives growing attention and a number of researches have shown its benefit. The present work shows one of the first hindcasting and closed-loop control experiments for a multi-purpose hydropower reservoir in a tropical region in Brazil. The case study is the hydropower project (HPP) Três Marias, located in southeast Brazil. The HPP reservoir is operated with two main objectives: (i) hydroelectricity generation and (ii) flood control at Pirapora City located 120 km downstream of the dam. In the experiments, precipitation forecasts based on observed data, deterministic and probabilistic forecasts with 50 ensemble members of the ECMWF are used as forcing of the MGB-IPH hydrological model to generate streamflow forecasts over a period of 2 years. The online optimization depends on a deterministic and multi-stage stochastic version of a model predictive control scheme. Results for the perfect forecasts show the potential benefit of the online optimization and indicate a desired forecast lead time of 30 days. In comparison, the use of actual forecasts with shorter lead times of up to 15 days shows the practical benefit of actual operational data. It appears that the use of stochastic optimization combined with ensemble forecasts leads to a significant higher level of flood protection without compromising the HPP's energy production.

  16. A Public-Private-Acadmic Partnership to Advance Solar Power Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haupt, Sue Ellen

    The National Center for Atmospheric Research (NCAR) is pleased to have led a partnership to advance the state-of-the-science of solar power forecasting by designing, developing, building, deploying, testing, and assessing the SunCast™ Solar Power Forecasting System. The project has included cutting edge research, testing in several geographically- and climatologically-diverse high penetration solar utilities and Independent System Operators (ISOs), and wide dissemination of the research results to raise the bar on solar power forecasting technology. The partners include three other national laboratories, six universities, and industry partners. This public-private-academic team has worked in concert to perform use-inspired research to advance solarmore » power forecasting through cutting-edge research to advance both the necessary forecasting technologies and the metrics for evaluating them. The project has culminated in a year-long, full-scale demonstration of provide irradiance and power forecasts to utilities and ISOs to use in their operations. The project focused on providing elements of a value chain, beginning with the weather that causes a deviation from clear sky irradiance and progresses through monitoring and observations, modeling, forecasting, dissemination and communication of the forecasts, interpretation of the forecast, and through decision-making, which produces outcomes that have an economic value. The system has been evaluated using metrics developed specifically for this project, which has provided rich information on model and system performance. Research was accomplished on the very short range (0-6 hours) Nowcasting system as well as on the longer term (6-72 hour) forecasting system. The shortest range forecasts are based on observations in the field. The shortest range system, built by Brookhaven National Laboratory (BNL) and based on Total Sky Imagers (TSIs) is TSICast, which operates on the shortest time scale with a latency of only a few minutes and forecasts that currently go out to about 15 min. This project has facilitated research in improving the hardware and software so that the new high definition cameras deployed at multiple nearby locations allow discernment of the clouds at varying levels and advection according to the winds observed at those levels. Improvements over “smart persistence” are about 29% for even these very short forecasts. StatCast is based on pyranometer data measured at the site as well as concurrent meteorological observations and forecasts. StatCast is based on regime-dependent artificial intelligence forecasting techniques and has been shown to improve on “smart persistence” forecasts by 15-50%. A second category of short-range forecasting systems employ satellite imagery and use that information to discern clouds and their motion, allowing them to project the clouds, and the resulting blockage of irradiance, in time. CIRACast (the system produced by the Cooperative Institute for Atmospheric Research [CIRA] at Colorado State University) was already one of the more advanced cloud motion systems, which is the reason that team was brought to this project. During the project timeframe, the CIRA team was able to advance cloud shadowing, parallax removal, and implementation of better advecting winds at different altitudes. CIRACast shows generally a 25-40% improvement over Smart Persistence between sunrise and approximately 1600 UTC (Coordinated Universal Time) . A second satellite-based system, MADCast (Multi-sensor Advective Diffusive foreCast system), assimilates data from multiple satellite imagers and profilers to assimilate a fully three-dimensional picture of the cloud into the dynamic core of WRF. During 2015, MADCast (provided at least 70% improvement over Smart Persistence, with most of that skill being derived during partly cloudy conditions. That allows advection of the clouds via the Weather Research and Forecasting (WRF) model dynamics directly. After WRF-Solar™ showed initial success, it was also deployed in nowcasting mode with coarser runs out to 6 hours made hourly. It provided improvements on the order of 50-60% over Smart Persistence for forecasts up to 1600 UTC. The advantages of WRF-Solar-Nowcasting and MADCast were then blended to develop the new MAD-WRF model that incorporates the most important features of each of those models, both assimilating satellite cloud fields and using WRF-So far physics to develop and dissipate clouds. MAE improvements for MAD-WRF for forecasts from 3-6 hours are improved over WRF-Solar-Now by 20%. While all the Nowcasting system components by themselves provide improvement over Smart Persistence, the largest benefit is derived when they are smartly blended together by the Nowcasting Integrator to produce an integrated forecast. The development of WRF-Solar™ under this project has provided the first numerical weather prediction (NWP) model specifically designed to meet the needs of irradiance forecasting. The first augmentation improved the solar tracking algorithm to account for deviations associated with the eccentricity of the Earth’s orbit and the obliquity of the Earth. Second, WRF-Solar™ added the direct normal irradiance (DNI) and diffuse (DIF) components from the radiation parameterization to the model output. Third, efficient parameterizations were implemented to either interpolate the irradiance in between calls to the expensive radiative transfer parameterization, or to use a fast radiative transfer code that avoids computing three-dimensional heating rates but provides the surface irradiance. Fourth, a new parameterization was developed to improve the representation of absorption and scattering of radiation by aerosols (aerosol direct effect). A fifth advance is that the aerosols now interact with the cloud microphysics, altering the cloud evolution and radiative properties, an effect that has been traditionally only implemented in atmospheric computationally costly chemistry models. A sixth development accounts for the feedbacks that sub-grid scale clouds produce in shortwave irradiance as implemented in a shallow cumulus parameterization Finally, WRF-Solar™ also allows assimilation of infrared irradiances from satellites to determine the three dimensional cloud field, allowing for an improved initialization of the cloud field that increases the performance of short-range forecasts. We find that WRF-Solar™ can improve clear sky irradiance prediction by 15-80% over a standard version of WRF, depending on location and cloud conditions. In a formal comparison to the NAM baseline, WRF-Solar™ showed improvements in the Day-Ahead forecast of 22-42%. The SunCast™ system requires substantial software engineering to blend all of the new model components as well as existing publically available NWP model runs. To do this we use an expert system for the Nowcasting blender and the Dynamic Integrated foreCast (DICast®) system for the NWP models. These two systems are then blended, we use an empirical power conversion method to convert the irradiance predictions to power, then apply an analog ensemble (AnEn) approach to further tune the forecast as well as to estimate its uncertainty. The AnEn module decreased RMSE (root mean squared error) by 17% over the blended SunCast™ power forecasts and provided skill in the probabilistic forecast with a Brier Skill Score of 0.55. In addition, we have also developed a Gridded Atmospheric Forecast System (GRAFS) in parallel, leveraging cost share funds. An economic evaluation based on Production Cost Modeling in the Public Service Company of Colorado showed that the observed 50% improvement in forecast accuracy will save their customers $819,200 with the projected MW deployment for 2024. Using econometrics, NCAR has scaled this savings to a national level and shown that an annual expected savings for this 50% forecast error reduction ranges from $11M in 2015 to $43M expected in 2040 with increased solar deployment. This amounts to a $455M discounted savings over the 26 year period of analysis.« less

  17. Time-Series Forecast Modeling on High-Bandwidth Network Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Sim, Alex

    With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less

  18. An Ensemble-Based Forecasting Framework to Optimize Reservoir Releases

    NASA Astrophysics Data System (ADS)

    Ramaswamy, V.; Saleh, F.

    2017-12-01

    Increasing frequency of extreme precipitation events are stressing the need to manage water resources on shorter timescales. Short-term management of water resources becomes proactive when inflow forecasts are available and this information can be effectively used in the control strategy. This work investigates the utility of short term hydrological ensemble forecasts for operational decision making during extreme weather events. An advanced automated hydrologic prediction framework integrating a regional scale hydrologic model, GIS datasets and the meteorological ensemble predictions from the European Center for Medium Range Weather Forecasting (ECMWF) was coupled to an implicit multi-objective dynamic programming model to optimize releases from a water supply reservoir. The proposed methodology was evaluated by retrospectively forecasting the inflows to the Oradell reservoir in the Hackensack River basin in New Jersey during the extreme hydrologic event, Hurricane Irene. Additionally, the flexibility of the forecasting framework was investigated by forecasting the inflows from a moderate rainfall event to provide important perspectives on using the framework to assist reservoir operations during moderate events. The proposed forecasting framework seeks to provide a flexible, assistive tool to alleviate the complexity of operational decision-making.

  19. Time-Series Forecast Modeling on High-Bandwidth Network Measurements

    DOE PAGES

    Yoo, Wucherl; Sim, Alex

    2016-06-24

    With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less

  20. Global Modeling and Assimilation Office Annual Report and Research Highlights 2011-2012

    NASA Technical Reports Server (NTRS)

    Rienecker, Michele M.

    2012-01-01

    Over the last year, the Global Modeling and Assimilation Office (GMAO) has continued to advance our GEOS-5-based systems, updating products for both weather and climate applications. We contributed hindcasts and forecasts to the National Multi-Model Ensemble (NMME) of seasonal forecasts and the suite of decadal predictions to the Coupled Model Intercomparison Project (CMIP5).

  1. Multi-model assessment of the impact of soil moisture initialization on mid-latitude summer predictability

    NASA Astrophysics Data System (ADS)

    Ardilouze, Constantin; Batté, L.; Bunzel, F.; Decremer, D.; Déqué, M.; Doblas-Reyes, F. J.; Douville, H.; Fereday, D.; Guemas, V.; MacLachlan, C.; Müller, W.; Prodhomme, C.

    2017-12-01

    Land surface initial conditions have been recognized as a potential source of predictability in sub-seasonal to seasonal forecast systems, at least for near-surface air temperature prediction over the mid-latitude continents. Yet, few studies have systematically explored such an influence over a sufficient hindcast period and in a multi-model framework to produce a robust quantitative assessment. Here, a dedicated set of twin experiments has been carried out with boreal summer retrospective forecasts over the 1992-2010 period performed by five different global coupled ocean-atmosphere models. The impact of a realistic versus climatological soil moisture initialization is assessed in two regions with high potential previously identified as hotspots of land-atmosphere coupling, namely the North American Great Plains and South-Eastern Europe. Over the latter region, temperature predictions show a significant improvement, especially over the Balkans. Forecast systems better simulate the warmest summers if they follow pronounced dry initial anomalies. It is hypothesized that models manage to capture a positive feedback between high temperature and low soil moisture content prone to dominate over other processes during the warmest summers in this region. Over the Great Plains, however, improving the soil moisture initialization does not lead to any robust gain of forecast quality for near-surface temperature. It is suggested that models biases prevent the forecast systems from making the most of the improved initial conditions.

  2. Ensemble Nonlinear Autoregressive Exogenous Artificial Neural Networks for Short-Term Wind Speed and Power Forecasting.

    PubMed

    Men, Zhongxian; Yee, Eugene; Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian

    2014-01-01

    Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an "optimal" weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds.

  3. Ensemble Nonlinear Autoregressive Exogenous Artificial Neural Networks for Short-Term Wind Speed and Power Forecasting

    PubMed Central

    Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian

    2014-01-01

    Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an “optimal” weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds. PMID:27382627

  4. A New Time-varying Concept of Risk in a Changing Climate.

    PubMed

    Sarhadi, Ali; Ausín, María Concepción; Wiper, Michael P

    2016-10-20

    In a changing climate arising from anthropogenic global warming, the nature of extreme climatic events is changing over time. Existing analytical stationary-based risk methods, however, assume multi-dimensional extreme climate phenomena will not significantly vary over time. To strengthen the reliability of infrastructure designs and the management of water systems in the changing environment, multidimensional stationary risk studies should be replaced with a new adaptive perspective. The results of a comparison indicate that current multi-dimensional stationary risk frameworks are no longer applicable to projecting the changing behaviour of multi-dimensional extreme climate processes. Using static stationary-based multivariate risk methods may lead to undesirable consequences in designing water system infrastructures. The static stationary concept should be replaced with a flexible multi-dimensional time-varying risk framework. The present study introduces a new multi-dimensional time-varying risk concept to be incorporated in updating infrastructure design strategies under changing environments arising from human-induced climate change. The proposed generalized time-varying risk concept can be applied for all stochastic multi-dimensional systems that are under the influence of changing environments.

  5. Absolute Lower Bound on the Bounce Action

    NASA Astrophysics Data System (ADS)

    Sato, Ryosuke; Takimoto, Masahiro

    2018-03-01

    The decay rate of a false vacuum is determined by the minimal action solution of the tunneling field: bounce. In this Letter, we focus on models with scalar fields which have a canonical kinetic term in N (>2 ) dimensional Euclidean space, and derive an absolute lower bound on the bounce action. In the case of four-dimensional space, we show the bounce action is generically larger than 24 /λcr, where λcr≡max [-4 V (ϕ )/|ϕ |4] with the false vacuum being at ϕ =0 and V (0 )=0 . We derive this bound on the bounce action without solving the equation of motion explicitly. Our bound is derived by a quite simple discussion, and it provides useful information even if it is difficult to obtain the explicit form of the bounce solution. Our bound offers a sufficient condition for the stability of a false vacuum, and it is useful as a quick check on the vacuum stability for given models. Our bound can be applied to a broad class of scalar potential with any number of scalar fields. We also discuss a necessary condition for the bounce action taking a value close to this lower bound.

  6. The Rényi entropy H2 as a rigorous, measurable lower bound for the entropy of the interaction region in multi-particle production processes

    NASA Astrophysics Data System (ADS)

    Bialas, A.; Czyz, W.; Zalewski, K.

    2006-10-01

    A model-independent lower bound on the entropy S of the multi-particle system produced in high energy collisions, provided by the measurable Rényi entropy H2, is shown to be very effective. Estimates show that the ratio H2/S remains close to one half for all realistic values of the parameters.

  7. Central Schemes for Multi-Dimensional Hamilton-Jacobi Equations

    NASA Technical Reports Server (NTRS)

    Bryson, Steve; Levy, Doron; Biegel, Bryan (Technical Monitor)

    2002-01-01

    We present new, efficient central schemes for multi-dimensional Hamilton-Jacobi equations. These non-oscillatory, non-staggered schemes are first- and second-order accurate and are designed to scale well with an increasing dimension. Efficiency is obtained by carefully choosing the location of the evolution points and by using a one-dimensional projection step. First-and second-order accuracy is verified for a variety of multi-dimensional, convex and non-convex problems.

  8. Impact of single-point GPS integrated water vapor estimates on short-range WRF model forecasts over southern India

    NASA Astrophysics Data System (ADS)

    Kumar, Prashant; Gopalan, Kaushik; Shukla, Bipasha Paul; Shyam, Abhineet

    2017-11-01

    Specifying physically consistent and accurate initial conditions is one of the major challenges of numerical weather prediction (NWP) models. In this study, ground-based global positioning system (GPS) integrated water vapor (IWV) measurements available from the International Global Navigation Satellite Systems (GNSS) Service (IGS) station in Bangalore, India, are used to assess the impact of GPS data on NWP model forecasts over southern India. Two experiments are performed with and without assimilation of GPS-retrieved IWV observations during the Indian winter monsoon period (November-December, 2012) using a four-dimensional variational (4D-Var) data assimilation method. Assimilation of GPS data improved the model IWV analysis as well as the subsequent forecasts. There is a positive impact of ˜10 % over Bangalore and nearby regions. The Weather Research and Forecasting (WRF) model-predicted 24-h surface temperature forecasts have also improved when compared with observations. Small but significant improvements were found in the rainfall forecasts compared to control experiments.

  9. Upper bounds on the error probabilities and asymptotic error exponents in quantum multiple state discrimination

    NASA Astrophysics Data System (ADS)

    Audenaert, Koenraad M. R.; Mosonyi, Milán

    2014-10-01

    We consider the multiple hypothesis testing problem for symmetric quantum state discrimination between r given states σ1, …, σr. By splitting up the overall test into multiple binary tests in various ways we obtain a number of upper bounds on the optimal error probability in terms of the binary error probabilities. These upper bounds allow us to deduce various bounds on the asymptotic error rate, for which it has been hypothesized that it is given by the multi-hypothesis quantum Chernoff bound (or Chernoff divergence) C(σ1, …, σr), as recently introduced by Nussbaum and Szkoła in analogy with Salikhov's classical multi-hypothesis Chernoff bound. This quantity is defined as the minimum of the pairwise binary Chernoff divergences min _{j

  10. Improving of local ozone forecasting by integrated models.

    PubMed

    Gradišar, Dejan; Grašič, Boštjan; Božnar, Marija Zlata; Mlakar, Primož; Kocijan, Juš

    2016-09-01

    This paper discuss the problem of forecasting the maximum ozone concentrations in urban microlocations, where reliable alerting of the local population when thresholds have been surpassed is necessary. To improve the forecast, the methodology of integrated models is proposed. The model is based on multilayer perceptron neural networks that use as inputs all available information from QualeAria air-quality model, WRF numerical weather prediction model and onsite measurements of meteorology and air pollution. While air-quality and meteorological models cover large geographical 3-dimensional space, their local resolution is often not satisfactory. On the other hand, empirical methods have the advantage of good local forecasts. In this paper, integrated models are used for improved 1-day-ahead forecasting of the maximum hourly value of ozone within each day for representative locations in Slovenia. The WRF meteorological model is used for forecasting meteorological variables and the QualeAria air-quality model for gas concentrations. Their predictions, together with measurements from ground stations, are used as inputs to a neural network. The model validation results show that integrated models noticeably improve ozone forecasts and provide better alert systems.

  11. A Diagnostics Tool to detect ensemble forecast system anomaly and guide operational decisions

    NASA Astrophysics Data System (ADS)

    Park, G. H.; Srivastava, A.; Shrestha, E.; Thiemann, M.; Day, G. N.; Draijer, S.

    2017-12-01

    The hydrologic community is moving toward using ensemble forecasts to take uncertainty into account during the decision-making process. The New York City Department of Environmental Protection (DEP) implements several types of ensemble forecasts in their decision-making process: ensemble products for a statistical model (Hirsch and enhanced Hirsch); the National Weather Service (NWS) Advanced Hydrologic Prediction Service (AHPS) forecasts based on the classical Ensemble Streamflow Prediction (ESP) technique; and the new NWS Hydrologic Ensemble Forecasting Service (HEFS) forecasts. To remove structural error and apply the forecasts to additional forecast points, the DEP post processes both the AHPS and the HEFS forecasts. These ensemble forecasts provide mass quantities of complex data, and drawing conclusions from these forecasts is time-consuming and difficult. The complexity of these forecasts also makes it difficult to identify system failures resulting from poor data, missing forecasts, and server breakdowns. To address these issues, we developed a diagnostic tool that summarizes ensemble forecasts and provides additional information such as historical forecast statistics, forecast skill, and model forcing statistics. This additional information highlights the key information that enables operators to evaluate the forecast in real-time, dynamically interact with the data, and review additional statistics, if needed, to make better decisions. We used Bokeh, a Python interactive visualization library, and a multi-database management system to create this interactive tool. This tool compiles and stores data into HTML pages that allows operators to readily analyze the data with built-in user interaction features. This paper will present a brief description of the ensemble forecasts, forecast verification results, and the intended applications for the diagnostic tool.

  12. Development of a decision support system for monitoring, reporting and forecasting ecological conditions of the Appalachian Trail

    USGS Publications Warehouse

    Wang, Yeqiao; Nemani, Ramakrishna; Dieffenbach, Fred; Stolte, Kenneth; Holcomb, Glenn B.; Robinson, Matt; Reese, Casey C.; McNiff, Marcia; Duhaime, Roland; Tierney, Geri; Mitchell, Brian; August, Peter; Paton, Peter; LaBash, Charles

    2010-01-01

    This paper introduces a collaborative multi-agency effort to develop an Appalachian Trail (A.T.) MEGA-Transect Decision Support System (DSS) for monitoring, reporting and forecasting ecological conditions of the A.T. and the surrounding lands. The project is to improve decisionmaking on management of the A.T. by providing a coherent framework for data integration, status reporting and trend analysis. The A.T. MEGA-Transect DSS is to integrate NASA multi-platform sensor data and modeling through the Terrestrial Observation and Prediction System (TOPS) and in situ measurements from A.T. MEGA-Transect partners to address identified natural resource priorities and improve resource management decisions.

  13. Impact of AIRS Thermodynamic Profiles on Precipitation Forecasts for Atmospheric River Cases Affecting the Western United States

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley T.; Jedlovec, Gary J.; Blakenship, Clay B.; Wick, Gary A.; Neiman, Paul J.

    2013-01-01

    This project is a collaborative activity between the NASA Short-term Prediction Research and Transition (SPoRT) Center and the NOAA Hydrometeorology Testbed (HMT) to evaluate a SPoRT Advanced Infrared Sounding Radiometer (AIRS: Aumann et al. 2003) enhanced moisture analysis product. We test the impact of assimilating AIRS temperature and humidity profiles above clouds and in partly cloudy regions, using the three-dimensional variational Gridpoint Statistical Interpolation (GSI) data assimilation (DA) system (Developmental Testbed Center 2012) to produce a new analysis. Forecasts of the Weather Research and Forecasting (WRF) model initialized from the new analysis are compared to control forecasts without the additional AIRS data. We focus on some cases where atmospheric rivers caused heavy precipitation on the US West Coast. We verify the forecasts by comparison with dropsondes and the Cooperative Institute for Research in the Atmosphere (CIRA) Blended Total Precipitable Water product.

  14. Forecast model applications of retrieved three dimensional liquid water fields

    NASA Technical Reports Server (NTRS)

    Raymond, William H.; Olson, William S.

    1990-01-01

    Forecasts are made for tropical storm Emily using heating rates derived from the SSM/I physical retrievals described in chapters 2 and 3. Average values of the latent heating rates from the convective and stratiform cloud simulations, used in the physical retrieval, are obtained for individual 1.1 km thick vertical layers. Then, the layer-mean latent heating rates are regressed against the slant path-integrated liquid and ice precipitation water contents to determine the best fit two parameter regression coefficients for each layer. The regression formulae and retrieved precipitation water contents are utilized to infer the vertical distribution of heating rates for forecast model applications. In the forecast model, diabatic temperature contributions are calculated and used in a diabatic initialization, or in a diabatic initialization combined with a diabatic forcing procedure. Our forecasts show that the time needed to spin-up precipitation processes in tropical storm Emily is greatly accelerated through the application of the data.

  15. Low-frequency seismic events in a wider volcanological context

    NASA Astrophysics Data System (ADS)

    Neuberg, J. W.; Collombet, M.

    2006-12-01

    Low-frequency seismic events have been in the centre of attention for several years, particularly on volcanoes with highly viscous magmas. The ultimate aim is to detect changes in volcanic activity by identifying changes in the seismic behaviour in order to forecast an eruption, or in case of an ongoing eruption, forecast the short and longterm behaviour of the volcanic system. A major boost in recent years arose through several attempts of multi-parameter volcanic monitoring and modelling programs, which allowed multi-disciplinary groups of volcanologists to interpret seismic signals together with, e.g. ground deformation, stress field analysis and petrological information. This talk will give several examples of such multi-disciplinary projects, focussing on the joint modelling of seismic source processes for low-frequency events together with advanced magma flow models, and the signs of magma movement in the deformation and stress field at the surface.

  16. Revealing the membrane-bound structure of neurokinin A using neutron diffraction

    NASA Astrophysics Data System (ADS)

    Darkes, Malcolm J. M.; Hauss, Thomas; Dante, Silvia; Bradshaw, Jeremy P.

    2000-03-01

    Neurokinin A (or substance K) belongs to the tachykinin family, a group of small amphipathic peptides that bind to specific membrane-embedded, G-protein coupled receptors. The agonist/receptor complex is quaternary in nature because the receptor binding sites are thought to be located within the lipid bilayer and because the role of water cannot be ignored. The cell membrane acts as a solvent to accumulate peptide and an inducer of peptide secondary structure. The three-dimensional shape that the peptide assumes when associated to the cell membrane will be an important parameter with regards to the receptor selectivity and affinity. Neutron diffraction measurements were carried out in order to define the location of the N-terminus of the peptide in synthetic phospholipid multi-bilayer stacks.

  17. Image matrix processor for fast multi-dimensional computations

    DOEpatents

    Roberson, George P.; Skeate, Michael F.

    1996-01-01

    An apparatus for multi-dimensional computation which comprises a computation engine, including a plurality of processing modules. The processing modules are configured in parallel and compute respective contributions to a computed multi-dimensional image of respective two dimensional data sets. A high-speed, parallel access storage system is provided which stores the multi-dimensional data sets, and a switching circuit routes the data among the processing modules in the computation engine and the storage system. A data acquisition port receives the two dimensional data sets representing projections through an image, for reconstruction algorithms such as encountered in computerized tomography. The processing modules include a programmable local host, by which they may be configured to execute a plurality of different types of multi-dimensional algorithms. The processing modules thus include an image manipulation processor, which includes a source cache, a target cache, a coefficient table, and control software for executing image transformation routines using data in the source cache and the coefficient table and loading resulting data in the target cache. The local host processor operates to load the source cache with a two dimensional data set, loads the coefficient table, and transfers resulting data out of the target cache to the storage system, or to another destination.

  18. Estimation and Selection via Absolute Penalized Convex Minimization And Its Multistage Adaptive Applications

    PubMed Central

    Huang, Jian; Zhang, Cun-Hui

    2013-01-01

    The ℓ1-penalized method, or the Lasso, has emerged as an important tool for the analysis of large data sets. Many important results have been obtained for the Lasso in linear regression which have led to a deeper understanding of high-dimensional statistical problems. In this article, we consider a class of weighted ℓ1-penalized estimators for convex loss functions of a general form, including the generalized linear models. We study the estimation, prediction, selection and sparsity properties of the weighted ℓ1-penalized estimator in sparse, high-dimensional settings where the number of predictors p can be much larger than the sample size n. Adaptive Lasso is considered as a special case. A multistage method is developed to approximate concave regularized estimation by applying an adaptive Lasso recursively. We provide prediction and estimation oracle inequalities for single- and multi-stage estimators, a general selection consistency theorem, and an upper bound for the dimension of the Lasso estimator. Important models including the linear regression, logistic regression and log-linear models are used throughout to illustrate the applications of the general results. PMID:24348100

  19. Optical transitions in two-dimensional topological insulators with point defects

    NASA Astrophysics Data System (ADS)

    Sablikov, Vladimir A.; Sukhanov, Aleksei A.

    2016-12-01

    Nontrivial properties of electronic states in topological insulators are inherent not only to the surface and boundary states, but to bound states localized at structure defects as well. We clarify how the unusual properties of the defect-induced bound states are manifested in optical absorption spectra in two-dimensional topological insulators. The calculations are carried out for defects with short-range potential. We find that the defects give rise to the appearance of specific features in the absorption spectrum, which are an inherent property of topological insulators. They have the form of two or three absorption peaks that are due to intracenter transitions between electron-like and hole-like bound states.

  20. Adaptive Numerical Algorithms in Space Weather Modeling

    NASA Technical Reports Server (NTRS)

    Toth, Gabor; vanderHolst, Bart; Sokolov, Igor V.; DeZeeuw, Darren; Gombosi, Tamas I.; Fang, Fang; Manchester, Ward B.; Meng, Xing; Nakib, Dalal; Powell, Kenneth G.; hide

    2010-01-01

    Space weather describes the various processes in the Sun-Earth system that present danger to human health and technology. The goal of space weather forecasting is to provide an opportunity to mitigate these negative effects. Physics-based space weather modeling is characterized by disparate temporal and spatial scales as well as by different physics in different domains. A multi-physics system can be modeled by a software framework comprising of several components. Each component corresponds to a physics domain, and each component is represented by one or more numerical models. The publicly available Space Weather Modeling Framework (SWMF) can execute and couple together several components distributed over a parallel machine in a flexible and efficient manner. The framework also allows resolving disparate spatial and temporal scales with independent spatial and temporal discretizations in the various models. Several of the computationally most expensive domains of the framework are modeled by the Block-Adaptive Tree Solar wind Roe Upwind Scheme (BATS-R-US) code that can solve various forms of the magnetohydrodynamics (MHD) equations, including Hall, semi-relativistic, multi-species and multi-fluid MHD, anisotropic pressure, radiative transport and heat conduction. Modeling disparate scales within BATS-R-US is achieved by a block-adaptive mesh both in Cartesian and generalized coordinates. Most recently we have created a new core for BATS-R-US: the Block-Adaptive Tree Library (BATL) that provides a general toolkit for creating, load balancing and message passing in a 1, 2 or 3 dimensional block-adaptive grid. We describe the algorithms of BATL and demonstrate its efficiency and scaling properties for various problems. BATS-R-US uses several time-integration schemes to address multiple time-scales: explicit time stepping with fixed or local time steps, partially steady-state evolution, point-implicit, semi-implicit, explicit/implicit, and fully implicit numerical schemes. Depending on the application, we find that different time stepping methods are optimal. Several of the time integration schemes exploit the block-based granularity of the grid structure. The framework and the adaptive algorithms enable physics based space weather modeling and even forecasting.

  1. Modeling, Simulation, and Forecasting of Subseasonal Variability

    NASA Technical Reports Server (NTRS)

    Waliser, Duane; Schubert, Siegfried; Kumar, Arun; Weickmann, Klaus; Dole, Randall

    2003-01-01

    A planning workshop on "Modeling, Simulation and Forecasting of Subseasonal Variability" was held in June 2003. This workshop was the first of a number of meetings planned to follow the NASA-sponsored workshop entitled "Prospects For Improved Forecasts Of Weather And Short-Term Climate Variability On Sub-Seasonal Time Scales" that was held April 2002. The 2002 workshop highlighted a number of key sources of unrealized predictability on subseasonal time scales including tropical heating, soil wetness, the Madden Julian Oscillation (MJO) [a.k.a Intraseasonal Oscillation (ISO)], the Arctic Oscillation (AO) and the Pacific/North American (PNA) pattern. The overarching objective of the 2003 follow-up workshop was to proceed with a number of recommendations made from the 2002 workshop, as well as to set an agenda and collate efforts in the areas of modeling, simulation and forecasting intraseasonal and short-term climate variability. More specifically, the aims of the 2003 workshop were to: 1) develop a baseline of the "state of the art" in subseasonal prediction capabilities, 2) implement a program to carry out experimental subseasonal forecasts, and 3) develop strategies for tapping the above sources of predictability by focusing research, model development, and the development/acquisition of new observations on the subseasonal problem. The workshop was held over two days and was attended by over 80 scientists, modelers, forecasters and agency personnel. The agenda of the workshop focused on issues related to the MJO and tropicalextratropical interactions as they relate to the subseasonal simulation and prediction problem. This included the development of plans for a coordinated set of GCM hindcast experiments to assess current model subseasonal prediction capabilities and shortcomings, an emphasis on developing a strategy to rectify shortcomings associated with tropical intraseasonal variability, namely diabatic processes, and continuing the implementation of an experimental forecast and model development program that focuses on one of the key sources of untapped predictability, namely the MJO. The tangible outcomes of the meeting included: 1) the development of a recommended framework for a set of multi-year ensembles of 45-day hindcasts to be carried out by a number of GCMs so that they can be analyzed in regards to their representations of subseasonal variability, predictability and forecast skill, 2) an assessment of the present status of GCM representations of the MJO and recommendations for future steps to take in order to remedy the remaining shortcomings in these representations, and 3) a final implementation plan for a multi-institute/multi-nation Experimental MJO Prediction Program.

  2. Development of a Decision Support System for Monitoring, Reporting, Forecasting Ecological Conditions of the Appalachian Trail

    Treesearch

    Y. Wang; R. Nemani; F. Dieffenbach; K. Stolte; G. Holcomb

    2010-01-01

    This paper introduces a collaborative multi-agency effort to develop an Appalachian Trail (A.T.) MEGA-Transect Decision Support System (DSS) for monitoring, reporting and forecasting ecological conditions of the A.T. and the surrounding lands. The project is to improve decision-making on management of the A.T. by providing a coherent framework for data integration,...

  3. Multi-dimensional scores to predict mortality in patients with idiopathic pulmonary fibrosis undergoing lung transplantation assessment.

    PubMed

    Fisher, Jolene H; Al-Hejaili, Faris; Kandel, Sonja; Hirji, Alim; Shapera, Shane; Mura, Marco

    2017-04-01

    The heterogeneous progression of idiopathic pulmonary fibrosis (IPF) makes prognostication difficult and contributes to high mortality on the waitlist for lung transplantation (LTx). Multi-dimensional scores (Composite Physiologic index [CPI], [Gender-Age-Physiology [GAP]; RIsk Stratification scorE [RISE]) demonstrated enhanced predictive power towards outcome in IPF. The lung allocation score (LAS) is a multi-dimensional tool commonly used to stratify patients assessed for LTx. We sought to investigate whether IPF-specific multi-dimensional scores predict mortality in patients with IPF assessed for LTx. The study included 302 patients with IPF who underwent a LTx assessment (2003-2014). Multi-dimensional scores were calculated. The primary outcome was 12-month mortality after assessment. LTx was considered as competing event in all analyses. At the end of the observation period, there were 134 transplants, 63 deaths, and 105 patients were alive without LTx. Multi-dimensional scores predicted mortality with accuracy similar to LAS, and superior to that of individual variables: area under the curve (AUC) for LAS was 0.78 (sensitivity 71%, specificity 86%); CPI 0.75 (sensitivity 67%, specificity 82%); GAP 0.67 (sensitivity 59%, specificity 74%); RISE 0.78 (sensitivity 71%, specificity 84%). A separate analysis conducted only in patients actively listed for LTx (n = 247; 50 deaths) yielded similar results. In patients with IPF assessed for LTx as well as in those actually listed, multi-dimensional scores predict mortality better than individual variables, and with accuracy similar to the LAS. If validated, multi-dimensional scores may serve as inexpensive tools to guide decisions on the timing of referral and listing for LTx. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. An application of ensemble/multi model approach for wind power production forecast.

    NASA Astrophysics Data System (ADS)

    Alessandrini, S.; Decimi, G.; Hagedorn, R.; Sperati, S.

    2010-09-01

    The wind power forecast of the 3 days ahead period are becoming always more useful and important in reducing the problem of grid integration and energy price trading due to the increasing wind power penetration. Therefore it's clear that the accuracy of this forecast is one of the most important requirements for a successful application. The wind power forecast is based on a mesoscale meteorological models that provides the 3 days ahead wind data. A Model Output Statistic correction is then performed to reduce systematic error caused, for instance, by a wrong representation of surface roughness or topography in the meteorological models. The corrected wind data are then used as input in the wind farm power curve to obtain the power forecast. These computations require historical time series of wind measured data (by an anemometer located in the wind farm or on the nacelle) and power data in order to be able to perform the statistical analysis on the past. For this purpose a Neural Network (NN) is trained on the past data and then applied in the forecast task. Considering that the anemometer measurements are not always available in a wind farm a different approach has also been adopted. A training of the NN to link directly the forecasted meteorological data and the power data has also been performed. The normalized RMSE forecast error seems to be lower in most cases by following the second approach. We have examined two wind farms, one located in Denmark on flat terrain and one located in a mountain area in the south of Italy (Sicily). In both cases we compare the performances of a prediction based on meteorological data coming from a single model with those obtained by using two or more models (RAMS, ECMWF deterministic, LAMI, HIRLAM). It is shown that the multi models approach reduces the day-ahead normalized RMSE forecast error of at least 1% compared to the singles models approach. Moreover the use of a deterministic global model, (e.g. ECMWF deterministic model) seems to reach similar level of accuracy of those of the mesocale models (LAMI and RAMS). Finally we have focused on the possibility of using the ensemble model (ECMWF) to estimate the hourly, three days ahead, power forecast accuracy. Contingency diagram between RMSE of the deterministic power forecast and the ensemble members spread of wind forecast have been produced. From this first analysis it seems that ensemble spread could be used as an indicator of the forecast's accuracy at least for the first day ahead period. In fact low spreads often correspond to low forecast error. For longer forecast horizon the correlation between RMSE and ensemble spread decrease becoming too low to be used for this purpose.

  5. Sample-Based Motion Planning in High-Dimensional and Differentially-Constrained Systems

    DTIC Science & Technology

    2010-02-01

    Reachable Set . . . 88 6-1 LittleDog Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 6-2 Dog bounding up stairs ...planning algorithm implemented on LittleDog, a quadruped robot . The motion planning algorithm successfully planned bounding trajectories over extremely...a motion planning algorithm implemented on LittleDog, a quadruped robot . The motion planning algorithm successfully planned bounding trajectories

  6. 3D cloud detection and tracking system for solar forecast using multiple sky imagers

    DOE PAGES

    Peng, Zhenzhou; Yu, Dantong; Huang, Dong; ...

    2015-06-23

    We propose a system for forecasting short-term solar irradiance based on multiple total sky imagers (TSIs). The system utilizes a novel method of identifying and tracking clouds in three-dimensional space and an innovative pipeline for forecasting surface solar irradiance based on the image features of clouds. First, we develop a supervised classifier to detect clouds at the pixel level and output cloud mask. In the next step, we design intelligent algorithms to estimate the block-wise base height and motion of each cloud layer based on images from multiple TSIs. Thus, this information is then applied to stitch images together intomore » larger views, which are then used for solar forecasting. We examine the system’s ability to track clouds under various cloud conditions and investigate different irradiance forecast models at various sites. We confirm that this system can 1) robustly detect clouds and track layers, and 2) extract the significant global and local features for obtaining stable irradiance forecasts with short forecast horizons from the obtained images. Finally, we vet our forecasting system at the 32-megawatt Long Island Solar Farm (LISF). Compared with the persistent model, our system achieves at least a 26% improvement for all irradiance forecasts between one and fifteen minutes.« less

  7. Evaluation of precipitation forecasts from 3D-Var and hybrid GSI-based system during Indian summer monsoon 2015

    NASA Astrophysics Data System (ADS)

    Singh, Sanjeev Kumar; Prasad, V. S.

    2018-02-01

    This paper presents a systematic investigation of medium-range rainfall forecasts from two versions of the National Centre for Medium Range Weather Forecasting (NCMRWF)-Global Forecast System based on three-dimensional variational (3D-Var) and hybrid analysis system namely, NGFS and HNGFS, respectively, during Indian summer monsoon (June-September) 2015. The NGFS uses gridpoint statistical interpolation (GSI) 3D-Var data assimilation system, whereas HNGFS uses hybrid 3D ensemble-variational scheme. The analysis includes the evaluation of rainfall fields and comparisons of rainfall using statistical score such as mean precipitation, bias, correlation coefficient, root mean square error and forecast improvement factor. In addition to these, categorical scores like Peirce skill score and bias score are also computed to describe particular aspects of forecasts performance. The comparison results of mean precipitation reveal that both the versions of model produced similar large-scale feature of Indian summer monsoon rainfall for day-1 through day-5 forecasts. The inclusion of fully flow-dependent background error covariance significantly improved the wet biases in HNGFS over the Indian Ocean. The forecast improvement factor and Peirce skill score in the HNGFS have also found better than NGFS for day-1 through day-5 forecasts.

  8. Bias correction of satellite precipitation products for flood forecasting application at the Upper Mahanadi River Basin in Eastern India

    NASA Astrophysics Data System (ADS)

    Beria, H.; Nanda, T., Sr.; Chatterjee, C.

    2015-12-01

    High resolution satellite precipitation products such as Tropical Rainfall Measuring Mission (TRMM), Climate Forecast System Reanalysis (CFSR), European Centre for Medium-Range Weather Forecasts (ECMWF), etc., offer a promising alternative to flood forecasting in data scarce regions. At the current state-of-art, these products cannot be used in the raw form for flood forecasting, even at smaller lead times. In the current study, these precipitation products are bias corrected using statistical techniques, such as additive and multiplicative bias corrections, and wavelet multi-resolution analysis (MRA) with India Meteorological Department (IMD) gridded precipitation product,obtained from gauge-based rainfall estimates. Neural network based rainfall-runoff modeling using these bias corrected products provide encouraging results for flood forecasting upto 48 hours lead time. We will present various statistical and graphical interpretations of catchment response to high rainfall events using both the raw and bias corrected precipitation products at different lead times.

  9. How fast can a black hole rotate?

    NASA Astrophysics Data System (ADS)

    Herdeiro, Carlos A. R.; Radu, Eugen

    2015-11-01

    Kerr black holes (BHs) have their angular momentum, J, bounded by their mass, M: Jc ≤ GM2. There are, however, known BH solutions violating this Kerr bound. We propose a very simple universal bound on the rotation, rather than on the angular momentum, of four-dimensional, stationary and axisymmetric, asymptotically flat BHs, given in terms of an appropriately defined horizon linear velocity, vH. The vH bound is simply that vH cannot exceed the velocity of light. We verify the vH bound for known BH solutions, including some that violate the Kerr bound, and conjecture that only extremal Kerr BHs saturate the vH bound.

  10. Design and implementation of space physics multi-model application integration based on web

    NASA Astrophysics Data System (ADS)

    Jiang, Wenping; Zou, Ziming

    With the development of research on space environment and space science, how to develop network online computing environment of space weather, space environment and space physics models for Chinese scientific community is becoming more and more important in recent years. Currently, There are two software modes on space physics multi-model application integrated system (SPMAIS) such as C/S and B/S. the C/S mode which is traditional and stand-alone, demands a team or workshop from many disciplines and specialties to build their own multi-model application integrated system, that requires the client must be deployed in different physical regions when user visits the integrated system. Thus, this requirement brings two shortcomings: reducing the efficiency of researchers who use the models to compute; inconvenience of accessing the data. Therefore, it is necessary to create a shared network resource access environment which could help users to visit the computing resources of space physics models through the terminal quickly for conducting space science research and forecasting spatial environment. The SPMAIS develops high-performance, first-principles in B/S mode based on computational models of the space environment and uses these models to predict "Space Weather", to understand space mission data and to further our understanding of the solar system. the main goal of space physics multi-model application integration system (SPMAIS) is to provide an easily and convenient user-driven online models operating environment. up to now, the SPMAIS have contained dozens of space environment models , including international AP8/AE8 IGRF T96 models and solar proton prediction model geomagnetic transmission model etc. which are developed by Chinese scientists. another function of SPMAIS is to integrate space observation data sets which offers input data for models online high-speed computing. In this paper, service-oriented architecture (SOA) concept that divides system into independent modules according to different business needs is applied to solve the problem of the independence of the physical space between multiple models. The classic MVC(Model View Controller) software design pattern is concerned to build the architecture of space physics multi-model application integrated system. The JSP+servlet+javabean technology is used to integrate the web application programs of space physics multi-model. It solves the problem of multi-user requesting the same job of model computing and effectively balances each server computing tasks. In addition, we also complete follow tasks: establishing standard graphical user interface based on Java Applet application program; Designing the interface between model computing and model computing results visualization; Realizing three-dimensional network visualization without plug-ins; Using Java3D technology to achieve a three-dimensional network scene interaction; Improved ability to interact with web pages and dynamic execution capabilities, including rendering three-dimensional graphics, fonts and color control. Through the design and implementation of the SPMAIS based on Web, we provide an online computing and application runtime environment of space physics multi-model. The practical application improves that researchers could be benefit from our system in space physics research and engineering applications.

  11. Using subseasonal-to-seasonal (S2S) extreme rainfall forecasts for extended-range flood prediction in Australia

    NASA Astrophysics Data System (ADS)

    White, C. J.; Franks, S. W.; McEvoy, D.

    2015-06-01

    Meteorological and hydrological centres around the world are looking at ways to improve their capacity to be able to produce and deliver skilful and reliable forecasts of high-impact extreme rainfall and flooding events on a range of prediction timescales (e.g. sub-daily, daily, multi-week, seasonal). Making improvements to extended-range rainfall and flood forecast models, assessing forecast skill and uncertainty, and exploring how to apply flood forecasts and communicate their benefits to decision-makers are significant challenges facing the forecasting and water resources management communities. This paper presents some of the latest science and initiatives from Australia on the development, application and communication of extreme rainfall and flood forecasts on the extended-range "subseasonal-to-seasonal" (S2S) forecasting timescale, with a focus on risk-based decision-making, increasing flood risk awareness and preparedness, capturing uncertainty, understanding human responses to flood forecasts and warnings, and the growing adoption of "climate services". The paper also demonstrates how forecasts of flood events across a range of prediction timescales could be beneficial to a range of sectors and society, most notably for disaster risk reduction (DRR) activities, emergency management and response, and strengthening community resilience. Extended-range S2S extreme flood forecasts, if presented as easily accessible, timely and relevant information are a valuable resource to help society better prepare for, and subsequently cope with, extreme flood events.

  12. Pedagogical Factors Stimulating the Self-Development of Students' Multi-Dimensional Thinking in Terms of Subject-Oriented Teaching

    ERIC Educational Resources Information Center

    Andreev, Valentin I.

    2014-01-01

    The main aim of this research is to disclose the essence of students' multi-dimensional thinking, also to reveal the rating of factors which stimulate the raising of effectiveness of self-development of students' multi-dimensional thinking in terms of subject-oriented teaching. Subject-oriented learning is characterized as a type of learning where…

  13. Should One Use the Ray-by-Ray Approximation in Core-Collapse Supernova Simulations?

    DOE PAGES

    Skinner, M. Aaron; Burrows, Adam; Dolence, Joshua C.

    2016-10-28

    We perform the first self-consistent, time-dependent, multi-group calculations in two dimensions (2D) to address the consequences of using the ray-by-ray+ transport simplification in core-collapse supernova simulations. Such a dimensional reduction is employed by many researchers to facilitate their resource-intensive calculations. Our new code (Fornax) implements multi-D transport, and can, by zeroing out transverse flux terms, emulate the ray-by-ray+ scheme. Using the same microphysics, initial models, resolution, and code, we compare the results of simulating 12-, 15-, 20-, and 25-M⊙ progenitor models using these two transport methods. Our findings call into question the wisdom of the pervasive use of the ray-by-ray+more » approach. Employing it leads to maximum post-bounce/preexplosion shock radii that are almost universally larger by tens of kilometers than those derived using the more accurate scheme, typically leaving the post-bounce matter less bound and artificially more “explodable.” In fact, for our 25-M⊙ progenitor, the ray-by-ray+ model explodes, while the corresponding multi-D transport model does not. Therefore, in two dimensions the combination of ray-by-ray+ with the axial sloshing hydrodynamics that is a feature of 2D supernova dynamics can result in quantitatively, and perhaps qualitatively, incorrect results.« less

  14. Should One Use the Ray-by-Ray Approximation in Core-collapse Supernova Simulations?

    NASA Astrophysics Data System (ADS)

    Skinner, M. Aaron; Burrows, Adam; Dolence, Joshua C.

    2016-11-01

    We perform the first self-consistent, time-dependent, multi-group calculations in two dimensions (2D) to address the consequences of using the ray-by-ray+ transport simplification in core-collapse supernova simulations. Such a dimensional reduction is employed by many researchers to facilitate their resource-intensive calculations. Our new code (Fornax) implements multi-D transport, and can, by zeroing out transverse flux terms, emulate the ray-by-ray+ scheme. Using the same microphysics, initial models, resolution, and code, we compare the results of simulating 12, 15, 20, and 25 M ⊙ progenitor models using these two transport methods. Our findings call into question the wisdom of the pervasive use of the ray-by-ray+ approach. Employing it leads to maximum post-bounce/pre-explosion shock radii that are almost universally larger by tens of kilometers than those derived using the more accurate scheme, typically leaving the post-bounce matter less bound and artificially more “explodable.” In fact, for our 25 M ⊙ progenitor, the ray-by-ray+ model explodes, while the corresponding multi-D transport model does not. Therefore, in two dimensions, the combination of ray-by-ray+ with the axial sloshing hydrodynamics that is a feature of 2D supernova dynamics can result in quantitatively, and perhaps qualitatively, incorrect results.

  15. Total Lightning and Radar Storm Characteristics Associated with Severe Storms in Central Florida

    NASA Technical Reports Server (NTRS)

    Goodman, Steven J; Raghavan, R.; Buechler, Dennis; Hodanish, S.; Sharp, D.; Williams, E.; Boldi, B.; Matlin, A.; Weber, M.

    1998-01-01

    This paper examines the three dimensional characteristics of lightning flashes and severe storms observed in Central Florida during 1997-1998. The lightning time history of severe and tornadic storms were captured during the on-going ground validation campaign supporting the Lightning Imaging Sensor (LIS) experiment on the Tropical Rainfall Measuring Mission (TRMM). The ground validation campaign is a collaborative experiment that began in 1997 and involves scientists at the Global Hydrology and Climate Center, MIT/Lincoln Laboratories, and the NWS Forecast Office at Melbourne, FL. Lightning signatures that may provide potential early warning of severe storms are being evaluated by the forecasters at the NWS/MLB office. Severe storms with extreme flash rates sometimes exceeding 300 per minute and accompanying rapid increases in flash rate prior to the onset of the severe weather (hall, damaging winds, tornadoes) have been reported by Hodanish et al. and Williams et al. (1998-this conference). We examine the co-evolving changes in storm structure (mass, echo top, shear, latent heat release) and kinematics associated with these extreme and rapid flash rate changes over time. The flash frequency and density are compared with the three dimensional radar reflectivity structure of the storm to help interpret the possible mechanisms producing the extreme and rapidly increasing flash rates. For two tornadic storms examined thus far, we find the burst of lightning is associated with the development of upper level rotation in the storm. In one case, the lightning burst follows the formation of a bounded weak echo region (BWER). The flash rates diminish with time as the rotation develops to the ground in conjunction with the decent of the reflectivity core. Our initial findings suggest the dramatic increase of flash rates is associated with a sudden and dramatic increase in storm updraft intensity which we hypothesize is stretching vertical vorticity as well as enhancing the development of the mixed phase region of the storm. We discuss the importance of these factors in producing both the observed extreme flash rates and the severe weather that follows in these storms and others to be presented.

  16. Interval forecasting of cyberattack intensity on informatization objects of industry using probability cluster model

    NASA Astrophysics Data System (ADS)

    Krakovsky, Y. M.; Luzgin, A. N.; Mikhailova, E. A.

    2018-05-01

    At present, cyber-security issues associated with the informatization objects of industry occupy one of the key niches in the state management system. As a result of functional disruption of these systems via cyberattacks, an emergency may arise related to loss of life, environmental disasters, major financial and economic damage, or disrupted activities of cities and settlements. When cyberattacks occur with high intensity, in these conditions there is the need to develop protection against them, based on machine learning methods. This paper examines interval forecasting and presents results with a pre-set intensity level. The interval forecasting is carried out based on a probabilistic cluster model. This method involves forecasting of one of the two predetermined intervals in which a future value of the indicator will be located; probability estimates are used for this purpose. A dividing bound of these intervals is determined by a calculation method based on statistical characteristics of the indicator. Source data are used that includes a number of hourly cyberattacks using a honeypot from March to September 2013.

  17. Mathematical model comparing of the multi-level economics systems

    NASA Astrophysics Data System (ADS)

    Brykalov, S. M.; Kryanev, A. V.

    2017-12-01

    The mathematical model (scheme) of a multi-level comparison of the economic system, characterized by the system of indices, is worked out. In the mathematical model of the multi-level comparison of the economic systems, the indicators of peer review and forecasting of the economic system under consideration can be used. The model can take into account the uncertainty in the estimated values of the parameters or expert estimations. The model uses the multi-criteria approach based on the Pareto solutions.

  18. On the balancing of structural and acoustic performance of a sandwich panel based on topology, property, and size optimization

    NASA Astrophysics Data System (ADS)

    Cameron, Christopher J.; Lind Nordgren, Eleonora; Wennhage, Per; Göransson, Peter

    2014-06-01

    Balancing structural and acoustic performance of a multi-layered sandwich panel is a formidable undertaking. Frequently the gains achieved in terms of reduced weight, still meeting the structural design requirements, are lost by the changes necessary to regain acceptable acoustic performance. To alleviate this, a design method for a multifunctional load bearing vehicle body panel is proposed which attempts to achieve a balance between structural and acoustic performance. The approach is based on numerical modelling of the structural and acoustic behaviour in a combined topology, size, and property optimization in order to achieve a three dimensional optimal distribution of structural and acoustic foam materials within the bounding surfaces of a sandwich panel. In particular the effects of the coupling between one of the bounding surface face sheets and acoustic foam are examined for its impact on both the structural and acoustic overall performance of the panel. The results suggest a potential in introducing an air gap between the acoustic foam parts and one of the face sheets, provided that the structural design constraints are met without prejudicing the layout of the different foam types.

  19. On decentralized adaptive full-order sliding mode control of multiple UAVs.

    PubMed

    Xiang, Xianbo; Liu, Chao; Su, Housheng; Zhang, Qin

    2017-11-01

    In this study, a novel decentralized adaptive full-order sliding mode control framework is proposed for the robust synchronized formation motion of multiple unmanned aerial vehicles (UAVs) subject to system uncertainty. First, a full-order sliding mode surface in a decentralized manner is designed to incorporate both the individual position tracking error and the synchronized formation error while the UAV group is engaged in building a certain desired geometric pattern in three dimensional space. Second, a decentralized virtual plant controller is constructed which allows the embedded low-pass filter to attain the chattering free property of the sliding mode controller. In addition, robust adaptive technique is integrated in the decentralized chattering free sliding control design in order to handle unknown bounded uncertainties, without requirements for assuming a priori knowledge of bounds on the system uncertainties as stated in conventional chattering free control methods. Subsequently, system robustness as well as stability of the decentralized full-order sliding mode control of multiple UAVs is synthesized. Numerical simulation results illustrate the effectiveness of the proposed control framework to achieve robust 3D formation flight of the multi-UAV system. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Evidencing Learning Outcomes: A Multi-Level, Multi-Dimensional Course Alignment Model

    ERIC Educational Resources Information Center

    Sridharan, Bhavani; Leitch, Shona; Watty, Kim

    2015-01-01

    This conceptual framework proposes a multi-level, multi-dimensional course alignment model to implement a contextualised constructive alignment of rubric design that authentically evidences and assesses learning outcomes. By embedding quality control mechanisms at each level for each dimension, this model facilitates the development of an aligned…

  1. Generalization of information-based concepts in forecast verification

    NASA Astrophysics Data System (ADS)

    Tödter, J.; Ahrens, B.

    2012-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.

  2. Relating anomaly correlation to lead time: Clustering analysis of CFSv2 forecasts of summer precipitation in China

    NASA Astrophysics Data System (ADS)

    Zhao, Tongtiegang; Liu, Pan; Zhang, Yongyong; Ruan, Chengqing

    2017-09-01

    Global climate model (GCM) forecasts are an integral part of long-range hydroclimatic forecasting. We propose to use clustering to explore anomaly correlation, which indicates the performance of raw GCM forecasts, in the three-dimensional space of latitude, longitude, and initialization time. Focusing on a certain period of the year, correlations for forecasts initialized at different preceding periods form a vector. The vectors of anomaly correlation across different GCM grid cells are clustered to reveal how GCM forecasts perform as time progresses. Through the case study of Climate Forecast System Version 2 (CFSv2) forecasts of summer precipitation in China, we observe that the correlation at a certain cell oscillates with lead time and can become negative. The use of clustering reveals two meaningful patterns that characterize the relationship between anomaly correlation and lead time. For some grid cells in Central and Southwest China, CFSv2 forecasts exhibit positive correlations with observations and they tend to improve as time progresses. This result suggests that CFSv2 forecasts tend to capture the summer precipitation induced by the East Asian monsoon and the South Asian monsoon. It also indicates that CFSv2 forecasts can potentially be applied to improving hydrological forecasts in these regions. For some other cells, the correlations are generally close to zero at different lead times. This outcome implies that CFSv2 forecasts still have plenty of room for further improvement. The robustness of the patterns has been tested using both hierarchical clustering and k-means clustering and examined with the Silhouette score.

  3. Mesoscale Modeling, Forecasting and Remote Sensing Research.

    DTIC Science & Technology

    remote sensing , cyclonic scale diagnostic studies and mesoscale numerical modeling and forecasting are summarized. Mechanisms involved in the release of potential instability are discussed and simulated quantitatively, giving particular attention to the convective formulation. The basic mesoscale model is documented including the equations, boundary condition, finite differences and initialization through an idealized frontal zone. Results of tests including a three dimensional test with real data, tests of convective/mesoscale interaction and tests with a detailed

  4. Assessing probabilistic predictions of ENSO phase and intensity from the North American Multimodel Ensemble

    NASA Astrophysics Data System (ADS)

    Tippett, Michael K.; Ranganathan, Meghana; L'Heureux, Michelle; Barnston, Anthony G.; DelSole, Timothy

    2017-05-01

    Here we examine the skill of three, five, and seven-category monthly ENSO probability forecasts (1982-2015) from single and multi-model ensemble integrations of the North American Multimodel Ensemble (NMME) project. Three-category forecasts are typical and provide probabilities for the ENSO phase (El Niño, La Niña or neutral). Additional forecast categories indicate the likelihood of ENSO conditions being weak, moderate or strong. The level of skill observed for differing numbers of forecast categories can help to determine the appropriate degree of forecast precision. However, the dependence of the skill score itself on the number of forecast categories must be taken into account. For reliable forecasts with same quality, the ranked probability skill score (RPSS) is fairly insensitive to the number of categories, while the logarithmic skill score (LSS) is an information measure and increases as categories are added. The ignorance skill score decreases to zero as forecast categories are added, regardless of skill level. For all models, forecast formats and skill scores, the northern spring predictability barrier explains much of the dependence of skill on target month and forecast lead. RPSS values for monthly ENSO forecasts show little dependence on the number of categories. However, the LSS of multimodel ensemble forecasts with five and seven categories show statistically significant advantages over the three-category forecasts for the targets and leads that are least affected by the spring predictability barrier. These findings indicate that current prediction systems are capable of providing more detailed probabilistic forecasts of ENSO phase and amplitude than are typically provided.

  5. Application of a medium-range global hydrologic probabilistic forecast scheme to the Ohio River Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voisin, Nathalie; Pappenberger, Florian; Lettenmaier, D. P.

    2011-08-15

    A 10-day globally applicable flood prediction scheme was evaluated using the Ohio River basin as a test site for the period 2003-2007. The Variable Infiltration Capacity (VIC) hydrology model was initialized with the European Centre for Medium Range Weather Forecasts (ECMWF) analysis temperatures and wind, and Tropical Rainfall Monitoring Mission Multi Satellite Precipitation Analysis (TMPA) precipitation up to the day of forecast. In forecast mode, the VIC model was then forced with a calibrated and statistically downscaled ECMWF ensemble prediction system (EPS) 10-day ensemble forecast. A parallel set up was used where ECMWF EPS forecasts were interpolated to the spatialmore » scale of the hydrology model. Each set of forecasts was extended by 5 days using monthly mean climatological variables and zero precipitation in order to account for the effect of initial conditions. The 15-day spatially distributed ensemble runoff forecasts were then routed to four locations in the basin, each with different drainage areas. Surrogates for observed daily runoff and flow were provided by the reference run, specifically VIC simulation forced with ECMWF analysis fields and TMPA precipitation fields. The flood prediction scheme using the calibrated and downscaled ECMWF EPS forecasts was shown to be more accurate and reliable than interpolated forecasts for both daily distributed runoff forecasts and daily flow forecasts. Initial and antecedent conditions dominated the flow forecasts for lead times shorter than the time of concentration depending on the flow forecast amounts and the drainage area sizes. The flood prediction scheme had useful skill for the 10 following days at all sites.« less

  6. Interacting quantum walkers: two-body bosonic and fermionic bound states

    NASA Astrophysics Data System (ADS)

    Krapivsky, P. L.; Luck, J. M.; Mallick, K.

    2015-11-01

    We investigate the dynamics of bound states of two interacting particles, either bosons or fermions, performing a continuous-time quantum walk on a one-dimensional lattice. We consider the situation where the distance between both particles has a hard bound, and the richer situation where the particles are bound by a smooth confining potential. The main emphasis is on the velocity characterizing the ballistic spreading of these bound states, and on the structure of the asymptotic distribution profile of their center-of-mass coordinate. The latter profile generically exhibits many internal fronts.

  7. Image matrix processor for fast multi-dimensional computations

    DOEpatents

    Roberson, G.P.; Skeate, M.F.

    1996-10-15

    An apparatus for multi-dimensional computation is disclosed which comprises a computation engine, including a plurality of processing modules. The processing modules are configured in parallel and compute respective contributions to a computed multi-dimensional image of respective two dimensional data sets. A high-speed, parallel access storage system is provided which stores the multi-dimensional data sets, and a switching circuit routes the data among the processing modules in the computation engine and the storage system. A data acquisition port receives the two dimensional data sets representing projections through an image, for reconstruction algorithms such as encountered in computerized tomography. The processing modules include a programmable local host, by which they may be configured to execute a plurality of different types of multi-dimensional algorithms. The processing modules thus include an image manipulation processor, which includes a source cache, a target cache, a coefficient table, and control software for executing image transformation routines using data in the source cache and the coefficient table and loading resulting data in the target cache. The local host processor operates to load the source cache with a two dimensional data set, loads the coefficient table, and transfers resulting data out of the target cache to the storage system, or to another destination. 10 figs.

  8. High-order fuzzy time-series based on multi-period adaptation model for forecasting stock markets

    NASA Astrophysics Data System (ADS)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Teoh, Hia-Jong

    2008-02-01

    Stock investors usually make their short-term investment decisions according to recent stock information such as the late market news, technical analysis reports, and price fluctuations. To reflect these short-term factors which impact stock price, this paper proposes a comprehensive fuzzy time-series, which factors linear relationships between recent periods of stock prices and fuzzy logical relationships (nonlinear relationships) mined from time-series into forecasting processes. In empirical analysis, the TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) and HSI (Heng Seng Index) are employed as experimental datasets, and four recent fuzzy time-series models, Chen’s (1996), Yu’s (2005), Cheng’s (2006) and Chen’s (2007), are used as comparison models. Besides, to compare with conventional statistic method, the method of least squares is utilized to estimate the auto-regressive models of the testing periods within the databases. From analysis results, the performance comparisons indicate that the multi-period adaptation model, proposed in this paper, can effectively improve the forecasting performance of conventional fuzzy time-series models which only factor fuzzy logical relationships in forecasting processes. From the empirical study, the traditional statistic method and the proposed model both reveal that stock price patterns in the Taiwan stock and Hong Kong stock markets are short-term.

  9. Development of Parallel Code for the Alaska Tsunami Forecast Model

    NASA Astrophysics Data System (ADS)

    Bahng, B.; Knight, W. R.; Whitmore, P.

    2014-12-01

    The Alaska Tsunami Forecast Model (ATFM) is a numerical model used to forecast propagation and inundation of tsunamis generated by earthquakes and other means in both the Pacific and Atlantic Oceans. At the U.S. National Tsunami Warning Center (NTWC), the model is mainly used in a pre-computed fashion. That is, results for hundreds of hypothetical events are computed before alerts, and are accessed and calibrated with observations during tsunamis to immediately produce forecasts. ATFM uses the non-linear, depth-averaged, shallow-water equations of motion with multiply nested grids in two-way communications between domains of each parent-child pair as waves get closer to coastal waters. Even with the pre-computation the task becomes non-trivial as sub-grid resolution gets finer. Currently, the finest resolution Digital Elevation Models (DEM) used by ATFM are 1/3 arc-seconds. With a serial code, large or multiple areas of very high resolution can produce run-times that are unrealistic even in a pre-computed approach. One way to increase the model performance is code parallelization used in conjunction with a multi-processor computing environment. NTWC developers have undertaken an ATFM code-parallelization effort to streamline the creation of the pre-computed database of results with the long term aim of tsunami forecasts from source to high resolution shoreline grids in real time. Parallelization will also permit timely regeneration of the forecast model database with new DEMs; and, will make possible future inclusion of new physics such as the non-hydrostatic treatment of tsunami propagation. The purpose of our presentation is to elaborate on the parallelization approach and to show the compute speed increase on various multi-processor systems.

  10. Shared Investment Projects and Forecasting Errors: Setting Framework Conditions for Coordination and Sequencing Data Quality Activities

    PubMed Central

    Leitner, Stephan; Brauneis, Alexander; Rausch, Alexandra

    2015-01-01

    In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments’ efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that—in some setups—a certain extent of misforecasting is desirable from the firm’s point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that—in particular for relatively good forecasters—most of our results are robust to changes in setting the parameters of our multi-agent simulation model. PMID:25803736

  11. Evaluation of the skill of North-American Multi-Model Ensemble (NMME) Global Climate Models in predicting average and extreme precipitation and temperature over the continental USA

    NASA Astrophysics Data System (ADS)

    Slater, Louise J.; Villarini, Gabriele; Bradley, Allen A.

    2016-08-01

    This paper examines the forecasting skill of eight Global Climate Models from the North-American Multi-Model Ensemble project (CCSM3, CCSM4, CanCM3, CanCM4, GFDL2.1, FLORb01, GEOS5, and CFSv2) over seven major regions of the continental United States. The skill of the monthly forecasts is quantified using the mean square error skill score. This score is decomposed to assess the accuracy of the forecast in the absence of biases (potential skill) and in the presence of conditional (slope reliability) and unconditional (standardized mean error) biases. We summarize the forecasting skill of each model according to the initialization month of the forecast and lead time, and test the models' ability to predict extended periods of extreme climate conducive to eight `billion-dollar' historical flood and drought events. Results indicate that the most skillful predictions occur at the shortest lead times and decline rapidly thereafter. Spatially, potential skill varies little, while actual model skill scores exhibit strong spatial and seasonal patterns primarily due to the unconditional biases in the models. The conditional biases vary little by model, lead time, month, or region. Overall, we find that the skill of the ensemble mean is equal to or greater than that of any of the individual models. At the seasonal scale, the drought events are better forecast than the flood events, and are predicted equally well in terms of high temperature and low precipitation. Overall, our findings provide a systematic diagnosis of the strengths and weaknesses of the eight models over a wide range of temporal and spatial scales.

  12. Volcanic Ash Data Assimilation System for Atmospheric Transport Model

    NASA Astrophysics Data System (ADS)

    Ishii, K.; Shimbori, T.; Sato, E.; Tokumoto, T.; Hayashi, Y.; Hashimoto, A.

    2017-12-01

    The Japan Meteorological Agency (JMA) has two operations for volcanic ash forecasts, which are Volcanic Ash Fall Forecast (VAFF) and Volcanic Ash Advisory (VAA). In these operations, the forecasts are calculated by atmospheric transport models including the advection process, the turbulent diffusion process, the gravitational fall process and the deposition process (wet/dry). The initial distribution of volcanic ash in the models is the most important but uncertain factor. In operations, the model of Suzuki (1983) with many empirical assumptions is adopted to the initial distribution. This adversely affects the reconstruction of actual eruption plumes.We are developing a volcanic ash data assimilation system using weather radars and meteorological satellite observation, in order to improve the initial distribution of the atmospheric transport models. Our data assimilation system is based on the three-dimensional variational data assimilation method (3D-Var). Analysis variables are ash concentration and size distribution parameters which are mutually independent. The radar observation is expected to provide three-dimensional parameters such as ash concentration and parameters of ash particle size distribution. On the other hand, the satellite observation is anticipated to provide two-dimensional parameters of ash clouds such as mass loading, top height and particle effective radius. In this study, we estimate the thickness of ash clouds using vertical wind shear of JMA numerical weather prediction, and apply for the volcanic ash data assimilation system.

  13. A stock market forecasting model combining two-directional two-dimensional principal component analysis and radial basis function neural network.

    PubMed

    Guo, Zhiqiang; Wang, Huaiqing; Yang, Jie; Miller, David J

    2015-01-01

    In this paper, we propose and implement a hybrid model combining two-directional two-dimensional principal component analysis ((2D)2PCA) and a Radial Basis Function Neural Network (RBFNN) to forecast stock market behavior. First, 36 stock market technical variables are selected as the input features, and a sliding window is used to obtain the input data of the model. Next, (2D)2PCA is utilized to reduce the dimension of the data and extract its intrinsic features. Finally, an RBFNN accepts the data processed by (2D)2PCA to forecast the next day's stock price or movement. The proposed model is used on the Shanghai stock market index, and the experiments show that the model achieves a good level of fitness. The proposed model is then compared with one that uses the traditional dimension reduction method principal component analysis (PCA) and independent component analysis (ICA). The empirical results show that the proposed model outperforms the PCA-based model, as well as alternative models based on ICA and on the multilayer perceptron.

  14. A Stock Market Forecasting Model Combining Two-Directional Two-Dimensional Principal Component Analysis and Radial Basis Function Neural Network

    PubMed Central

    Guo, Zhiqiang; Wang, Huaiqing; Yang, Jie; Miller, David J.

    2015-01-01

    In this paper, we propose and implement a hybrid model combining two-directional two-dimensional principal component analysis ((2D)2PCA) and a Radial Basis Function Neural Network (RBFNN) to forecast stock market behavior. First, 36 stock market technical variables are selected as the input features, and a sliding window is used to obtain the input data of the model. Next, (2D)2PCA is utilized to reduce the dimension of the data and extract its intrinsic features. Finally, an RBFNN accepts the data processed by (2D)2PCA to forecast the next day's stock price or movement. The proposed model is used on the Shanghai stock market index, and the experiments show that the model achieves a good level of fitness. The proposed model is then compared with one that uses the traditional dimension reduction method principal component analysis (PCA) and independent component analysis (ICA). The empirical results show that the proposed model outperforms the PCA-based model, as well as alternative models based on ICA and on the multilayer perceptron. PMID:25849483

  15. A Multi-Armed Bandit Approach to Following a Markov Chain

    DTIC Science & Technology

    2017-06-01

    focus on the House to Café transition (p1,4). We develop a Multi-Armed Bandit approach for efficiently following this target, where each state takes the...and longitude (each state corresponding to a physical location and a small set of activities). The searcher would then apply our approach on this...the target’s transition probability and the true probability over time. Further, we seek to provide upper bounds (i.e., worst case bounds) on the

  16. Stochastic analysis of three-dimensional flow in a bounded domain

    USGS Publications Warehouse

    Naff, R.L.; Vecchia, A.V.

    1986-01-01

    A commonly accepted first-order approximation of the equation for steady state flow in a fully saturated spatially random medium has the form of Poisson's equation. This form allows for the advantageous use of Green's functions to solve for the random output (hydraulic heads) in terms of a convolution over the random input (the logarithm of hydraulic conductivity). A solution for steady state three- dimensional flow in an aquifer bounded above and below is presented; consideration of these boundaries is made possible by use of Green's functions to solve Poisson's equation. Within the bounded domain the medium hydraulic conductivity is assumed to be a second-order stationary random process as represented by a simple three-dimensional covariance function. Upper and lower boundaries are taken to be no-flow boundaries; the mean flow vector lies entirely in the horizontal dimensions. The resulting hydraulic head covariance function exhibits nonstationary effects resulting from the imposition of boundary conditions. Comparisons are made with existing infinite domain solutions.

  17. Content modification attacks on consensus seeking multi-agent system with double-integrator dynamics.

    PubMed

    Dong, Yimeng; Gupta, Nirupam; Chopra, Nikhil

    2016-11-01

    In this paper, vulnerability of a distributed consensus seeking multi-agent system (MAS) with double-integrator dynamics against edge-bound content modification cyber attacks is studied. In particular, we define a specific edge-bound content modification cyber attack called malignant content modification attack (MCoMA), which results in unbounded growth of an appropriately defined group disagreement vector. Properties of MCoMA are utilized to design detection and mitigation algorithms so as to impart resilience in the considered MAS against MCoMA. Additionally, the proposed detection mechanism is extended to detect the general edge-bound content modification attacks (not just MCoMA). Finally, the efficacies of the proposed results are illustrated through numerical simulations.

  18. Content modification attacks on consensus seeking multi-agent system with double-integrator dynamics

    NASA Astrophysics Data System (ADS)

    Dong, Yimeng; Gupta, Nirupam; Chopra, Nikhil

    2016-11-01

    In this paper, vulnerability of a distributed consensus seeking multi-agent system (MAS) with double-integrator dynamics against edge-bound content modification cyber attacks is studied. In particular, we define a specific edge-bound content modification cyber attack called malignant content modification attack (MCoMA), which results in unbounded growth of an appropriately defined group disagreement vector. Properties of MCoMA are utilized to design detection and mitigation algorithms so as to impart resilience in the considered MAS against MCoMA. Additionally, the proposed detection mechanism is extended to detect the general edge-bound content modification attacks (not just MCoMA). Finally, the efficacies of the proposed results are illustrated through numerical simulations.

  19. Minimum Dimension of a Hilbert Space Needed to Generate a Quantum Correlation.

    PubMed

    Sikora, Jamie; Varvitsiotis, Antonios; Wei, Zhaohui

    2016-08-05

    Consider a two-party correlation that can be generated by performing local measurements on a bipartite quantum system. A question of fundamental importance is to understand how many resources, which we quantify by the dimension of the underlying quantum system, are needed to reproduce this correlation. In this Letter, we identify an easy-to-compute lower bound on the smallest Hilbert space dimension needed to generate a given two-party quantum correlation. We show that our bound is tight on many well-known correlations and discuss how it can rule out correlations of having a finite-dimensional quantum representation. We show that our bound is multiplicative under product correlations and also that it can witness the nonconvexity of certain restricted-dimensional quantum correlations.

  20. Optimal one-dimensional inversion and bounding of magnetotelluric apparent resistivity and phase measurements

    NASA Astrophysics Data System (ADS)

    Parker, Robert L.; Booker, John R.

    1996-12-01

    The properties of the log of the admittance in the complex frequency plane lead to an integral representation for one-dimensional magnetotelluric (MT) apparent resistivity and impedance phase similar to that found previously for complex admittance. The inverse problem of finding a one-dimensional model for MT data can then be solved using the same techniques as for complex admittance, with similar results. For instance, the one-dimensional conductivity model that minimizes the χ2 misfit statistic for noisy apparent resistivity and phase is a series of delta functions. One of the most important applications of the delta function solution to the inverse problem for complex admittance has been answering the question of whether or not a given set of measurements is consistent with the modeling assumption of one-dimensionality. The new solution allows this test to be performed directly on standard MT data. Recently, it has been shown that induction data must pass the same one-dimensional consistency test if they correspond to the polarization in which the electric field is perpendicular to the strike of two-dimensional structure. This greatly magnifies the utility of the consistency test. The new solution also allows one to compute the upper and lower bounds permitted on phase or apparent resistivity at any frequency given a collection of MT data. Applications include testing the mutual consistency of apparent resistivity and phase data and placing bounds on missing phase or resistivity data. Examples presented demonstrate detection and correction of equipment and processing problems and verification of compatibility with two-dimensional B-polarization for MT data after impedance tensor decomposition and for continuous electromagnetic profiling data.

  1. Multi-dimensional quantum state sharing based on quantum Fourier transform

    NASA Astrophysics Data System (ADS)

    Qin, Huawang; Tso, Raylin; Dai, Yuewei

    2018-03-01

    A scheme of multi-dimensional quantum state sharing is proposed. The dealer performs the quantum SUM gate and the quantum Fourier transform to encode a multi-dimensional quantum state into an entanglement state. Then the dealer distributes each participant a particle of the entanglement state, to share the quantum state among n participants. In the recovery, n-1 participants measure their particles and supply their measurement results; the last participant performs the unitary operation on his particle according to these measurement results and can reconstruct the initial quantum state. The proposed scheme has two merits: It can share the multi-dimensional quantum state and it does not need the entanglement measurement.

  2. Improving the effectiveness of real-time flood forecasting through Predictive Uncertainty estimation: the multi-temporal approach

    NASA Astrophysics Data System (ADS)

    Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Todini, Ezio

    2015-04-01

    The negative effects of severe flood events are usually contrasted through structural measures that, however, do not fully eliminate flood risk. Non-structural measures, such as real-time flood forecasting and warning, are also required. Accurate stage/discharge future predictions with appropriate forecast lead-time are sought by decision-makers for implementing strategies to mitigate the adverse effects of floods. Traditionally, flood forecasting has been approached by using rainfall-runoff and/or flood routing modelling. Indeed, both types of forecasts, cannot be considered perfectly representing future outcomes because of lacking of a complete knowledge of involved processes (Todini, 2004). Nonetheless, although aware that model forecasts are not perfectly representing future outcomes, decision makers are de facto implicitly assuming the forecast of water level/discharge/volume, etc. as "deterministic" and coinciding with what is going to occur. Recently the concept of Predictive Uncertainty (PU) was introduced in hydrology (Krzysztofowicz, 1999), and several uncertainty processors were developed (Todini, 2008). PU is defined as the probability of occurrence of the future realization of a predictand (water level/discharge/volume) conditional on: i) prior observations and knowledge, ii) the available information obtained on the future value, typically provided by one or more forecast models. Unfortunately, PU has been frequently interpreted as a measure of lack of accuracy rather than the appropriate tool allowing to take the most appropriate decisions, given a model or several models' forecasts. With the aim to shed light on the benefits for appropriately using PU, a multi-temporal approach based on the MCP approach (Todini, 2008; Coccia and Todini, 2011) is here applied to stage forecasts at sites along the Upper Tiber River. Specifically, the STAge Forecasting-Rating Curve Model Muskingum-based (STAFOM-RCM) (Barbetta et al., 2014) along with the Rating-Curve Model in Real Time (RCM-RT) (Barbetta and Moramarco, 2014) are used to this end. Both models without considering rainfall information explicitly considers, at each time of forecast, the estimate of lateral contribution along the river reach for which the stage forecast is performed at downstream end. The analysis is performed for several reaches using different lead times according to the channel length. Barbetta, S., Moramarco, T., Brocca, L., Franchini, M. and Melone, F. 2014. Confidence interval of real-time forecast stages provided by the STAFOM-RCM model: the case study of the Tiber River (Italy). Hydrological Processes, 28(3),729-743. Barbetta, S. and Moramarco, T. 2014. Real-time flood forecasting by relating local stage and remote discharge. Hydrological Sciences Journal, 59(9 ), 1656-1674. Coccia, G. and Todini, E. 2011. Recent developments in predictive uncertainty assessment based on the Model Conditional Processor approach. Hydrology and Earth System Sciences, 15, 3253-3274. doi:10.5194/hess-15-3253-2011. Krzysztofowicz, R. 1999. Bayesian theory of probabilistic forecasting via deterministic hydrologic model, Water Resour. Res., 35, 2739-2750. Todini, E. 2004. Role and treatment of uncertainty in real-time flood forecasting. Hydrological Processes 18(14), 2743_2746. Todini, E. 2008. A model conditional processor to assess predictive uncertainty in flood forecasting. Intl. J. River Basin Management, 6(2): 123-137.

  3. Seasonal drought ensemble predictions based on multiple climate models in the upper Han River Basin, China

    NASA Astrophysics Data System (ADS)

    Ma, Feng; Ye, Aizhong; Duan, Qingyun

    2017-03-01

    An experimental seasonal drought forecasting system is developed based on 29-year (1982-2010) seasonal meteorological hindcasts generated by the climate models from the North American Multi-Model Ensemble (NMME) project. This system made use of a bias correction and spatial downscaling method, and a distributed time-variant gain model (DTVGM) hydrologic model. DTVGM was calibrated using observed daily hydrological data and its streamflow simulations achieved Nash-Sutcliffe efficiency values of 0.727 and 0.724 during calibration (1978-1995) and validation (1996-2005) periods, respectively, at the Danjiangkou reservoir station. The experimental seasonal drought forecasting system (known as NMME-DTVGM) is used to generate seasonal drought forecasts. The forecasts were evaluated against the reference forecasts (i.e., persistence forecast and climatological forecast). The NMME-DTVGM drought forecasts have higher detectability and accuracy and lower false alarm rate than the reference forecasts at different lead times (from 1 to 4 months) during the cold-dry season. No apparent advantage is shown in drought predictions during spring and summer seasons because of a long memory of the initial conditions in spring and a lower predictive skill for precipitation in summer. Overall, the NMME-based seasonal drought forecasting system has meaningful skill in predicting drought several months in advance, which can provide critical information for drought preparedness and response planning as well as the sustainable practice of water resource conservation over the basin.

  4. New Aspects of Probabilistic Forecast Verification Using Information Theory

    NASA Astrophysics Data System (ADS)

    Tödter, Julian; Ahrens, Bodo

    2013-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.

  5. Short-term load and wind power forecasting using neural network-based prediction intervals.

    PubMed

    Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas

    2014-02-01

    Electrical power systems are evolving from today's centralized bulk systems to more decentralized systems. Penetrations of renewable energies, such as wind and solar power, significantly increase the level of uncertainty in power systems. Accurate load forecasting becomes more complex, yet more important for management of power systems. Traditional methods for generating point forecasts of load demands cannot properly handle uncertainties in system operations. To quantify potential uncertainties associated with forecasts, this paper implements a neural network (NN)-based method for the construction of prediction intervals (PIs). A newly introduced method, called lower upper bound estimation (LUBE), is applied and extended to develop PIs using NN models. A new problem formulation is proposed, which translates the primary multiobjective problem into a constrained single-objective problem. Compared with the cost function, this new formulation is closer to the primary problem and has fewer parameters. Particle swarm optimization (PSO) integrated with the mutation operator is used to solve the problem. Electrical demands from Singapore and New South Wales (Australia), as well as wind power generation from Capital Wind Farm, are used to validate the PSO-based LUBE method. Comparative results show that the proposed method can construct higher quality PIs for load and wind power generation forecasts in a short time.

  6. Five-Dimensional Gauged Supergravity with Higher Derivatives

    NASA Astrophysics Data System (ADS)

    Hanaki, Kentaro

    This thesis summarizes the recent developments on the study of five-dimensional gauged supergravity with higher derivative terms, emphasizing in particular the application to understanding the hydrodynamic properties of gauge theory plasma via the AdS/CFT correspondence. We first review how the ungauged and gauged five-dimensional supergravity actions with higher derivative terms can be constructed using the off-shell superconformal formalism. Then we relate the gauged supergravity to four-dimensional gauge theory using the AdS/CFT correspondence and extract the physical quantities associated with gauge theory plasma from the dual classical supergravity computations. We put a particular emphasis on the discussion of the conjectured lower bound for the shear viscosity over entropy density ratio proposed by Kovtun, Son and Starinets, and discuss how higher derivative terms in supergravity and the introduction of chemical potential for the R-charge affect this bound.

  7. Ray tracing a three dimensional scene using a grid

    DOEpatents

    Wald, Ingo; Ize, Santiago; Parker, Steven G; Knoll, Aaron

    2013-02-26

    Ray tracing a three-dimensional scene using a grid. One example embodiment is a method for ray tracing a three-dimensional scene using a grid. In this example method, the three-dimensional scene is made up of objects that are spatially partitioned into a plurality of cells that make up the grid. The method includes a first act of computing a bounding frustum of a packet of rays, and a second act of traversing the grid slice by slice along a major traversal axis. Each slice traversal includes a first act of determining one or more cells in the slice that are overlapped by the frustum and a second act of testing the rays in the packet for intersection with any objects at least partially bounded by the one or more cells overlapped by the frustum.

  8. Distortion Representation of Forecast Errors for Model Skill Assessment and Objective Analysis

    NASA Technical Reports Server (NTRS)

    Hoffman, Ross N.; Nehrkorn, Thomas; Grassotti, Christopher

    1998-01-01

    We proposed a novel characterization of errors for numerical weather predictions. A general distortion representation allows for the displacement and amplification or bias correction of forecast anomalies. Characterizing and decomposing forecast error in this way has several important applications, including the model assessment application and the objective analysis application. In this project, we have focused on the assessment application, restricted to a realistic but univariate 2-dimensional situation. Specifically, we study the forecast errors of the sea level pressure (SLP), the 500 hPa geopotential height, and the 315 K potential vorticity fields for forecasts of the short and medium range. The forecasts are generated by the Goddard Earth Observing System (GEOS) data assimilation system with and without ERS-1 scatterometer data. A great deal of novel work has been accomplished under the current contract. In broad terms, we have developed and tested an efficient algorithm for determining distortions. The algorithm and constraints are now ready for application to larger data sets to be used to determine the statistics of the distortion as outlined above, and to be applied in data analysis by using GEOS water vapor imagery to correct short-term forecast errors.

  9. Stochastic Forcing for High-Resolution Regional and Global Ocean and Atmosphere-Ocean Coupled Ensemble Forecast System

    NASA Astrophysics Data System (ADS)

    Rowley, C. D.; Hogan, P. J.; Martin, P.; Thoppil, P.; Wei, M.

    2017-12-01

    An extended range ensemble forecast system is being developed in the US Navy Earth System Prediction Capability (ESPC), and a global ocean ensemble generation capability to represent uncertainty in the ocean initial conditions has been developed. At extended forecast times, the uncertainty due to the model error overtakes the initial condition as the primary source of forecast uncertainty. Recently, stochastic parameterization or stochastic forcing techniques have been applied to represent the model error in research and operational atmospheric, ocean, and coupled ensemble forecasts. A simple stochastic forcing technique has been developed for application to US Navy high resolution regional and global ocean models, for use in ocean-only and coupled atmosphere-ocean-ice-wave ensemble forecast systems. Perturbation forcing is added to the tendency equations for state variables, with the forcing defined by random 3- or 4-dimensional fields with horizontal, vertical, and temporal correlations specified to characterize different possible kinds of error. Here, we demonstrate the stochastic forcing in regional and global ensemble forecasts with varying perturbation amplitudes and length and time scales, and assess the change in ensemble skill measured by a range of deterministic and probabilistic metrics.

  10. Interference Lattice-based Loop Nest Tilings for Stencil Computations

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Frumkin, Michael

    2000-01-01

    A common method for improving performance of stencil operations on structured multi-dimensional discretization grids is loop tiling. Tile shapes and sizes are usually determined heuristically, based on the size of the primary data cache. We provide a lower bound on the numbers of cache misses that must be incurred by any tiling, and a close achievable bound using a particular tiling based on the grid interference lattice. The latter tiling is used to derive highly efficient loop orderings. The total number of cache misses of a code is the sum of (necessary) cold misses and misses caused by elements being dropped from the cache between successive loads (replacement misses). Maximizing temporal locality is equivalent to minimizing replacement misses. Temporal locality of loop nests implementing stencil operations is optimized by tilings that avoid data conflicts. We divide the loop nest iteration space into conflict-free tiles, derived from the cache miss equation. The tiling involves the definition of the grid interference lattice an equivalence class of grid points whose images in main memory map to the same location in the cache-and the construction of a special basis for the lattice. Conflicts only occur on the boundaries of the tiles, unless the tiles are too thin. We show that the surface area of the tiles is bounded for grids of any dimensionality, and for caches of any associativity, provided the eccentricity of the fundamental parallelepiped (the tile spanned by the basis) of the lattice is bounded. Eccentricity is determined by two factors, aspect ratio and skewness. The aspect ratio of the parallelepiped can be bounded by appropriate array padding. The skewness can be bounded by the choice of a proper basis. Combining these two strategies ensures that pathologically thin tiles are avoided. They do not, however, minimize replacement misses per se. The reason is that tile visitation order influences the number of data conflicts on the tile boundaries. If two adjacent tiles are visited successively, there will be no replacement misses on the shared boundary. The iteration space may be covered with pencils larger than the size of the cache while avoiding data conflicts if the pencils are traversed by a scanning-face method. Replacement misses are incurred only on the boundaries of the pencils, and the number of misses is minimized by maximizing the volume of the scanning face, not the volume of the tile. We present an algorithm for constructing the most efficient scanning face for a given grid and stencil operator. In two dimensions it is based on a continued fraction algorithm. In three dimensions it follows Voronoi's successive minima algorithm. We show experimental results of using the scanning face, and compare with canonical loop orderings.

  11. Improvement of forecast skill for severe weather by merging radar-based extrapolation and storm-scale NWP corrected forecast

    NASA Astrophysics Data System (ADS)

    Wang, Gaili; Wong, Wai-Kin; Hong, Yang; Liu, Liping; Dong, Jili; Xue, Ming

    2015-03-01

    The primary objective of this study is to improve the performance of deterministic high resolution rainfall forecasts caused by severe storms by merging an extrapolation radar-based scheme with a storm-scale Numerical Weather Prediction (NWP) model. Effectiveness of Multi-scale Tracking and Forecasting Radar Echoes (MTaRE) model was compared with that of a storm-scale NWP model named Advanced Regional Prediction System (ARPS) for forecasting a violent tornado event that developed over parts of western and much of central Oklahoma on May 24, 2011. Then the bias corrections were performed to improve the forecast accuracy of ARPS forecasts. Finally, the corrected ARPS forecast and radar-based extrapolation were optimally merged by using a hyperbolic tangent weight scheme. The comparison of forecast skill between MTaRE and ARPS in high spatial resolution of 0.01° × 0.01° and high temporal resolution of 5 min showed that MTaRE outperformed ARPS in terms of index of agreement and mean absolute error (MAE). MTaRE had a better Critical Success Index (CSI) for less than 20-min lead times and was comparable to ARPS for 20- to 50-min lead times, while ARPS had a better CSI for more than 50-min lead times. Bias correction significantly improved ARPS forecasts in terms of MAE and index of agreement, although the CSI of corrected ARPS forecasts was similar to that of the uncorrected ARPS forecasts. Moreover, optimally merging results using hyperbolic tangent weight scheme further improved the forecast accuracy and became more stable.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Kyri; Dall'Anese, Emiliano; Summers, Tyler

    This paper outlines a data-driven, distributionally robust approach to solve chance-constrained AC optimal power flow problems in distribution networks. Uncertain forecasts for loads and power generated by photovoltaic (PV) systems are considered, with the goal of minimizing PV curtailment while meeting power flow and voltage regulation constraints. A data- driven approach is utilized to develop a distributionally robust conservative convex approximation of the chance-constraints; particularly, the mean and covariance matrix of the forecast errors are updated online, and leveraged to enforce voltage regulation with predetermined probability via Chebyshev-based bounds. By combining an accurate linear approximation of the AC power flowmore » equations with the distributionally robust chance constraint reformulation, the resulting optimization problem becomes convex and computationally tractable.« less

  13. Lifting primordial non-Gaussianity above the noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welling, Yvette; Woude, Drian van der; Pajer, Enrico, E-mail: welling@strw.leidenuniv.nl, E-mail: D.C.vanderWoude@uu.nl, E-mail: enrico.pajer@gmail.com

    2016-08-01

    Primordial non-Gaussianity (PNG) in Large Scale Structures is obfuscated by the many additional sources of non-linearity. Within the Effective Field Theory approach to Standard Perturbation Theory, we show that matter non-linearities in the bispectrum can be modeled sufficiently well to strengthen current bounds with near future surveys, such as Euclid. We find that the EFT corrections are crucial to this improvement in sensitivity. Yet, our understanding of non-linearities is still insufficient to reach important theoretical benchmarks for equilateral PNG, while, for local PNG, our forecast is more optimistic. We consistently account for the theoretical error intrinsic to the perturbative approachmore » and discuss the details of its implementation in Fisher forecasts.« less

  14. Flight Departure Delay and Rerouting Under Uncertainty in En Route Convective Weather

    NASA Technical Reports Server (NTRS)

    Mukherjee, Avijit; Grabbe, Shon; Sridhar, Banavar

    2011-01-01

    Delays caused by uncertainty in weather forecasts can be reduced by improving traffic flow management decisions. This paper presents a methodology for traffic flow management under uncertainty in convective weather forecasts. An algorithm for assigning departure delays and reroutes to aircraft is presented. Departure delay and route assignment are executed at multiple stages, during which, updated weather forecasts and flight schedules are used. At each stage, weather forecasts up to a certain look-ahead time are treated as deterministic and flight scheduling is done to mitigate the impact of weather on four-dimensional flight trajectories. Uncertainty in weather forecasts during departure scheduling results in tactical airborne holding of flights. The amount of airborne holding depends on the accuracy of forecasts as well as the look-ahead time included in the departure scheduling. The weather forecast look-ahead time is varied systematically within the experiments performed in this paper to analyze its effect on flight delays. Based on the results, longer look-ahead times cause higher departure delays and additional flying time due to reroutes. However, the amount of airborne holding necessary to prevent weather incursions reduces when the forecast look-ahead times are higher. For the chosen day of traffic and weather, setting the look-ahead time to 90 minutes yields the lowest total delay cost.

  15. Multi-Reanalysis Comparison of Variability in Analysis Increment of Column-Integrated Water Vapor Associated with Madden-Julian Oscillation

    NASA Astrophysics Data System (ADS)

    Yokoi, S.

    2014-12-01

    This study conducts a comparison of three reanalysis products (JRA-55, JRA-25, and ERA-Interim) in representation of Madden-Julian Oscillation (MJO), focusing on column-integrated water vapor (CWV) that is considered as an essential variable for discussing MJO dynamics. Besides the analysis fields of CWV, which exhibit spatio-temporal distributions that are quite similar to satellite observations, CWV tendency simulated by forecast models and analysis increment calculated by data assimilation are examined. For JRA-55, it is revealed that, while its forecast model is able to simulate eastward propagation of the CWV anomaly, it tends to weaken the amplitude, and data assimilation process sustains the amplitude. The multi-reanalysis comparison of the analysis increment further reveals that this weakening bias is probably caused by excessively weak cloud-radiative feedback represented by the model. This bias in the feedback strength makes anomalous moisture supply by the vertical advection term in the CWV budget equation too insensitive to precipitation anomaly, resulting in reduction of the amplitude of CWV anomaly. ERA-Interim has a nearly opposite feature; the forecast model represents excessively strong feedback and unrealistically strengthens the amplitude, while the data assimilation weakens it. These results imply the necessity of accurate representation of the cloud-radiative feedback strength for a short-term MJO forecast, and may be evidence to support the argument that this feedback is essential for the existence of MJO. Furthermore, this study demonstrates that the multi-reanalysis comparison of the analysis increment will provide useful information for identifying model biases and, potentially, for estimating parameters that are difficult to estimate solely from observation data, such as gross moist stability.

  16. A multi-scale ensemble-based framework for forecasting compound coastal-riverine flooding: The Hackensack-Passaic watershed and Newark Bay

    NASA Astrophysics Data System (ADS)

    Saleh, F.; Ramaswamy, V.; Wang, Y.; Georgas, N.; Blumberg, A.; Pullen, J.

    2017-12-01

    Estuarine regions can experience compound impacts from coastal storm surge and riverine flooding. The challenges in forecasting flooding in such areas are multi-faceted due to uncertainties associated with meteorological drivers and interactions between hydrological and coastal processes. The objective of this work is to evaluate how uncertainties from meteorological predictions propagate through an ensemble-based flood prediction framework and translate into uncertainties in simulated inundation extents. A multi-scale framework, consisting of hydrologic, coastal and hydrodynamic models, was used to simulate two extreme flood events at the confluence of the Passaic and Hackensack rivers and Newark Bay. The events were Hurricane Irene (2011), a combination of inland flooding and coastal storm surge, and Hurricane Sandy (2012) where coastal storm surge was the dominant component. The hydrodynamic component of the framework was first forced with measured streamflow and ocean water level data to establish baseline inundation extents with the best available forcing data. The coastal and hydrologic models were then forced with meteorological predictions from 21 ensemble members of the Global Ensemble Forecast System (GEFS) to retrospectively represent potential future conditions up to 96 hours prior to the events. Inundation extents produced by the hydrodynamic model, forced with the 95th percentile of the ensemble-based coastal and hydrologic boundary conditions, were in good agreement with baseline conditions for both events. The USGS reanalysis of Hurricane Sandy inundation extents was encapsulated between the 50th and 95th percentile of the forecasted inundation extents, and that of Hurricane Irene was similar but with caveats associated with data availability and reliability. This work highlights the importance of accounting for meteorological uncertainty to represent a range of possible future inundation extents at high resolution (∼m).

  17. Multi-centennial upper-ocean heat content reconstruction using online data assimilation

    NASA Astrophysics Data System (ADS)

    Perkins, W. A.; Hakim, G. J.

    2017-12-01

    The Last Millennium Reanalysis (LMR) provides an advanced paleoclimate ensemble data assimilation framework for multi-variate climate field reconstructions over the Common Era. Although reconstructions in this framework with full Earth system models remain prohibitively expensive, recent work has shown improved ensemble reconstruction validation using computationally inexpensive linear inverse models (LIMs). Here we leverage these techniques in pursuit of a new multi-centennial field reconstruction of upper-ocean heat content (OHC), synthesizing model dynamics with observational constraints from proxy records. OHC is an important indicator of internal climate variability and responds to planetary energy imbalances. Therefore, a consistent extension of the OHC record in time will help inform aspects of low-frequency climate variability. We use the Community Climate System Model version 4 (CCSM4) and Max Planck Institute (MPI) last millennium simulations to derive the LIMs, and the PAGES2K v.2.0 proxy database to perform annually resolved reconstructions of upper-OHC, surface air temperature, and wind stress over the last 500 years. Annual OHC reconstructions and uncertainties for both the global mean and regional basins are compared against observational and reanalysis data. We then investigate differences in dynamical behavior at decadal and longer time scales between the reconstruction and simulations in the last-millennium Coupled Model Intercomparison Project version 5 (CMIP5). Preliminary investigation of 1-year forecast skill for an OHC-only LIM shows largely positive spatial grid point local anomaly correlations (LAC) with a global average LAC of 0.37. Compared to 1-year OHC persistence forecast LAC (global average LAC of 0.30), the LIM outperforms the persistence forecasts in the tropical Indo-Pacific region, the equatorial Atlantic, and in certain regions near the Antarctic Circumpolar Current. In other regions, the forecast correlations are less than the persistence case but still positive overall.

  18. Measuring the Perception of the Teachers' Autonomy-Supportive Behavior in Physical Education: Development and Initial Validation of a Multi-Dimensional Instrument

    ERIC Educational Resources Information Center

    Tilga, Henri; Hein, Vello; Koka, Andre

    2017-01-01

    This research aimed to develop and validate an instrument to assess the students' perceptions of the teachers' autonomy-supportive behavior by the multi-dimensional scale (Multi-Dimensional Perceived Autonomy Support Scale for Physical Education). The participants were 1,476 students aged 12- to 15-years-old. In Study 1, a pool of 37 items was…

  19. CCPP-ARM Parameterization Testbed Model Forecast Data

    DOE Data Explorer

    Klein, Stephen

    2008-01-15

    Dataset contains the NCAR CAM3 (Collins et al., 2004) and GFDL AM2 (GFDL GAMDT, 2004) forecast data at locations close to the ARM research sites. These data are generated from a series of multi-day forecasts in which both CAM3 and AM2 are initialized at 00Z every day with the ECMWF reanalysis data (ERA-40), for the year 1997 and 2000 and initialized with both the NASA DAO Reanalyses and the NCEP GDAS data for the year 2004. The DOE CCPP-ARM Parameterization Testbed (CAPT) project assesses climate models using numerical weather prediction techniques in conjunction with high quality field measurements (e.g. ARM data).

  20. Synthesis of surface bound silver nanoparticles on cellulose fibers using lignin as multi-functional agent.

    PubMed

    Hu, Sixiao; Hsieh, You-Lo

    2015-10-20

    Lignin has proven to be highly effective "green" multi-functional binding, complexing and reducing agents for silver cations as well as capping agents for the synthesis of silver nanoparticles on ultra-fine cellulose fibrous membranes. Silver nanoparticles could be synthesized in 10min to be densely distributed and stably bound on the cellulose fiber surfaces at up to 2.9% in mass. Silver nanoparticle increased in sizes from 5 to 100nm and became more polydispersed in size distribution on larger fibers and with longer synthesis time. These cellulose fiber bound silver nanoparticles did not agglomerate under elevated temperatures and showed improved thermal stability. The presence of alkali lignin conferred moderate UV absorbing ability in both UV-B and UV-C regions whereas the bound silver nanoparticles exhibited excellent antibacterial activities toward Escherichia coli. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Bayesian flood forecasting methods: A review

    NASA Astrophysics Data System (ADS)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.

  2. Controlling bi-partite entanglement in multi-qubit systems

    NASA Astrophysics Data System (ADS)

    Plesch, Martin; Novotný, Jaroslav; Dzuráková, Zuzana; Buzek, Vladimír

    2004-02-01

    Bi-partite entanglement in multi-qubit systems cannot be shared freely. The rules of quantum mechanics impose bounds on how multi-qubit systems can be correlated. In this paper, we utilize a concept of entangled graphs with weighted edges in order to analyse pure quantum states of multi-qubit systems. Here qubits are represented by vertexes of the graph, while the presence of bi-partite entanglement is represented by an edge between corresponding vertexes. The weight of each edge is defined to be the entanglement between the two qubits connected by the edge, as measured by the concurrence. We prove that each entangled graph with entanglement bounded by a specific value of the concurrence can be represented by a pure multi-qubit state. In addition, we present a logic network with O(N2) elementary gates that can be used for preparation of the weighted entangled graphs of N qubits.

  3. Improved Use of Satellite Imagery to Forecast Hurricanes

    NASA Technical Reports Server (NTRS)

    Louis, Jean-Francois

    2001-01-01

    This project tested a novel method that uses satellite imagery to correct phase errors in the initial state for numerical weather prediction, applied to hurricane forecasts. The system was tested on hurricanes Guillermo (1997), Felicia (1997) and Iniki (1992). We compared the performance of the system with and without phase correction to a procedure that uses bogus data in the initial state, similar to current operational procedures. The phase correction keeps the hurricane on track in the analysis and is far superior to a system without phase correction. Compared to operational procedure, phase correction generates somewhat worse 3-day forecast of the hurricane track, but better forecast of intensity. It is believed that the phase correction module would work best in the context of 4-dimensional variational data assimilation. Very little modification to 4DVar would be required.

  4. Bounded-Angle Iterative Decoding of LDPC Codes

    NASA Technical Reports Server (NTRS)

    Dolinar, Samuel; Andrews, Kenneth; Pollara, Fabrizio; Divsalar, Dariush

    2009-01-01

    Bounded-angle iterative decoding is a modified version of conventional iterative decoding, conceived as a means of reducing undetected-error rates for short low-density parity-check (LDPC) codes. For a given code, bounded-angle iterative decoding can be implemented by means of a simple modification of the decoder algorithm, without redesigning the code. Bounded-angle iterative decoding is based on a representation of received words and code words as vectors in an n-dimensional Euclidean space (where n is an integer).

  5. Forecast Modelling via Variations in Binary Image-Encoded Information Exploited by Deep Learning Neural Networks.

    PubMed

    Liu, Da; Xu, Ming; Niu, Dongxiao; Wang, Shoukai; Liang, Sai

    2016-01-01

    Traditional forecasting models fit a function approximation from dependent invariables to independent variables. However, they usually get into trouble when date are presented in various formats, such as text, voice and image. This study proposes a novel image-encoded forecasting method that input and output binary digital two-dimensional (2D) images are transformed from decimal data. Omitting any data analysis or cleansing steps for simplicity, all raw variables were selected and converted to binary digital images as the input of a deep learning model, convolutional neural network (CNN). Using shared weights, pooling and multiple-layer back-propagation techniques, the CNN was adopted to locate the nexus among variations in local binary digital images. Due to the computing capability that was originally developed for binary digital bitmap manipulation, this model has significant potential for forecasting with vast volume of data. The model was validated by a power loads predicting dataset from the Global Energy Forecasting Competition 2012.

  6. Forecast Modelling via Variations in Binary Image-Encoded Information Exploited by Deep Learning Neural Networks

    PubMed Central

    Xu, Ming; Niu, Dongxiao; Wang, Shoukai; Liang, Sai

    2016-01-01

    Traditional forecasting models fit a function approximation from dependent invariables to independent variables. However, they usually get into trouble when date are presented in various formats, such as text, voice and image. This study proposes a novel image-encoded forecasting method that input and output binary digital two-dimensional (2D) images are transformed from decimal data. Omitting any data analysis or cleansing steps for simplicity, all raw variables were selected and converted to binary digital images as the input of a deep learning model, convolutional neural network (CNN). Using shared weights, pooling and multiple-layer back-propagation techniques, the CNN was adopted to locate the nexus among variations in local binary digital images. Due to the computing capability that was originally developed for binary digital bitmap manipulation, this model has significant potential for forecasting with vast volume of data. The model was validated by a power loads predicting dataset from the Global Energy Forecasting Competition 2012. PMID:27281032

  7. New watershed-based climate forecast products for hydrologists and water managers

    NASA Astrophysics Data System (ADS)

    Baker, S. A.; Wood, A.; Rajagopalan, B.; Lehner, F.; Peng, P.; Ray, A. J.; Barsugli, J. J.; Werner, K.

    2017-12-01

    Operational sub-seasonal to seasonal (S2S) climate predictions have advanced in skill in recent years but are yet to be broadly utilized by stakeholders in the water management sector. While some of the challenges that relate to fundamental predictability are difficult or impossible to surmount, other hurdles related to forecast product formulation, translation, relevance, and accessibility can be directly addressed. These include products being misaligned with users' space-time needs, products disseminated in formats users cannot easily process, and products based on raw model outputs that are biased relative to user climatologies. In each of these areas, more can be done to bridge the gap by enhancing the usability, quality, and relevance of water-oriented predictions. In addition, water stakeholder impacts can benefit from short-range extremes predictions (such as 2-3 day storms or 1-week heat waves) at S2S time-scales, for which few products exist. We present interim results of a Research to Operations (R2O) effort sponsored by the NOAA MAPP Climate Testbed to (1) formulate climate prediction products so as to reduce hurdles to in water stakeholder adoption, and to (2) explore opportunities for extremes prediction at S2S time scales. The project is currently using CFSv2 and National Multi-­Model Ensemble (NMME) reforecasts and forecasts to develop real-time watershed-based climate forecast products, and to train post-processing approaches to enhance the skill and reliability of raw real-time S2S forecasts. Prototype S2S climate data products (forecasts and associated skill analyses) are now being operationally staged at NCAR on a public website to facilitate further product development through interactions with water managers. Initial demonstration products include CFSv2-based bi-weekly climate forecasts (weeks 1-2, 2-3, and 3-4) for sub-regional scale hydrologic units, and NMME-based monthly and seasonal prediction products. Raw model mean skill at these time-space resolutions for some periods (e.g., weeks 3-4) is unusably low, but for other periods, and for multi-month leads with NMME, precipitation and particularly temperature forecasts exhibit useful skill. Website: http://hydro.rap.ucar.edu/s2s/

  8. The Effect of Model Grid Resolution on the Distributed Hydrologic Simulations for Forecasting Stream Flows and Reservoir Storage

    NASA Astrophysics Data System (ADS)

    Turnbull, S. J.

    2017-12-01

    Within the US Army Corps of Engineers (USACE), reservoirs are typically operated according to a rule curve that specifies target water levels based on the time of year. The rule curve is intended to maximize flood protection by specifying releases of water before the dominant rainfall period for a region. While some operating allowances are permissible, generally the rule curve elevations must be maintained. While this operational approach provides for the required flood control purpose, it may not result in optimal reservoir operations for multi-use impoundments. In the Russian River Valley of California a multi-agency research effort called Forecast-Informed Reservoir Operations (FIRO) is assessing the application of forecast weather and streamflow predictions to potentially enhance the operation of reservoirs in the watershed. The focus of the study has been on Lake Mendocino, a USACE project important for flood control, water supply, power generation and ecological flows. As part of this effort the Engineer Research and Development Center is assessing the ability of utilizing the physics based, distributed watershed model Gridded Surface Subsurface Hydrologic Analysis (GSSHA) model to simulate stream flows, reservoir stages, and discharges while being driven by weather forecast products. A key question in this application is the effect of watershed model resolution on forecasted stream flows. To help resolve this question, GSSHA models of multiple grid resolutions, 30, 50, and 270m, were developed for the upper Russian River, which includes Lake Mendocino. The models were derived from common inputs: DEM, soils, land use, stream network, reservoir characteristics, and specified inflows and discharges. All the models were calibrated in both event and continuous simulation mode using measured precipitation gages and then driven with the West-WRF atmospheric model in prediction mode to assess the ability of the model to function in short term, less than one week, forecasting mode. In this presentation we will discuss the effect the grid resolution has model development, parameter assignment, streamflow prediction and forecasting capability utilizing the West-WRF forecast hydro-meteorology.

  9. A new scoring method for evaluating the performance of earthquake forecasts and predictions

    NASA Astrophysics Data System (ADS)

    Zhuang, J.

    2009-12-01

    This study presents a new method, namely the gambling score, for scoring the performance of earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. A fair scoring scheme should reward the success in a way that is compatible with the risk taken. Suppose that we have the reference model, usually the Poisson model for usual cases or Omori-Utsu formula for the case of forecasting aftershocks, which gives probability p0 that at least 1 event occurs in a given space-time-magnitude window. The forecaster, similar to a gambler, who starts with a certain number of reputation points, bets 1 reputation point on ``Yes'' or ``No'' according to his forecast, or bets nothing if he performs a NA-prediction. If the forecaster bets 1 reputation point of his reputations on ``Yes" and loses, the number of his reputation points is reduced by 1; if his forecasts is successful, he should be rewarded (1-p0)/p0 reputation points. The quantity (1-p0)/p0 is the return (reward/bet) ratio for bets on ``Yes''. In this way, if the reference model is correct, the expected return that he gains from this bet is 0. This rule also applies to probability forecasts. Suppose that p is the occurrence probability of an earthquake given by the forecaster. We can regard the forecaster as splitting 1 reputation point by betting p on ``Yes'' and 1-p on ``No''. In this way, the forecaster's expected pay-off based on the reference model is still 0. From the viewpoints of both the reference model and the forecaster, the rule for rewarding and punishment is fair. This method is also extended to the continuous case of point process models, where the reputation points bet by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

  10. Demonstrating the Operational Value of Atmospheric Infrared Sounder (AIRS) Retrieved Profiles in the Pre-Convective Environment

    NASA Technical Reports Server (NTRS)

    Kozlowski, Danielle M.; Zavodsky, T.; Jedloved, Gary J.

    2011-01-01

    The Short-term Prediction Research and Transition Center (SPoRT) is a collaborative partnership between NASA and operational forecasting partners, including a number of National Weather Service offices. SPoRT provides real-time NASA products and capabilities to its partners to address specific operational forecast challenges. One operational forecast challenge is forecasting convective weather in data-void regions such as large bodies of water (e.g. Gulf of Mexico). To address this forecast challenge, SPoRT produces a twice-daily three-dimensional analysis that blends a model first-guess from the Advanced Research Weather Research and Forecasting (WRF-ARW) model with retrieved profiles from the Atmospheric Infrared Sounder (AIRS) -- a hyperspectral sounding instrument aboard NASA's Aqua satellite that provides temperature and moisture profiles of the atmosphere. AIRS profiles are unique in that they give a three dimensional view of the atmosphere that is not available through the current rawinsonde network. AIRS has two overpass swaths across North America each day, one valid in the 0700-0900 UTC timeframe and the other in the 1900-2100 UTC timeframe. This is helpful because the rawinsonde network only has data from 0000 UTC and 1200 UTC at specific land-based locations. Comparing the AIRS analysis product with control analyses that include no AIRS data demonstrates the value of the retrieved profiles to situational awareness for the pre-convective (and convective) environment. In an attempt to verify that the AIRS analysis was a good representation of the vertical structure of the atmosphere, both the AIRS and control analyses are compared to a Rapid Update Cycle (RUC) analysis used by operational forecasters. Using guidance from operational forecasters, convective available potential energy (CAPE) was determined to be a vital variable in making convective forecasts and is used herein to demonstrate the utility of the AIRS profiles in changing the vertical thermodynamic structure of the atmosphere in the pre-convective and convective environment. CAPE is an important metric because of it is a quantitative measure of atmospheric stability, which is necessary information when forecasting for convective weather. Case studies from the summer of 2010 were examined, and most impact from the AIRS retrieved profiles occurred over the data-void Gulf of Mexico with fields of convective potential closer to the RUC than the CNTL. Mixed results were found when AIRS retrieved profiles were used over land, so more cases need to be examined to determine whether AIRS would be an effective tool over land. Additional analyses of problematic convective forecasts over the Gulf Coast will be needed to determine the operational impact of AIRS. SPoRT eventually plans to transition the AIRS product to select Weather Forecast Office (WFO) partners, pending the outcome of these additional analyses.

  11. Surface Pressure Dependencies in the GEOS-Chem-Adjoint System and the Impact of the GEOS-5 Surface Pressure on CO2 Model Forecast

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Weidner, Richard

    2016-01-01

    In the GEOS-Chem Adjoint (GCA) system, the total (wet) surface pressure of the GEOS meteorology is employed as dry surface pressure, ignoring the presence of water vapor. The Jet Propulsion Laboratory (JPL) Carbon Monitoring System (CMS) research team has been evaluating the impact of the above discrepancy on the CO2 model forecast and the CO2 flux inversion. The JPL CMS research utilizes a multi-mission assimilation framework developed by the Multi-Mission Observation Operator (M2O2) research team at JPL extending the GCA system. The GCA-M2O2 framework facilitates mission-generic 3D and 4D-variational assimilations streamlining the interfaces to the satellite data products and prior emission inventories. The GCA-M2O2 framework currently integrates the GCA system version 35h and provides a dry surface pressure setup to allow the CO2 model forecast to be performed with the GEOS-5 surface pressure directly or after converting it to dry surface pressure.

  12. Surface Pressure Dependencies in the Geos-Chem-Adjoint System and the Impact of the GEOS-5 Surface Pressure on CO2 Model Forecast

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Weidner, Richard

    2016-01-01

    In the GEOS-Chem Adjoint (GCA) system, the total (wet) surface pressure of the GEOS meteorology is employed as dry surface pressure, ignoring the presence of water vapor. The Jet Propulsion Laboratory (JPL) Carbon Monitoring System (CMS) research team has been evaluating the impact of the above discrepancy on the CO2 model forecast and the CO2 flux inversion. The JPL CMS research utilizes a multi-mission assimilation framework developed by the Multi-Mission Observation Operator (M2O2) research team at JPL extending the GCA system. The GCA-M2O2 framework facilitates mission-generic 3D and 4D-variational assimilations streamlining the interfaces to the satellite data products and prior emission inventories. The GCA-M2O2 framework currently integrates the GCA system version 35h and provides a dry surface pressure setup to allow the CO2 model forecast to be performed with the GEOS-5 surface pressure directly or after converting it to dry surface pressure.

  13. Nonlinear Conservation Laws and Finite Volume Methods

    NASA Astrophysics Data System (ADS)

    Leveque, Randall J.

    Introduction Software Notation Classification of Differential Equations Derivation of Conservation Laws The Euler Equations of Gas Dynamics Dissipative Fluxes Source Terms Radiative Transfer and Isothermal Equations Multi-dimensional Conservation Laws The Shock Tube Problem Mathematical Theory of Hyperbolic Systems Scalar Equations Linear Hyperbolic Systems Nonlinear Systems The Riemann Problem for the Euler Equations Numerical Methods in One Dimension Finite Difference Theory Finite Volume Methods Importance of Conservation Form - Incorrect Shock Speeds Numerical Flux Functions Godunov's Method Approximate Riemann Solvers High-Resolution Methods Other Approaches Boundary Conditions Source Terms and Fractional Steps Unsplit Methods Fractional Step Methods General Formulation of Fractional Step Methods Stiff Source Terms Quasi-stationary Flow and Gravity Multi-dimensional Problems Dimensional Splitting Multi-dimensional Finite Volume Methods Grids and Adaptive Refinement Computational Difficulties Low-Density Flows Discrete Shocks and Viscous Profiles Start-Up Errors Wall Heating Slow-Moving Shocks Grid Orientation Effects Grid-Aligned Shocks Magnetohydrodynamics The MHD Equations One-Dimensional MHD Solving the Riemann Problem Nonstrict Hyperbolicity Stiffness The Divergence of B Riemann Problems in Multi-dimensional MHD Staggered Grids The 8-Wave Riemann Solver Relativistic Hydrodynamics Conservation Laws in Spacetime The Continuity Equation The 4-Momentum of a Particle The Stress-Energy Tensor Finite Volume Methods Multi-dimensional Relativistic Flow Gravitation and General Relativity References

  14. An advance forecasting system for ship originated oil spills in the Mediterranean

    NASA Astrophysics Data System (ADS)

    Zodiatis, G.; Lardner, R.; De Dominicis, M.; Coppini, G.; Pinardi, N.

    2012-04-01

    One of the permanent risks from an oil spill incident in the Mediterranean is associated with the heavy traffic in maritime transport, as well nowadays with the coastal and offshore installations related to the oil and gas industry. Such dense activity imposes on the coastal countries the need for preparing an operational response to major oil spill incidents. In the recent past, several policies related to oil spill response have been adopted internationally. At the regional level the Barcelona convention, recognizing pollution from oil spills as one of the major threats to the marine environment of the Mediterranean, initiated the preparedness for responding to major oil spill incidents, through various national and sub-regional contingency plans. At the European level the Member States was obliged to implement the EU Directive 2005/35, aimed at identifying the polluter and bringing them to prosecution. The response to an oil spill incident employs various measures and equipment. However, the success of such response depends greatly on the prediction of the movement and weathering of the oil spills. Such predictions may obtained through the operational application of advanced numerical oil spill models integrated with met-ocean forecasting data. A well established operational system for oil spill predictions in the Mediterranean is the MEDSLIK three dimensional model that predicts the transport, diffusion and spreading of oil spill and incorporates the fate processes of evaporation, emulsification, viscosity changes, dispersion into the water column and coastal impact and adhesion. MEDSLIK is integrated with the MyOCEAN regional and several downscaled ocean forecasting systems in the Mediterranean, contributing to the development of the GMES marine services. Moreover, MEDSLIK has been coupled with EMSA-CSN and ESA ASAR imageries, for short forward and backward predictions, to assist the response agencies in the implementation of the EU Directive 2005/35. From 2007 to 2011 more than a thousand possible ship originated oil slicks were detected by ASAR imageries in the Levantine Basin and then used for operational predictions by MEDSLIK. The successful operation of the MEDSLIK oil spill prediction system in the Levantine Basin has initiated efforts to implement a multi model approach to oil spill predictions in the Mediterranean through the realization of the recently approved project known as MedDESS4MS-Mediterranean Decision Support System for Maritime Safety, funded under the MED program. MEDESS4MS project is dedicated to the prevention of maritime risks and strengthening of maritime safety related to oil spill pollution in the Mediterranean. MEDESS4MS will deliver an integrated operational multi model oil spill prediction service in the Mediterranean, connected to existing monitoring platforms (EMSA-CSN, REMPEC, AIS), using the well established oil spill modeling systems, the data from the GMES Marine Core Services and the national ocean forecasting systems.

  15. Bounded solutions in a T-shaped waveguide and the spectral properties of the Dirichlet ladder

    NASA Astrophysics Data System (ADS)

    Nazarov, S. A.

    2014-08-01

    The Dirichlet problem is considered on the junction of thin quantum waveguides (of thickness h ≪ 1) in the shape of an infinite two-dimensional ladder. Passage to the limit as h → +0 is discussed. It is shown that the asymptotically correct transmission conditions at nodes of the corresponding one-dimensional quantum graph are Dirichlet conditions rather than the conventional Kirchhoff transmission conditions. The result is obtained by analyzing bounded solutions of a problem in the T-shaped waveguide that the boundary layer phenomenon.

  16. Multi-dimensional Fokker-Planck equation analysis using the modified finite element method

    NASA Astrophysics Data System (ADS)

    Náprstek, J.; Král, R.

    2016-09-01

    The Fokker-Planck equation (FPE) is a frequently used tool for the solution of cross probability density function (PDF) of a dynamic system response excited by a vector of random processes. FEM represents a very effective solution possibility, particularly when transition processes are investigated or a more detailed solution is needed. Actual papers deal with single degree of freedom (SDOF) systems only. So the respective FPE includes two independent space variables only. Stepping over this limit into MDOF systems a number of specific problems related to a true multi-dimensionality must be overcome. Unlike earlier studies, multi-dimensional simplex elements in any arbitrary dimension should be deployed and rectangular (multi-brick) elements abandoned. Simple closed formulae of integration in multi-dimension domain have been derived. Another specific problem represents the generation of multi-dimensional finite element mesh. Assembling of system global matrices should be subjected to newly composed algorithms due to multi-dimensionality. The system matrices are quite full and no advantages following from their sparse character can be profited from, as is commonly used in conventional FEM applications in 2D/3D problems. After verification of partial algorithms, an illustrative example dealing with a 2DOF non-linear aeroelastic system in combination with random and deterministic excitations is discussed.

  17. Simplification of the Kalman filter for meteorological data assimilation

    NASA Technical Reports Server (NTRS)

    Dee, Dick P.

    1991-01-01

    The paper proposes a new statistical method of data assimilation that is based on a simplification of the Kalman filter equations. The forecast error covariance evolution is approximated simply by advecting the mass-error covariance field, deriving the remaining covariances geostrophically, and accounting for external model-error forcing only at the end of each forecast cycle. This greatly reduces the cost of computation of the forecast error covariance. In simulations with a linear, one-dimensional shallow-water model and data generated artificially, the performance of the simplified filter is compared with that of the Kalman filter and the optimal interpolation (OI) method. The simplified filter produces analyses that are nearly optimal, and represents a significant improvement over OI.

  18. Tropical Pacific moisture variability: Its detection, synoptic structure and consequences in the general circulation

    NASA Technical Reports Server (NTRS)

    Mcguirk, James P.

    1990-01-01

    Satellite data analysis tools are developed and implemented for the diagnosis of atmospheric circulation systems over the tropical Pacific Ocean. The tools include statistical multi-variate procedures, a multi-spectral radiative transfer model, and the global spectral forecast model at NMC. Data include in-situ observations; satellite observations from VAS (moisture, infrared and visible) NOAA polar orbiters (including Tiros Operational Satellite System (TOVS) multi-channel sounding data and OLR grids) and scanning multichannel microwave radiometer (SMMR); and European Centre for Medium Weather Forecasts (ECHMWF) analyses. A primary goal is a better understanding of the relation between synoptic structures of the area, particularly tropical plumes, and the general circulation, especially the Hadley circulation. A second goal is the definition of the quantitative structure and behavior of all Pacific tropical synoptic systems. Finally, strategies are examined for extracting new and additional information from existing satellite observations. Although moisture structure is emphasized, thermal patterns are also analyzed. Both horizontal and vertical structures are studied and objective quantitative results are emphasized.

  19. Observed and forecast flood-inundation mapping application-A pilot study of an eleven-mile reach of the White River, Indianapolis, Indiana

    USGS Publications Warehouse

    Kim, Moon H.; Morlock, Scott E.; Arihood, Leslie D.; Kiesler, James L.

    2011-01-01

    Near-real-time and forecast flood-inundation mapping products resulted from a pilot study for an 11-mile reach of the White River in Indianapolis. The study was done by the U.S. Geological Survey (USGS), Indiana Silver Jackets hazard mitigation taskforce members, the National Weather Service (NWS), the Polis Center, and Indiana University, in cooperation with the City of Indianapolis, the Indianapolis Museum of Art, the Indiana Department of Homeland Security, and the Indiana Department of Natural Resources, Division of Water. The pilot project showed that it is technically feasible to create a flood-inundation map library by means of a two-dimensional hydraulic model, use a map from the library to quickly complete a moderately detailed local flood-loss estimate, and automatically run the hydraulic model during a flood event to provide the maps and flood-damage information through a Web graphical user interface. A library of static digital flood-inundation maps was created by means of a calibrated two-dimensional hydraulic model. Estimated water-surface elevations were developed for a range of river stages referenced to a USGS streamgage and NWS flood forecast point colocated within the study reach. These maps were made available through the Internet in several formats, including geographic information system, Keyhole Markup Language, and Portable Document Format. A flood-loss estimate was completed for part of the study reach by using one of the flood-inundation maps from the static library. The Federal Emergency Management Agency natural disaster-loss estimation program HAZUS-MH, in conjunction with local building information, was used to complete a level 2 analysis of flood-loss estimation. A Service-Oriented Architecture-based dynamic flood-inundation application was developed and was designed to start automatically during a flood, obtain near real-time and forecast data (from the colocated USGS streamgage and NWS flood forecast point within the study reach), run the two-dimensional hydraulic model, and produce flood-inundation maps. The application used local building data and depth-damage curves to estimate flood losses based on the maps, and it served inundation maps and flood-loss estimates through a Web-based graphical user interface.

  20. Long-term ensemble forecast of snowmelt inflow into the Cheboksary Reservoir under two different weather scenarios

    NASA Astrophysics Data System (ADS)

    Gelfan, Alexander; Moreydo, Vsevolod; Motovilov, Yury; Solomatine, Dimitri P.

    2018-04-01

    A long-term forecasting ensemble methodology, applied to water inflows into the Cheboksary Reservoir (Russia), is presented. The methodology is based on a version of the semi-distributed hydrological model ECOMAG (ECOlogical Model for Applied Geophysics) that allows for the calculation of an ensemble of inflow hydrographs using two different sets of weather ensembles for the lead time period: observed weather data, constructed on the basis of the Ensemble Streamflow Prediction methodology (ESP-based forecast), and synthetic weather data, simulated by a multi-site weather generator (WG-based forecast). We have studied the following: (1) whether there is any advantage of the developed ensemble forecasts in comparison with the currently issued operational forecasts of water inflow into the Cheboksary Reservoir, and (2) whether there is any noticeable improvement in probabilistic forecasts when using the WG-simulated ensemble compared to the ESP-based ensemble. We have found that for a 35-year period beginning from the reservoir filling in 1982, both continuous and binary model-based ensemble forecasts (issued in the deterministic form) outperform the operational forecasts of the April-June inflow volume actually used and, additionally, provide acceptable forecasts of additional water regime characteristics besides the inflow volume. We have also demonstrated that the model performance measures (in the verification period) obtained from the WG-based probabilistic forecasts, which are based on a large number of possible weather scenarios, appeared to be more statistically reliable than the corresponding measures calculated from the ESP-based forecasts based on the observed weather scenarios.

  1. A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables

    NASA Astrophysics Data System (ADS)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.

  2. Future Weather Forecasting in the Year 2020-Investing in Technology Today: Improving Weather and Environmental Predictions

    NASA Technical Reports Server (NTRS)

    Anthes, Richard; Schoeberl, Mark

    2000-01-01

    Fast-forward twenty years to the nightly simultaneous TV/webcast. Accurate 8-14 day regional forecasts will be available as will be a whole host of linked products including economic impact, travel, energy usage, etc. On-demand, personalized street-level forecasts will be downloaded into your PDA. Your home system will automatically update the products of interest to you (e.g. severe storm forecasts, hurricane predictions, etc). Short and long range climate forecasts will be used by your "Quicken 2020" to make suggest changes in your "futures" investment portfolio. Through a lively and informative multi-media presentation, leading Space-Earth Science Researchers and Technologists will share their vision for the year 2020, offering a possible futuristic forecast enabled through the application of new technologies under development today. Copies of the 'broadcast' will be available on Beta Tape for your own future use. If sufficient interest exists, the program may also be made available for broadcasters wishing to do stand-ups with roll-ins from the San Francisco meeting for their viewers back home.

  3. Multi-stage decoding for multi-level block modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1991-01-01

    In this paper, we investigate various types of multi-stage decoding for multi-level block modulation codes, in which the decoding of a component code at each stage can be either soft-decision or hard-decision, maximum likelihood or bounded-distance. Error performance of codes is analyzed for a memoryless additive channel based on various types of multi-stage decoding, and upper bounds on the probability of an incorrect decoding are derived. Based on our study and computation results, we find that, if component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. In particular, we find that the difference in performance between the suboptimum multi-stage soft-decision maximum likelihood decoding of a modulation code and the single-stage optimum decoding of the overall code is very small: only a fraction of dB loss in SNR at the probability of an incorrect decoding for a block of 10(exp -6). Multi-stage decoding of multi-level modulation codes really offers a way to achieve the best of three worlds, bandwidth efficiency, coding gain, and decoding complexity.

  4. A one-dimensional model of solid-earth electrical resistivity beneath Florida

    USGS Publications Warehouse

    Blum, Cletus; Love, Jeffrey J.; Pedrie, Kolby; Bedrosian, Paul A.; Rigler, E. Joshua

    2015-11-19

    An estimated one-dimensional layered model of electrical resistivity beneath Florida was developed from published geological and geophysical information. The resistivity of each layer is represented by plausible upper and lower bounds as well as a geometric mean resistivity. Corresponding impedance transfer functions, Schmucker-Weidelt transfer functions, apparent resistivity, and phase responses are calculated for inducing geomagnetic frequencies ranging from 10−5 to 100 hertz. The resulting one-dimensional model and response functions can be used to make general estimates of time-varying electric fields associated with geomagnetic storms such as might represent induction hazards for electric-power grid operation. The plausible upper- and lower-bound resistivity structures show the uncertainty, giving a wide range of plausible time-varying electric fields.

  5. Localization of massless Dirac particles via spatial modulations of the Fermi velocity

    NASA Astrophysics Data System (ADS)

    Downing, C. A.; Portnoi, M. E.

    2017-08-01

    The electrons found in Dirac materials are notorious for being difficult to manipulate due to the Klein phenomenon and absence of backscattering. Here we investigate how spatial modulations of the Fermi velocity in two-dimensional Dirac materials can give rise to localization effects, with either full (zero-dimensional) confinement or partial (one-dimensional) confinement possible depending on the geometry of the velocity modulation. We present several exactly solvable models illustrating the nature of the bound states which arise, revealing how the gradient of the Fermi velocity is crucial for determining fundamental properties of the bound states such as the zero-point energy. We discuss the implications for guiding electronic waves in few-mode waveguides formed by Fermi velocity modulation.

  6. Multi-time scale Climate Informed Stochastic Hybrid Simulation-Optimization Model (McISH model) for Multi-Purpose Reservoir System

    NASA Astrophysics Data System (ADS)

    Lu, M.; Lall, U.

    2013-12-01

    In order to mitigate the impacts of climate change, proactive management strategies to operate reservoirs and dams are needed. A multi-time scale climate informed stochastic model is developed to optimize the operations for a multi-purpose single reservoir by simulating decadal, interannual, seasonal and sub-seasonal variability. We apply the model to a setting motivated by the largest multi-purpose dam in N. India, the Bhakhra reservoir on the Sutlej River, a tributary of the Indus. This leads to a focus on timing and amplitude of the flows for the monsoon and snowmelt periods. The flow simulations are constrained by multiple sources of historical data and GCM future projections, that are being developed through a NSF funded project titled 'Decadal Prediction and Stochastic Simulation of Hydroclimate Over Monsoon Asia'. The model presented is a multilevel, nonlinear programming model that aims to optimize the reservoir operating policy on a decadal horizon and the operation strategy on an updated annual basis. The model is hierarchical, in terms of having a structure that two optimization models designated for different time scales are nested as a matryoshka doll. The two optimization models have similar mathematical formulations with some modifications to meet the constraints within that time frame. The first level of the model is designated to provide optimization solution for policy makers to determine contracted annual releases to different uses with a prescribed reliability; the second level is a within-the-period (e.g., year) operation optimization scheme that allocates the contracted annual releases on a subperiod (e.g. monthly) basis, with additional benefit for extra release and penalty for failure. The model maximizes the net benefit of irrigation, hydropower generation and flood control in each of the periods. The model design thus facilitates the consistent application of weather and climate forecasts to improve operations of reservoir systems. The decadal flow simulations are re-initialized every year with updated climate projections to improve the reliability of the operation rules for the next year, within which the seasonal operation strategies are nested. The multi-level structure can be repeated for monthly operation with weekly subperiods to take advantage of evolving weather forecasts and seasonal climate forecasts. As a result of the hierarchical structure, sub-seasonal even weather time scale updates and adjustment can be achieved. Given an ensemble of these scenarios, the McISH reservoir simulation-optimization model is able to derive the desired reservoir storage levels, including minimum and maximum, as a function of calendar date, and the associated release patterns. The multi-time scale approach allows adaptive management of water supplies acknowledging the changing risks, meeting both the objectives over the decade in expected value and controlling the near term and planning period risk through probabilistic reliability constraints. For the applications presented, the target season is the monsoon season from June to September. The model also includes a monthly flood volume forecast model, based on a Copula density fit to the monthly flow and the flood volume flow. This is used to guide dynamic allocation of the flood control volume given the forecasts.

  7. Progress and challenges with Warn-on-Forecast

    NASA Astrophysics Data System (ADS)

    Stensrud, David J.; Wicker, Louis J.; Xue, Ming; Dawson, Daniel T.; Yussouf, Nusrat; Wheatley, Dustan M.; Thompson, Therese E.; Snook, Nathan A.; Smith, Travis M.; Schenkman, Alexander D.; Potvin, Corey K.; Mansell, Edward R.; Lei, Ting; Kuhlman, Kristin M.; Jung, Youngsun; Jones, Thomas A.; Gao, Jidong; Coniglio, Michael C.; Brooks, Harold E.; Brewster, Keith A.

    2013-04-01

    The current status and challenges associated with two aspects of Warn-on-Forecast-a National Oceanic and Atmospheric Administration research project exploring the use of a convective-scale ensemble analysis and forecast system to support hazardous weather warning operations-are outlined. These two project aspects are the production of a rapidly-updating assimilation system to incorporate data from multiple radars into a single analysis, and the ability of short-range ensemble forecasts of hazardous convective weather events to provide guidance that could be used to extend warning lead times for tornadoes, hailstorms, damaging windstorms and flash floods. Results indicate that a three-dimensional variational assimilation system, that blends observations from multiple radars into a single analysis, shows utility when evaluated by forecasters in the Hazardous Weather Testbed and may help increase confidence in a warning decision. The ability of short-range convective-scale ensemble forecasts to provide guidance that could be used in warning operations is explored for five events: two tornadic supercell thunderstorms, a macroburst, a damaging windstorm and a flash flood. Results show that the ensemble forecasts of the three individual severe thunderstorm events are very good, while the forecasts from the damaging windstorm and flash flood events, associated with mesoscale convective systems, are mixed. Important interactions between mesoscale and convective-scale features occur for the mesoscale convective system events that strongly influence the quality of the convective-scale forecasts. The development of a successful Warn-on-Forecast system will take many years and require the collaborative efforts of researchers and operational forecasters to succeed.

  8. Comparison of the UAF Ionosphere Model with Incoherent-Scatter Radar Data

    NASA Astrophysics Data System (ADS)

    McAllister, J.; Maurits, S.; Kulchitsky, A.; Watkins, B.

    2004-12-01

    The UAF Eulerian Parallel Polar Ionosphere Model (UAF EPPIM) is a first-principles three-dimensional time-dependent representation of the northern polar ionosphere (>50 degrees north latitude). The model routinely generates short-term (~2 hours) ionospheric forecasts in real-time. It may also be run in post-processing/batch mode for specific time periods, including long-term (multi-year) simulations. The model code has been extensively validated (~100k comparisons/model year) against ionosonde foF2 data during quiet and moderate solar activity in 2002-2004 with reasonable fidelity (typical relative RMS 10-20% for summer daytime, 30-50% winter nighttime). However, ionosonde data is frequently not available during geomagnetic disturbances. The objective of the work reported here is to compare model outputs with available incoherent-scatter radar data during the storm period of October-November 2003. Model accuracy is examined for this period and compared to model performance during geomagnetically quiet and moderate circumstances. Possible improvements are suggested which are likely to boost model fidelity during storm conditions.

  9. Evaluation of Improved Pushback Forecasts Derived from Airline Ground Operations Data

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Theis, Georg; Feron, Eric; Clarke, John-Paul

    2003-01-01

    Accurate and timely predictions of airline pushbacks can potentially lead to improved performance of automated decision-support tools for airport surface traffic, thus reducing the variability and average duration of costly airline delays. One factor which affects the realization of these benefits is the level of uncertainty inherent in the turn processes. To characterize this inherent uncertainty, three techniques are developed for predicting time-to-go until pushback as a function of available ground-time; elapsed ground-time; and the status (not-started/in-progress/completed) of individual turn processes (cleaning, fueling, etc.). These techniques are tested against a large and detailed dataset covering approximately l0(exp 4) real-world turn operations obtained through collaboration with Deutsche Lufthansa AG. Even after the dataset is filtered to obtain a sample of turn operations with minimal uncertainty, the standard deviation of forecast error for all three techniques is lower-bounded away from zero, indicating that turn operations have a significant stochastic component. This lower-bound result shows that decision-support tools must be designed to incorporate robust mechanisms for coping with pushback demand stochasticity, rather than treating the pushback demand process as a known deterministic input.

  10. South Pole of the Sun, March 20, 2007

    NASA Image and Video Library

    2007-04-27

    NASA Solar TErrestrial RElations Observatory satellites have provided the first 3-dimensional images of the Sun. This view will aid scientists ability to understand solar physics to improve space weather forecasting. .

  11. North Pole of the Sun, March 20, 2007

    NASA Image and Video Library

    2007-04-27

    NASA Solar TErrestrial RElations Observatory satellites have provided the first 3-dimensional images of the Sun. This view will aid scientists ability to understand solar physics to improve space weather forecasting.

  12. Regional early flood warning system: design and implementation

    NASA Astrophysics Data System (ADS)

    Chang, L. C.; Yang, S. N.; Kuo, C. L.; Wang, Y. F.

    2017-12-01

    This study proposes a prototype of the regional early flood inundation warning system in Tainan City, Taiwan. The AI technology is used to forecast multi-step-ahead regional flood inundation maps during storm events. The computing time is only few seconds that leads to real-time regional flood inundation forecasting. A database is built to organize data and information for building real-time forecasting models, maintaining the relations of forecasted points, and displaying forecasted results, while real-time data acquisition is another key task where the model requires immediately accessing rain gauge information to provide forecast services. All programs related database are constructed in Microsoft SQL Server by using Visual C# to extracting real-time hydrological data, managing data, storing the forecasted data and providing the information to the visual map-based display. The regional early flood inundation warning system use the up-to-date Web technologies driven by the database and real-time data acquisition to display the on-line forecasting flood inundation depths in the study area. The friendly interface includes on-line sequentially showing inundation area by Google Map, maximum inundation depth and its location, and providing KMZ file download of the results which can be watched on Google Earth. The developed system can provide all the relevant information and on-line forecast results that helps city authorities to make decisions during typhoon events and make actions to mitigate the losses.

  13. Evaluation of CMAQ and CAMx Ensemble Air Quality Forecasts during the 2015 MAPS-Seoul Field Campaign

    NASA Astrophysics Data System (ADS)

    Kim, E.; Kim, S.; Bae, C.; Kim, H. C.; Kim, B. U.

    2015-12-01

    The performance of Air quality forecasts during the 2015 MAPS-Seoul Field Campaign was evaluated. An forecast system has been operated to support the campaign's daily aircraft route decisions for airborne measurements to observe long-range transporting plume. We utilized two real-time ensemble systems based on the Weather Research and Forecasting (WRF)-Sparse Matrix Operator Kernel Emissions (SMOKE)-Comprehensive Air quality Model with extensions (CAMx) modeling framework and WRF-SMOKE- Community Multi_scale Air Quality (CMAQ) framework over northeastern Asia to simulate PM10 concentrations. Global Forecast System (GFS) from National Centers for Environmental Prediction (NCEP) was used to provide meteorological inputs for the forecasts. For an additional set of retrospective simulations, ERA Interim Reanalysis from European Centre for Medium-Range Weather Forecasts (ECMWF) was also utilized to access forecast uncertainties from the meteorological data used. Model Inter-Comparison Study for Asia (MICS-Asia) and National Institute of Environment Research (NIER) Clean Air Policy Support System (CAPSS) emission inventories are used for foreign and domestic emissions, respectively. In the study, we evaluate the CMAQ and CAMx model performance during the campaign by comparing the results to the airborne and surface measurements. Contributions of foreign and domestic emissions are estimated using a brute force method. Analyses on model performance and emissions will be utilized to improve air quality forecasts for the upcoming KORUS-AQ field campaign planned in 2016.

  14. Remarkable response with pembrolizumab plus albumin-bound paclitaxel in 2 cases of HER2-positive metastatic breast cancer who have failed to multi-anti-HER2 targeted therapy.

    PubMed

    Li, Bian; Tao, Wang; Shao-Hua, Zhang; Ze-Rui, Qu; Fu-Quan, Jin; Fan, Li; Ze-Fei, Jiang

    2018-04-03

    In clinical practice, one subgroup patients of breast cancer might have developed resistance to multi-anti-HER2 targeted drugs(trastuzumab, lapatinib and/or T-DM1) and can not benefit from the anti-HER2 targeted therapy continuously. We attempt to change the next therapic way for these patients. Two patients with metastatic breast cancer who have failed to multi-anti-HER2 targeted therapy were treated with pembrolizumab (2 mg/Kg, day1) plus albumin-bound paclitaxel (125 mg/m 2 , day1,8) every 3 weeks. CT evaluation and HER2 ECD test were performed every 2 cycles. Both of the two patients achieved remarkable response with Partial Remission (PR), meanwhile serum HER2 ECD levels (the upper normal limit is 15 ng/ml) showed a remarkable decreases(compared to the base line decreases 75% and 60% respectively). The results indicate that regimen of pembrolizumab combination with albumin-bound paclitaxel might produce response in patients with HER2-positive metastatic breast cancer who have failed to multi-anti-HER2 targeted therapy.

  15. Topological invariant and cotranslational symmetry in strongly interacting multi-magnon systems

    NASA Astrophysics Data System (ADS)

    Qin, Xizhou; Mei, Feng; Ke, Yongguan; Zhang, Li; Lee, Chaohong

    2018-01-01

    It is still an outstanding challenge to characterize and understand the topological features of strongly interacting states such as bound states in interacting quantum systems. Here, by introducing a cotranslational symmetry in an interacting multi-particle quantum system, we systematically develop a method to define a Chern invariant, which is a generalization of the well-known Thouless-Kohmoto-Nightingale-den Nijs invariant, for identifying strongly interacting topological states. As an example, we study the topological multi-magnon states in a generalized Heisenberg XXZ model, which can be realized by the currently available experiment techniques of cold atoms (Aidelsburger et al 2013 Phys. Rev. Lett. 111, 185301; Miyake et al 2013 Phys. Rev. Lett. 111, 185302). Through calculating the two-magnon excitation spectrum and the defined Chern number, we explore the emergence of topological edge bound states and give their topological phase diagram. We also analytically derive an effective single-particle Hofstadter superlattice model for a better understanding of the topological bound states. Our results not only provide a new approach to defining a topological invariant for interacting multi-particle systems, but also give insights into the characterization and understanding of strongly interacting topological states.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bena, Iosif; Bobev, Nikolay; Warner, Nicholas P.

    We discuss 'spectral-flow' coordinate transformations that take asymptotically four-dimensional solutions into other asymptotically four-dimensional solutions. We find that spectral flow can relate smooth three-charge solutions with a multicenter Taub-NUT base to solutions where one or several Taub-NUT centers are replaced by two-charge supertubes, and vice versa. We further show that multiparameter spectral flows can map such Taub-NUT centers to more singular centers that are either D2-D0 or pure D0-brane sources. Since supertubes can depend on arbitrary functions, we establish that the moduli space of smooth horizonless black-hole microstate solutions is classically of infinite dimension. We also use the physics ofmore » supertubes to argue that some multicenter solutions that appear to be bound states from a four-dimensional perspective are in fact not bound states when considered from a five- or six-dimensional perspective.« less

  17. Multi-stage decoding for multi-level block modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao

    1991-01-01

    Various types of multistage decoding for multilevel block modulation codes, in which the decoding of a component code at each stage can be either soft decision or hard decision, maximum likelihood or bounded distance are discussed. Error performance for codes is analyzed for a memoryless additive channel based on various types of multi-stage decoding, and upper bounds on the probability of an incorrect decoding are derived. It was found that, if component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. It was found that the difference in performance between the suboptimum multi-stage soft decision maximum likelihood decoding of a modulation code and the single stage optimum decoding of the overall code is very small, only a fraction of dB loss in SNR at the probability of an incorrect decoding for a block of 10(exp -6). Multi-stage decoding of multi-level modulation codes really offers a way to achieve the best of three worlds, bandwidth efficiency, coding gain, and decoding complexity.

  18. Large enhancement of second harmonic generation from transition-metal dichalcogenide monolayer on grating near bound states in the continuum.

    PubMed

    Wang, Tiecheng; Zhang, Shihao

    2018-01-08

    Second harmonic generation from the two-layer structure where a transition-metal dichalcogenide monolayer is put on a one-dimensional grating has been studied. This grating supports bound states in the continuum which have no leakage lying within the continuum of radiation modes, we can enhance the second harmonic generation from the transition-metal dichalcogenide monolayer by more than four orders of magnitude based on the critical field enhancement near the bound states in the continuum. In order to complete this calculation, the scattering matrix theory has been extended to include the nonlinear effect and the scattering matrix of a two-dimensional material including nonlinear terms; furthermore, two methods to observe the bound states in the continuum are considered, where one is tuning the thickness of the grating and the other is changing the incident angle of the electromagnetic wave. We have also discussed various modulation of the second harmonic generation enhancement by adjusting the azimuthal angle of the transition-metal dichalcogenide monolayer.

  19. Remote and Local Influences in Forecasting Pacific SST: a Linear Inverse Model and a Multimodel Ensemble Study

    NASA Astrophysics Data System (ADS)

    Faggiani Dias, D.; Subramanian, A. C.; Zanna, L.; Miller, A. J.

    2017-12-01

    Sea surface temperature (SST) in the Pacific sector is well known to vary on time scales from seasonal to decadal, and the ability to predict these SST fluctuations has many societal and economical benefits. Therefore, we use a suite of statistical linear inverse models (LIMs) to understand the remote and local SST variability that influences SST predictions over the North Pacific region and further improve our understanding on how the long-observed SST record can help better guide multi-model ensemble forecasts. Observed monthly SST anomalies in the Pacific sector (between 15oS and 60oN) are used to construct different regional LIMs for seasonal to decadal prediction. The forecast skills of the LIMs are compared to that from two operational forecast systems in the North American Multi-Model Ensemble (NMME) revealing that the LIM has better skill in the Northeastern Pacific than NMME models. The LIM is also found to have comparable forecast skill for SST in the Tropical Pacific with NMME models. This skill, however, is highly dependent on the initialization month, with forecasts initialized during the summer having better skill than those initialized during the winter. The forecast skill with LIM is also influenced by the verification period utilized to make the predictions, likely due to the changing character of El Niño in the 20th century. The North Pacific seems to be a source of predictability for the Tropics on seasonal to interannual time scales, while the Tropics act to worsen the skill for the forecast in the North Pacific. The data were also bandpassed into seasonal, interannual and decadal time scales to identify the relationships between time scales using the structure of the propagator matrix. For the decadal component, this coupling occurs the other way around: Tropics seem to be a source of predictability for the Extratropics, but the Extratropics don't improve the predictability for the Tropics. These results indicate the importance of temporal scale interactions in improving predictability on decadal timescales. Hence, we show that LIMs are not only useful as benchmarks for estimates of statistical skill, but also to isolate contributions to the forecast skills from different timescales, spatial scales or even model components.

  20. Development of multi-dimensional body image scale for malaysian female adolescents

    PubMed Central

    Taib, Mohd Nasir Mohd; Shariff, Zalilah Mohd; Khor, Geok Lin

    2008-01-01

    The present study was conducted to develop a Multi-dimensional Body Image Scale for Malaysian female adolescents. Data were collected among 328 female adolescents from a secondary school in Kuantan district, state of Pahang, Malaysia by using a self-administered questionnaire and anthropometric measurements. The self-administered questionnaire comprised multiple measures of body image, Eating Attitude Test (EAT-26; Garner & Garfinkel, 1979) and Rosenberg Self-esteem Inventory (Rosenberg, 1965). The 152 items from selected multiple measures of body image were examined through factor analysis and for internal consistency. Correlations between Multi-dimensional Body Image Scale and body mass index (BMI), risk of eating disorders and self-esteem were assessed for construct validity. A seven factor model of a 62-item Multi-dimensional Body Image Scale for Malaysian female adolescents with construct validity and good internal consistency was developed. The scale encompasses 1) preoccupation with thinness and dieting behavior, 2) appearance and body satisfaction, 3) body importance, 4) muscle increasing behavior, 5) extreme dieting behavior, 6) appearance importance, and 7) perception of size and shape dimensions. Besides, a multidimensional body image composite score was proposed to screen negative body image risk in female adolescents. The result found body image was correlated with BMI, risk of eating disorders and self-esteem in female adolescents. In short, the present study supports a multi-dimensional concept for body image and provides a new insight into its multi-dimensionality in Malaysian female adolescents with preliminary validity and reliability of the scale. The Multi-dimensional Body Image Scale can be used to identify female adolescents who are potentially at risk of developing body image disturbance through future intervention programs. PMID:20126371

  1. Development of multi-dimensional body image scale for malaysian female adolescents.

    PubMed

    Chin, Yit Siew; Taib, Mohd Nasir Mohd; Shariff, Zalilah Mohd; Khor, Geok Lin

    2008-01-01

    The present study was conducted to develop a Multi-dimensional Body Image Scale for Malaysian female adolescents. Data were collected among 328 female adolescents from a secondary school in Kuantan district, state of Pahang, Malaysia by using a self-administered questionnaire and anthropometric measurements. The self-administered questionnaire comprised multiple measures of body image, Eating Attitude Test (EAT-26; Garner & Garfinkel, 1979) and Rosenberg Self-esteem Inventory (Rosenberg, 1965). The 152 items from selected multiple measures of body image were examined through factor analysis and for internal consistency. Correlations between Multi-dimensional Body Image Scale and body mass index (BMI), risk of eating disorders and self-esteem were assessed for construct validity. A seven factor model of a 62-item Multi-dimensional Body Image Scale for Malaysian female adolescents with construct validity and good internal consistency was developed. The scale encompasses 1) preoccupation with thinness and dieting behavior, 2) appearance and body satisfaction, 3) body importance, 4) muscle increasing behavior, 5) extreme dieting behavior, 6) appearance importance, and 7) perception of size and shape dimensions. Besides, a multidimensional body image composite score was proposed to screen negative body image risk in female adolescents. The result found body image was correlated with BMI, risk of eating disorders and self-esteem in female adolescents. In short, the present study supports a multi-dimensional concept for body image and provides a new insight into its multi-dimensionality in Malaysian female adolescents with preliminary validity and reliability of the scale. The Multi-dimensional Body Image Scale can be used to identify female adolescents who are potentially at risk of developing body image disturbance through future intervention programs.

  2. Spider-web inspired multi-resolution graphene tactile sensor.

    PubMed

    Liu, Lu; Huang, Yu; Li, Fengyu; Ma, Ying; Li, Wenbo; Su, Meng; Qian, Xin; Ren, Wanjie; Tang, Kanglai; Song, Yanlin

    2018-05-08

    Multi-dimensional accurate response and smooth signal transmission are critical challenges in the advancement of multi-resolution recognition and complex environment analysis. Inspired by the structure-activity relationship between discrepant microstructures of the spiral and radial threads in a spider web, we designed and printed graphene with porous and densely-packed microstructures to integrate into a multi-resolution graphene tactile sensor. The three-dimensional (3D) porous graphene structure performs multi-dimensional deformation responses. The laminar densely-packed graphene structure contributes excellent conductivity with flexible stability. The spider-web inspired printed pattern inherits orientational and locational kinesis tracking. The multi-structure construction with homo-graphene material can integrate discrepant electronic properties with remarkable flexibility, which will attract enormous attention for electronic skin, wearable devices and human-machine interactions.

  3. Towards an integrated forecasting system for fisheries on habitat-bound stocks

    NASA Astrophysics Data System (ADS)

    Christensen, A.; Butenschön, M.; Gürkan, Z.; Allen, I. J.

    2013-03-01

    First results of a coupled modelling and forecasting system for fisheries on habitat-bound stocks are being presented. The system consists currently of three mathematically, fundamentally different model subsystems coupled offline: POLCOMS providing the physical environment implemented in the domain of the north-west European shelf, the SPAM model which describes sandeel stocks in the North Sea, and the third component, the SLAM model, which connects POLCOMS and SPAM by computing the physical-biological interaction. Our major experience by the coupling model subsystems is that well-defined and generic model interfaces are very important for a successful and extendable coupled model framework. The integrated approach, simulating ecosystem dynamics from physics to fish, allows for analysis of the pathways in the ecosystem to investigate the propagation of changes in the ocean climate and to quantify the impacts on the higher trophic level, in this case the sandeel population, demonstrated here on the basis of hindcast data. The coupled forecasting system is tested for some typical scientific questions appearing in spatial fish stock management and marine spatial planning, including determination of local and basin-scale maximum sustainable yield, stock connectivity and source/sink structure. Our presented simulations indicate that sandeel stocks are currently exploited close to the maximum sustainable yield, even though periodic overfishing seems to have occurred, but large uncertainty is associated with determining stock maximum sustainable yield due to stock inherent dynamics and climatic variability. Our statistical ensemble simulations indicates that the predictive horizon set by climate interannual variability is 2-6 yr, after which only an asymptotic probability distribution of stock properties, like biomass, are predictable.

  4. Symmetry-breaking instability of quadratic soliton bound states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delque, Michaeel; Departement d'Optique P.M. Duffieux, Institut FEMTO-ST, Universite de Franche-Comte, CNRS UMR 6174, F-25030 Besancon; Fanjoux, Gil

    We study both numerically and experimentally two-dimensional soliton bound states in quadratic media and demonstrate their symmetry-breaking instability. The experiment is performed in a potassium titanyl phosphate crystal in a type-II configuration. The bound state is generated by the copropagation of the antisymmetric fundamental beam locked in phase with the symmetrical second harmonic one. Experimental results are in good agreement with numerical simulations of the nonlinear wave equations.

  5. Evaluating the spatio-temporal performance of sky imager based solar irradiance analysis and forecasts

    NASA Astrophysics Data System (ADS)

    Schmidt, T.; Kalisch, J.; Lorenz, E.; Heinemann, D.

    2015-10-01

    Clouds are the dominant source of variability in surface solar radiation and uncertainty in its prediction. However, the increasing share of solar energy in the world-wide electric power supply increases the need for accurate solar radiation forecasts. In this work, we present results of a shortest-term global horizontal irradiance (GHI) forecast experiment based on hemispheric sky images. A two month dataset with images from one sky imager and high resolutive GHI measurements from 99 pyranometers distributed over 10 km by 12 km is used for validation. We developed a multi-step model and processed GHI forecasts up to 25 min with an update interval of 15 s. A cloud type classification is used to separate the time series in different cloud scenarios. Overall, the sky imager based forecasts do not outperform the reference persistence forecasts. Nevertheless, we find that analysis and forecast performance depend strongly on the predominant cloud conditions. Especially convective type clouds lead to high temporal and spatial GHI variability. For cumulus cloud conditions, the analysis error is found to be lower than that introduced by a single pyranometer if it is used representatively for the whole area in distances from the camera larger than 1-2 km. Moreover, forecast skill is much higher for these conditions compared to overcast or clear sky situations causing low GHI variability which is easier to predict by persistence. In order to generalize the cloud-induced forecast error, we identify a variability threshold indicating conditions with positive forecast skill.

  6. Using NMME in Region-Specific Operational Seasonal Climate Forecasts

    NASA Astrophysics Data System (ADS)

    Gronewold, A.; Bolinger, R. A.; Fry, L. M.; Kompoltowicz, K.

    2015-12-01

    The National Oceanic and Atmospheric Administration's Climate Prediction Center (NOAA/CPC) provides access to a suite of real-time monthly climate forecasts that comprise the North American Multi-Model Ensemble (NMME) in an attempt to meet increasing demands for monthly to seasonal climate prediction. While the graphical map forecasts of the NMME are informative, there is a need to provide decision-makers with probabilistic forecasts specific to their region of interest. Here, we demonstrate the potential application of the NMME to address regional climate projection needs by developing new forecasts of temperature and precipitation for the North American Great Lakes, the largest system of lakes on Earth. Regional opertional water budget forecasts rely on these outlooks to initiate monthly forecasts not only of the water budget, but of monthly lake water levels as well. More specifically, we present an alternative for improving existing operational protocols that currently involve a relatively time-consuming and subjective procedure based on interpreting the maps of the NMME. In addition, all forecasts are currently presented in the NMME in a probabilistic format, with equal weighting given to each member of the ensemble. In our new evolution of this product, we provide historical context for the forecasts by superimposing them (in an on-line graphical user interface) with the historical range of observations. Implementation of this new tool has already led to noticeable advantages in regional water budget forecasting, and has the potential to be transferred to other regional decision-making authorities as well.

  7. Implementation of a Multi-Robot Coverage Algorithm on a Two-Dimensional, Grid-Based Environment

    DTIC Science & Technology

    2017-06-01

    two planar laser range finders with a 180-degree field of view , color camera, vision beacons, and wireless communicator. In their system, the robots...Master’s thesis 4. TITLE AND SUBTITLE IMPLEMENTATION OF A MULTI -ROBOT COVERAGE ALGORITHM ON A TWO -DIMENSIONAL, GRID-BASED ENVIRONMENT 5. FUNDING NUMBERS...path planning coverage algorithm for a multi -robot system in a two -dimensional, grid-based environment. We assess the applicability of a topology

  8. A simple new filter for nonlinear high-dimensional data assimilation

    NASA Astrophysics Data System (ADS)

    Tödter, Julian; Kirchgessner, Paul; Ahrens, Bodo

    2015-04-01

    The ensemble Kalman filter (EnKF) and its deterministic variants, mostly square root filters such as the ensemble transform Kalman filter (ETKF), represent a popular alternative to variational data assimilation schemes and are applied in a wide range of operational and research activities. Their forecast step employs an ensemble integration that fully respects the nonlinear nature of the analyzed system. In the analysis step, they implicitly assume the prior state and observation errors to be Gaussian. Consequently, in nonlinear systems, the analysis mean and covariance are biased, and these filters remain suboptimal. In contrast, the fully nonlinear, non-Gaussian particle filter (PF) only relies on Bayes' theorem, which guarantees an exact asymptotic behavior, but because of the so-called curse of dimensionality it is exposed to weight collapse. This work shows how to obtain a new analysis ensemble whose mean and covariance exactly match the Bayesian estimates. This is achieved by a deterministic matrix square root transformation of the forecast ensemble, and subsequently a suitable random rotation that significantly contributes to filter stability while preserving the required second-order statistics. The forecast step remains as in the ETKF. The proposed algorithm, which is fairly easy to implement and computationally efficient, is referred to as the nonlinear ensemble transform filter (NETF). The properties and performance of the proposed algorithm are investigated via a set of Lorenz experiments. They indicate that such a filter formulation can increase the analysis quality, even for relatively small ensemble sizes, compared to other ensemble filters in nonlinear, non-Gaussian scenarios. Furthermore, localization enhances the potential applicability of this PF-inspired scheme in larger-dimensional systems. Finally, the novel algorithm is coupled to a large-scale ocean general circulation model. The NETF is stable, behaves reasonably and shows a good performance with a realistic ensemble size. The results confirm that, in principle, it can be applied successfully and as simple as the ETKF in high-dimensional problems without further modifications of the algorithm, even though it is only based on the particle weights. This proves that the suggested method constitutes a useful filter for nonlinear, high-dimensional data assimilation, and is able to overcome the curse of dimensionality even in deterministic systems.

  9. Bound Magnon Dominance of the Magnetic Susceptibility of the One-Dimensional Heisenberg Spin One-Half Ferromagnet Cyclohexylammonium Trichlorocuprate

    NASA Astrophysics Data System (ADS)

    Haines, Donald Noble

    1987-09-01

    This study is an experimental investigation of the differential magnetic susceptibility of the spin one -half, one-dimensional, Ising-Heisenberg ferromagnet (S = 1over 2,1d,HIF). Recent theoretical work predicts the existence of magnon bound states in this model system, and that these bound spin wave states dominate its thermodynamic properties. Further, the theories indicate that classical linearized spin wave theory fails completely in such systems, and may also be intrinsically incorrect in certain higher dimensional systems. The purpose of this research is to confirm the existence of bound magnons in the S = 1over 2,1d,HIF for the nearly Heisenberg case, and demonstrate the dominance of the bound states over the spin wave states in determining thermodynamic behavior. A preliminary numerical study was performed to determine the ranges of magnetic field and temperature at which bound magnons might be expected to make a significant contribution to the magnetic susceptibility and specific heat of the S = 1over 2,1d,HIF. It was found that bound magnons dominate at low and high fields, and spin waves dominate at intermediate fields. For anisotropies less than 2% bound magnons dominate the low temperature regime for all fields. To test the theoretical predictions cyclohexylammonium trichlorocuprate(II) (CHAC) was chosen as a model S = 1over 2,1d,HIF compound for experimental study. The differential susceptibility of a powder sample of CHAC was measured as a function of temperature in fields of 0, 1, 2, and 3T. The temperature range for these studies was 4.2K to 40K. Susceptibility measurements were performed using an ac mutual inductance bridge which employs a SQUID (Superconducting Quantum Interference Device) as a null detector. The design, calibration, and operation of this instrument are described. Data from the experiments compare favorably with the theoretical predictions, confirming the existence of bound magnons in the nearly Heisenberg S = 1over 2,1d,HIF. Further, the experimental results clearly show that bound magnons are the dominant excitation determining the susceptibility for all fields and temperatures studied. Spin wave theory cannot describe the data for any values of the adjustable parameters.

  10. State estimation and prediction using clustered particle filters.

    PubMed

    Lee, Yoonsang; Majda, Andrew J

    2016-12-20

    Particle filtering is an essential tool to improve uncertain model predictions by incorporating noisy observational data from complex systems including non-Gaussian features. A class of particle filters, clustered particle filters, is introduced for high-dimensional nonlinear systems, which uses relatively few particles compared with the standard particle filter. The clustered particle filter captures non-Gaussian features of the true signal, which are typical in complex nonlinear dynamical systems such as geophysical systems. The method is also robust in the difficult regime of high-quality sparse and infrequent observations. The key features of the clustered particle filtering are coarse-grained localization through the clustering of the state variables and particle adjustment to stabilize the method; each observation affects only neighbor state variables through clustering and particles are adjusted to prevent particle collapse due to high-quality observations. The clustered particle filter is tested for the 40-dimensional Lorenz 96 model with several dynamical regimes including strongly non-Gaussian statistics. The clustered particle filter shows robust skill in both achieving accurate filter results and capturing non-Gaussian statistics of the true signal. It is further extended to multiscale data assimilation, which provides the large-scale estimation by combining a cheap reduced-order forecast model and mixed observations of the large- and small-scale variables. This approach enables the use of a larger number of particles due to the computational savings in the forecast model. The multiscale clustered particle filter is tested for one-dimensional dispersive wave turbulence using a forecast model with model errors.

  11. State estimation and prediction using clustered particle filters

    PubMed Central

    Lee, Yoonsang; Majda, Andrew J.

    2016-01-01

    Particle filtering is an essential tool to improve uncertain model predictions by incorporating noisy observational data from complex systems including non-Gaussian features. A class of particle filters, clustered particle filters, is introduced for high-dimensional nonlinear systems, which uses relatively few particles compared with the standard particle filter. The clustered particle filter captures non-Gaussian features of the true signal, which are typical in complex nonlinear dynamical systems such as geophysical systems. The method is also robust in the difficult regime of high-quality sparse and infrequent observations. The key features of the clustered particle filtering are coarse-grained localization through the clustering of the state variables and particle adjustment to stabilize the method; each observation affects only neighbor state variables through clustering and particles are adjusted to prevent particle collapse due to high-quality observations. The clustered particle filter is tested for the 40-dimensional Lorenz 96 model with several dynamical regimes including strongly non-Gaussian statistics. The clustered particle filter shows robust skill in both achieving accurate filter results and capturing non-Gaussian statistics of the true signal. It is further extended to multiscale data assimilation, which provides the large-scale estimation by combining a cheap reduced-order forecast model and mixed observations of the large- and small-scale variables. This approach enables the use of a larger number of particles due to the computational savings in the forecast model. The multiscale clustered particle filter is tested for one-dimensional dispersive wave turbulence using a forecast model with model errors. PMID:27930332

  12. Enhancing Nursing Staffing Forecasting With Safety Stock Over Lead Time Modeling.

    PubMed

    McNair, Douglas S

    2015-01-01

    In balancing competing priorities, it is essential that nursing staffing provide enough nurses to safely and effectively care for the patients. Mathematical models to predict optimal "safety stocks" have been routine in supply chain management for many years but have up to now not been applied in nursing workforce management. There are various aspects that exhibit similarities between the 2 disciplines, such as an evolving demand forecast according to acuity and the fact that provisioning "stock" to meet demand in a future period has nonzero variable lead time. Under assumptions about the forecasts (eg, the demand process is well fit as an autoregressive process) and about the labor supply process (≥1 shifts' lead time), we show that safety stock over lead time for such systems is effectively equivalent to the corresponding well-studied problem for systems with stationary demand bounds and base stock policies. Hence, we can apply existing models from supply chain analytics to find the optimal safety levels of nurse staffing. We use a case study with real data to demonstrate that there are significant benefits from the inclusion of the forecast process when determining the optimal safety stocks.

  13. Apo And Calcium-Bound Crystal Structures of Alpha-11 Giardin, An Unusual Annexin From 'Giardia Lamblia'

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pathuri, P.; Nguyen, E.T.; Svard, S.G.

    2007-07-12

    Alpha-11 giardin is a member of the multi-gene alpha-giardin family in the intestinal protozoan, Giardia lamblia. This gene family shares an ancestry with the annexin super family, whose common characteristic is calcium-dependent binding to membranes that contain acidic phospholipids. Several alpha giardins are highly expressed during parasite-induced diarrhea in humans. Despite being a member of a large family of proteins, little is known about the function and cellular localization of alpha-11 giardin, although giardins are often associated with the cytoskeleton. It has been shown that Giardia exhibits high levels of alpha-11 giardin mRNA transcript throughout its life cycle; however, constitutivemore » over-expression of this protein is lethal to the parasite. Determining the three-dimensional structure of an alpha-giardin is essential to identifying functional domains shared in the alpha-giardin family. Here we report the crystal structures of the apo and Ca{sup 2+}-bound forms of alpha-11 giardin, the first alpha giardin to be characterized structurally. Crystals of apo and Ca{sup 2+}-bound alpha-11 giardin diffracted to 1.1 angstroms and 2.93 angstroms, respectively. The crystal structure of selenium-substituted apo alpha-11 giardin reveals a planar array of four tandem repeats of predominantly {alpha}-helical domains, reminiscent of previously determined annexin structures, making this the highest-resolution structure of an annexin to date. The apo alpha-11 giardin structure also reveals a hydrophobic core formed between repeats I/IV and II/III, a region typically hydrophilic in other annexins. Surprisingly, the Ca{sup 2+}-bound structure contains only a single calcium ion, located in the DE loop of repeat I and coordinated differently from the two types of calcium sites observed in previous annexin structures. The apo and Ca{sup 2+}-bound alpha-11 giardin structures assume overall similar conformations; however, Ca2+-bound alpha-11 giardin crystallized in a lower-symmetry space group with four molecules in the asymmetric unit. Vesicle-binding studies suggest that alpha-11 giardin, unlike most other annexins, does not bind to vesicles composed of acidic phospholipids in a calcium-dependent manner.« less

  14. Application of high-resolution, two-dimensional 1H and 13C nuclear magnetic resonance techniques to the characterization of lipid oxidation products in autoxidized linoleoyl/linolenoylglycerols.

    PubMed

    Silwood, C J; Grootveld, M

    1999-07-01

    Subjection of polyunsaturated fatty acid (PUFA)-rich culinary oils to standard frying episodes generates a range of lipid oxidation products (LOP), including saturated and alpha,beta-unsaturated aldehydes which arise from the thermally induced fragmentation of conjugated hydroperoxydiene precursors. Since such LOP are damaging to human health, we have employed high-resolution, two-dimensional 1H-1H relayed coherence transfer, 1H-1H total correlation, 1H-13C heteronuclear multiple quantum correlation, and 1H-1H J-resolved nuclear magnetic resonance (NMR) spectroscopic techniques to further elucidate the molecular structures of these components present in (i) a model linoleoylglycerol compound (1,3-dilinolein) allowed to autoxidize at ambient temperature and (ii) PUFA-rich culinary oils subjected to repeated frying episodes. The above techniques readily facilitate the resolution of selected vinylic and aldehydic resonances of LOP which appear as complex overlapping patterns in conventional one-dimensional spectra, particularly when employed in combination with solvent-induced spectral shift modifications. Hence, much useful multi-component information regarding the identity and/or classification of glycerol-bound conjugated hydroperoxydiene and hydroxydiene adducts, and saturated and alpha,beta-unsaturated aldehydes, present in autoxidized PUFA matrices is provided by these NMR methods. Such molecular information is of much value to researchers investigating the deleterious health effects of LOP available in the diet.

  15. Multi-dimensional simulations of core-collapse supernova explosions with CHIMERA

    NASA Astrophysics Data System (ADS)

    Messer, O. E. B.; Harris, J. A.; Hix, W. R.; Lentz, E. J.; Bruenn, S. W.; Mezzacappa, A.

    2018-04-01

    Unraveling the core-collapse supernova (CCSN) mechanism is a problem that remains essentially unsolved despite more than four decades of effort. Spherically symmetric models with otherwise high physical fidelity generally fail to produce explosions, and it is widely accepted that CCSNe are inherently multi-dimensional. Progress in realistic modeling has occurred recently through the availability of petascale platforms and the increasing sophistication of supernova codes. We will discuss our most recent work on understanding neutrino-driven CCSN explosions employing multi-dimensional neutrino-radiation hydrodynamics simulations with the Chimera code. We discuss the inputs and resulting outputs from these simulations, the role of neutrino radiation transport, and the importance of multi-dimensional fluid flows in shaping the explosions. We also highlight the production of 48Ca in long-running Chimera simulations.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shirokov, M. E.

    We analyse two possible definitions of the squashed entanglement in an infinite-dimensional bipartite system: direct translation of the finite-dimensional definition and its universal extension. It is shown that the both definitions produce the same lower semicontinuous entanglement measure possessing all basis properties of the squashed entanglement on the set of states having at least one finite marginal entropy. It is also shown that the second definition gives an adequate lower semicontinuous extension of this measure to all states of the infinite-dimensional bipartite system. A general condition relating continuity of the squashed entanglement to continuity of the quantum mutual information ismore » proved and its corollaries are considered. Continuity bound for the squashed entanglement under the energy constraint on one subsystem is obtained by using the tight continuity bound for quantum conditional mutual information (proved in the Appendix by using Winter’s technique). It is shown that the same continuity bound is valid for the entanglement of formation. As a result the asymptotic continuity of the both entanglement measures under the energy constraint on one subsystem is proved.« less

  17. Strongly bound excitons in anatase TiO 2 single crystals and nanoparticles

    DOE PAGES

    Baldini, E.; Chiodo, L.; Dominguez, A.; ...

    2017-04-13

    Anatase TiO 2 is among the most studied materials for light-energy conversion applications, but the nature of its fundamental charge excitations is still unknown. Yet it is crucial to establish whether light absorption creates uncorrelated electron-hole pairs or bound excitons and, in the latter case, to determine their character. Here, by combining steady-state angle-resolved photoemission spectroscopy and spectroscopic ellipsometry with state-of-the-art ab initio calculations, we demonstrate that the direct optical gap of single crystals is dominated by a strongly bound exciton rising over the continuum of indirect interband transitions. This exciton possesses an intermediate character between the Wannier-Mott and Frenkelmore » regimes and displays a peculiar two-dimensional wavefunction in the three-dimensional lattice. The nature of the higher-energy excitations is also identified. Furthermore, the universal validity of our results is confirmed up to room temperature by observing the same elementary excitations in defect-rich samples (doped single crystals and nanoparticles) via ultrafast two-dimensional deep-ultraviolet spectroscopy.« less

  18. Wronskian Method for Bound States

    ERIC Educational Resources Information Center

    Fernandez, Francisco M.

    2011-01-01

    We propose a simple and straightforward method based on Wronskians for the calculation of bound-state energies and wavefunctions of one-dimensional quantum-mechanical problems. We explicitly discuss the asymptotic behaviour of the wavefunction and show that the allowed energies make the divergent part vanish. As illustrative examples we consider…

  19. Real-time demonstration and evaluation of over-the-loop short to medium-range ensemble streamflow forecasting

    NASA Astrophysics Data System (ADS)

    Wood, A. W.; Clark, E.; Newman, A. J.; Nijssen, B.; Clark, M. P.; Gangopadhyay, S.; Arnold, J. R.

    2015-12-01

    The US National Weather Service River Forecasting Centers are beginning to operationalize short range to medium range ensemble predictions that have been in development for several years. This practice contrasts with the traditional single-value forecast practice at these lead times not only because the ensemble forecasts offer a basis for quantifying forecast uncertainty, but also because the use of ensembles requires a greater degree of automation in the forecast workflow than is currently used. For instance, individual ensemble member forcings cannot (practically) be manually adjusted, a step not uncommon with the current single-value paradigm, thus the forecaster is required to adopt a more 'over-the-loop' role than before. The relative lack of experience among operational forecasters and forecast users (eg, water managers) in the US with over-the-loop approaches motivates the creation of a real-time demonstration and evaluation platform for exploring the potential of over-the-loop workflows to produce usable ensemble short-to-medium range forecasts, as well as long range predictions. We describe the development and early results of such an effort by a collaboration between NCAR and the two water agencies, the US Army Corps of Engineers and the US Bureau of Reclamation. Focusing on small to medium sized headwater basins around the US, and using multi-decade series of ensemble streamflow hindcasts, we also describe early results, assessing the skill of daily-updating, over-the-loop forecasts driven by a set of ensemble atmospheric outputs from the NCEP GEFS for lead times from 1-15 days.

  20. Near-real-time simulation and internet-based delivery of forecast-flood inundation maps using two-dimensional hydraulic modeling--A pilot study for the Snoqualmie River, Washington

    USGS Publications Warehouse

    Jones, Joseph L.; Fulford, Janice M.; Voss, Frank D.

    2002-01-01

    A system of numerical hydraulic modeling, geographic information system processing, and Internet map serving, supported by new data sources and application automation, was developed that generates inundation maps for forecast floods in near real time and makes them available through the Internet. Forecasts for flooding are generated by the National Weather Service (NWS) River Forecast Center (RFC); these forecasts are retrieved automatically by the system and prepared for input to a hydraulic model. The model, TrimR2D, is a new, robust, two-dimensional model capable of simulating wide varieties of discharge hydrographs and relatively long stream reaches. TrimR2D was calibrated for a 28-kilometer reach of the Snoqualmie River in Washington State, and is used to estimate flood extent, depth, arrival time, and peak time for the RFC forecast. The results of the model are processed automatically by a Geographic Information System (GIS) into maps of flood extent, depth, and arrival and peak times. These maps subsequently are processed into formats acceptable by an Internet map server (IMS). The IMS application is a user-friendly interface to access the maps over the Internet; it allows users to select what information they wish to see presented and allows the authors to define scale-dependent availability of map layers and their symbology (appearance of map features). For example, the IMS presents a background of a digital USGS 1:100,000-scale quadrangle at smaller scales, and automatically switches to an ortho-rectified aerial photograph (a digital photograph that has camera angle and tilt distortions removed) at larger scales so viewers can see ground features that help them identify their area of interest more effectively. For the user, the option exists to select either background at any scale. Similar options are provided for both the map creator and the viewer for the various flood maps. This combination of a robust model, emerging IMS software, and application interface programming should allow the technology developed in the pilot study to be applied to other river systems where NWS forecasts are provided routinely.

  1. Assessing a 3D smoothed seismicity model of induced earthquakes

    NASA Astrophysics Data System (ADS)

    Zechar, Jeremy; Király, Eszter; Gischig, Valentin; Wiemer, Stefan

    2016-04-01

    As more energy exploration and extraction efforts cause earthquakes, it becomes increasingly important to control induced seismicity. Risk management schemes must be improved and should ultimately be based on near-real-time forecasting systems. With this goal in mind, we propose a test bench to evaluate models of induced seismicity based on metrics developed by the CSEP community. To illustrate the test bench, we consider a model based on the so-called seismogenic index and a rate decay; to produce three-dimensional forecasts, we smooth past earthquakes in space and time. We explore four variants of this model using the Basel 2006 and Soultz-sous-Forêts 2004 datasets to make short-term forecasts, test their consistency, and rank the model variants. Our results suggest that such a smoothed seismicity model is useful for forecasting induced seismicity within three days, and giving more weight to recent events improves forecast performance. Moreover, the location of the largest induced earthquake is forecast well by this model. Despite the good spatial performance, the model does not estimate the seismicity rate well: it frequently overestimates during stimulation and during the early post-stimulation period, and it systematically underestimates around shut-in. In this presentation, we also describe a robust estimate of information gain, a modification that can also benefit forecast experiments involving tectonic earthquakes.

  2. Rare events modeling with support vector machine: Application to forecasting large-amplitude geomagnetic substorms and extreme events in financial markets.

    NASA Astrophysics Data System (ADS)

    Gavrishchaka, V. V.; Ganguli, S. B.

    2001-12-01

    Reliable forecasting of rare events in a complex dynamical system is a challenging problem that is important for many practical applications. Due to the nature of rare events, data set available for construction of the statistical and/or machine learning model is often very limited and incomplete. Therefore many widely used approaches including such robust algorithms as neural networks can easily become inadequate for rare events prediction. Moreover in many practical cases models with high-dimensional inputs are required. This limits applications of the existing rare event modeling techniques (e.g., extreme value theory) that focus on univariate cases. These approaches are not easily extended to multivariate cases. Support vector machine (SVM) is a machine learning system that can provide an optimal generalization using very limited and incomplete training data sets and can efficiently handle high-dimensional data. These features may allow to use SVM to model rare events in some applications. We have applied SVM-based system to the problem of large-amplitude substorm prediction and extreme event forecasting in stock and currency exchange markets. Encouraging preliminary results will be presented and other possible applications of the system will be discussed.

  3. Evaluation of radar and automatic weather station data assimilation for a heavy rainfall event in southern China

    NASA Astrophysics Data System (ADS)

    Hou, Tuanjie; Kong, Fanyou; Chen, Xunlai; Lei, Hengchi; Hu, Zhaoxia

    2015-07-01

    To improve the accuracy of short-term (0-12 h) forecasts of severe weather in southern China, a real-time storm-scale forecasting system, the Hourly Assimilation and Prediction System (HAPS), has been implemented in Shenzhen, China. The forecasting system is characterized by combining the Advanced Research Weather Research and Forecasting (WRF-ARW) model and the Advanced Regional Prediction System (ARPS) three-dimensional variational data assimilation (3DVAR) package. It is capable of assimilating radar reflectivity and radial velocity data from multiple Doppler radars as well as surface automatic weather station (AWS) data. Experiments are designed to evaluate the impacts of data assimilation on quantitative precipitation forecasting (QPF) by studying a heavy rainfall event in southern China. The forecasts from these experiments are verified against radar, surface, and precipitation observations. Comparison of echo structure and accumulated precipitation suggests that radar data assimilation is useful in improving the short-term forecast by capturing the location and orientation of the band of accumulated rainfall. The assimilation of radar data improves the short-term precipitation forecast skill by up to 9 hours by producing more convection. The slight but generally positive impact that surface AWS data has on the forecast of near-surface variables can last up to 6-9 hours. The assimilation of AWS observations alone has some benefit for improving the Fractions Skill Score (FSS) and bias scores; when radar data are assimilated, the additional AWS data may increase the degree of rainfall overprediction.

  4. Oregon Washington Coastal Ocean Forecast System: Real-time Modeling and Data Assimilation

    NASA Astrophysics Data System (ADS)

    Erofeeva, S.; Kurapov, A. L.; Pasmans, I.

    2016-02-01

    Three-day forecasts of ocean currents, temperature and salinity along the Oregon and Washington coasts are produced daily by a numerical ROMS-based ocean circulation model. NAM is used to derive atmospheric forcing for the model. Fresh water discharge from Columbia River, Fraser River, and small rivers in Puget Sound are included. The forecast is constrained by open boundary conditions derived from the global Navy HYCOM model and once in 3 days assimilation of recent data, including HF radar surface currents, sea surface temperature from the GOES satellite, and SSH from several satellite altimetry missions. 4-dimensional variational data assimilation is implemented in 3-day time windows using the tangent linear and adjoint codes developed at OSU. The system is semi-autonomous - all the data, including NAM and HYCOM fields are automatically updated, and daily operational forecast is automatically initiated. The pre-assimilation data quality control and post-assimilation forecast quality control require the operator's involvement. The daily forecast and 60 days of hindcast fields are available for public on opendap. As part of the system model validation plots to various satellites and SEAGLIDER are also automatically updated and available on the web (http://ingria.coas.oregonstate.edu/rtdavow/). Lessons learned in this pilot real-time coastal ocean forecasting project help develop and test metrics for forecast skill assessment for the West Coast Operational Forecast System (WCOFS), currently at testing and development phase at the National Oceanic and Atmospheric Administration (NOAA).

  5. HEPS4Power - Extended-range Hydrometeorological Ensemble Predictions for Improved Hydropower Operations and Revenues

    NASA Astrophysics Data System (ADS)

    Bogner, Konrad; Monhart, Samuel; Liniger, Mark; Spririg, Christoph; Jordan, Fred; Zappa, Massimiliano

    2015-04-01

    In recent years large progresses have been achieved in the operational prediction of floods and hydrological drought with up to ten days lead time. Both the public and the private sectors are currently using probabilistic runoff forecast in order to monitoring water resources and take actions when critical conditions are to be expected. The use of extended-range predictions with lead times exceeding 10 days is not yet established. The hydropower sector in particular might have large benefits from using hydro meteorological forecasts for the next 15 to 60 days in order to optimize the operations and the revenues from their watersheds, dams, captions, turbines and pumps. The new Swiss Competence Centers in Energy Research (SCCER) targets at boosting research related to energy issues in Switzerland. The objective of HEPS4POWER is to demonstrate that operational extended-range hydro meteorological forecasts have the potential to become very valuable tools for fine tuning the production of energy from hydropower systems. The project team covers a specific system-oriented value chain starting from the collection and forecast of meteorological data (MeteoSwiss), leading to the operational application of state-of-the-art hydrological models (WSL) and terminating with the experience in data presentation and power production forecasts for end-users (e-dric.ch). The first task of the HEPS4POWER will be the downscaling and post-processing of ensemble extended-range meteorological forecasts (EPS). The goal is to provide well-tailored forecasts of probabilistic nature that should be reliable in statistical and localized at catchment or even station level. The hydrology related task will consist in feeding the post-processed meteorological forecasts into a HEPS using a multi-model approach by implementing models with different complexity. Also in the case of the hydrological ensemble predictions, post-processing techniques need to be tested in order to improve the quality of the forecasts against observed discharge. Analysis should be specifically oriented to the maximisation of hydroelectricity production. Thus, verification metrics should include economic measures like cost loss approaches. The final step will include the transfer of the HEPS system to several hydropower systems, the connection with the energy market prices and the development of probabilistic multi-reservoir production and management optimizations guidelines. The baseline model chain yielding three-days forecasts established for a hydropower system in southern-Switzerland will be presented alongside with the work-plan to achieve seasonal ensemble predictions.

  6. An analytical solution of groundwater level fluctuation in a U-shaped leaky coastal aquifer

    NASA Astrophysics Data System (ADS)

    Huang, Fu-Kuo; Chuang, Mo-Hsiung; Wang, Shu-chuan

    2017-04-01

    Tide-induced groundwater level fluctuations in coastal aquifers have attracted much attention in past years, especially for the issues associated with the impact of the coastline shape, multi-layered leaky aquifer system, and anisotropy of aquifers. In this study, a homogeneous but anisotropic multi-layered leaky aquifer system with U-shaped coastline is considered, where the subsurface system consisting of an unconfined aquifer, a leaky confined aquifer, and a semi-permeable layer between them. The analytical solution of the model obtained herein may be considered as an extended work of two solutions; one was developed by Huang et al. (Huang et al. Tide-induced groundwater level fluctuation in a U-shaped coastal aquifer, J. Hydrol. 2015; 530: 291-305) for two-dimensional interacting tidal waves bounded by three water-land boundaries while the other was by Li and Jiao (Li and Jiao. Tidal groundwater level fluctuations in L-shaped leaky coastal aquifer system, J. Hydrol. 2002; 268: 234-243) for two-dimensional interacting tidal waves of leaky coastal aquifer system adjacent to a cross-shore estuary. In this research, the effects of leakage and storativity of the semi-permeable layer on the amplitude and phase shift of the tidal head fluctuation, and the influence of anisotropy of the aquifer are all examined for the U-shaped leaky coastal aquifer. Some existing solutions in literatures can be regarded as the special cases of the present solution if the aquifer system is isotropic and non-leaky. The results obtained will be beneficial to coastal development and management for water resources.

  7. Dynamic State Estimation of Terrestrial and Solar Plasmas

    NASA Astrophysics Data System (ADS)

    Kamalabadi, Farzad

    A pervasive problem in virtually all branches of space science is the estimation of multi-dimensional state parameters of a dynamical system from a collection of indirect, often incomplete, and imprecise measurements. Subsequent scientific inference is predicated on rigorous analysis, interpretation, and understanding of physical observations and on the reliability of the associated quantitative statistical bounds and performance characteristics of the algorithms used. In this work, we focus on these dynamic state estimation problems and illustrate their importance in the context of two timely activities in space remote sensing. First, we discuss the estimation of multi-dimensional ionospheric state parameters from UV spectral imaging measurements anticipated to be acquired the recently selected NASA Heliophysics mission, Ionospheric Connection Explorer (ICON). Next, we illustrate that similar state-space formulations provide the means for the estimation of 3D, time-dependent densities and temperatures in the solar corona from a series of white-light and EUV measurements. We demonstrate that, while a general framework for the stochastic formulation of the state estimation problem is suited for systematic inference of the parameters of a hidden Markov process, several challenges must be addressed in the assimilation of an increasing volume and diversity of space observations. These challenges are: (1) the computational tractability when faced with voluminous and multimodal data, (2) the inherent limitations of the underlying models which assume, often incorrectly, linear dynamics and Gaussian noise, and (3) the unavailability or inaccuracy of transition probabilities and noise statistics. We argue that pursuing answers to these questions necessitates cross-disciplinary research that enables progress toward systematically reconciling observational and theoretical understanding of the space environment.

  8. Finite-time consensus for multi-agent systems with globally bounded convergence time under directed communication graphs

    NASA Astrophysics Data System (ADS)

    Fu, Junjie; Wang, Jin-zhi

    2017-09-01

    In this paper, we study the finite-time consensus problems with globally bounded convergence time also known as fixed-time consensus problems for multi-agent systems subject to directed communication graphs. Two new distributed control strategies are proposed such that leaderless and leader-follower consensus are achieved with convergence time independent on the initial conditions of the agents. Fixed-time formation generation and formation tracking problems are also solved as the generalizations. Simulation examples are provided to demonstrate the performance of the new controllers.

  9. SHOULD ONE USE THE RAY-BY-RAY APPROXIMATION IN CORE-COLLAPSE SUPERNOVA SIMULATIONS?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skinner, M. Aaron; Burrows, Adam; Dolence, Joshua C., E-mail: burrows@astro.princeton.edu, E-mail: askinner@astro.princeton.edu, E-mail: jdolence@lanl.gov

    2016-11-01

    We perform the first self-consistent, time-dependent, multi-group calculations in two dimensions (2D) to address the consequences of using the ray-by-ray+ transport simplification in core-collapse supernova simulations. Such a dimensional reduction is employed by many researchers to facilitate their resource-intensive calculations. Our new code (Fornax) implements multi-D transport, and can, by zeroing out transverse flux terms, emulate the ray-by-ray+ scheme. Using the same microphysics, initial models, resolution, and code, we compare the results of simulating 12, 15, 20, and 25 M {sub ⊙} progenitor models using these two transport methods. Our findings call into question the wisdom of the pervasive usemore » of the ray-by-ray+ approach. Employing it leads to maximum post-bounce/pre-explosion shock radii that are almost universally larger by tens of kilometers than those derived using the more accurate scheme, typically leaving the post-bounce matter less bound and artificially more “explodable.” In fact, for our 25 M {sub ⊙} progenitor, the ray-by-ray+ model explodes, while the corresponding multi-D transport model does not. Therefore, in two dimensions, the combination of ray-by-ray+ with the axial sloshing hydrodynamics that is a feature of 2D supernova dynamics can result in quantitatively, and perhaps qualitatively, incorrect results.« less

  10. Use of High-resolution WRF Simulations to Forecast Lightning Threat

    NASA Technical Reports Server (NTRS)

    McCaul, William E.; LaCasse, K.; Goodman, S. J.

    2006-01-01

    Recent observational studies have confirmed the existence of a robust statistical relationship between lightning flash rates and the amount of large precipitating ice hydrometeors in storms. This relationship is exploited, in conjunction with the capabilities of recent forecast models such as WRF, to forecast the threat of lightning from convective storms using the output fields from the model forecasts. The simulated vertical flux of graupel at -15C is used in this study as a proxy for charge separation processes and their associated lightning risk. Six-h simulations are conducted for a number of case studies for which three-dimensional lightning validation data from the North Alabama Lightning Mapping Array are available. Experiments indicate that initialization of the WRF model on a 2 km grid using Eta boundary conditions, Doppler radar radial velocity and reflectivity fields, and METAR and ACARS data yield the most realistic simulations. An array of subjective and objective statistical metrics are employed to document the utility of the WRF forecasts. The simulation results are also compared to other more traditional means of forecasting convective storms, such as those based on inspection of the convective available potential energy field.

  11. High-Resolution WRF Forecasts of Lightning Threat

    NASA Technical Reports Server (NTRS)

    Goodman, S. J.; McCaul, E. W., Jr.; LaCasse, K.

    2007-01-01

    Tropical Rainfall Measuring Mission (TRMM)lightning and precipitation observations have confirmed the existence of a robust relationship between lightning flash rates and the amount of large precipitating ice hydrometeors in storms. This relationship is exploited, in conjunction with the capabilities of the Weather Research and Forecast (WRF) model, to forecast the threat of lightning from convective storms using the output fields from the model forecasts. The simulated vertical flux of graupel at -15C is used in this study as a proxy for charge separation processes and their associated lightning risk. Initial experiments using 6-h simulations are conducted for a number of case studies for which three-dimensional lightning validation data from the North Alabama Lightning Mapping Array are available. The WRF has been initialized on a 2 km grid using Eta boundary conditions, Doppler radar radial velocity and reflectivity fields, and METAR and ACARS data. An array of subjective and objective statistical metrics is employed to document the utility of the WRF forecasts. The simulation results are also compared to other more traditional means of forecasting convective storms, such as those based on inspection of the convective available potential energy field.

  12. Aerosol analysis and forecast in the European Centre for Medium-Range Weather Forecasts Integrated Forecast System: 2. Data assimilation

    NASA Astrophysics Data System (ADS)

    Benedetti, A.; Morcrette, J.-J.; Boucher, O.; Dethof, A.; Engelen, R. J.; Fisher, M.; Flentje, H.; Huneeus, N.; Jones, L.; Kaiser, J. W.; Kinne, S.; Mangold, A.; Razinger, M.; Simmons, A. J.; Suttie, M.

    2009-07-01

    This study presents the new aerosol assimilation system, developed at the European Centre for Medium-Range Weather Forecasts, for the Global and regional Earth-system Monitoring using Satellite and in-situ data (GEMS) project. The aerosol modeling and analysis system is fully integrated in the operational four-dimensional assimilation apparatus. Its purpose is to produce aerosol forecasts and reanalyses of aerosol fields using optical depth data from satellite sensors. This paper is the second of a series which describes the GEMS aerosol effort. It focuses on the theoretical architecture and practical implementation of the aerosol assimilation system. It also provides a discussion of the background errors and observations errors for the aerosol fields, and presents a subset of results from the 2-year reanalysis which has been run for 2003 and 2004 using data from the Moderate Resolution Imaging Spectroradiometer on the Aqua and Terra satellites. Independent data sets are used to show that despite some compromises that have been made for feasibility reasons in regards to the choice of control variable and error characteristics, the analysis is very skillful in drawing to the observations and in improving the forecasts of aerosol optical depth.

  13. Forecasting the quality of water-suppressed 1 H MR spectra based on a single-shot water scan.

    PubMed

    Kyathanahally, Sreenath P; Kreis, Roland

    2017-08-01

    To investigate whether an initial non-water-suppressed acquisition that provides information about the signal-to-noise ratio (SNR) and linewidth is enough to forecast the maximally achievable final spectral quality and thus inform the operator whether the foreseen number of averages and achieved field homogeneity is adequate. A large range of spectra with varying SNR and linewidth was simulated and fitted with popular fitting programs to determine the dependence of fitting errors on linewidth and SNR. A tool to forecast variance based on a single acquisition was developed and its performance evaluated on simulated and in vivo data obtained at 3 Tesla from various brain regions and acquisition settings. A strong correlation to real uncertainties in estimated metabolite contents was found for the forecast values and the Cramer-Rao lower bounds obtained from the water-suppressed spectra. It appears to be possible to forecast the best-case errors associated with specific metabolites to be found in model fits of water-suppressed spectra based on a single water scan. Thus, nonspecialist operators will be able to judge ahead of time whether the planned acquisition can possibly be of sufficient quality to answer the targeted clinical question or whether it needs more averages or improved shimming. Magn Reson Med 78:441-451, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  14. Left Limb of North Pole of the Sun, March 20, 2007

    NASA Image and Video Library

    2007-04-27

    NASA Solar TErrestrial RElations Observatory satellites have provided the first 3-dimensional images of the Sun. This view will aid scientists ability to understand solar physics to improve space weather forecasting.

  15. Right Limb of the South Pole of the Sun, March 18, 2007

    NASA Image and Video Library

    2007-04-27

    NASA Solar TErrestrial RElations Observatory satellites have provided the first 3-dimensional images of the Sun. This view will aid scientists ability to understand solar physics to improve space weather forecasting.

  16. Closer View of the Equatorial Region of the Sun, March 24, 2007

    NASA Image and Video Library

    2007-04-27

    NASA Solar TErrestrial RElations Observatory satellites have provided the first 3-dimensional images of the Sun. This view will aid scientists ability to understand solar physics to improve space weather forecasting.

  17. Studies for the 3-Dimensional Structure, Composition, and Dynamic of Io's Atmosphere

    NASA Technical Reports Server (NTRS)

    Smyth, William H.

    2001-01-01

    Research work is discussed for the following: (1) the exploration of new H and Cl chemistry in Io's atmosphere using the already developed two-dimensional multi-species hydrodynamic model of Wong and Smyth; and (2) for the development of a new three-dimensional multi-species hydrodynamic model for Io's atmosphere.

  18. Lower and upper bounds for entanglement of Rényi-α entropy.

    PubMed

    Song, Wei; Chen, Lin; Cao, Zhuo-Liang

    2016-12-23

    Entanglement Rényi-α entropy is an entanglement measure. It reduces to the standard entanglement of formation when α tends to 1. We derive analytical lower and upper bounds for the entanglement Rényi-α entropy of arbitrary dimensional bipartite quantum systems. We also demonstrate the application our bound for some concrete examples. Moreover, we establish the relation between entanglement Rényi-α entropy and some other entanglement measures.

  19. Data on Support Vector Machines (SVM) model to forecast photovoltaic power.

    PubMed

    Malvoni, M; De Giorgi, M G; Congedo, P M

    2016-12-01

    The data concern the photovoltaic (PV) power, forecasted by a hybrid model that considers weather variations and applies a technique to reduce the input data size, as presented in the paper entitled "Photovoltaic forecast based on hybrid pca-lssvm using dimensionality reducted data" (M. Malvoni, M.G. De Giorgi, P.M. Congedo, 2015) [1]. The quadratic Renyi entropy criteria together with the principal component analysis (PCA) are applied to the Least Squares Support Vector Machines (LS-SVM) to predict the PV power in the day-ahead time frame. The data here shared represent the proposed approach results. Hourly PV power predictions for 1,3,6,12, 24 ahead hours and for different data reduction sizes are provided in Supplementary material.

  20. Teaching a Machine to Feel Postoperative Pain: Combining High-Dimensional Clinical Data with Machine Learning Algorithms to Forecast Acute Postoperative Pain

    PubMed Central

    Tighe, Patrick J.; Harle, Christopher A.; Hurley, Robert W.; Aytug, Haldun; Boezaart, Andre P.; Fillingim, Roger B.

    2015-01-01

    Background Given their ability to process highly dimensional datasets with hundreds of variables, machine learning algorithms may offer one solution to the vexing challenge of predicting postoperative pain. Methods Here, we report on the application of machine learning algorithms to predict postoperative pain outcomes in a retrospective cohort of 8071 surgical patients using 796 clinical variables. Five algorithms were compared in terms of their ability to forecast moderate to severe postoperative pain: Least Absolute Shrinkage and Selection Operator (LASSO), gradient-boosted decision tree, support vector machine, neural network, and k-nearest neighbor, with logistic regression included for baseline comparison. Results In forecasting moderate to severe postoperative pain for postoperative day (POD) 1, the LASSO algorithm, using all 796 variables, had the highest accuracy with an area under the receiver-operating curve (ROC) of 0.704. Next, the gradient-boosted decision tree had an ROC of 0.665 and the k-nearest neighbor algorithm had an ROC of 0.643. For POD 3, the LASSO algorithm, using all variables, again had the highest accuracy, with an ROC of 0.727. Logistic regression had a lower ROC of 0.5 for predicting pain outcomes on POD 1 and 3. Conclusions Machine learning algorithms, when combined with complex and heterogeneous data from electronic medical record systems, can forecast acute postoperative pain outcomes with accuracies similar to methods that rely only on variables specifically collected for pain outcome prediction. PMID:26031220

  1. Turbulence-driven Coronal Heating and Improvements to Empirical Forecasting of the Solar Wind

    NASA Astrophysics Data System (ADS)

    Woolsey, Lauren N.; Cranmer, Steven R.

    2014-06-01

    Forecasting models of the solar wind often rely on simple parameterizations of the magnetic field that ignore the effects of the full magnetic field geometry. In this paper, we present the results of two solar wind prediction models that consider the full magnetic field profile and include the effects of Alfvén waves on coronal heating and wind acceleration. The one-dimensional magnetohydrodynamic code ZEPHYR self-consistently finds solar wind solutions without the need for empirical heating functions. Another one-dimensional code, introduced in this paper (The Efficient Modified-Parker-Equation-Solving Tool, TEMPEST), can act as a smaller, stand-alone code for use in forecasting pipelines. TEMPEST is written in Python and will become a publicly available library of functions that is easy to adapt and expand. We discuss important relations between the magnetic field profile and properties of the solar wind that can be used to independently validate prediction models. ZEPHYR provides the foundation and calibration for TEMPEST, and ultimately we will use these models to predict observations and explain space weather created by the bulk solar wind. We are able to reproduce with both models the general anticorrelation seen in comparisons of observed wind speed at 1 AU and the flux tube expansion factor. There is significantly less spread than comparing the results of the two models than between ZEPHYR and a traditional flux tube expansion relation. We suggest that the new code, TEMPEST, will become a valuable tool in the forecasting of space weather.

  2. Customization of Discriminant Function Analysis for Prediction of Solar Flares

    DTIC Science & Technology

    2005-03-01

    lives such as telecommunication, commercial airlines, electrical power , wireless services, and terrestrial weather tracking and forecasting...the 1800’s can wreak havoc on today’s power , fuel, and telecommunication lines and finds its origin in solar activity. Enormous amounts of solar...inducing potential differences across large areas of the surface. Earth-bound power , fuel, and telecommunication lines grounded to the Earth provide an

  3. Multi-Model Validation in the Chesapeake Bay Region During Frontier Sentinel 2010

    DTIC Science & Technology

    2012-09-28

    which a 72-hr forecast took approximately 1 hr. Identical runs were performed on the DoD Supercomputing Resources Center (DSRC) host “ DaVinci ” at the...performance Navy DSRC host DaVinci . Products of water level and horizontal current maps as well as station time series, identical to those produced by the...forecast meteorological fields. The NCOM simulations were run daily on 128 CPUs at the Navy DSRC host DaVinci and required approximately 5 hrs of wall

  4. Multi-Model Ensemble Approaches to Data Assimilation Using the 4D-Local Ensemble Transform Kalman Filter

    DTIC Science & Technology

    2013-09-30

    accuracy of the analysis . Root mean square difference ( RMSD ) is much smaller for RIP than for either Simple Ocean Data Assimilation or Incremental... Analysis Update globally for temperature as well as salinity. Regionally the same results were found, with only one exception in which the salinity RMSD ...short-term forecast using a numerical model with the observations taken within the forecast time window. The resulting state is the so-called “ analysis

  5. Genetic algorithm for neural networks optimization

    NASA Astrophysics Data System (ADS)

    Setyawati, Bina R.; Creese, Robert C.; Sahirman, Sidharta

    2004-11-01

    This paper examines the forecasting performance of multi-layer feed forward neural networks in modeling a particular foreign exchange rates, i.e. Japanese Yen/US Dollar. The effects of two learning methods, Back Propagation and Genetic Algorithm, in which the neural network topology and other parameters fixed, were investigated. The early results indicate that the application of this hybrid system seems to be well suited for the forecasting of foreign exchange rates. The Neural Networks and Genetic Algorithm were programmed using MATLAB«.

  6. Vector tomography for reconstructing electric fields with non-zero divergence in bounded domains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koulouri, Alexandra, E-mail: koulouri@uni-muenster.de; Department of Electrical and Electronic Engineering, Imperial College London, Exhibition Road, London SW7 2BT; Brookes, Mike

    In vector tomography (VT), the aim is to reconstruct an unknown multi-dimensional vector field using line integral data. In the case of a 2-dimensional VT, two types of line integral data are usually required. These data correspond to integration of the parallel and perpendicular projection of the vector field along the integration lines and are called the longitudinal and transverse measurements, respectively. In most cases, however, the transverse measurements cannot be physically acquired. Therefore, the VT methods are typically used to reconstruct divergence-free (or source-free) velocity and flow fields that can be reconstructed solely from the longitudinal measurements. In thismore » paper, we show how vector fields with non-zero divergence in a bounded domain can also be reconstructed from the longitudinal measurements without the need of explicitly evaluating the transverse measurements. To the best of our knowledge, VT has not previously been used for this purpose. In particular, we study low-frequency, time-harmonic electric fields generated by dipole sources in convex bounded domains which arise, for example, in electroencephalography (EEG) source imaging. We explain in detail the theoretical background, the derivation of the electric field inverse problem and the numerical approximation of the line integrals. We show that fields with non-zero divergence can be reconstructed from the longitudinal measurements with the help of two sparsity constraints that are constructed from the transverse measurements and the vector Laplace operator. As a comparison to EEG source imaging, we note that VT does not require mathematical modeling of the sources. By numerical simulations, we show that the pattern of the electric field can be correctly estimated using VT and the location of the source activity can be determined accurately from the reconstructed magnitudes of the field. - Highlights: • Vector tomography is used to reconstruct electric fields generated by dipole sources. • Inverse solutions are based on longitudinal and transverse line integral measurements. • Transverse line integral measurements are used as a sparsity constraint. • Numerical procedure to approximate the line integrals is described in detail. • Patterns of the studied electric fields are correctly estimated.« less

  7. Space Weather Models and Their Validation and Verification at the CCMC

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2010-01-01

    The Community Coordinated l\\lodeling Center (CCMC) is a US multi-agency activity with a dual mission. With equal emphasis, CCMC strives to provide science support to the international space research community through the execution of advanced space plasma simulations, and it endeavors to support the space weather needs of the CS and partners. Space weather support involves a broad spectrum, from designing robust forecasting systems and transitioning them to forecasters, to providing space weather updates and forecasts to NASA's robotic mission operators. All of these activities have to rely on validation and verification of models and their products, so users and forecasters have the means to assign confidence levels to the space weather information. In this presentation, we provide an overview of space weather models resident at CCMC, as well as of validation and verification activities undertaken at CCMC or through the use of CCMC services.

  8. Short-term forecasting of electric loads using nonlinear autoregressive artificial neural networks with exogenous vector inputs

    DOE PAGES

    Buitrago, Jaime; Asfour, Shihab

    2017-01-01

    Short-term load forecasting is crucial for the operations planning of an electrical grid. Forecasting the next 24 h of electrical load in a grid allows operators to plan and optimize their resources. The purpose of this study is to develop a more accurate short-term load forecasting method utilizing non-linear autoregressive artificial neural networks (ANN) with exogenous multi-variable input (NARX). The proposed implementation of the network is new: the neural network is trained in open-loop using actual load and weather data, and then, the network is placed in closed-loop to generate a forecast using the predicted load as the feedback input.more » Unlike the existing short-term load forecasting methods using ANNs, the proposed method uses its own output as the input in order to improve the accuracy, thus effectively implementing a feedback loop for the load, making it less dependent on external data. Using the proposed framework, mean absolute percent errors in the forecast in the order of 1% have been achieved, which is a 30% improvement on the average error using feedforward ANNs, ARMAX and state space methods, which can result in large savings by avoiding commissioning of unnecessary power plants. Finally, the New England electrical load data are used to train and validate the forecast prediction.« less

  9. Short-term forecasting of electric loads using nonlinear autoregressive artificial neural networks with exogenous vector inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buitrago, Jaime; Asfour, Shihab

    Short-term load forecasting is crucial for the operations planning of an electrical grid. Forecasting the next 24 h of electrical load in a grid allows operators to plan and optimize their resources. The purpose of this study is to develop a more accurate short-term load forecasting method utilizing non-linear autoregressive artificial neural networks (ANN) with exogenous multi-variable input (NARX). The proposed implementation of the network is new: the neural network is trained in open-loop using actual load and weather data, and then, the network is placed in closed-loop to generate a forecast using the predicted load as the feedback input.more » Unlike the existing short-term load forecasting methods using ANNs, the proposed method uses its own output as the input in order to improve the accuracy, thus effectively implementing a feedback loop for the load, making it less dependent on external data. Using the proposed framework, mean absolute percent errors in the forecast in the order of 1% have been achieved, which is a 30% improvement on the average error using feedforward ANNs, ARMAX and state space methods, which can result in large savings by avoiding commissioning of unnecessary power plants. Finally, the New England electrical load data are used to train and validate the forecast prediction.« less

  10. Sub-seasonal Predictability of Heavy Precipitation Events: Implication for Real-time Flood Management in Iran

    NASA Astrophysics Data System (ADS)

    Najafi, H.; Shahbazi, A.; Zohrabi, N.; Robertson, A. W.; Mofidi, A.; Massah Bavani, A. R.

    2016-12-01

    Each year, a number of high impact weather events occur worldwide. Since any level of predictability at sub-seasonal to seasonal timescale is highly beneficial to society, international efforts is now on progress to promote reliable Ensemble Prediction Systems for monthly forecasts within the WWRP/WCRP initiative (S2S) project and North American Multi Model Ensemble (NMME). For water resources managers in the face of extreme events, not only can reliable forecasts of high impact weather events prevent catastrophic losses caused by floods but also contribute to benefits gained from hydropower generation and water markets. The aim of this paper is to analyze the predictability of recent severe weather events over Iran. Two recent heavy precipitations are considered as an illustration to examine whether S2S forecasts can be used for developing flood alert systems especially where large cascade of dams are in operation. Both events have caused major damages to cities and infrastructures. The first severe precipitation was is in the early November 2015 when heavy precipitation (more than 50 mm) occurred in 2 days. More recently, up to 300 mm of precipitation is observed within less than a week in April 2016 causing a consequent flash flood. Over some stations, the observed precipitation was even more than the total annual mean precipitation. To analyze the predictive capability, ensemble forecasts from several operational centers including (European Centre for Medium-Range Weather Forecasts (ECMWF) system, Climate Forecast System Version 2 (CFSv2) and Chinese Meteorological Center (CMA) are evaluated. It has been observed that significant changes in precipitation anomalies were likely to be predicted days in advance. The next step will be to conduct thorough analysis based on comparing multi-model outputs over the full hindcast dataset developing real-time high impact weather prediction systems.

  11. Development and application of an atmospheric-hydrologic-hydraulic flood forecasting model driven by TIGGE ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Bao, Hongjun; Zhao, Linna

    2012-02-01

    A coupled atmospheric-hydrologic-hydraulic ensemble flood forecasting model, driven by The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) data, has been developed for flood forecasting over the Huaihe River. The incorporation of numerical weather prediction (NWP) information into flood forecasting systems may increase forecast lead time from a few hours to a few days. A single NWP model forecast from a single forecast center, however, is insufficient as it involves considerable non-predictable uncertainties and leads to a high number of false alarms. The availability of global ensemble NWP systems through TIGGE offers a new opportunity for flood forecast. The Xinanjiang model used for hydrological rainfall-runoff modeling and the one-dimensional unsteady flow model applied to channel flood routing are coupled with ensemble weather predictions based on the TIGGE data from the Canadian Meteorological Centre (CMC), the European Centre for Medium-Range Weather Forecasts (ECMWF), the UK Met Office (UKMO), and the US National Centers for Environmental Prediction (NCEP). The developed ensemble flood forecasting model is applied to flood forecasting of the 2007 flood season as a test case. The test case is chosen over the upper reaches of the Huaihe River above Lutaizi station with flood diversion and retarding areas. The input flood discharge hydrograph from the main channel to the flood diversion area is estimated with the fixed split ratio of the main channel discharge. The flood flow inside the flood retarding area is calculated as a reservoir with the water balance method. The Muskingum method is used for flood routing in the flood diversion area. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE ensemble forecasts. The results demonstrate satisfactory flood forecasting with clear signals of probability of floods up to a few days in advance, and show that TIGGE ensemble forecast data are a promising tool for forecasting of flood inundation, comparable with that driven by raingauge observations.

  12. Multi-dimensional simulations of core-collapse supernova explosions with CHIMERA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Messer, Bronson; Harris, James Austin; Hix, William Raphael

    Unraveling the core-collapse supernova (CCSN) mechanism is a problem that remains essentially unsolved despite more than four decades of effort. Spherically symmetric models with otherwise high physical fidelity generally fail to produce explosions, and it is widely accepted that CCSNe are inherently multi-dimensional. Progress in realistic modeling has occurred recently through the availability of petascale platforms and the increasing sophistication of supernova codes. We will discuss our most recent work on understanding neutrino-driven CCSN explosions employing multi-dimensional neutrino-radiation hydrodynamics simulations with the Chimera code. We discuss the inputs and resulting outputs from these simulations, the role of neutrino radiation transport,more » and the importance of multi-dimensional fluid flows in shaping the explosions. We also highlight the production of 48Ca in long-running Chimera simulations.« less

  13. Improving medium-range ensemble streamflow forecasts through statistical post-processing

    NASA Astrophysics Data System (ADS)

    Mendoza, Pablo; Wood, Andy; Clark, Elizabeth; Nijssen, Bart; Clark, Martyn; Ramos, Maria-Helena; Nowak, Kenneth; Arnold, Jeffrey

    2017-04-01

    Probabilistic hydrologic forecasts are a powerful source of information for decision-making in water resources operations. A common approach is the hydrologic model-based generation of streamflow forecast ensembles, which can be implemented to account for different sources of uncertainties - e.g., from initial hydrologic conditions (IHCs), weather forecasts, and hydrologic model structure and parameters. In practice, hydrologic ensemble forecasts typically have biases and spread errors stemming from errors in the aforementioned elements, resulting in a degradation of probabilistic properties. In this work, we compare several statistical post-processing techniques applied to medium-range ensemble streamflow forecasts obtained with the System for Hydromet Applications, Research and Prediction (SHARP). SHARP is a fully automated prediction system for the assessment and demonstration of short-term to seasonal streamflow forecasting applications, developed by the National Center for Atmospheric Research, University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. The suite of post-processing techniques includes linear blending, quantile mapping, extended logistic regression, quantile regression, ensemble analogs, and the generalized linear model post-processor (GLMPP). We assess and compare these techniques using multi-year hindcasts in several river basins in the western US. This presentation discusses preliminary findings about the effectiveness of the techniques for improving probabilistic skill, reliability, discrimination, sharpness and resolution.

  14. Hourly runoff forecasting for flood risk management: Application of various computational intelligence models

    NASA Astrophysics Data System (ADS)

    Badrzadeh, Honey; Sarukkalige, Ranjan; Jayawardena, A. W.

    2015-10-01

    Reliable river flow forecasts play a key role in flood risk mitigation. Among different approaches of river flow forecasting, data driven approaches have become increasingly popular in recent years due to their minimum information requirements and ability to simulate nonlinear and non-stationary characteristics of hydrological processes. In this study, attempts are made to apply four different types of data driven approaches, namely traditional artificial neural networks (ANN), adaptive neuro-fuzzy inference systems (ANFIS), wavelet neural networks (WNN), and, hybrid ANFIS with multi resolution analysis using wavelets (WNF). Developed models applied for real time flood forecasting at Casino station on Richmond River, Australia which is highly prone to flooding. Hourly rainfall and runoff data were used to drive the models which have been used for forecasting with 1, 6, 12, 24, 36 and 48 h lead-time. The performance of models further improved by adding an upstream river flow data (Wiangaree station), as another effective input. All models perform satisfactorily up to 12 h lead-time. However, the hybrid wavelet-based models significantly outperforming the ANFIS and ANN models in the longer lead-time forecasting. The results confirm the robustness of the proposed structure of the hybrid models for real time runoff forecasting in the study area.

  15. Simultaneous Estimation of Model State Variables and Observation and Forecast Biases Using a Two-Stage Hybrid Kalman Filter

    NASA Technical Reports Server (NTRS)

    Pauwels, V. R. N.; DeLannoy, G. J. M.; Hendricks Franssen, H.-J.; Vereecken, H.

    2013-01-01

    In this paper, we present a two-stage hybrid Kalman filter to estimate both observation and forecast bias in hydrologic models, in addition to state variables. The biases are estimated using the discrete Kalman filter, and the state variables using the ensemble Kalman filter. A key issue in this multi-component assimilation scheme is the exact partitioning of the difference between observation and forecasts into state, forecast bias and observation bias updates. Here, the error covariances of the forecast bias and the unbiased states are calculated as constant fractions of the biased state error covariance, and the observation bias error covariance is a function of the observation prediction error covariance. In a series of synthetic experiments, focusing on the assimilation of discharge into a rainfall-runoff model, it is shown that both static and dynamic observation and forecast biases can be successfully estimated. The results indicate a strong improvement in the estimation of the state variables and resulting discharge as opposed to the use of a bias-unaware ensemble Kalman filter. Furthermore, minimal code modification in existing data assimilation software is needed to implement the method. The results suggest that a better performance of data assimilation methods should be possible if both forecast and observation biases are taken into account.

  16. Advances in air quality prediction with the use of integrated systems

    NASA Astrophysics Data System (ADS)

    Dragani, R.; Benedetti, A.; Engelen, R. J.; Peuch, V. H.

    2017-12-01

    Recent years have seen the rise of global operational atmospheric composition forecasting systems for several applications including climate monitoring, provision of boundary conditions for regional air quality forecasting, energy sector applications, to mention a few. Typically, global forecasts are provided in the medium-range up to five days ahead and are initialized with an analysis based on satellite data. In this work we present the latest advances in data assimilation using the ECMWF's 4D-Var system extended to atmospheric composition which is currently operational under the Copernicus Atmosphere Monitoring Service of the European Commission. The service is based on acquisition of all relevant data available in near-real-time, the processing of these datasets in the assimilation and the subsequent dissemination of global forecasts at ECMWF. The global forecasts are used by the CAMS regional models as boundary conditions for the European forecasts based on a multi-model ensemble. The global forecasts are also used to provide boundary conditions for other parts of the world (e.g., China) and are freely available to all interested entities. Some of the regional models also perform assimilation of satellite and ground-based observations. All products are assessed, validated and made publicly available on https://atmosphere.copernicus.eu/.

  17. Statistical and dynamical forecast of regional precipitation after mature phase of ENSO

    NASA Astrophysics Data System (ADS)

    Sohn, S.; Min, Y.; Lee, J.; Tam, C.; Ahn, J.

    2010-12-01

    While the seasonal predictability of general circulation models (GCMs) has been improved, the current model atmosphere in the mid-latitude does not respond correctly to external forcing such as tropical sea surface temperature (SST), particularly over the East Asia and western North Pacific summer monsoon regions. In addition, the time-scale of prediction scope is considerably limited and the model forecast skill still is very poor beyond two weeks. Although recent studies indicate that coupled model based multi-model ensemble (MME) forecasts show the better performance, the long-lead forecasts exceeding 9 months still show a dramatic decrease of the seasonal predictability. This study aims at diagnosing the dynamical MME forecasts comprised of the state of art 1-tier models as well as comparing them with the statistical model forecasts, focusing on the East Asian summer precipitation predictions after mature phase of ENSO. The lagged impact of El Nino as major climate contributor on the summer monsoon in model environments is also evaluated, in the sense of the conditional probabilities. To evaluate the probability forecast skills, the reliability (attributes) diagram and the relative operating characteristics following the recommendations of the World Meteorological Organization (WMO) Standardized Verification System for Long-Range Forecasts are used in this study. The results should shed light on the prediction skill for dynamical model and also for the statistical model, in forecasting the East Asian summer monsoon rainfall with a long-lead time.

  18. Operational Monitoring and Forecasting in Regional Seas: the Aegean Sea example

    NASA Astrophysics Data System (ADS)

    Nittis, K.; Perivoliotis, L.; Zervakis, V.; Papadopoulos, A.; Tziavos, C.

    2003-04-01

    The increasing economic activities in the coastal zone and the associated pressure on the marine environment have raised the interest on monitoring systems able to provide supporting information for its effective management and protection. Such an integrated monitoring, forecasting and information system is being developed during the past years in the Aegean Sea. Its main component is the POSEIDON network that provides real-time data for meteorological and surface oceanographic parameters (waves, currents, hydrological and biochemical data) from 11 fixed oceanographic buoys. The numerical forecasting system is composed by an ETA atmospheric model, a WAM wave model and a POM hydrodynamic model that provide every day 72 hours forecasts. The system is operational since May 2000 and its products are published through Internet while a sub-set is also available through cellular telephony. New type of observing platforms will be available in the near future through a number of EU funded research projects. The Mediterranean Moored Multi-sensor Array (M3A) that was developed for the needs of the Mediterranean Forecasting System and was tested during 2000-2001 will be operational in 2004 during the MFSTEP project. The M3A system incorporates sensors for optical and chemical measurements (Oxygen, Turbidity, Chlorophyll-a, Nutrients and PAR) in the euphotic zone (0-100m) together with sensors for physical parameters (Temperature, Salinity, Current speed and direction) at the 0-500m layer. A Ferry-Box system will also operate during 2004 in the southern Aegean Sea, providing surface data for physical and bio-chemical properties. The ongoing modeling efforts include coupling with larger scale circulation models of the Mediterranean, high-resolution downscaling to coastal areas of the Aegean Sea and development of multi-variate data assimilation methods.

  19. Improving seasonal forecasts of hydroclimatic variables through the state of multiple large-scale climate signals

    NASA Astrophysics Data System (ADS)

    Castelletti, A.; Giuliani, M.; Block, P. J.

    2017-12-01

    Increasingly uncertain hydrologic regimes combined with more frequent and intense extreme events are challenging water systems management worldwide, emphasizing the need of accurate medium- to long-term predictions to timely prompt anticipatory operations. Despite modern forecasts are skillful over short lead time (from hours to days), predictability generally tends to decrease on longer lead times. Global climate teleconnection, such as El Niño Southern Oscillation (ENSO), may contribute in extending forecast lead times. However, ENSO teleconnection is well defined in some locations, such as Western USA and Australia, while there is no consensus on how it can be detected and used in other regions, particularly in Europe, Africa, and Asia. In this work, we generalize the Niño Index Phase Analysis (NIPA) framework by contributing the Multi Variate Niño Index Phase Analysis (MV-NIPA), which allows capturing the state of multiple large-scale climate signals (i.e. ENSO, North Atlantic Oscillation, Pacific Decadal Oscillation, Atlantic Multi-decadal Oscillation, Indian Ocean Dipole) to forecast hydroclimatic variables on a seasonal time scale. Specifically, our approach distinguishes the different phases of the considered climate signals and, for each phase, identifies relevant anomalies in Sea Surface Temperature (SST) that influence the local hydrologic conditions. The potential of the MV-NIPA framework is demonstrated through an application to the Lake Como system, a regulated lake in northern Italy which is mainly operated for flood control and irrigation supply. Numerical results show high correlations between seasonal SST values and one season-ahead precipitation in the Lake Como basin. The skill of the resulting MV-NIPA forecast outperforms the one of ECMWF products. This information represents a valuable contribution to partially anticipate the summer water availability, especially during drought events, ultimately supporting the improvement of the Lake Como operations.

  20. Searching for effective forces in laboratory insect swarms

    NASA Astrophysics Data System (ADS)

    Puckett, James G.; Kelley, Douglas H.; Ouellette, Nicholas T.

    2014-04-01

    Collective animal behaviour is often modeled by systems of agents that interact via effective social forces, including short-range repulsion and long-range attraction. We search for evidence of such effective forces by studying laboratory swarms of the flying midge Chironomus riparius. Using multi-camera stereoimaging and particle-tracking techniques, we record three-dimensional trajectories for all the individuals in the swarm. Acceleration measurements show a clear short-range repulsion, which we confirm by considering the spatial statistics of the midges, but no conclusive long-range interactions. Measurements of the mean free path of the insects also suggest that individuals are on average very weakly coupled, but that they are also tightly bound to the swarm itself. Our results therefore suggest that some attractive interaction maintains cohesion of the swarms, but that this interaction is not as simple as an attraction to nearest neighbours.

  1. Self-Organizing Maps method in recent Adriatic Sea environmental studies: applications and perspectives

    NASA Astrophysics Data System (ADS)

    Mihanovic, H.; Vilibic, I.

    2014-12-01

    Herein we present three recent oceanographic studies performed in the Adriatic Sea (the northernmost arm of the Mediterranean Sea), where Self-Organizing Maps (SOM) method, an unsupervised neural network method capable of recognizing patterns in various types of datasets, was applied to environmental data. The first study applied the SOM method to a long (50 years) series of thermohaline, dissolved oxygen and nutrient data measured over a deep (1200 m) Southern Adriatic Pit, in order to extract characteristic deep water mass patterns and their temporal variability. Low-dimensional SOM solutions revealed that the patterns were not sensitive to nutrients but were determined mostly by temperature, salinity and DO content; therefore, the water masses in the region can be traced by using no nutrient data. The second study encompassed the classification of surface current patterns measured by HF radars over the northernmost part of the Adriatic, by applying the SOM method to the HF radar data and operational mesoscale meteorological model surface wind fields. The major output from this study was a high correlation found between characteristic ocean current distribution patterns with and without wind data introduced to the SOM, implying the dominant wind driven dynamics over a local scale. That nominates the SOM method as a basis for generating very fast real-time forecast models over limited domains, based on the existing atmospheric forecasts and basin-oriented ocean experiments. The last study classified the sea ambient noise distributions in a habitat area of bottlenose dolphin, connecting it to the man-made noise generated by different types of vessels. Altogether, the usefulness of the SOM method has been recognized in different aspects of basin-scale ocean environmental studies, and may be a useful tool in future investigations of understanding of the multi-disciplinary dynamics over a basin, including the creation of operational environmental forecasting systems.

  2. The Worldwide Interplanetary Scintillation (IPS) Stations (WIPSS) Network in support of Space-Weather Science and Forecasting

    NASA Astrophysics Data System (ADS)

    Bisi, M. M.; Gonzalez-Esparza, A.; Jackson, B. V.; Aguilar-Rodriguez, E.; Tokumaru, M.; Chashei, I. V.; Tyul'bashev, S. A.; Manoharan, P. K.; Fallows, R. A.; Chang, O.; Mejia-Ambriz, J. C.; Yu, H. S.; Fujiki, K.; Shishov, V.

    2016-12-01

    The phenomenon of space weather - analogous to terrestrial weather which describes the changing low-altitude atmospheric conditions on Earth - is essentially a description of the changes in the plasma environment at and near the Earth. Some key parameters for space-weather purposes driving space weather at the Earth include velocity, density, magnetic field, high-energy particles, and radiation coming into and within the near-Earth space environment. Interplanetary scintillation (IPS) can be used to provide a global measure of velocity and density as well as indications of changes in the plasma and magnetic-field rotations along each observational line of sight. If the observations are formally inverted into a three-dimensional (3-D) tomographic reconstruction (such as using the University of California, San Diego - UCSD - kinematic model and reconstruction technique), then source-surface magnetic fields can also be propagated out to the Earth (and beyond) as well as in-situ data also being incorporated into the reconstruction. Currently, this has been done using IPS data only from the Institute for Space-Earth Environmental (ISEE) and has been scientifically since the 1990s, and in a forecast mode since around 2000. There is now a defined IPS Common Data Format (IPSCDFv1.0) which is being implemented by the majority of the IPS community (this also feeds into the tomography). The Worldwide IPS Stations (WIPSS) Network aims to bring together, using IPSCDFv1.0, the worldwide real-time capable IPS observatories with well-developed and tested analyses techniques being unified across all single-site systems (such as MEXART, Pushchino, and Ooty) and cross-calibrated to the multi-site ISEE system (as well as learning from the scientific-based systems such as EISCAT, LOFAR, and the MWA), into the UCSD 3-D tomography to improve the accuracy, spatial and temporal data coverage, and both the spatial and temporal resolution for improved space-weather science and forecast capabilities.

  3. The Worldwide Interplanetary Scintillation (IPS) Stations (WIPSS) Network in support of Space-Weather Science and Forecasting

    NASA Astrophysics Data System (ADS)

    Bisi, Mario Mark; Americo Gonzalez-Esparza, J.; Jackson, Bernard; Aguilar-Rodriguez, Ernesto; Tokumaru, Munetoshi; Chashei, Igor; Tyul'bashev, Sergey; Manoharan, Periasamy; Fallows, Richard; Chang, Oyuki; Yu, Hsiu-Shan; Fujiki, Ken'ichi; Shishov, Vladimir; Barnes, David

    2017-04-01

    The phenomenon of space weather - analogous to terrestrial weather which describes the changing low-altitude atmospheric conditions on Earth - is essentially a description of the changes in the plasma environment at and near the Earth. Some key parameters for space-weather purposes driving space weather at the Earth include velocity, density, magnetic field, high-energy particles, and radiation coming into and within the near-Earth space environment. Interplanetary scintillation (IPS) can be used to provide a global measure of velocity and density as well as indications of changes in the plasma and magnetic-field rotations along each observational line of sight. If the observations are formally inverted into a three-dimensional (3-D) tomographic reconstruction (such as using the University of California, San Diego - UCSD - kinematic model and reconstruction technique), then source-surface magnetic fields can also be propagated out to the Earth (and beyond) as well as in-situ data also being incorporated into the reconstruction. Currently, this has been done using IPS data only from the Institute for Space-Earth Environmental (ISEE) and has been scientifically since the 1990s, and in a forecast mode since around 2000. There is now a defined (and updated) IPS Common Data Format (IPSCDFv1.1) which is being implemented by the majority of the IPS community (this also feeds into the UCSD tomography). The Worldwide IPS Stations (WIPSS) Network aims to bring together, using IPSCDFv1.1, the worldwide real-time capable IPS observatories with well-developed and tested analyses techniques being unified across all single-site systems (such as MEXART, Pushchino, and Ooty) and cross-calibrated to the multi-site ISEE system (as well as learning from the scientific-based systems such as EISCAT, LOFAR, and the MWA), into the UCSD 3-D tomography to improve the accuracy, spatial and temporal data coverage, and both the spatial and temporal resolution for improved space-weather science and forecast capabilities.

  4. Teaching ocean wave forecasting using computer-generated visualization and animation—Part 2: swell forecasting

    NASA Astrophysics Data System (ADS)

    Whitford, Dennis J.

    2002-05-01

    This paper, the second of a two-part series, introduces undergraduate students to ocean wave forecasting using interactive computer-generated visualization and animation. Verbal descriptions and two-dimensional illustrations are often insufficient for student comprehension. Fortunately, the introduction of computers in the geosciences provides a tool for addressing this problem. Computer-generated visualization and animation, accompanied by oral explanation, have been shown to be a pedagogical improvement to more traditional methods of instruction. Cartographic science and other disciplines using geographical information systems have been especially aggressive in pioneering the use of visualization and animation, whereas oceanography has not. This paper will focus on the teaching of ocean swell wave forecasting, often considered a difficult oceanographic topic due to the mathematics and physics required, as well as its interdependence on time and space. Several MATLAB ® software programs are described and offered to visualize and animate group speed, frequency dispersion, angular dispersion, propagation, and wave height forecasting of deep water ocean swell waves. Teachers may use these interactive visualizations and animations without requiring an extensive background in computer programming.

  5. A versatile data-visualization application for the Norwegian flood forecasting service

    NASA Astrophysics Data System (ADS)

    Kobierska, Florian; Langsholt, Elin G.; Hamududu, Byman H.; Engeland, Kolbjørn

    2017-04-01

    - General motivation A graphical user interface has been developed to visualize multi-model hydrological forecasts at the flood forecasting service of the Norwegian water and energy directorate. It is based on the R 'shiny' package, with which interactive web applications can quickly be prototyped. The app queries multiple data sources, building a comprehensive infographics dashboard for the decision maker. - Main features of the app The visualization application comprises several tabs, each built with different functionality and focus. A map of forecast stations gives a rapid insight of the flood situation and serves, concurrently, as a map station selection (based on the 'leaflet' package). The map selection is linked to multi-panel forecast plots which can present input, state or runoff parameters. Another tab focuses on past model performance and calibration runs. - Software design choices The application was programmed with a focus on flexibility regarding data-sources. The parsing of text-based model results was explicitly separated from the app (in the separate R package 'NVEDATA'), so that it only loads standardized RData binary files. We focused on allowing re-usability in other contexts by structuring the app into specific 'shiny' modules. The code was bundled into an R package, which is available on GitHub. - Documentation efforts A documentation website is under development. For easier collaboration, we chose to host it on the 'GitHub Pages' branch of the repository and build it automatically with a continuous integration service. The aim is to gather all information about the flood forecasting methodology at NVE in one location. This encompasses details on each hydrological model used as well as the documentation of the data-visualization application. - Outlook for further development The ability to select a group of stations by filtering a table (i.e. past performance, past major flood events, catchment parameters) and exporting it to the forecast tab could be of interest for detailed model analysis. The design choices for this app were motivated by a need for extensibility and modularity and those qualities will be tested and improved as new datasets need integrating into this to​ol.

  6. Formation of Spiral-Arm Spurs and Bound Clouds in Vertically Stratified Galactic Gas Disks

    NASA Astrophysics Data System (ADS)

    Kim, Woong-Tae; Ostriker, Eve C.

    2006-07-01

    We investigate the growth of spiral-arm substructure in vertically stratified, self-gravitating, galactic gas disks, using local numerical MHD simulations. Our new models extend our previous two-dimensional studies, which showed that a magnetized spiral shock in a thin disk can undergo magneto-Jeans instability (MJI), resulting in regularly spaced interarm spur structures and massive gravitationally bound fragments. Similar spur (or ``feather'') features have recently been seen in high-resolution observations of several galaxies. Here we consider two sets of numerical models: two-dimensional simulations that use a ``thick-disk'' gravitational kernel, and three-dimensional simulations with explicit vertical stratification. Both models adopt an isothermal equation of state with cs=7 km s-1. When disks are sufficiently magnetized and self-gravitating, the result in both sorts of models is the growth of spiral-arm substructure similar to that in our previous razor-thin models. Reduced self-gravity due to nonzero disk thickness increases the spur spacing to ~10 times the Jeans length at the arm peak. Bound clouds that form from spur fragmentation have masses ~(1-3)×107 Msolar each, similar to the largest observed GMCs. The mass-to-flux ratios and specific angular momenta of the bound condensations are lower than large-scale galactic values, as is true for observed GMCs. We find that unmagnetized or weakly magnetized two-dimensional models are unstable to the ``wiggle instability'' previously identified by Wada & Koda. However, our fully three-dimensional models do not show this effect. Nonsteady motions and strong vertical shear prevent coherent vortical structures from forming, evidently suppressing the wiggle instability. We also find no clear traces of Parker instability in the nonlinear spiral arm substructures that emerge, although conceivably Parker modes may help seed the MJI at early stages since azimuthal wavelengths are similar.

  7. Convective Weather Forecast Quality Metrics for Air Traffic Management Decision-Making

    NASA Technical Reports Server (NTRS)

    Chatterji, Gano B.; Gyarfas, Brett; Chan, William N.; Meyn, Larry A.

    2006-01-01

    Since numerical weather prediction models are unable to accurately forecast the severity and the location of the storm cells several hours into the future when compared with observation data, there has been a growing interest in probabilistic description of convective weather. The classical approach for generating uncertainty bounds consists of integrating the state equations and covariance propagation equations forward in time. This step is readily recognized as the process update step of the Kalman Filter algorithm. The second well known method, known as the Monte Carlo method, consists of generating output samples by driving the forecast algorithm with input samples selected from distributions. The statistical properties of the distributions of the output samples are then used for defining the uncertainty bounds of the output variables. This method is computationally expensive for a complex model compared to the covariance propagation method. The main advantage of the Monte Carlo method is that a complex non-linear model can be easily handled. Recently, a few different methods for probabilistic forecasting have appeared in the literature. A method for computing probability of convection in a region using forecast data is described in Ref. 5. Probability at a grid location is computed as the fraction of grid points, within a box of specified dimensions around the grid location, with forecast convection precipitation exceeding a specified threshold. The main limitation of this method is that the results are dependent on the chosen dimensions of the box. The examples presented Ref. 5 show that this process is equivalent to low-pass filtering of the forecast data with a finite support spatial filter. References 6 and 7 describe the technique for computing percentage coverage within a 92 x 92 square-kilometer box and assigning the value to the center 4 x 4 square-kilometer box. This technique is same as that described in Ref. 5. Characterizing the forecast, following the process described in Refs. 5 through 7, in terms of percentage coverage or confidence level is notionally sound compared to characterizing in terms of probabilities because the probability of the forecast being correct can only be determined using actual observations. References 5 through 7 only use the forecast data and not the observations. The method for computing the probability of detection, false alarm ratio and several forecast quality metrics (Skill Scores) using both the forecast and observation data are given in Ref. 2. This paper extends the statistical verification method in Ref. 2 to determine co-occurrence probabilities. The method consists of computing the probability that a severe weather cell (grid location) is detected in the observation data in the neighborhood of the severe weather cell in the forecast data. Probabilities of occurrence at the grid location and in its neighborhood with higher severity, and with lower severity in the observation data compared to that in the forecast data are examined. The method proposed in Refs. 5 through 7 is used for computing the probability that a certain number of cells in the neighborhood of severe weather cells in the forecast data are seen as severe weather cells in the observation data. Finally, the probability of existence of gaps in the observation data in the neighborhood of severe weather cells in forecast data is computed. Gaps are defined as openings between severe weather cells through which an aircraft can safely fly to its intended destination. The rest of the paper is organized as follows. Section II summarizes the statistical verification method described in Ref. 2. The extension of this method for computing the co-occurrence probabilities in discussed in Section HI. Numerical examples using NCWF forecast data and NCWD observation data are presented in Section III to elucidate the characteristics of the co-occurrence probabilities. This section also discusses the procedure for computing throbabilities that the severity of convection in the observation data will be higher or lower in the neighborhood of grid locations compared to that indicated at the grid locations in the forecast data. The probability of coverage of neighborhood grid cells is also described via examples in this section. Section IV discusses the gap detection algorithm and presents a numerical example to illustrate the method. The locations of the detected gaps in the observation data are used along with the locations of convective weather cells in the forecast data to determine the probability of existence of gaps in the neighborhood of these cells. Finally, the paper is concluded in Section V.

  8. Regional Development Impacts Multi-Regional - Multi-Industry Model (MRMI) Users Manual,

    DTIC Science & Technology

    1982-09-01

    indicators, described in Chapter 2, are estimated as well. Finally, MRMI is flexible, as it can incorporate alternative macroeconomic , national inter...national and regional economic contexts and data sources for estimating macroeconomic and direct impacts data. Considerations for ensuring consistency...Chapter 4 is devoted to model execution and the interpretation of its output. As MRMI forecasts are based upon macroeconomic , national inter-industry

  9. Improving medium-range and seasonal hydroclimate forecasts in the southeast USA

    NASA Astrophysics Data System (ADS)

    Tian, Di

    Accurate hydro-climate forecasts are important for decision making by water managers, agricultural producers, and other stake holders. Numerical weather prediction models and general circulation models may have potential for improving hydro-climate forecasts at different scales. In this study, forecast analogs of the Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) based on different approaches were evaluated for medium-range reference evapotranspiration (ETo), irrigation scheduling, and urban water demand forecasts in the southeast United States; the Climate Forecast System version 2 (CFSv2) and the North American national multi-model ensemble (NMME) were statistically downscaled for seasonal forecasts of ETo, precipitation (P) and 2-m temperature (T2M) at the regional level. The GFS mean temperature (Tmean), relative humidity, and wind speed (Wind) reforecasts combined with the climatology of Reanalysis 2 solar radiation (Rs) produced higher skill than using the direct GFS output only. Constructed analogs showed slightly higher skill than natural analogs for deterministic forecasts. Both irrigation scheduling driven by the GEFS-based ETo forecasts and GEFS-based ETo forecast skill were generally positive up to one week throughout the year. The GEFS improved ETo forecast skill compared to the GFS. The GEFS-based analog forecasts for the input variables of an operational urban water demand model were skillful when applied in the Tampa Bay area. The modified operational models driven by GEFS analog forecasts showed higher forecast skill than the operational model based on persistence. The results for CFSv2 seasonal forecasts showed maximum temperature (Tmax) and Rs had the greatest influence on ETo. The downscaled Tmax showed the highest predictability, followed by Tmean, Tmin, Rs, and Wind. The CFSv2 model could better predict ETo in cold seasons during El Nino Southern Oscillation (ENSO) events only when the forecast initial condition was in ENSO. Downscaled P and T2M forecasts were produced by directly downscaling the NMME P and T2M output or indirectly using the NMME forecasts of Nino3.4 sea surface temperatures to predict local-scale P and T2M. The indirect method generally showed the highest forecast skill which occurs in cold seasons. The bias-corrected NMME ensemble forecast skill did not outperform the best single model.

  10. Evaluating the spatio-temporal performance of sky-imager-based solar irradiance analysis and forecasts

    NASA Astrophysics Data System (ADS)

    Schmidt, Thomas; Kalisch, John; Lorenz, Elke; Heinemann, Detlev

    2016-03-01

    Clouds are the dominant source of small-scale variability in surface solar radiation and uncertainty in its prediction. However, the increasing share of solar energy in the worldwide electric power supply increases the need for accurate solar radiation forecasts. In this work, we present results of a very short term global horizontal irradiance (GHI) forecast experiment based on hemispheric sky images. A 2-month data set with images from one sky imager and high-resolution GHI measurements from 99 pyranometers distributed over 10 km by 12 km is used for validation. We developed a multi-step model and processed GHI forecasts up to 25 min with an update interval of 15 s. A cloud type classification is used to separate the time series into different cloud scenarios. Overall, the sky-imager-based forecasts do not outperform the reference persistence forecasts. Nevertheless, we find that analysis and forecast performance depends strongly on the predominant cloud conditions. Especially convective type clouds lead to high temporal and spatial GHI variability. For cumulus cloud conditions, the analysis error is found to be lower than that introduced by a single pyranometer if it is used representatively for the whole area in distances from the camera larger than 1-2 km. Moreover, forecast skill is much higher for these conditions compared to overcast or clear sky situations causing low GHI variability, which is easier to predict by persistence. In order to generalize the cloud-induced forecast error, we identify a variability threshold indicating conditions with positive forecast skill.

  11. Global scale predictability of floods

    NASA Astrophysics Data System (ADS)

    Weerts, Albrecht; Gijsbers, Peter; Sperna Weiland, Frederiek

    2016-04-01

    Flood (and storm surge) forecasting at the continental and global scale has only become possible in recent years (Emmerton et al., 2016; Verlaan et al., 2015) due to the availability of meteorological forecast, global scale precipitation products and global scale hydrologic and hydrodynamic models. Deltares has setup GLOFFIS a research-oriented multi model operational flood forecasting system based on Delft-FEWS in an open experimental ICT facility called Id-Lab. In GLOFFIS both the W3RA and PCRGLOB-WB model are run in ensemble mode using GEFS and ECMWF-EPS (latency 2 days). GLOFFIS will be used for experiments into predictability of floods (and droughts) and their dependency on initial state estimation, meteorological forcing and the hydrologic model used. Here we present initial results of verification of the ensemble flood forecasts derived with the GLOFFIS system. Emmerton, R., Stephens, L., Pappenberger, F., Pagano, T., Weerts, A., Wood, A. Salamon, P., Brown, J., Hjerdt, N., Donnelly, C., Cloke, H. Continental and Global Scale Flood Forecasting Systems, WIREs Water (accepted), 2016 Verlaan M, De Kleermaeker S, Buckman L. GLOSSIS: Global storm surge forecasting and information system 2015, Australasian Coasts & Ports Conference, 15-18 September 2015,Auckland, New Zealand.

  12. Daily rainfall forecasting for one year in a single run using Singular Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Unnikrishnan, Poornima; Jothiprakash, V.

    2018-06-01

    Effective modelling and prediction of smaller time step rainfall is reported to be very difficult owing to its highly erratic nature. Accurate forecast of daily rainfall for longer duration (multi time step) may be exceptionally helpful in the efficient planning and management of water resources systems. Identification of inherent patterns in a rainfall time series is also important for an effective water resources planning and management system. In the present study, Singular Spectrum Analysis (SSA) is utilized to forecast the daily rainfall time series pertaining to Koyna watershed in Maharashtra, India, for 365 days after extracting various components of the rainfall time series such as trend, periodic component, noise and cyclic component. In order to forecast the time series for longer time step (365 days-one window length), the signal and noise components of the time series are forecasted separately and then added together. The results of the study show that the method of SSA could extract the various components of the time series effectively and could also forecast the daily rainfall time series for longer duration such as one year in a single run with reasonable accuracy.

  13. Communicating uncertainty in seasonal and interannual climate forecasts in Europe.

    PubMed

    Taylor, Andrea L; Dessai, Suraje; de Bruin, Wändi Bruine

    2015-11-28

    Across Europe, organizations in different sectors are sensitive to climate variability and change, at a range of temporal scales from the seasonal to the interannual to the multi-decadal. Climate forecast providers face the challenge of communicating the uncertainty inherent in these forecasts to these decision-makers in a way that is transparent, understandable and does not lead to a false sense of certainty. This article reports the findings of a user-needs survey, conducted with 50 representatives of organizations in Europe from a variety of sectors (e.g. water management, forestry, energy, tourism, health) interested in seasonal and interannual climate forecasts. We find that while many participating organizations perform their own 'in house' risk analysis most require some form of processing and interpretation by forecast providers. However, we also find that while users tend to perceive seasonal and interannual forecasts to be useful, they often find them difficult to understand, highlighting the need for communication formats suitable for both expert and non-expert users. In addition, our results show that people tend to prefer familiar formats for receiving information about uncertainty. The implications of these findings for both the providers and users of climate information are discussed. © 2015 The Authors.

  14. Communicating uncertainty in seasonal and interannual climate forecasts in Europe

    PubMed Central

    Taylor, Andrea L.; Dessai, Suraje; de Bruin, Wändi Bruine

    2015-01-01

    Across Europe, organizations in different sectors are sensitive to climate variability and change, at a range of temporal scales from the seasonal to the interannual to the multi-decadal. Climate forecast providers face the challenge of communicating the uncertainty inherent in these forecasts to these decision-makers in a way that is transparent, understandable and does not lead to a false sense of certainty. This article reports the findings of a user-needs survey, conducted with 50 representatives of organizations in Europe from a variety of sectors (e.g. water management, forestry, energy, tourism, health) interested in seasonal and interannual climate forecasts. We find that while many participating organizations perform their own ‘in house’ risk analysis most require some form of processing and interpretation by forecast providers. However, we also find that while users tend to perceive seasonal and interannual forecasts to be useful, they often find them difficult to understand, highlighting the need for communication formats suitable for both expert and non-expert users. In addition, our results show that people tend to prefer familiar formats for receiving information about uncertainty. The implications of these findings for both the providers and users of climate information are discussed. PMID:26460115

  15. Eodataservice.org: Big Data Platform to Enable Multi-disciplinary Information Extraction from Geospatial Data

    NASA Astrophysics Data System (ADS)

    Natali, S.; Mantovani, S.; Barboni, D.; Hogan, P.

    2017-12-01

    In 1999, US Vice-President Al Gore outlined the concept of `Digital Earth' as a multi-resolution, three-dimensional representation of the planet to find, visualise and make sense of vast amounts of geo- referenced information on physical and social environments, allowing to navigate through space and time, accessing historical and forecast data to support scientists, policy-makers, and any other user. The eodataservice platform (http://eodataservice.org/) implements the Digital Earth Concept: eodatasevice is a cross-domain platform that makes available a large set of multi-year global environmental collections allowing data discovery, visualization, combination, processing and download. It implements a "virtual datacube" approach where data stored on distributed data centers are made available via standardized OGC-compliant interfaces. Dedicated web-based Graphic User Interfaces (based on the ESA-NASA WebWorldWind technology) as well as web-based notebooks (e.g. Jupyter notebook), deskop GIS tools and command line interfaces can be used to access and manipulate the data. The platform can be fully customized on users' needs. So far eodataservice has been used for the following thematic applications: High resolution satellite data distribution Land surface monitoring using SAR surface deformation data Atmosphere, ocean and climate applications Climate-health applications Urban Environment monitoring Safeguard of cultural heritage sites Support to farmers and (re)-insurances in the agriculturés field In the current work, the EO Data Service concept is presented as key enabling technology; furthermore various examples are provided to demonstrate the high level of interdisciplinarity of the platform.

  16. Convective Weather Forecast Accuracy Analysis at Center and Sector Levels

    NASA Technical Reports Server (NTRS)

    Wang, Yao; Sridhar, Banavar

    2010-01-01

    This paper presents a detailed convective forecast accuracy analysis at center and sector levels. The study is aimed to provide more meaningful forecast verification measures to aviation community, as well as to obtain useful information leading to the improvements in the weather translation capacity models. In general, the vast majority of forecast verification efforts over past decades have been on the calculation of traditional standard verification measure scores over forecast and observation data analyses onto grids. These verification measures based on the binary classification have been applied in quality assurance of weather forecast products at the national level for many years. Our research focuses on the forecast at the center and sector levels. We calculate the standard forecast verification measure scores for en-route air traffic centers and sectors first, followed by conducting the forecast validation analysis and related verification measures for weather intensities and locations at centers and sectors levels. An approach to improve the prediction of sector weather coverage by multiple sector forecasts is then developed. The weather severe intensity assessment was carried out by using the correlations between forecast and actual weather observation airspace coverage. The weather forecast accuracy on horizontal location was assessed by examining the forecast errors. The improvement in prediction of weather coverage was determined by the correlation between actual sector weather coverage and prediction. observed and forecasted Convective Weather Avoidance Model (CWAM) data collected from June to September in 2007. CWAM zero-minute forecast data with aircraft avoidance probability of 60% and 80% are used as the actual weather observation. All forecast measurements are based on 30-minute, 60- minute, 90-minute, and 120-minute forecasts with the same avoidance probabilities. The forecast accuracy analysis for times under one-hour showed that the errors in intensity and location for center forecast are relatively low. For example, 1-hour forecast intensity and horizontal location errors for ZDC center were about 0.12 and 0.13. However, the correlation between sector 1-hour forecast and actual weather coverage was weak, for sector ZDC32, about 32% of the total variation of observation weather intensity was unexplained by forecast; the sector horizontal location error was about 0.10. The paper also introduces an approach to estimate the sector three-dimensional actual weather coverage by using multiple sector forecasts, which turned out to produce better predictions. Using Multiple Linear Regression (MLR) model for this approach, the correlations between actual observation and the multiple sector forecast model prediction improved by several percents at 95% confidence level in comparison with single sector forecast.

  17. South Pole of the Sun, March 20, 2007 Anaglyph

    NASA Image and Video Library

    2007-04-27

    NASA Solar TErrestrial RElations Observatory satellites have provided the first 3-dimensional images of the Sun. This view will aid scientists ability to understand solar physics to improve space weather forecasting. 3D glasses are necessary.

  18. North Pole of the Sun, March 20, 2007 Anaglyph

    NASA Image and Video Library

    2007-04-27

    NASA Solar TErrestrial RElations Observatory satellites have provided the first 3-dimensional images of the Sun. This view will aid scientists ability to understand solar physics to improve space weather forecasting. 3D glasses are necessary.

  19. Investigation and evaluation of a computer program to minimize three-dimensional flight time tracks

    NASA Technical Reports Server (NTRS)

    Parke, F. I.

    1981-01-01

    The program for the DC 8-D3 flight planning was slightly modified for the three dimensional flight planning for DC 10 aircrafts. Several test runs of the modified program over the North Atlantic and North America were made for verifying the program. While geopotential height and temperature were used in a previous program as meteorological data, the modified program uses wind direction and speed and temperature received from the National Weather Service. A scanning program was written to collect required weather information from the raw data received in a packed decimal format. Two sets of weather data, the 12-hour forecast and 24-hour forecast based on 0000 GMT, are used for dynamic processes in testruns. In order to save computing time only the weather data of the North Atlantic and North America is previously stored in a PCF file and then scanned one by one.

  20. An enhanced data visualization method for diesel engine malfunction classification using multi-sensor signals.

    PubMed

    Li, Yiqing; Wang, Yu; Zi, Yanyang; Zhang, Mingquan

    2015-10-21

    The various multi-sensor signal features from a diesel engine constitute a complex high-dimensional dataset. The non-linear dimensionality reduction method, t-distributed stochastic neighbor embedding (t-SNE), provides an effective way to implement data visualization for complex high-dimensional data. However, irrelevant features can deteriorate the performance of data visualization, and thus, should be eliminated a priori. This paper proposes a feature subset score based t-SNE (FSS-t-SNE) data visualization method to deal with the high-dimensional data that are collected from multi-sensor signals. In this method, the optimal feature subset is constructed by a feature subset score criterion. Then the high-dimensional data are visualized in 2-dimension space. According to the UCI dataset test, FSS-t-SNE can effectively improve the classification accuracy. An experiment was performed with a large power marine diesel engine to validate the proposed method for diesel engine malfunction classification. Multi-sensor signals were collected by a cylinder vibration sensor and a cylinder pressure sensor. Compared with other conventional data visualization methods, the proposed method shows good visualization performance and high classification accuracy in multi-malfunction classification of a diesel engine.

  1. An Enhanced Data Visualization Method for Diesel Engine Malfunction Classification Using Multi-Sensor Signals

    PubMed Central

    Li, Yiqing; Wang, Yu; Zi, Yanyang; Zhang, Mingquan

    2015-01-01

    The various multi-sensor signal features from a diesel engine constitute a complex high-dimensional dataset. The non-linear dimensionality reduction method, t-distributed stochastic neighbor embedding (t-SNE), provides an effective way to implement data visualization for complex high-dimensional data. However, irrelevant features can deteriorate the performance of data visualization, and thus, should be eliminated a priori. This paper proposes a feature subset score based t-SNE (FSS-t-SNE) data visualization method to deal with the high-dimensional data that are collected from multi-sensor signals. In this method, the optimal feature subset is constructed by a feature subset score criterion. Then the high-dimensional data are visualized in 2-dimension space. According to the UCI dataset test, FSS-t-SNE can effectively improve the classification accuracy. An experiment was performed with a large power marine diesel engine to validate the proposed method for diesel engine malfunction classification. Multi-sensor signals were collected by a cylinder vibration sensor and a cylinder pressure sensor. Compared with other conventional data visualization methods, the proposed method shows good visualization performance and high classification accuracy in multi-malfunction classification of a diesel engine. PMID:26506347

  2. Verification of Meteorological and Oceanographic Ensemble Forecasts in the U.S. Navy

    NASA Astrophysics Data System (ADS)

    Klotz, S.; Hansen, J.; Pauley, P.; Sestak, M.; Wittmann, P.; Skupniewicz, C.; Nelson, G.

    2013-12-01

    The Navy Ensemble Forecast Verification System (NEFVS) has been promoted recently to operational status at the U.S. Navy's Fleet Numerical Meteorology and Oceanography Center (FNMOC). NEFVS processes FNMOC and National Centers for Environmental Prediction (NCEP) meteorological and ocean wave ensemble forecasts, gridded forecast analyses, and innovation (observational) data output by FNMOC's data assimilation system. The NEFVS framework consists of statistical analysis routines, a variety of pre- and post-processing scripts to manage data and plot verification metrics, and a master script to control application workflow. NEFVS computes metrics that include forecast bias, mean-squared error, conditional error, conditional rank probability score, and Brier score. The system also generates reliability and Receiver Operating Characteristic diagrams. In this presentation we describe the operational framework of NEFVS and show examples of verification products computed from ensemble forecasts, meteorological observations, and forecast analyses. The construction and deployment of NEFVS addresses important operational and scientific requirements within Navy Meteorology and Oceanography. These include computational capabilities for assessing the reliability and accuracy of meteorological and ocean wave forecasts in an operational environment, for quantifying effects of changes and potential improvements to the Navy's forecast models, and for comparing the skill of forecasts from different forecast systems. NEFVS also supports the Navy's collaboration with the U.S. Air Force, NCEP, and Environment Canada in the North American Ensemble Forecast System (NAEFS) project and with the Air Force and the National Oceanic and Atmospheric Administration (NOAA) in the National Unified Operational Prediction Capability (NUOPC) program. This program is tasked with eliminating unnecessary duplication within the three agencies, accelerating the transition of new technology, such as multi-model ensemble forecasting, to U.S. Department of Defense use, and creating a superior U.S. global meteorological and oceanographic prediction capability. Forecast verification is an important component of NAEFS and NUOPC. Distribution Statement A: Approved for Public Release; distribution is unlimited

  3. Verification of Meteorological and Oceanographic Ensemble Forecasts in the U.S. Navy

    NASA Astrophysics Data System (ADS)

    Klotz, S. P.; Hansen, J.; Pauley, P.; Sestak, M.; Wittmann, P.; Skupniewicz, C.; Nelson, G.

    2012-12-01

    The Navy Ensemble Forecast Verification System (NEFVS) has been promoted recently to operational status at the U.S. Navy's Fleet Numerical Meteorology and Oceanography Center (FNMOC). NEFVS processes FNMOC and National Centers for Environmental Prediction (NCEP) meteorological and ocean wave ensemble forecasts, gridded forecast analyses, and innovation (observational) data output by FNMOC's data assimilation system. The NEFVS framework consists of statistical analysis routines, a variety of pre- and post-processing scripts to manage data and plot verification metrics, and a master script to control application workflow. NEFVS computes metrics that include forecast bias, mean-squared error, conditional error, conditional rank probability score, and Brier score. The system also generates reliability and Receiver Operating Characteristic diagrams. In this presentation we describe the operational framework of NEFVS and show examples of verification products computed from ensemble forecasts, meteorological observations, and forecast analyses. The construction and deployment of NEFVS addresses important operational and scientific requirements within Navy Meteorology and Oceanography (METOC). These include computational capabilities for assessing the reliability and accuracy of meteorological and ocean wave forecasts in an operational environment, for quantifying effects of changes and potential improvements to the Navy's forecast models, and for comparing the skill of forecasts from different forecast systems. NEFVS also supports the Navy's collaboration with the U.S. Air Force, NCEP, and Environment Canada in the North American Ensemble Forecast System (NAEFS) project and with the Air Force and the National Oceanic and Atmospheric Administration (NOAA) in the National Unified Operational Prediction Capability (NUOPC) program. This program is tasked with eliminating unnecessary duplication within the three agencies, accelerating the transition of new technology, such as multi-model ensemble forecasting, to U.S. Department of Defense use, and creating a superior U.S. global meteorological and oceanographic prediction capability. Forecast verification is an important component of NAEFS and NUOPC.

  4. The Wind Forecast Improvement Project (WFIP): A Public/Private Partnership for Improving Short Term Wind Energy Forecasts and Quantifying the Benefits of Utility Operations. The Southern Study Area, Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Jeffrey M.; Manobianco, John; Schroeder, John

    This Final Report presents a comprehensive description, findings, and conclusions for the Wind Forecast Improvement Project (WFIP) -- Southern Study Area (SSA) work led by AWS Truepower (AWST). This multi-year effort, sponsored by the Department of Energy (DOE) and National Oceanographic and Atmospheric Administration (NOAA), focused on improving short-term (15-minute - 6 hour) wind power production forecasts through the deployment of an enhanced observation network of surface and remote sensing instrumentation and the use of a state-of-the-art forecast modeling system. Key findings from the SSA modeling and forecast effort include: 1. The AWST WFIP modeling system produced an overall 10more » - 20% improvement in wind power production forecasts over the existing Baseline system, especially during the first three forecast hours; 2. Improvements in ramp forecast skill, particularly for larger up and down ramps; 3. The AWST WFIP data denial experiments showed mixed results in the forecasts incorporating the experimental network instrumentation; however, ramp forecasts showed significant benefit from the additional observations, indicating that the enhanced observations were key to the model systems’ ability to capture phenomena responsible for producing large short-term excursions in power production; 4. The OU CAPS ARPS simulations showed that the additional WFIP instrument data had a small impact on their 3-km forecasts that lasted for the first 5-6 hours, and increasing the vertical model resolution in the boundary layer had a greater impact, also in the first 5 hours; and 5. The TTU simulations were inconclusive as to which assimilation scheme (3DVAR versus EnKF) provided better forecasts, and the additional observations resulted in some improvement to the forecasts in the first 1 - 3 hours.« less

  5. Obtaining lower bounds from the progressive hedging algorithm for stochastic mixed-integer programs

    DOE PAGES

    Gade, Dinakar; Hackebeil, Gabriel; Ryan, Sarah M.; ...

    2016-04-02

    We present a method for computing lower bounds in the progressive hedging algorithm (PHA) for two-stage and multi-stage stochastic mixed-integer programs. Computing lower bounds in the PHA allows one to assess the quality of the solutions generated by the algorithm contemporaneously. The lower bounds can be computed in any iteration of the algorithm by using dual prices that are calculated during execution of the standard PHA. In conclusion, we report computational results on stochastic unit commitment and stochastic server location problem instances, and explore the relationship between key PHA parameters and the quality of the resulting lower bounds.

  6. Statistical and Hydrological evaluation of precipitation forecasts from IMD MME and ECMWF numerical weather forecasts for Indian River basins

    NASA Astrophysics Data System (ADS)

    Mohite, A. R.; Beria, H.; Behera, A. K.; Chatterjee, C.; Singh, R.

    2016-12-01

    Flood forecasting using hydrological models is an important and cost-effective non-structural flood management measure. For forecasting at short lead times, empirical models using real-time precipitation estimates have proven to be reliable. However, their skill depreciates with increasing lead time. Coupling a hydrologic model with real-time rainfall forecasts issued from numerical weather prediction (NWP) systems could increase the lead time substantially. In this study, we compared 1-5 days precipitation forecasts from India Meteorological Department (IMD) Multi-Model Ensemble (MME) with European Center for Medium Weather forecast (ECMWF) NWP forecasts for over 86 major river basins in India. We then evaluated the hydrologic utility of these forecasts over Basantpur catchment (approx. 59,000 km2) of the Mahanadi River basin. Coupled MIKE 11 RR (NAM) and MIKE 11 hydrodynamic (HD) models were used for the development of flood forecast system (FFS). RR model was calibrated using IMD station rainfall data. Cross-sections extracted from SRTM 30 were used as input to the MIKE 11 HD model. IMD started issuing operational MME forecasts from the year 2008, and hence, both the statistical and hydrologic evaluation were carried out from 2008-2014. The performance of FFS was evaluated using both the NWP datasets separately for the year 2011, which was a large flood year in Mahanadi River basin. We will present figures and metrics for statistical (threshold based statistics, skill in terms of correlation and bias) and hydrologic (Nash Sutcliffe efficiency, mean and peak error statistics) evaluation. The statistical evaluation will be at pan-India scale for all the major river basins and the hydrologic evaluation will be for the Basantpur catchment of the Mahanadi River basin.

  7. Two-polariton bound states in the Jaynes-Cummings-Hubbard model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Max T. C.; Law, C. K.

    2011-05-15

    We examine the eigenstates of the one-dimensional Jaynes-Cummings-Hubbard model in the two-excitation subspace. We discover that two-excitation bound states emerge when the ratio of vacuum Rabi frequency to the tunneling rate between cavities exceeds a critical value. We determine the critical value as a function of the quasimomentum quantum number, and indicate that the bound states carry a strong correlation in which the two polaritons appear to be spatially confined together.

  8. Regularization by Functions of Bounded Variation and Applications to Image Enhancement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casas, E.; Kunisch, K.; Pola, C.

    1999-09-15

    Optimization problems regularized by bounded variation seminorms are analyzed. The optimality system is obtained and finite-dimensional approximations of bounded variation function spaces as well as of the optimization problems are studied. It is demonstrated that the choice of the vector norm in the definition of the bounded variation seminorm is of special importance for approximating subspaces consisting of piecewise constant functions. Algorithms based on a primal-dual framework that exploit the structure of these nondifferentiable optimization problems are proposed. Numerical examples are given for denoising of blocky images with very high noise.

  9. Spatio-temporal behaviour of medium-range ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Kipling, Zak; Primo, Cristina; Charlton-Perez, Andrew

    2010-05-01

    Using the recently-developed mean-variance of logarithms (MVL) diagram, together with the TIGGE archive of medium-range ensemble forecasts from nine different centres, we present an analysis of the spatio-temporal dynamics of their perturbations, and show how the differences between models and perturbation techniques can explain the shape of their characteristic MVL curves. We also consider the use of the MVL diagram to compare the growth of perturbations within the ensemble with the growth of the forecast error, showing that there is a much closer correspondence for some models than others. We conclude by looking at how the MVL technique might assist in selecting models for inclusion in a multi-model ensemble, and suggest an experiment to test its potential in this context.

  10. Capturing rogue waves by multi-point statistics

    NASA Astrophysics Data System (ADS)

    Hadjihosseini, A.; Wächter, Matthias; Hoffmann, N. P.; Peinke, J.

    2016-01-01

    As an example of a complex system with extreme events, we investigate ocean wave states exhibiting rogue waves. We present a statistical method of data analysis based on multi-point statistics which for the first time allows the grasping of extreme rogue wave events in a highly satisfactory statistical manner. The key to the success of the approach is mapping the complexity of multi-point data onto the statistics of hierarchically ordered height increments for different time scales, for which we can show that a stochastic cascade process with Markov properties is governed by a Fokker-Planck equation. Conditional probabilities as well as the Fokker-Planck equation itself can be estimated directly from the available observational data. With this stochastic description surrogate data sets can in turn be generated, which makes it possible to work out arbitrary statistical features of the complex sea state in general, and extreme rogue wave events in particular. The results also open up new perspectives for forecasting the occurrence probability of extreme rogue wave events, and even for forecasting the occurrence of individual rogue waves based on precursory dynamics.

  11. Development of fine-resolution analyses and expanded large-scale forcing properties. Part II: Scale-awareness and application to single-column model experiments

    DOE PAGES

    Feng, Sha; Vogelmann, Andrew M.; Li, Zhijin; ...

    2015-01-20

    Fine-resolution three-dimensional fields have been produced using the Community Gridpoint Statistical Interpolation (GSI) data assimilation system for the U.S. Department of Energy’s Atmospheric Radiation Measurement Program (ARM) Southern Great Plains region. The GSI system is implemented in a multi-scale data assimilation framework using the Weather Research and Forecasting model at a cloud-resolving resolution of 2 km. From the fine-resolution three-dimensional fields, large-scale forcing is derived explicitly at grid-scale resolution; a subgrid-scale dynamic component is derived separately, representing subgrid-scale horizontal dynamic processes. Analyses show that the subgrid-scale dynamic component is often a major component over the large-scale forcing for grid scalesmore » larger than 200 km. The single-column model (SCM) of the Community Atmospheric Model version 5 (CAM5) is used to examine the impact of the grid-scale and subgrid-scale dynamic components on simulated precipitation and cloud fields associated with a mesoscale convective system. It is found that grid-scale size impacts simulated precipitation, resulting in an overestimation for grid scales of about 200 km but an underestimation for smaller grids. The subgrid-scale dynamic component has an appreciable impact on the simulations, suggesting that grid-scale and subgrid-scale dynamic components should be considered in the interpretation of SCM simulations.« less

  12. The multi-layer multi-configuration time-dependent Hartree method for bosons: theory, implementation, and applications.

    PubMed

    Cao, Lushuai; Krönke, Sven; Vendrell, Oriol; Schmelcher, Peter

    2013-10-07

    We develop the multi-layer multi-configuration time-dependent Hartree method for bosons (ML-MCTDHB), a variational numerically exact ab initio method for studying the quantum dynamics and stationary properties of general bosonic systems. ML-MCTDHB takes advantage of the permutation symmetry of identical bosons, which allows for investigations of the quantum dynamics from few to many-body systems. Moreover, the multi-layer feature enables ML-MCTDHB to describe mixed bosonic systems consisting of arbitrary many species. Multi-dimensional as well as mixed-dimensional systems can be accurately and efficiently simulated via the multi-layer expansion scheme. We provide a detailed account of the underlying theory and the corresponding implementation. We also demonstrate the superior performance by applying the method to the tunneling dynamics of bosonic ensembles in a one-dimensional double well potential, where a single-species bosonic ensemble of various correlation strengths and a weakly interacting two-species bosonic ensemble are considered.

  13. On the upper bound in the Bohm sheath criterion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kotelnikov, I. A., E-mail: I.A.Kotelnikov@inp.nsk.su; Skovorodin, D. I., E-mail: D.I.Skovorodin@inp.nsk.su

    2016-02-15

    The question is discussed about the existence of an upper bound in the Bohm sheath criterion, according to which the Debye sheath at the interface between plasma and a negatively charged electrode is stable only if the ion flow velocity in plasma exceeds the ion sound velocity. It is stated that, with an exception of some artificial ionization models, the Bohm sheath criterion is satisfied as an equality at the lower bound and the ion flow velocity is equal to the speed of sound. In the one-dimensional theory, a supersonic flow appears in an unrealistic model of a localized ionmore » source the size of which is less than the Debye length; however, supersonic flows seem to be possible in the two- and three-dimensional cases. In the available numerical codes used to simulate charged particle sources with a plasma emitter, the presence of the upper bound in the Bohm sheath criterion is not supposed; however, the correspondence with experimental data is usually achieved if the ion flow velocity in plasma is close to the ion sound velocity.« less

  14. Building regional early flood warning systems by AI techniques

    NASA Astrophysics Data System (ADS)

    Chang, F. J.; Chang, L. C.; Amin, M. Z. B. M.

    2017-12-01

    Building early flood warning system is essential for the protection of the residents against flood hazards and make actions to mitigate the losses. This study implements AI technology for forecasting multi-step-ahead regional flood inundation maps during storm events. The methodology includes three major schemes: (1) configuring the self-organizing map (SOM) to categorize a large number of regional inundation maps into a meaningful topology; (2) building dynamic neural networks to forecast multi-step-ahead average inundated depths (AID); and (3) adjusting the weights of the selected neuron in the constructed SOM based on the forecasted AID to obtain real-time regional inundation maps. The proposed models are trained, and tested based on a large number of inundation data sets collected in regions with the most frequent and serious flooding in the river basin. The results appear that the SOM topological relationships between individual neurons and their neighbouring neurons are visible and clearly distinguishable, and the hybrid model can continuously provide multistep-ahead visible regional inundation maps with high resolution during storm events, which have relatively small RMSE values and high R2 as compared with numerical simulation data sets. The computing time is only few seconds, and thereby leads to real-time regional flood inundation forecasting and make early flood inundation warning system. We demonstrate that the proposed hybrid ANN-based model has a robust and reliable predictive ability and can be used for early warning to mitigate flood disasters.

  15. Optimization of radioactive sources to achieve the highest precision in three-phase flow meters using Jaya algorithm.

    PubMed

    Roshani, G H; Karami, A; Khazaei, A; Olfateh, A; Nazemi, E; Omidi, M

    2018-05-17

    Gamma ray source has very important role in precision of multi-phase flow metering. In this study, different combination of gamma ray sources (( 133 Ba- 137 Cs), ( 133 Ba- 60 Co), ( 241 Am- 137 Cs), ( 241 Am- 60 Co), ( 133 Ba- 241 Am) and ( 60 Co- 137 Cs)) were investigated in order to optimize the three-phase flow meter. Three phases were water, oil and gas and the regime was considered annular. The required data was numerically generated using MCNP-X code which is a Monte-Carlo code. Indeed, the present study devotes to forecast the volume fractions in the annular three-phase flow, based on a multi energy metering system including various radiation sources and also one NaI detector, using a hybrid model of artificial neural network and Jaya Optimization algorithm. Since the summation of volume fractions is constant, a constraint modeling problem exists, meaning that the hybrid model must forecast only two volume fractions. Six hybrid models associated with the number of used radiation sources are designed. The models are employed to forecast the gas and water volume fractions. The next step is to train the hybrid models based on numerically obtained data. The results show that, the best forecast results are obtained for the gas and water volume fractions of the system including the ( 241 Am- 137 Cs) as the radiation source. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Simulation Based Earthquake Forecasting with RSQSim

    NASA Astrophysics Data System (ADS)

    Gilchrist, J. J.; Jordan, T. H.; Dieterich, J. H.; Richards-Dinger, K. B.

    2016-12-01

    We are developing a physics-based forecasting model for earthquake ruptures in California. We employ the 3D boundary element code RSQSim to generate synthetic catalogs with millions of events that span up to a million years. The simulations incorporate rate-state fault constitutive properties in complex, fully interacting fault systems. The Unified California Earthquake Rupture Forecast Version 3 (UCERF3) model and data sets are used for calibration of the catalogs and specification of fault geometry. Fault slip rates match the UCERF3 geologic slip rates and catalogs are tuned such that earthquake recurrence matches the UCERF3 model. Utilizing the Blue Waters Supercomputer, we produce a suite of million-year catalogs to investigate the epistemic uncertainty in the physical parameters used in the simulations. In particular, values of the rate- and state-friction parameters a and b, the initial shear and normal stress, as well as the earthquake slip speed, are varied over several simulations. In addition to testing multiple models with homogeneous values of the physical parameters, the parameters a, b, and the normal stress are varied with depth as well as in heterogeneous patterns across the faults. Cross validation of UCERF3 and RSQSim is performed within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) to determine the affect of the uncertainties in physical parameters observed in the field and measured in the lab, on the uncertainties in probabilistic forecasting. We are particularly interested in the short-term hazards of multi-event sequences due to complex faulting and multi-fault ruptures.

  17. Large time behavior of entropy solutions to one-dimensional unipolar hydrodynamic model for semiconductor devices

    NASA Astrophysics Data System (ADS)

    Huang, Feimin; Li, Tianhong; Yu, Huimin; Yuan, Difan

    2018-06-01

    We are concerned with the global existence and large time behavior of entropy solutions to the one-dimensional unipolar hydrodynamic model for semiconductors in the form of Euler-Poisson equations in a bounded interval. In this paper, we first prove the global existence of entropy solution by vanishing viscosity and compensated compactness framework. In particular, the solutions are uniformly bounded with respect to space and time variables by introducing modified Riemann invariants and the theory of invariant region. Based on the uniform estimates of density, we further show that the entropy solution converges to the corresponding unique stationary solution exponentially in time. No any smallness condition is assumed on the initial data and doping profile. Moreover, the novelty in this paper is about the unform bound with respect to time for the weak solutions of the isentropic Euler-Poisson system.

  18. Gauge field localization on brane worlds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guerrero, Rommel; Rodriguez, R. Omar; Melfo, Alejandra

    2010-04-15

    We consider the effects of spacetime curvature and brane thickness on the localization of gauge fields on a brane via kinetic terms induced by localized fermions. We find that in a warped geometry with an infinitely thin brane, both the infrared and the ultraviolet behavior of the electromagnetic propagator are affected, providing a more stringent bound on the brane's tension than that coming from the requirement of four-dimensional gravity on the brane. On the other hand, for a thick wall in a flat spacetime, where the fermions are localized by means of a Yukawa coupling, we find that four-dimensional electromagnetismmore » is recovered in a region bounded from above by the same critical distance appearing in the thin case, but also from below by a new scale related to the brane's thickness and the electromagnetic couplings. This imposes very stringent bounds on the brane's thickness which seem to invalidate the localization mechanism for this case.« less

  19. Teaching ocean wave forecasting using computer-generated visualization and animation—Part 1: sea forecasting

    NASA Astrophysics Data System (ADS)

    Whitford, Dennis J.

    2002-05-01

    Ocean waves are the most recognized phenomena in oceanography. Unfortunately, undergraduate study of ocean wave dynamics and forecasting involves mathematics and physics and therefore can pose difficulties with some students because of the subject's interrelated dependence on time and space. Verbal descriptions and two-dimensional illustrations are often insufficient for student comprehension. Computer-generated visualization and animation offer a visually intuitive and pedagogically sound medium to present geoscience, yet there are very few oceanographic examples. A two-part article series is offered to explain ocean wave forecasting using computer-generated visualization and animation. This paper, Part 1, addresses forecasting of sea wave conditions and serves as the basis for the more difficult topic of swell wave forecasting addressed in Part 2. Computer-aided visualization and animation, accompanied by oral explanation, are a welcome pedagogical supplement to more traditional methods of instruction. In this article, several MATLAB ® software programs have been written to visualize and animate development and comparison of wave spectra, wave interference, and forecasting of sea conditions. These programs also set the stage for the more advanced and difficult animation topics in Part 2. The programs are user-friendly, interactive, easy to modify, and developed as instructional tools. By using these software programs, teachers can enhance their instruction of these topics with colorful visualizations and animation without requiring an extensive background in computer programming.

  20. Nonlinear problems in data-assimilation : Can synchronization help?

    NASA Astrophysics Data System (ADS)

    Tribbia, J. J.; Duane, G. S.

    2009-12-01

    Over the past several years, operational weather centers have initiated ensemble prediction and assimilation techniques to estimate the error covariance of forecasts in the short and the medium range. The ensemble techniques used are based on linear methods. The theory This technique s been shown to be a useful indicator of skill in the linear range where forecast errors are small relative to climatological variance. While this advance has been impressive, there are still ad hoc aspects of its use in practice, like the need for covariance inflation which are troubling. Furthermore, to be of utility in the nonlinear range an ensemble assimilation and prediction method must be capable of giving probabilistic information for the situation where a probability density forecast becomes multi-modal. A prototypical, simplest example of such a situation is the planetary-wave regime transition where the pdf is bimodal. Our recent research show how the inconsistencies and extensions of linear methodology can be consistently treated using the paradigm of synchronization which views the problems of assimilation and forecasting as that of optimizing the forecast model state with respect to the future evolution of the atmosphere.

Top