Halkett, G K B; McKay, J; Hegney, D G; Breen, Lauren J; Berg, M; Ebert, M A; Davis, M; Kearvell, R
2017-09-01
Workforce recruitment and retention are issues in radiation oncology. The working environment is likely to have an impact on retention; however, there is a lack of research in this area. The objectives of this study were to: investigate radiation therapists' (RTs) and radiation oncology medical physicists' (ROMPs) perceptions of work and the working environment; and determine the factors that influence the ability of RTs and ROMPs to undertake their work and how these factors affect recruitment and retention. Semi-structured interviews were conducted and thematic analysis was used. Twenty-eight RTs and 21 ROMPs participated. The overarching themes were delivering care, support in work, working conditions and lifestyle. The overarching themes were mostly consistent across both groups; however, the exemplars reflected the different roles and perspectives of RTs and ROMPs. Participants described the importance they placed on treating patients and improving their lives. Working conditions were sometimes difficult with participants reporting pressure at work, large workloads and longer hours and overtime. Insufficient staff numbers impacted on the effectiveness of staff, the working environment and intentions to stay. Staff satisfaction is likely to be improved if changes are made to the working environment. We make recommendations that may assist departments to support RTs and ROMPs. © 2016 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Voellmer, George
1992-01-01
The Robotics Branch of the Goddard Space Flight Center has under development a robot that fits inside a Get Away Special can. In the RObotic Materials Processing System (ROMPS) HitchHiker experiment, this robot is used to transport pallets containing wafers of different materials from their storage rack to a halogen lamp furnace for rapid thermal processing in a microgravity environment. It then returns them to their storage rack. A large part of the mechanical design of the robot dealt with the potential misalignment between the various components that are repeatedly mated and demated. A system of tapered guides and compliant springs was designed to work within the robot's force and accuracy capabilities. This paper discusses the above and other robot design issues in detail, and presents examples of ROMPS robot analyses that are applicable to other HitcherHiker materials handling missions.
On the hitchhiker Robot Operated Materials Processing System: Experiment data system
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Jenstrom, Del
1995-01-01
The Space Shuttle Discovery STS-64 mission carried the first American autonomous robot into space, the Robot Operated Materials Processing System (ROMPS). On this mission ROMPS was the only Hitchhiker experiment and had a unique opportunity to utilize all Hitchhiker space carrier capabilities. ROMPS conducted rapid thermal processing of the one hundred semiconductor material samples to study how micro gravity affects the resulting material properties. The experiment was designed, built and operated by a small GSFC team in cooperation with industry and university based principal investigators who provided the material samples and data interpretation. ROMPS' success presents some valuable lessons in such cooperation, as well as in the utilization of the Hitchhiker carrier for complex applications. The motivation of this paper is to share these lessons with the scientific community interested in attached payload experiments. ROMPS has a versatile and intelligent material processing control data system. This paper uses the ROMPS data system as the guiding thread to present the ROMPS mission experience. It presents an overview of the ROMPS experiment followed by considerations of the flight and ground data subsystems and their architecture, data products generation during mission operations, and post mission data utilization. It then presents the lessons learned from the development and operation of the ROMPS data system as well as those learned during post-flight data processing.
End-functionalized ROMP polymers for Biomedical Applications
Madkour, Ahmad E.; Koch, Amelie H. R.; Lienkamp, Karen; Tew, Gregory N.
2010-01-01
We present two novel allyl-based terminating agents that can be used to end-functionalize living polymer chains obtained by ring-opening metathesis polymerization (ROMP) using Grubbs’ third generation catalyst. Both terminating agents can be easily synthesized and yield ROMP polymers with stable, storable activated ester groups at the chain-end. These end-functionalized ROMP polymers are attractive building blocks for advanced polymeric materials, especially in the biomedical field. Dye-labeling and surface-coupling of antimicrobially active polymers using these end-groups were demonstrated. PMID:21499549
RoMPS concept review automatic control of space robot
NASA Technical Reports Server (NTRS)
1991-01-01
The Robot operated Material Processing in Space (RoMPS) experiment is being performed to explore the marriage of two emerging space commercialization technologies: materials processing in microgravity and robotics. This concept review presents engineering drawings and limited technical descriptions of the RoMPS programs' electrical and software systems.
Toward chemical propulsion: synthesis of ROMP-propelled nanocars.
Godoy, Jazmin; Vives, Guillaume; Tour, James M
2011-01-25
The synthesis and ring-opening metathesis polymerization (ROMP) activity of two nanocars functionalized with an olefin metathesis catalyst is reported. The nanocars were attached to a Hoveyda-Grubbs first- or second-generation metathesis catalyst via a benzylidene moiety. The catalytic activity of these nanocars toward ROMP of 1,5-cyclooctadiene was similar to that of their parent catalysts. The activity of the Hoveyda-Grubbs first-generation catalyst-functionalized nanocar was further tested with polymerization of norbornene. Hence, the prospect is heightened for a ROMP process to propel nanocars across a surface by providing the translational force.
Fursule, Ishan A; Abtahi, Ashkan; Watkins, Charles B; Graham, Kenneth R; Berron, Brad J
2018-01-15
In situ crosslinking is expected to increase the solvent stability of coatings formed by surface-initiated ring opening metathesis polymerization (SI ROMP). Solvent-associated degradation limits the utility of SI ROMP coatings. SI ROMP coatings have a unique capacity for post-functionalization through reaction of the unsaturated site on the polymer backbone. Any post-reaction scheme which requires a liquid solvent has the potential to degrade the coating and lower the thickness of the resulting film. We designed a macromolecular crosslinking group based on PEG dinorbornene. The PEG length is tailored to the expected mean chain to chain distance during surface-initiated polymerization. This crosslinking macromer is randomly copolymerized with norbornene through SI ROMP on a gold coated substrate. The solvent stability of polynorbornene coatings with and without PEG dinorbornene is quantitatively determined, and the mechanism of degradation is further supported through XPS and AFM analyses. The addition of the 0.25mol% PEG dinorbornene significantly increases the solvent stability of the SI ROMP coatings. The crosslinker presence in the more stable films is supported with observable PEG absorbances by FTIR and an increase in contact angle hysteresis when compared to non-crosslinked coatings. The oxidation of the SI ROMP coatings is supported by the observation of carbonyl oxygen in the polynorbornene coatings. The rapid loss of the non-crosslinked SI ROMP coating corresponds to nanoscale pitting across the surface and micron-scale regions of widespread film loss. The crosslinked coatings have uniform nanoscale pitting, but the crosslinked films show no evidence of micron-scale film damage. In all, the incorporation of minimal crosslinking content is a simple strategy for improving the solvent stability of SI ROMP coatings. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Aïssa, B.; Nechache, R.; Haddad, E.; Jamroz, W.; Merle, P. G.; Rosei, F.
2012-10-01
A self healing composite material consisting of 5-Ethylidene-2-Norbornene (5E2N) monomer reacted with Ruthenium Grubbs' Catalyst (RGC) was prepared. First, the kinetics of the 5E2N ring opening metathesis polymerization (ROMP) reaction RGC was studied as a function of temperature. We show that the polymerization reaction is still effective in a large temperature range (-15 to 45 °C), occurring at short time scales (less than 1 min at 40 °C). Second, the amount of RGC required for ROMP reaction significantly decreased through its nanostructuration by means of a UV-excimer laser ablation process. RGC nanostructures of few nanometers in size where successfully obtained directly on silicon substrates. The X-ray photoelectron spectroscopy data strongly suggest that the RGC still keep its original stoichiometry after nanostructuration. More importantly, the associated ROMP reaction was successfully achieved at an extreme low RGC concentration equivalent to (11.16 ± 1.28) × 10-4 Vol.%, occurring at very short time reaction. This approach opens new prospects for using healing agent nanocomposite materials for self-repair functionality, thereby obtaining a higher catalytic efficiency per unit mass.
NASA Astrophysics Data System (ADS)
Aïssa, B.; Haddad, E.; Jamroz, W.; Hassani, S.; Farahani, R. D.; Merle, P. G.; Therriault, D.
2012-10-01
We report on the fabrication of self-healing nanocomposite materials, consisting of single-walled carbon nanotube (SWCNT) reinforced 5-ethylidene-2-norbornene (5E2N) healing agent—reacted with ruthenium Grubbs catalyst—by means of ultrasonication, followed by a three-roll mixing mill process. The kinetics of the 5E2N ring opening metathesis polymerization (ROMP) was studied as a function of the reaction temperature and the SWCNT loads. Our results demonstrated that the ROMP reaction was still effective in a large temperature domain ( - 15-45 °C), occurring at very short time scales (less than 1 min at 40 °C). On the other hand, the micro-indentation analysis performed on the SWCNT/5E2N nanocomposite material after its ROMP polymerization showed a clear increase in both the hardness and the Young modulus—up to nine times higher than that of the virgin polymer—when SWCNT loads range only from 0.1 to 2 wt%. The approach demonstrated here opens new prospects for using carbon nanotube and healing agent nanocomposite materials for self-repair functionality, especially in a space environment.
Surveying trends in radiation oncology medical physics in the Asia Pacific Region.
Kron, Tomas; Healy, Brendan; Ng, Kwan Hoong
2016-07-01
Our study aims to assess and track work load, working conditions and professional recognition of radiation oncology medical physicists (ROMPs) in the Asia Pacific Region over time. A structured questionnaire was mailed in 2008, 2011 and 2014 to senior medical physicists representing 23 countries. The questionnaire covers 7 themes: education and training including certification; staffing; typical tasks; professional organisations; resources; research and teaching; job satisfaction. Across all surveys the response rate was >85% with the replies representing practice affecting more than half of the world's population. The expectation of ROMP qualifications (MSc and between 1 and 3years of clinical experience) has not changed much over the years. However, compared to 2008, the number of medical physicists in many countries has doubled. Formal professional certification is only available in a small number of countries. The number of experienced ROMPs is small in particular in low and middle income countries. The increase in staff numbers from 2008 to 2014 is matched by a similar increase in the number of treatment units which is accompanied by an increase in treatment complexity. Many ROMPs are required to work overtime and not many find time for research. Resource availability has only improved marginally and ROMPs still feel generally overworked, but professional recognition, while varying widely, appears to be improving slowly. While number of physicists and complexity of treatment techniques and technologies have increased significantly, ROMP practice remains essentially unchanged over the last 6years in the Asia Pacific Region. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Adrift in the Gray Zone: IRB Perspectives on Research in the Learning Health System.
Lee, Sandra Soo-Jin; Kelley, Maureen; Cho, Mildred K; Kraft, Stephanie Alessi; James, Cyan; Constantine, Melissa; Meyer, Adrienne N; Diekema, Douglas; Capron, Alexander M; Wilfond, Benjamin S; Magnus, David
2016-01-01
Human subjects protection in healthcare contexts rests on the premise that a principled boundary distinguishes clinical research and clinical practice. However, growing use of evidence-based clinical practices by health systems makes it increasingly difficult to disentangle research from a wide range of clinical activities that are sometimes called "research on medical practice" (ROMP), including quality improvement activities and comparative effectiveness research. The recent growth of ROMP activities has created an ethical and regulatory gray zone with significant implications for the oversight of human subjects research. We conducted six semi-structured, open-ended focus group discussions with IRB members to understand their experiences and perspectives on ethical oversight of ROMP, including randomization of patients to standard treatments. Our study revealed that IRB members are unclear or divided on the central questions at stake in the current policy debate over ethical oversight of ROMP: IRB members struggle to make a clear distinction between clinical research and medical practice improvement, lack consensus on when ROMP requires IRB review and oversight, and are uncertain about what constitutes incremental risk when patients are randomized to different treatments, any of which may be offered in usual care. They characterized the central challenge as a balancing act, between, on the one hand, making information fully transparent to patients and providing adequate oversight, and on the other hand, avoiding a chilling effect on the research process or harming the physician-patient relationship. Evidence-based guidance that supports IRB members in providing adequate and effective oversight of ROMP without impeding the research process or harming the physician-patient relationship is necessary to realize the full benefits of the learning health system.
Radzinski, Scott C; Foster, Jeffrey C; Matson, John B
2016-04-01
Bottlebrush polymers are synthesized using a tandem ring-opening polymerization (ROP) and ring-opening metathesis polymerization (ROMP) strategy. For the first time, ROP and ROMP are conducted sequentially in the same pot to yield well-defined bottlebrush polymers with molecular weights in excess of 10(6) Da. The first step of this process involves the synthesis of a polylactide macromonomer (MM) via ROP of d,l-lactide initiated by an alcohol-functionalized norbornene. ROMP grafting-through is then carried out in the same pot to produce the bottlebrush polymer. The applicability of this methodology is evaluated for different MM molecular weights and bottlebrush backbone degrees of polymerization. Size-exclusion chromatographic and (1)H NMR spectroscopic analyses confirm excellent control over both polymerization steps. In addition, bottlebrush polymers are imaged using atomic force microscopy and stain-free transmission electron microscopy on graphene oxide. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A study of space-rated connectors using a robot end-effector
NASA Technical Reports Server (NTRS)
Nguyen, Charles C.
1995-01-01
The main research activities have been directed toward the study of the Robot Operated Materials Processing System (ROMPS), developed at GSFC under a flight project to investigate commercially promising in-space material processes and to design reflyable robot automated systems to be used in the above processes for low-cost operations. The research activities can be divided into two phases. Phase 1 dealt with testing of ROMPS robot mechanical interfaces and compliant device using a Stewart Platform testbed and Phase 2 with computer simulation study of the ROMPS robot control system. This report provides a summary of the results obtained in Phase 1 and Phase 2.
Medical physics aspects of cancer care in the Asia Pacific region
Kron, T; Cheung, KY; Dai, J; Ravindran, P; Soejoko, D; Inamura, K; Song, JY; Bold, L; Srivastava, R; Rodriguez, L; Wong, TJ; Kumara, A; Lee, CC; Krisanachinda, A; Nguyen, XC; Ng, KH
2008-01-01
Medical physics plays an essential role in modern medicine. This is particularly evident in cancer care where medical physicists are involved in radiotherapy treatment planning and quality assurance as well as in imaging and radiation protection. Due to the large variety of tasks and interests, medical physics is often subdivided into specialties such as radiology, nuclear medicine and radiation oncology medical physics. However, even within their specialty, the role of radiation oncology medical physicists (ROMPs) is diverse and varies between different societies. Therefore, a questionnaire was sent to leading medical physicists in most countries/areas in the Asia/Pacific region to determine the education, role and status of medical physicists. Answers were received from 17 countries/areas representing nearly 2800 radiation oncology medical physicists. There was general agreement that medical physicists should have both academic (typically at MSc level) and clinical (typically at least 2 years) training. ROMPs spent most of their time working in radiotherapy treatment planning (average 17 hours per week); however radiation protection and engineering tasks were also common. Typically, only physicists in large centres are involved in research and teaching. Most respondents thought that the workload of physicists was high, with more than 500 patients per year per physicist, less than one ROMP per two oncologists being the norm, and on average, one megavoltage treatment unit per medical physicist. There was also a clear indication of increased complexity of technology in the region with many countries/areas reporting to have installed helical tomotherapy, IMRT (Intensity Modulated Radiation Therapy), IGRT (Image Guided Radiation Therapy), Gamma-knife and Cyber-knife units. This and the continued workload from brachytherapy will require growing expertise and numbers in the medical physics workforce. Addressing these needs will be an important challenge for the future. PMID:21611001
ROMPS critical design review. Volume 2: Robot module design documentation
NASA Technical Reports Server (NTRS)
Dobbs, M. E.
1992-01-01
The robot module design documentation for the Remote Operated Materials Processing in Space (ROMPS) experiment is compiled. This volume presents the following information: robot module modifications; Easylab commands definitions and flowcharts; Easylab program definitions and flowcharts; robot module fault conditions and structure charts; and C-DOC flow structure and cross references.
RoMPS concept review automatic control of space robot, volume 2
NASA Technical Reports Server (NTRS)
Dobbs, M. E.
1991-01-01
Topics related to robot operated materials processing in space (RoMPS) are presented in view graph form and include: (1) system concept; (2) Hitchhiker Interface Requirements; (3) robot axis control concepts; (4) Autonomous Experiment Management System; (5) Zymate Robot Controller; (6) Southwest SC-4 Computer; (7) oven control housekeeping data; and (8) power distribution.
ROMPS critical design review data package
NASA Technical Reports Server (NTRS)
Dobbs, M. E.
1992-01-01
The design elements of the Robot-Operated Material Processing in Space (ROMPS) system are described in outline and graphical form. The following subsystems/topics are addressed: servo system, testbed and simulation results, System V Controller, robot module, furnace module, SCL experiment supervisor and script sample processing control, battery system, watchdog timers, mechanical/thermal considerations, and fault conditions and recovery.
Labelling Polymers and Micellar Nanoparticles via Initiation, Propagation and Termination with ROMP
Thompson, Matthew P.; Randolph, Lyndsay M.; James, Carrie R.; Davalos, Ashley N.; Hahn, Michael E.
2014-01-01
In this paper we compare and contrast three approaches for labelling polymers with functional groups via ring-opening metathesis polymerization (ROMP). We explored the incorporation of functionality via initiation, termination and propagation employing an array of novel initiators, termination agents and monomers. The goal was to allow the generation of selectively labelled and well-defined polymers that would in turn lead to the formation of labelled nanomaterials. Norbornene analogues, prepared as functionalized monomers for ROMP, included fluorescent dyes (rhodamine, fluorescein, EDANS, and coumarin), quenchers (DABCYL), conjugatable moieties (NHS esters, pentafluorophenyl esters), and protected amines. In addition, a set of symmetrical olefins for terminally labelling polymers, and for the generation of initiators in situ is described. PMID:24855496
Study and Analysis of The Robot-Operated Material Processing Systems (ROMPS)
NASA Technical Reports Server (NTRS)
Nguyen, Charles C.
1996-01-01
This is a report presenting the progress of a research grant funded by NASA for work performed during 1 Oct. 1994 - 31 Sep. 1995. The report deals with the development and investigation of potential use of software for data processing for the Robot Operated Material Processing System (ROMPS). It reports on the progress of data processing of calibration samples processed by ROMPS in space and on earth. First data were retrieved using the I/O software and manually processed using MicroSoft Excel. Then the data retrieval and processing process was automated using a program written in C which is able to read the telemetry data and produce plots of time responses of sample temperatures and other desired variables. LabView was also employed to automatically retrieve and process the telemetry data.
ROMPS critical design review. Volume 1: Hardware
NASA Technical Reports Server (NTRS)
Dobbs, M. E.
1992-01-01
Topics concerning the Robot-Operated Material Processing in Space (ROMPS) Program are presented in viewgraph form and include the following: a systems overview; servocontrol and servomechanisms; testbed and simulation results; system V controller; robot module; furnace module; SCL experiment supervisor; SCL script sample processing control; SCL experiment supervisor fault handling; block diagrams; hitchhiker interfaces; battery systems; watchdog timers; mechanical/thermal systems; and fault conditions and recovery.
Healing efficiency of epoxy-based materials for structural application
NASA Astrophysics Data System (ADS)
Raimondo, Marialuigia; Guadagno, Liberata
2012-07-01
This paper describes a self-healing composite exhibiting high levels of healing efficiency under working conditions typical of aeronautic applications. The self-healing material is composed of a thermosetting epoxy matrix in which a catalyst of Ring Opening Metathesis Polymerization (ROMP) and nanocapsules are dispersed. The nanocapsules contain a monomer able to polymerize via ROMP. The preliminary results demonstrate an efficient self-repair function which is also active at very low temperatures.
Sajjad, Muhammad; Mehmood, Irfan; Baik, Sung Wook
2015-01-01
Image super-resolution (SR) plays a vital role in medical imaging that allows a more efficient and effective diagnosis process. Usually, diagnosing is difficult and inaccurate from low-resolution (LR) and noisy images. Resolution enhancement through conventional interpolation methods strongly affects the precision of consequent processing steps, such as segmentation and registration. Therefore, we propose an efficient sparse coded image SR reconstruction technique using a trained dictionary. We apply a simple and efficient regularized version of orthogonal matching pursuit (ROMP) to seek the coefficients of sparse representation. ROMP has the transparency and greediness of OMP and the robustness of the L1-minization that enhance the dictionary learning process to capture feature descriptors such as oriented edges and contours from complex images like brain MRIs. The sparse coding part of the K-SVD dictionary training procedure is modified by substituting OMP with ROMP. The dictionary update stage allows simultaneously updating an arbitrary number of atoms and vectors of sparse coefficients. In SR reconstruction, ROMP is used to determine the vector of sparse coefficients for the underlying patch. The recovered representations are then applied to the trained dictionary, and finally, an optimization leads to high-resolution output of high-quality. Experimental results demonstrate that the super-resolution reconstruction quality of the proposed scheme is comparatively better than other state-of-the-art schemes.
Medical physics aspects of cancer care in the Asia Pacific region: 2011 survey results
Kron, T; Azhari, HA; Voon, EO; Cheung, KY; Ravindran, P; Soejoko, D; Inamura, K; Han, Y; Ung, NM; Bold, L; Win, UM; Srivastava, R; Meyer, J; Farrukh, S; Rodriguez, L; Kuo, M; Lee, JCL; Kumara, A; Lee, CC; Krisanachinda, A; Nguyen, XC; Ng, KH
2012-01-01
Background: Medical physicists are essential members of the radiation oncology team. Given the increasing complexity of radiotherapy delivery, it is important to ensure adequate training and staffing. The aim of the present study was to update a similar survey from 2008 and assess the situation of medical physicists in the large and diverse Asia Pacific region. Methods: Between March and July 2011, a survey on profession and practice of radiation oncology medical physicists (ROMPs) in the Asia Pacific region was performed. The survey was sent to senior physicists in 22 countries. Replies were received from countries that collectively represent more than half of the world’s population. The survey questions explored five areas: education, staffing, work patterns including research and teaching, resources available, and job satisfaction. Results and discussion: Compared to a data from a similar survey conducted three years ago, the number of medical physicists in participating countries increased by 29% on average. This increase is similar to the increase in the number of linear accelerators, showing that previously identified staff shortages have yet to be substantially addressed. This is also highlighted by the fact that most ROMPs are expected to work overtime often and without adequate compensation. While job satisfaction has stayed similar compared to the previous survey, expectations for education and training have increased somewhat. This is in line with a trend towards certification of ROMPs. Conclusion: As organisations such as the International Labour Organization (ILO) start to recognise medical physics as a profession, it is evident that despite some encouraging signs there is still a lot of work required towards establishing an adequately trained and resourced medical physics workforce in the Asia Pacific region. PMID:22970066
Tiwari, Sapana; Kumar, Ashu; Mangalgi, Smita; Rathod, Vedika; Prakash, Archana; Barua, Anita; Arora, Sonia; Sathyaseelan, Kannusamy
2013-01-01
Brucellosis is an important zoonotic infectious disease of humans and livestock with worldwide distribution and is caused by bacteria of the genus Brucella. The diagnosis of brucellosis always requires laboratory confirmation by either isolation of pathogens or detection of specific antibodies. The conventional serological tests available for the diagnosis of brucellosis are less specific and show cross-reactivity with other closely related organisms. These tests also necessitate the handling of Brucella species for antigen preparation. Therefore, there is a need to develop reliable, rapid, and user-friendly systems for disease diagnosis and alternatives to vaccine approaches. Keeping in mind the importance of brucellosis as an emerging infection and the prevalence in India, we carried out the present study to compare the recombinant antigens with the native antigens (cell envelope and sonicated antigen) of Brucella for diagnosis of human brucellosis by an indirect plate enzyme-linked immunosorbent assay (ELISA). Recombinant outer membrane protein 28 (rOmp28) and rOmp31 antigens were cloned, expressed, and purified in the bacterial expression system, and the purified proteins were used as antigens. Indirect plate ELISAs were then performed and standardized for comparison of the reactivities of recombinant and native antigens against the 433 clinical samples submitted for brucellosis testing, 15 culture-positive samples, and 20 healthy donor samples. The samples were separated into four groups based on their positivity to rose bengal plate agglutination tests (RBPTs), standard tube agglutination tests (STATs), and 2-mercaptoethanol (2ME) tests. The sensitivities and specificities of all the antigens were calculated, and the rOmp28 antigen was found to be more suitable for the clinical diagnosis of brucellosis than the rOmp31 antigen and native antigens. The rOmp28-based ELISA showed a very high degree of agreement with the conventional agglutination tests and promising results for further use in clinical screening and serodiagnosis of human brucellosis. PMID:23761658
Medical physics aspects of cancer care in the Asia Pacific region: 2014 survey results.
Kron, Tomas; Azhari, H A; Voon, E O; Cheung, K Y; Ravindran, P; Soejoko, D; Inamura, K; Han, Y; Ung, N M; TsedenIsh, Bolortuya; Win, U M; Srivastava, R; Marsh, S; Farrukh, S; Rodriguez, L; Kuo, Men; Baggarley, S; DilipKumara, A H; Lee, C C; Krisanachinda, A; Nguyen, X C; Ng, K H
2015-09-01
It was the aim of this work to assess and track the workload, working conditions and professional recognition of radiation oncology medical physicists (ROMPs) in the Asia Pacific region over time. In this third survey since 2008, a structured questionnaire was mailed in 2014 to 22 senior medical physicists representing 23 countries. As in previous surveys the questionnaire covered seven themes: 1 education, training and professional certification, 2 staffing, 3 typical tasks, 4 professional organisations, 5 resources, 6 research and teaching, and 7 job satisfaction. The response rate of 100% is a result of performing a survey through a network, which allows easy follow-up. The replies cover 4841 ROMPs in 23 countries. Compared to 2008, the number of medical physicists in many countries has doubled. However, the number of experienced ROMPs compared to the overall workforce is still small, especially in low and middle income countries. The increase in staff is matched by a similar increase in the number of treatment units over the years. Furthermore, the number of countries using complex techniques (IMRT, IGRT) or installing high end equipment (tomotherapy, robotic linear accelerators) is increasing. Overall, ROMPs still feel generally overworked and the professional recognition, while varying widely, appears to be improving only slightly. Radiation oncology medical physics practice has not changed significantly over the last 6 years in the Asia Pacific Region even if the number of physicists and the number and complexity of treatment techniques and technologies have increased dramatically.
NASA Goddard Space Flight Center Robotic Processing System Program Automation Systems, volume 2
NASA Technical Reports Server (NTRS)
Dobbs, M. E.
1991-01-01
Topics related to robot operated materials processing in space (RoMPS) are presented in view graph form. Some of the areas covered include: (1) mission requirements; (2) automation management system; (3) Space Transportation System (STS) Hitchhicker Payload; (4) Spacecraft Command Language (SCL) scripts; (5) SCL software components; (6) RoMPS EasyLab Command & Variable summary for rack stations and annealer module; (7) support electronics assembly; (8) SCL uplink packet definition; (9) SC-4 EasyLab System Memory Map; (10) Servo Axis Control Logic Suppliers; and (11) annealing oven control subsystem.
Outschoorn, Ingrid M; Rose, Noel R; Burek, C Lynne; Jones, Tim W; Mackay, Ian R; Rowley, Merrill J
2005-06-01
The genetic control of the levels of autoantibodies has rarely been examined. We examined the heritability of autoantibodies to glutamic acid decarboxylase (GAD65) in type 1 diabetes, and to thyroglobulin (Tg) in chronic lymphocytic thyroiditis and thyrotoxicosis, using regression of offspring on midparent (ROMP) methods. Levels of autoantibodies in patients and their parents were significantly correlated in thyrotoxicosis (R2 = 0.569, p = 0.001), consistent with the reported Gm association, but not in chronic lymphocytic thyroiditis or type 1 diabetes. Extension of the procedure to other autoantibody disorders could be informative.
NASA Technical Reports Server (NTRS)
Voellmer, George
1997-01-01
The Goddard Space Flight Center has developed the Robot Operated Materials Processing System (ROMPS) that flew aboard STS-64 in September, 1994. The ROMPS robot transported pallets containing wafers of different materials from their storage racks to a furnace for thermal processing. A system of tapered guides and compliant springs was designed to deal with the potential misalignments. The robot and all the sample pallets were locked down for launch and landing. The design of the passive lockdown system, and the interplay between it and the alignment system are presented.
Novel self-healing materials chemistries for targeted applications
NASA Astrophysics Data System (ADS)
Wilson, Gerald O.
Self-healing materials of the type developed by White and co-workers [1] were designed to autonomically heal themselves when damaged, thereby extending the lifetime of various applications in which such material systems are employed. The system was based on urea-formaldehyde microcapsules containing dicyclopentadiene (DCPD) and Grubbs' catalyst particles embedded together in an epoxy matrix. When a crack propagates through the material, it ruptures the microcapsules, releasing DCPD into the crack plane, where it comes in contact and reacts with the catalyst to initiate a ring opening metathesis polymerization (ROMP), bonding the crack and restoring structural continuity. The present work builds on this concept in several ways. Firstly, it expands the scope and versatility of the ROMP self-healing chemistry by incorporation into epoxy vinyl ester matrices. Major technical challenges in this application include protection of the catalyst from deactivation by aggressive curing agents, and optimization of the concentration of healing agents in the matrix. Secondly, new ruthenium catalysts are evaluated for application in ROMP-based self-healing materials. The use of alternative derivatives of Grubbs' catalyst gave rise to self-healing systems with improved healing efficiencies and thermal properties. Evaluation of the stability of these new catalysts to primary amine curing agents used in the curing of common epoxy matrices also led to the discovery and characterization of new ruthenium catalysts which exhibited ROMP initiation kinetics superior to those of first and second generation Grubbs' catalysts. Finally, free radical polymerization was evaluated for application in the development of bio-compatible self-healing materials. [1] White, S. R.; Sottos, N. R.; Geubelle, P. R.; Moore, J. S.; Kessler, M. R.; Sriram, S. R.; Brown, E. N.; Viswanathan, S. Nature 2001, 409, 794.
Outschoorn, Ingrid M; Hoffman, William H; Rose, Noel R; Burek, C Lynne
2007-07-01
Only a few methods can be applied in a simple manner to estimate the genetic control of autoimmunity in humans. Here we examined the heritability of autoantibodies to two thyroid antigens; thyroglobulin (Tg) and thyroperoxidase (TPO, formerly known as thyroid microsomal antigen), using methods of regression of offspring on mid-parental values (ROMP). With the data sets available, affected and unaffected siblings were compared by this rapid screening method using results determined by hemagglutination (HA). The presence of both types of autoantibodies showed positive heritability in patients with Graves' thyrotoxicosis (TT), but it was not observed in chronic lymphocytic or Hashimoto's thyroiditis (CLT) patients. Since these assays have been extensively used over the years by most diagnostic and research laboratories, they should provide some insight as to which quantifiable parameters may be usefully accumulated to help select groups of patients and their families for further genetic study. ROMP may also be useful to determine the sequential appearance of different types of antibody in predicting disease onset in other family members, and in distinguishing maternal and paternal effects on imprinting. The method may be extended to study epitope spreading and other measures of disease progression.
Simulation study of the ROMPS robot control system
NASA Technical Reports Server (NTRS)
Nguyen, Charles C.; Liu, HUI-I.
1994-01-01
This is a report presenting the progress of a research grant funded by NASA for work performed from June 1, 1993 to August 1, 1993. The report deals with the Robot Operated Material Processing System (ROMPS). It presents results of a computer simulation study conducted to investigate the performance of the control systems controlling the azimuth, elevation, and radial axes of the ROMPS and its gripper. Four study cases are conducted. The first case investigates the control of free motion of the three areas. In the second case, the compliant motion in the elevation axis with the wrist compliant device is studied in terms of position accuracy and impact forces. The third case focuses on the behavior of the control system in controlling the robot motion along the radial axis when pulling the pallet out of the rack. In the fourth case, the compliant motion of the gripper grasping a solid object under the effect of the gripper passive compliance is studied in terms of position accuracy and contact forces. For each of the above cases, a set of PIR gains will be selected to optimize the controller performance and computer simulation results will be presented and discussed.
Du, Z Q; Wang, J Y
2015-10-27
Brucella, an intracellular parasite that infects some livestock and humans, can damage or destroy the reproductive system of livestock. The syndrome is referred to as brucellosis and often occurs in pastoral areas; it is contagious from livestock to humans. In this study, the intact Brucella suis outer membrane protein 31 (omp31) gene was cloned, recombinantly expressed, and examined as a subunit vaccine candidate. The intact Brucella lumazine synthase (bls) gene was cloned and recombinantly expressed to study polymerization function in vitro. Non-reducing gel electrophoresis showed that rBs-BLS existed in different forms in vitro, including as a dimer and a pentamer. An enzyme-linked immunosorbent assay result showed that rOmp31 protein could induce production of an antibody in rabbits. However, the rOmp31-BLS fusion protein could elicit a much higher antibody titer in rabbits; this construct involved fusion of the Omp31 molecule with the BLS molecule. Our results indicate that Omp31 is involved in immune stimulation, while BLS has a polymerizing function based on rOmp31-BLS fusion protein immunogenicity. These data suggest that Omp31 is an ideal subunit vaccine candidate and that the BLS molecule is a favorable transport vector for antigenic proteins.
Mapping Optimal Charge Density and Length of ROMP-Based PTDMs for siRNA Internalization.
Caffrey, Leah M; deRonde, Brittany M; Minter, Lisa M; Tew, Gregory N
2016-10-10
A fundamental understanding of how polymer structure impacts internalization and delivery of biologically relevant cargoes, particularly small interfering ribonucleic acid (siRNA), is of critical importance to the successful design of improved delivery reagents. Herein we report the use of ring-opening metathesis polymerization (ROMP) methods to synthesize two series of guanidinium-rich protein transduction domain mimics (PTDMs): one based on an imide scaffold that contains one guanidinium moiety per repeat unit, and another based on a diester scaffold that contains two guanidinium moieties per repeat unit. By varying both the degree of polymerization and, in effect, the relative number of cationic charges in each PTDM, the performances of the two ROMP backbones for siRNA internalization were evaluated and compared. Internalization of fluorescently labeled siRNA into Jurkat T cells demonstrated that fluorescein isothiocyanate (FITC)-siRNA internalization had a charge content dependence, with PTDMs containing approximately 40 to 60 cationic charges facilitating the most internalization. Despite this charge content dependence, the imide scaffold yielded much lower viabilities in Jurkat T cells than the corresponding diester PTDMs with similar numbers of cationic charges, suggesting that the diester scaffold is preferred for siRNA internalization and delivery applications. These developments will not only improve our understanding of the structural factors necessary for optimal siRNA internalization, but will also guide the future development of optimized PTDMs for siRNA internalization and delivery.
de Groot, Florentine P; Robertson, Narelle M; Swinburn, Boyd A; de Silva-Sanigorski, Andrea M
2010-08-31
Obesity is a major public health issue; however, only limited evidence is available about effective ways to prevent obesity, particularly in early childhood. Romp & Chomp was a community-wide obesity prevention intervention conducted in Geelong Australia with a target group of 12,000 children aged 0-5 years. The intervention had an environmental and capacity building focus and we have recently demonstrated that the prevalence of overweight/obesity was lower in intervention children, post-intervention. Capacity building is defined as the development of knowledge, skills, commitment, structures, systems and leadership to enable effective health promotion and the aim of this study was to determine if the capacity of the Geelong community, represented by key stakeholder organisations, to support healthy eating and physical activity for young children was increased after Romp & Chomp. A mixed methods evaluation with three data sources was utilised. 1) Document analysis comprised assessment of the documented formative and intervention activities against a capacity building framework (five domains: Partnerships, Leadership, Resource Allocation, Workforce Development, and Organisational Development); 2) Thematic analysis of key informant interviews (n = 16); and 3) the quantitative Community Capacity Index Survey. Document analysis showed that the majority of the capacity building activities addressed the Partnerships, Resource Allocation and Organisational Development domains of capacity building, with a lack of activity in the Leadership and Workforce Development domains. The thematic analysis revealed the establishment of sustainable partnerships, use of specialist advice, and integration of activities into ongoing formal training for early childhood workers. Complex issues also emerged from the key informant interviews regarding the challenges of limited funding, high staff turnover, changing governance structures, lack of high level leadership and unclear communication strategies. The Community Capacity Index provided further evidence that the project implementation network achieved a moderate level of capacity. Romp & Chomp increased the capacity of organisations, settings and services in the Geelong community to support healthy eating and physical activity for young children. Despite this success there are important learnings from this mixed methods evaluation that should inform current and future community-based public health and health promotion initiatives. ANZCTRN12607000374460.
ROMPS critical design review. Volume 3: Furnace module design documentation
NASA Technical Reports Server (NTRS)
Dobbs, M. E.
1992-01-01
As part of the furnace module design documentation, the furnace module Easylab programs definitions and command variables are described. Also included are Easylab commands flow charts and fault conditions.
Romping through Summer in a Wheelchair.
ERIC Educational Resources Information Center
Nolan, Karen
1981-01-01
Children with physical handicaps can participate in many of the same summer camp activities as non-disabled persons. Described are the programs at Camp Merry Heart, operated by New Jersey's Easter Seal Society. (WB)
ROMP-based thermosetting polymers from modified castor oil with various cross-linking agents
NASA Astrophysics Data System (ADS)
Ding, Rui
Polymers derived from bio-renewable resources are finding an increase in global demand. In addition, polymers with distinctive functionalities are required in certain advanced fields, such as aerospace and civil engineering. In an attempt to meet both these needs, the goal of this work aims to develop a range of bio-based thermosetting matrix polymers for potential applications in multifunctional composites. Ring-opening metathesis polymerization (ROMP), which recently has been explored as a powerful method in polymer chemistry, was employed as a unique pathway to polymerize agricultural oil-based reactants. Specifically, a novel norbornyl-functionalized castor oil alcohol (NCA) was investigated to polymerize different cross-linking agents using ROMP. The effects of incorporating dicyclopentadiene (DCPD) and a norbornene-based crosslinker (CL) were systematically evaluated with respect to curing behavior and thermal mechanical properties of the polymers. Isothermal differential scanning calorimetry (DSC) was used to investigate the conversion during cure. Dynamic DSC scans at multiple heating rates revealed conversion-dependent activation energy by Ozawa-Flynn-Wall analysis. The glass transition temperature, storage modulus, and loss modulus for NCA/DCPD and NCA/CL copolymers with different cross-linking agent loading were compared using dynamic mechanical analysis. Cross-link density was examined to explain the very different dynamic mechanical behavior. Mechanical stress-strain curves were developed through tensile test, and thermal stability of the cross-linked polymers was evaluated by thermogravimetric analysis to further investigate the structure-property relationships in these systems.
Ring-Opening Metathesis Polymerization in Aqueous Media using a Macroinitiator Approach.
Foster, Jeffery; Varlas, Spyridon; Blackman, Lewis; Arkinstall, Lucy; O'Reilly, Rachel Kerry
2018-06-26
Water-soluble and amphiphilic polymers are of great interest to industry and academia, as they can be used in applications such as biomaterials and drug delivery. Whilst ring-opening metathesis polymerization (ROMP) is a fast and functional group tolerant methodology for the synthesis of a wide range of polymers, its full potential for the synthesis of water-soluble polymers has yet to be realized. To address this we report a general strategy for the synthesis of block copolymers in aqueous milieu using a commercially available ROMP catalyst and a macroinitiator approach. This allows for excellent control in the preparation of block copolymers in water. If the second monomer is chosen such that it forms a water-insoluble polymer, polymerization-induced self-assembly (PISA) occurs and a variety of self-assembled nano-object morphologies can be accessed. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Lienkamp, Karen; Madkour, Ahmad E.; Musante, Ashlan; Nelson, Christopher F.; Nüsslein, Klaus
2014-01-01
Synthetic Mimics of Antimicrobial Peptides (SMAMPs) imitate natural host-defense peptides, a vital component of the body’s immune system. This work presents a molecular construction kit that allows the easy and versatile synthesis of a broad variety of facially amphiphilic oxanorbornene-derived monomers. Their ring-opening metathesis polymerization (ROMP) and deprotection provide several series of SMAMPs. Using amphiphilicity, monomer feed ratio, and molecular weight as parameters, polymers with 533 times higher selectivitiy (selecitviy = hemolytic concentration/minimum inhibitory concentration) for bacteria over mammalian cells were discovered. Some of these polymers were 50 times more selective for Gram-positive over Gram-negative bacteria while other polymers surprisingly showed the opposite preference. This kind of “double selectivity” (bacteria over mammalian and one bacterial type over another) is unprecedented in other polymer systems and is attributed to the monomer’s facial amphiphilicity. PMID:18593128
Kronström, Mats H; Johnson, Glen H; Hompesch, Richard W
2010-01-01
A new elastomeric impression material has been formulated with a ring-opening metathesis chemistry. In addition to other properties of clinical significance, the impression accuracy must be confirmed. The purpose of this study was to compare the accuracy of the new elastomeric impression material with vinyl polysiloxane and polyether following both spray and immersion disinfection. Impressions of a modified dentoform with a stainless steel crown preparation in the lower right quadrant were made, and type IV gypsum working casts and dies were formed. Anteroposterior (AP), cross-arch (CA), buccolingual (BL), mesiodistal (MD), occlusogingivobuccal (OGB), and occlusogingivolingual (OGL) dimensions were measured using a microscope. Working cast and die dimensions were compared to those of the master model. The impression materials were a newly formulated, ring-opening metathesis-polymerization impression material (ROMP Cartridge Tray and ROMP Volume Wash), vinyl polysiloxane (VPS, Aquasil Ultra Monophase/LV), and a polyether (PE, Impregum Penta Soft/Permadyne Garant L). Fifteen impressions with each material were made, of which 5 were disinfected by spray for 10 minutes (CaviCide), 5 were disinfected by immersion for 90 minutes (ProCide D), and 5 were not disinfected. There were significant cross-product interactions with a 2-way ANOVA, so a 1-way ANOVA and Dunnett's T3 multiple comparison test were used to compare the dimensional changes of the 3 impression materials, by disinfection status and for each location (alpha=.05). For ROMP, there were no significant differences from the master, for any dimension, when comparing the control and 2 disinfectant conditions. No significant differences were detected among the 3 impression materials for CA, BL, and MD. The working die dimensions of OGB and OGL for VPS with immersion disinfection were significantly shorter than with PE and ROMP (P<.05). Overall, the AP dimension was more accurate than CA, and the BL of working dies was 0.040 mm greater in diameter than MD. The accuracy of gypsum working casts and working dies from the new and 2 existing types of impression material were similar, for both spray and immersion disinfection. Judicious application of a die spacer can compensate for the small differences observed. VPS may require additional laboratory accommodation to compensate for a shorter working die. Copyright 2010 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.
Korn, Ariella R; Hennessy, Erin; Hammond, Ross A; Allender, Steven; Gillman, Matthew W; Kasman, Matt; McGlashan, Jaimie; Millar, Lynne; Owen, Brynle; Pachucki, Mark C; Swinburn, Boyd; Tovar, Alison; Economos, Christina D
2018-05-31
Involving groups of community stakeholders (e.g., steering committees) to lead community-wide health interventions appears to support multiple outcomes ranging from policy and systems change to individual biology. While numerous tools are available to measure stakeholder characteristics, many lack detail on reliability and validity, are not context specific, and may not be sensitive enough to capture change over time. This study describes the development and reliability of a novel survey to measure Stakeholder-driven Community Diffusion via assessment of stakeholders' social networks, knowledge, and engagement about childhood obesity prevention. This study was completed in three phases. Phase 1 included conceptualization and online survey development through literature reviews and expert input. Phase 2 included a retrospective study with stakeholders from two completed whole-of-community interventions. Between May-October 2015, 21 stakeholders from the Shape Up Somerville and Romp & Chomp interventions recalled their social networks, knowledge, and engagement pre-post intervention. We also assessed one-week test-retest reliability of knowledge and engagement survey modules among Shape Up Somerville respondents. Phase 3 included survey modifications and a second prospective reliability assessment. Test-retest reliability was assessed in May 2016 among 13 stakeholders involved in ongoing interventions in Victoria, Australia. In Phase 1, we developed a survey with 7, 20 and 50 items for the social networks, knowledge, and engagement survey modules, respectively. In the Phase 2 retrospective study, Shape Up Somerville and Romp & Chomp networks included 99 and 54 individuals. Pre-post Shape Up Somerville and Romp & Chomp mean knowledge scores increased by 3.5 points (95% CI: 0.35-6.72) and (- 0.42-7.42). Engagement scores did not change significantly (Shape Up Somerville: 1.1 points (- 0.55-2.73); Romp & Chomp: 0.7 points (- 0.43-1.73)). Intraclass correlation coefficients (ICCs) for knowledge and engagement were 0.88 (0.67-0.97) and 0.97 (0.89-0.99). In Phase 3, the modified knowledge and engagement survey modules included 18 and 25 items, respectively. Knowledge and engagement ICCs were 0.84 (0.62-0.95) and 0.58 (0.23-0.86). The survey measures upstream stakeholder properties-social networks, knowledge, and engagement-with good test-retest reliability. Future research related to Stakeholder-driven Community Diffusion should focus on prospective change and survey validation for intervention effectiveness.
Using Lanthanide Nanoparticles as Isotopic Tags for Biomarker Detection by Mass Cytometry
NASA Astrophysics Data System (ADS)
Cao, Pengpeng
The development of robust, versatile, and high-throughput biosensing techniques has widespread implications for early disease detection and accurate diagnosis. An innovative technology, mass cytometry, has been developed to use isotopically-labelled antibodies to simultaneously study multiple parameters of single cells. The current detection sensitivity of mass cytometry is limited by the number of copies of a given isotope that can be attached to a given antibody. This thesis describes research on the synthesis, characterization, and bioconjugation of a new class of nanoparticle-based labelling agents to be employed for the detection of low-abundance biomarkers by mass cytometry. Hydrophobic lanthanide nanoparticles (Ln NPs) have been prepared by the Winnik group. To render the NPs water-soluble for biological applications, we coated the NP surface with a first generation of multidentate poly(ethylene glycol) (PEG)-based ligands via ligand exchange. We measured the size, morphology, and polydispersity of these hydrophilic NPs by transmission electron microscopy (TEM) and dynamic light scattering (DLS). The colloidal stability of the NPs was determined at various pH and in phosphate buffered saline (PBS) solutions. Tetradentate-PEG-coated NPs (Tetra-NPs) exhibited the best stability at pH 3 to 9, and in PBS. However, when cells were treated with Tetra-NPs in preliminary in vitro studies, significant undesirable non-specific binding (NSB) was observed. In order to tackle the NSB issue presented in the Tetra-NPs, we prepared a second generation of polymer-based ligands using ring-opening metathesis polymerization (ROMP). A small library of ROMP polymers was synthesized, characterized, and used to stabilize NPs in aqueous solutions. The ROMP-NPs were found to have significantly reduced NSB to cells by inductively coupled plasma-mass spectrometry (ICP-MS). To further modify the NPs, amine groups were introduced as functional handles to both the tetradentate-PEG and ROMP polymer ligands. These amine groups on the NP surface were used to conjugate to the antibodies via maleimide-thiol chemistry. The antigen-recognizing abilities of the antibody-NP conjugates were assessed using two cell lines (CD34-positive KG1a and CD34-negative HL60 cells) by ICP-MS and mass cytometry. It is hoped that the lessons learned from these studies will ultimately support the development of a new biosensing technique for early disease detection.
Ward, W. C.; Cunningham, K.J.; Renken, R.A.; Wacker, M.A.; Carlson, J.I.
2003-01-01
An analysis was made to describe and interpret the lithology of a part of the Upper Floridan aquifer penetrated by the Regional Observation Monitoring Program (ROMP) 29A test corehole in Highlands County, Florida. This information was integrated into a one-dimensional hydrostratigraphic model that delineates candidate flow zones and confining units in the context of sequence stratigraphy. Results from this test corehole will serve as a starting point to build a robust three-dimensional sequence-stratigraphic framework of the Floridan aquifer system. The ROMP 29A test corehole penetrated the Avon Park Formation, Ocala Limestone, Suwannee Limestone, and Hawthorn Group of middle Eocene to Pliocene age. The part of the Avon Park Formation penetrated in the ROMP 29A test corehole contains two composite depositional sequences. A transgressive systems tract and a highstand systems tract were interpreted for the upper composite sequence; however, only a highstand systems tract was interpreted for the lower composite sequence of the deeper Avon Park stratigraphic section. The composite depositional sequences are composed of at least five high-frequency depositional sequences. These sequences contain high-frequency cycle sets that are an amalgamation of vertically stacked high-frequency cycles. Three types of high-frequency cycles have been identified in the Avon Park Formation: peritidal, shallow subtidal, and deeper subtidal high-frequency cycles. The vertical distribution of carbonate-rock diffuse flow zones within the Avon Park Formation is heterogeneous. Porous vuggy intervals are less than 10 feet, and most are much thinner. The volumetric arrangement of the diffuse flow zones shows that most occur in the highstand systems tract of the lower composite sequence of the Avon Park Formation as compared to the upper composite sequence, which contains both a backstepping transgressive systems tract and a prograding highstand systems tract. Although the porous and permeable layers are not thick, some intervals may exhibit lateral continuity because of their deposition on a broad low-relief ramp. A thick interval of thin vuggy zones and open faults forms thin conduit flow zones mixed with relatively thicker carbonate-rock diffuse flow zones between a depth of 1,070 and 1,244 feet below land surface (bottom of the test corehole). This interval is the most transmissive part of the Avon Park Formation penetrated in the ROMP 29A test corehole and is included in the highstand systems tract of the lower composite sequence. The Ocala Limestone is considered to be a semiconfining unit and contains three depositional sequences penetrated by the ROMP 29A test corehole. Deposited within deeper subtidal depositional cycles, no zones of enhanced porosity and permeability are expected in the Ocala Limestone. A thin erosional remnant of the shallow marine Suwannee Limestone overlies the Ocala Limestone, and permeability seems to be comparatively low because moldic porosity is poorly connected. Rocks that comprise the lower Hawthorn Group, Suwannee Limestone, and Ocala Limestone form a permeable upper zone of the Upper Floridan aquifer, and rocks of the lower Ocala Limestone and Avon Park Formation form a permeable lower zone of the Upper Floridan aquifer. On the basis of a preliminary analysis of transmissivity estimates for wells located north of Lake Okeechobee, spatial relations among groups of relatively high and low transmissivity values within the upper zone are evident. Upper zone transmissivity is generally less than 10,000 feet squared per day in areas located south of a line that extends through Charlotte, Sarasota, DeSoto, Highlands, Polk, Osceola, Okeechobee, and St. Lucie Counties. Transmissivity patterns within the lower zone of the Avon Park Formation cannot be regionally assessed because insufficient data over a wide areal extent have not been compiled.
Measuring the Electro-Optic Coefficients of Bulk-poled Polymers
2012-09-01
polymethylmethacrylate (PMMA) was produced by CYRO Industries (Acrylite H15) and distributed by AMCO Plastics. All other chemicals were obtained from Sigma...nitrostyryl) benzene) PMMA polymethylmethacrylate RMS root-mean-square ROMP ring-opening metathesis polymer Tg glass transition temperature
de Silva-Sanigorski, A; Elea, D; Bell, C; Kremer, P; Carpenter, L; Nichols, M; Smith, M; Sharp, S; Boak, R; Swinburn, B
2011-05-01
The Romp & Chomp intervention reduced the prevalence of overweight/obesity in pre-school children in Geelong, Victoria, Australia through an intervention promoting healthy eating and active play in early childhood settings. This study aims to determine if the intervention successfully created more health promoting family day care (FDC) environments. The evaluation had a cross-sectional, quasi-experimental design with the intervention FDC service in Geelong and a comparison sample from 17 FDC services across Victoria. A 45-item questionnaire capturing nutrition- and physical activity-related aspects of the policy, socio-cultural and physical environments of the FDC service was completed by FDC care providers (in 2008) in the intervention (n= 28) and comparison (n= 223) samples. Select results showed intervention children spent less time in screen-based activities (P= 0.03), organized active play (P < 0.001) and free inside play (P= 0.03) than comparison children. There were more rules related to healthy eating (P < 0.001), more care provider practices that supported children's positive meal experiences (P < 0.001), fewer unhealthy food items allowed (P= 0.05), higher odds of staff being trained in nutrition (P= 0.04) and physical activity (P < 0.001), lower odds of having set minimum times for outside (P < 0.001) and organized (P= 0.01) active play, and of rewarding children with food (P < 0.001). Romp & Chomp improved the FDC service to one that discourages sedentary behaviours and promotes opportunities for children to eat nutritious foods. Ongoing investment to increase children's physical activity within the setting and improving the capacity and health literacy of care providers is required to extend and sustain the improvements. © 2011 Blackwell Publishing Ltd.
Pinaud, Julien; Trinh, Thi Kim Hoang; Sauvanier, David; Placet, Emeline; Songsee, Sriprapai; Lacroix-Desmazes, Patrick; Becht, Jean-Michel; Tarablsi, Bassam; Lalevée, Jacques; Pichavant, Loïc; Héroguez, Valérie; Chemtob, Abraham
2018-01-09
1,3-Bis(mesityl)imidazolium tetraphenylborate (IMesH + BPh 4 - ) can be synthesized in one step by anion metathesis between the corresponding imidazolium chloride and sodium tetraphenylborate. In the presence of 2-isopropylthioxanthone (sensitizer), an IMes N-heterocyclic carbene (NHC) ligand can be photogenerated under irradiation at 365 nm through coupled electron/proton transfer reactions. By combining this tandem NHC photogenerator system with metathesis inactive [RuCl 2 (p-cymene)] 2 precatalyst, the highly active RuCl 2 (p-cymene)(IMes) complex can be formed in situ, enabling a complete ring-opening metathesis polymerization (ROMP) of norbornene in the matter of minutes at room temperature. To the best of our knowledge, this is the first example of a photogenerated NHC. Its exploitation in photoROMP has resulted in a simplified process compared to current photocatalysts, because only stable commercial or easily synthesized reagents are required. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Three-dimensional organization of block copolymers on "DNA-minimal" scaffolds.
McLaughlin, Christopher K; Hamblin, Graham D; Hänni, Kevin D; Conway, Justin W; Nayak, Manoj K; Carneiro, Karina M M; Bazzi, Hassan S; Sleiman, Hanadi F
2012-03-07
Here, we introduce a 3D-DNA construction method that assembles a minimum number of DNA strands in quantitative yield, to give a scaffold with a large number of single-stranded arms. This DNA frame is used as a core structure to organize other functional materials in 3D as the shell. We use the ring-opening metathesis polymerization (ROMP) to generate block copolymers that are covalently attached to DNA strands. Site-specific hybridization of these DNA-polymer chains on the single-stranded arms of the 3D-DNA scaffold gives efficient access to DNA-block copolymer cages. These biohybrid cages possess polymer chains that are programmably positioned in three dimensions on a DNA core and display increased nuclease resistance as compared to unfunctionalized DNA cages. © 2012 American Chemical Society
Near-term probabilistic forecast of significant wildfire events for the Western United States
Haiganoush K. Preisler; Karin L. Riley; Crystal S. Stonesifer; Dave E. Calkin; Matt Jolly
2016-01-01
Fire danger and potential for large fires in the United States (US) is currently indicated via several forecasted qualitative indices. However, landscape-level quantitative forecasts of the probability of a large fire are currently lacking. In this study, we present a framework for forecasting large fire occurrence - an extreme value event - and evaluating...
Self-consistency tests of large-scale dynamics parameterizations for single-column modeling
Edman, Jacob P.; Romps, David M.
2015-03-18
Large-scale dynamics parameterizations are tested numerically in cloud-resolving simulations, including a new version of the weak-pressure-gradient approximation (WPG) introduced by Edman and Romps (2014), the weak-temperature-gradient approximation (WTG), and a prior implementation of WPG. We perform a series of self-consistency tests with each large-scale dynamics parameterization, in which we compare the result of a cloud-resolving simulation coupled to WTG or WPG with an otherwise identical simulation with prescribed large-scale convergence. In self-consistency tests based on radiative-convective equilibrium (RCE; i.e., no large-scale convergence), we find that simulations either weakly coupled or strongly coupled to either WPG or WTG are self-consistent, butmore » WPG-coupled simulations exhibit a nonmonotonic behavior as the strength of the coupling to WPG is varied. We also perform self-consistency tests based on observed forcings from two observational campaigns: the Tropical Warm Pool International Cloud Experiment (TWP-ICE) and the ARM Southern Great Plains (SGP) Summer 1995 IOP. In these tests, we show that the new version of WPG improves upon prior versions of WPG by eliminating a potentially troublesome gravity-wave resonance.« less
Autoantibody heritability in thyroiditis: IgG subclass contributions.
Outschoorn, Ingrid M; Talor, Monica V; Hoffman, William H; Rowley, Merrill J; Mackay, Ian R; Rose, Noel R; Burek, C Lynne
2011-05-01
Using a simple screening technique called regression of offspring on mid-parent (ROMP) to examine the role of IgG subclasses in affected and unaffected siblings of children and adolescents with autoimmune thyroid disease and their parents, both total-restricted and subclass-restricted autoantibodies to thyroglobulin (Tg) were assayed quantitatively for each of the IgG subclasses. There was a significant correlation of anti-Tg titer of probands with parental titers in thyrotoxicosis (TT), (R(2) = 0.569, p = 0.001), but not in chronic lymphocytic thyroiditis. The most striking correlation was in TT patients of African-American ancestry, (R(2) = 0.9863, p = 0.0007). Additional insight is provided by examining the contributions of the IgG subclasses individually, particularly those whose concentrations appear not to have direct influence on the total IgG titers. Thus, using small numbers of patients, and assaying the IgG subclass distributions, as well as any other immunoglobulin isotypes that are significantly altered in autoantibody assays, ROMP can be performed rapidly to ascertain which quantifiable parameters may be usefully extended to predict disease onset and progression.
NASA Technical Reports Server (NTRS)
Koster, Randal; Walker, Greg; Mahanama, Sarith; Reichle, Rolf
2012-01-01
Continental-scale offline simulations with a land surface model are used to address two important issues in the forecasting of large-scale seasonal streamflow: (i) the extent to which errors in soil moisture initialization degrade streamflow forecasts, and (ii) the extent to which the downscaling of seasonal precipitation forecasts, if it could be done accurately, would improve streamflow forecasts. The reduction in streamflow forecast skill (with forecasted streamflow measured against observations) associated with adding noise to a soil moisture field is found to be, to first order, proportional to the average reduction in the accuracy of the soil moisture field itself. This result has implications for streamflow forecast improvement under satellite-based soil moisture measurement programs. In the second and more idealized ("perfect model") analysis, precipitation downscaling is found to have an impact on large-scale streamflow forecasts only if two conditions are met: (i) evaporation variance is significant relative to the precipitation variance, and (ii) the subgrid spatial variance of precipitation is adequately large. In the large-scale continental region studied (the conterminous United States), these two conditions are met in only a somewhat limited area.
Maniac Talk - Pawan K. Bhartia
2014-08-27
Pawan K. Bhartia Maniac Lecture, August 27, 2014 NASA climate scientist Dr. P.K. Bhartia presented a Maniac Talk entitled "Maxwell Demon, Black Swan and a Romp in Scientific Hinterlands." PK discussed his roller coaster career, which got nearly derailed after a brief tryst with history and his obsession for understanding esoteric details of measurements that once in a while leads to something interesting.
2008-12-01
attached DR1 to a tunable high glass transition temperatue (Tg) polymeric backbone prepared by ROMP. Figure 1. Standard and required poling...approximately 13-15 g of polymer. The remainder of the mixed polymer adhered to screw or barrel. Norbornyl-DR1 monomer (1). 5-norbornene-2- carboxylic acid
Adhesion strength of norbornene-based self-healing agents to an amine-cured epoxy
NASA Astrophysics Data System (ADS)
Huang, Guang Chun; Lee, Jong Keun; Kessler, Michael R.; Yoon, Sungho
2009-07-01
Self-healing is triggered by crack propagation through embedded microcapsules in an epoxy matrix, which then release the liquid healing agent into the crack plane. Subsequent exposure of the healing agent to the chemical catalyst initiates ring-opening metathesis polymerization (ROMP) and bonding of the crack faces. In order to improve self-healing functionality, it is necessary to enhance adhesion of polymerized healing agent within the crack to the matrix resin. In this study, shear bond strength between different norbornene-based healing agents and an amine-cured epoxy resin was evaluated using the single lap shear test method (ASTM D3163, modified). The healing agents tested include endodicyclopentadiene (endo-DCPD), 5-ethylidene-2-norbornene (ENB) and DCPD/ENB blends. 5-Norbornene-2-methanol (NBM) was used as an adhesion promoter, containing hydroxyl groups to form hydrogen bonds with the amine-cured epoxy. A custom synthesized norbornene-based crosslinking agent was also added to improve adhesion for ENB by increasing the crosslinking density of the adhesive after ROMP. The healing agents were polymerized with varying loadings of the 1st generation Grubbs' catalyst at different reaction times and temperatures.
NASA Astrophysics Data System (ADS)
Diot-Néant, Florian; Migeot, Loïs; Hollande, Louis; Reano, Felix A.; Domenek, Sandra; Allais, Florent
2017-12-01
Antioxidant norbornene-based monomers bearing biobased sterically hindered phenols (SHP) - NDF (norbornene dihydroferulate) and NDS (norbornene dihydrosinapate) - have been successfully prepared through biocatalysis from naturally occurring ferulic and sinapic acids, respectively, in presence of Candida antarctica Lipase B (Cal-B). The ring opening metathesis polymerization (ROMP) of these monomers was investigated according to ruthenium catalyst type (GI) vs. (HGII) and monomer to catalyst molar ratio ([M]/[C]). The co-polymerization of antioxidant functionalized monomer (NDF or NDS) and non-active norbornene (N) has also been performed in order to adjust the number of SHP groups present per weight unit and tune the antioxidant activity of the copolymers. The polydispersity of the resulting copolymers was readily improved by a simple acetone wash to provide antioxidant polymers with well-defined structures. After hydrogenation with p-toluenesulfonylhydrazine (p-TSH), the radical scavenging ability of the resulting saturated polymers was evaluated using α,α-diphenyl-β-picrylhydrazyl (DPPH) analysis. Results demonstrated that polymers bearing sinapic acid SHP exhibited higher antiradical activity than the polymer bearing ferulic acid SHP. In addition it was also shown that only a small SHP content was needed in the copolymers to exhibit a potent antioxidant activity.
NASA Astrophysics Data System (ADS)
Saharia, M.; Wood, A.; Clark, M. P.; Bennett, A.; Nijssen, B.; Clark, E.; Newman, A. J.
2017-12-01
Most operational streamflow forecasting systems rely on a forecaster-in-the-loop approach in which some parts of the forecast workflow require an experienced human forecaster. But this approach faces challenges surrounding process reproducibility, hindcasting capability, and extension to large domains. The operational hydrologic community is increasingly moving towards `over-the-loop' (completely automated) large-domain simulations yet recent developments indicate a widespread lack of community knowledge about the strengths and weaknesses of such systems for forecasting. A realistic representation of land surface hydrologic processes is a critical element for improving forecasts, but often comes at the substantial cost of forecast system agility and efficiency. While popular grid-based models support the distributed representation of land surface processes, intermediate-scale Hydrologic Unit Code (HUC)-based modeling could provide a more efficient and process-aligned spatial discretization, reducing the need for tradeoffs between model complexity and critical forecasting requirements such as ensemble methods and comprehensive model calibration. The National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the USACE to implement, assess, and demonstrate real-time, over-the-loop distributed streamflow forecasting for several large western US river basins and regions. In this presentation, we present early results from short to medium range hydrologic and streamflow forecasts for the Pacific Northwest (PNW). We employ a real-time 1/16th degree daily ensemble model forcings as well as downscaled Global Ensemble Forecasting System (GEFS) meteorological forecasts. These datasets drive an intermediate-scale configuration of the Structure for Unifying Multiple Modeling Alternatives (SUMMA) model, which represents the PNW using over 11,700 HUCs. The system produces not only streamflow forecasts (using the MizuRoute channel routing tool) but also distributed model states such as soil moisture and snow water equivalent. We also describe challenges in distributed model-based forecasting, including the application and early results of real-time hydrologic data assimilation.
NASA Astrophysics Data System (ADS)
Li, Ji; Chen, Yangbo; Wang, Huanyu; Qin, Jianming; Li, Jie; Chiao, Sen
2017-03-01
Long lead time flood forecasting is very important for large watershed flood mitigation as it provides more time for flood warning and emergency responses. The latest numerical weather forecast model could provide 1-15-day quantitative precipitation forecasting products in grid format, and by coupling this product with a distributed hydrological model could produce long lead time watershed flood forecasting products. This paper studied the feasibility of coupling the Liuxihe model with the Weather Research and Forecasting quantitative precipitation forecast (WRF QPF) for large watershed flood forecasting in southern China. The QPF of WRF products has three lead times, including 24, 48 and 72 h, with the grid resolution being 20 km × 20 km. The Liuxihe model is set up with freely downloaded terrain property; the model parameters were previously optimized with rain gauge observed precipitation, and re-optimized with the WRF QPF. Results show that the WRF QPF has bias with the rain gauge precipitation, and a post-processing method is proposed to post-process the WRF QPF products, which improves the flood forecasting capability. With model parameter re-optimization, the model's performance improves also. This suggests that the model parameters be optimized with QPF, not the rain gauge precipitation. With the increasing of lead time, the accuracy of the WRF QPF decreases, as does the flood forecasting capability. Flood forecasting products produced by coupling the Liuxihe model with the WRF QPF provide a good reference for large watershed flood warning due to its long lead time and rational results.
Controlled Ring-Opening Metathesis Polymerization by Molybdenum and Tungsten Alkylidene Complexes
1988-07-29
weights and low polydispersities (as low as 1.03) consistent with a living catalyst system employing 50, 100, 200, and 400 eq of monomer. The reactions are...secondary metathesis of polymer chains Bulky alkoxide ligands Wittig-like reaction Ring-opening metathesis polymerization (ROMP) Feast monomer Cyclic...olefins Retro Diels-Alder reaction Norbornene (NBE) Low temperature column chromatography Endo-,endo-5,6-dicarbomethoxynorbornene Discrete, soluble
Skipping toward Seniority: One Queer Scholar's Romp through the Weeds of Academe
ERIC Educational Resources Information Center
Lugg, Catherine A.
2017-01-01
This reflective essay, which is both autobiographical and historical in nature, is framed by answering the questions posed by the editors regarding my work: What values inform it, how I actually do it, and why do I do it? Quite simply, I am writing to encourage social change for all queer people, be it merely the little corner of my own social…
NASA Astrophysics Data System (ADS)
Shankar, Chandrashekar
The goal of this research was to gain a fundamental understanding of the properties of networks created by the ring opening metathesis polymerization (ROMP) of dicyclopentadiene (DCPD) used in self-healing materials. To this end we used molecular simulation methods to generate realistic structures of DCPD networks, characterize their structures, and determine their mechanical properties. Density functional theory (DFT) calculations, complemented by structural information derived from molecular dynamics simulations were used to reconstruct experimental Raman spectra and differential scanning calorimetry (DSC) data. We performed coarse-grained simulations comparing networks generated via the ROMP reaction process and compared them to those generated via a RANDOM process, which led to the fundamental realization that the polymer topology has a unique influence on the network properties. We carried out fully atomistic simulations of DCPD using a novel algorithm for recreating ROMP reactions of DCPD molecules. Mechanical properties derived from these atomistic networks are in excellent agreement with those obtained from coarse-grained simulations in which interactions between nodes are subject to angular constraints. This comparison provides self-consistent validation of our simulation results and helps to identify the level of detail necessary for the coarse-grained interaction model. Simulations suggest networks can classified into three stages: fluid-like, rubber-like or glass-like delineated by two thresholds in degree of reaction alpha: The onset of finite magnitudes for the Young's modulus, alphaY, and the departure of the Poisson ration from 0.5, alphaP. In each stage the polymer exhibits a different predominant mechanical response to deformation. At low alpha < alphaY it flows. At alpha Y < alpha < alphaP the response is entropic with no change in internal energy. At alpha > alphaP the response is enthalpic change in internal energy. We developed graph theory-based network characterizations to correlate between network topology and the simulated mechanical properties. (1) Eigenvector centrality (2) Graph fractal dimension, (3) Fiedler partitioning, and (4) Cross-link fraction (Q3+Q4). Of these quantities, the Fiedler partition is the best characteristic for the prediction of Young's Modulus. The new computational tools developed in this research are of great fundamental and practical interest.
Curtis, Elana; Wikaire, Erena; Jiang, Yannan; McMillan, Louise; Loto, Robert; Poole, Phillippa; Barrow, Mark; Bagg, Warwick; Reid, Papaarangi
2017-01-01
Objective To determine associations between admission markers of socioeconomic status, transitioning, bridging programme attendance and prior academic preparation on academic outcomes for indigenous Māori, Pacific and rural students admitted into medicine under access pathways designed to widen participation. Findings were compared with students admitted via the general (usual) admission pathway. Design Retrospective observational study using secondary data. Setting 6-year medical programme (MBChB), University of Auckland, Aotearoa New Zealand. Students are selected and admitted into Year 2 following a first year (undergraduate) or prior degree (graduate). Participants 1676 domestic students admitted into Year 2 between 2002 and 2012 via three pathways: GENERAL admission (1167), Māori and Pacific Admission Scheme—MAPAS (317) or Rural Origin Medical Preferential Entry—ROMPE (192). Of these, 1082 students completed the programme in the study period. Main outcome measures Graduated from medical programme (yes/no), academic scores in Years 2–3 (Grade Point Average (GPA), scored 0–9). Results 735/778 (95%) of GENERAL, 111/121 (92%) of ROMPE and 146/183 (80%) of MAPAS students graduated from intended programme. The graduation rate was significantly lower in the MAPAS students (p<0.0001). The average Year 2–3 GPA was 6.35 (SD 1.52) for GENERAL, which was higher than 5.82 (SD 1.65, p=0.0013) for ROMPE and 4.33 (SD 1.56, p<0.0001) for MAPAS. Multiple regression analyses identified three key predictors of better academic outcomes: bridging programme attendance, admission as an undergraduate and admission GPA/Grade Point Equivalent (GPE). Attending local urban schools and higher school deciles were also associated with a greater likelihood of graduation. All regression models have controlled for predefined baseline confounders (gender, age and year of admission). Conclusions There were varied associations between admission variables and academic outcomes across the three admission pathways. Equity-targeted admission programmes inclusive of variations in academic threshold for entry may support a widening participation agenda, however, additional academic and pastoral supports are recommended. PMID:28847768
Solar flare predictions and warnings
NASA Technical Reports Server (NTRS)
White, K. P., III
1972-01-01
The real-time solar monitoring information supplied to support SPARCS equipped rocket launches, the routine collection and analysis of 3.3-mm solar radio maps, short-term flare forecasts based on these maps, longer-term forecasts based on the recurrence of active regions, and an extension of the flare forecasting technique are summarized. Forecasts for expectation of a solar flare of class or = 2F are given and compared with observed flares. A total of 52 plage regions produced all the flares of class or = 1N during the study period. The following results are indicated: of the total of 21 positive forecasts, 3 were correct and 18 were incorrect; of the total of 31 negative forecasts, 3 were incorrect and 28 were correct; of a total of 6 plage regions producing large flares, 3 were correctly forecast and 3 were missed; and of 46 regions not producing any large flares, 18 were incorrectly forecast and 28 were correctly forecast.
DOE R&D Accomplishments Database
Schrock, R. R.
1993-12-01
Four studies are reported: living cyclopolymerization of diethyl dipropargylmalonate by Mo(CH-t-Bu)(NAr)[OCMe(CF{sub 3}){sub 2}]{sub 2} in dimethoxyethane, effect of chain length on conductivity of polyacetylene, nonlinear optical analysis of a series of triblock copolymers containing model polyenes, and synthesis of bifunctional hexafluoro-t-butoxide Mo species and their use as initiators in ROMP reactions.
Vatansever, Fatma; Hamblin, Michael R.
2016-01-01
Core–shell CdSe/ZnS quantum dots (QDs) are useful as tunable photostable fluorophores for multiple applications in industry, biology, and medicine. However, to achieve the optimum optical properties, the surface of the QDs must be passivated to remove charged sites that might bind extraneous substances and allow aggregation. Here we describe a method of growing an organic polymer corona onto the QD surface using the bottom-up approach of surface-initiated ring-opening metathesis polymerization (SI-ROMP) with Grubbs catalyst. CdSe/ZnS QDs were first coated with mercaptopropionic acid by displacing the original trioctylphosphine oxide layer, and then reacted with 7-octenyl dimethyl chlorosilane. The resulting octenyl double bonds allowed the attachment of ruthenium alkylidene groups as a catalyst. A subsequent metathesis reaction with strained bicyclic monomers (norbornene-dicarbonyl chloride (NDC), and a mixture of NDC and norbornenylethylisobutyl-polyhedral oligomeric silsesquioxane (norbornoPOSS)) allowed the construction of tethered organic homo-polymer or co-polymer layers onto the QD. Compounds were characterized by FT-IR, 1H-NMR, X-ray photoelectron spectroscopy, differential scanning calorimetry, and transmission electron microscopy. Atomic force microscopy showed that the coated QDs were separate and non-aggregated with a range of diameter of 48–53 nm. PMID:28360819
United States Geological Survey fire science: fire danger monitoring and forecasting
Eidenshink, Jeff C.; Howard, Stephen M.
2012-01-01
Each day, the U.S. Geological Survey produces 7-day forecasts for all Federal lands of the distributions of number of ignitions, number of fires above a given size, and conditional probabilities of fires growing larger than a specified size. The large fire probability map is an estimate of the likelihood that ignitions will become large fires. The large fire forecast map is a probability estimate of the number of fires on federal lands exceeding 100 acres in the forthcoming week. The ignition forecast map is a probability estimate of the number of fires on Federal land greater than 1 acre in the forthcoming week. The extreme event forecast is the probability estimate of the number of fires on Federal land that may exceed 5,000 acres in the forthcoming week.
2016-09-01
Laboratory Change in Weather Research and Forecasting (WRF) Model Accuracy with Age of Input Data from the Global Forecast System (GFS) by JL Cogan...analysis. As expected, accuracy generally tended to decline as the large-scale data aged , but appeared to improve slightly as the age of the large...19 Table 7 Minimum and maximum mean RMDs for each WRF time (or GFS data age ) category. Minimum and
Statistical model for forecasting monthly large wildfire events in western United States
Haiganoush K. Preisler; Anthony L. Westerling
2006-01-01
The ability to forecast the number and location of large wildfire events (with specified confidence bounds) is important to fire managers attempting to allocate and distribute suppression efforts during severe fire seasons. This paper describes the development of a statistical model for assessing the forecasting skills of fire-danger predictors and producing 1-month-...
Student Enrollment Forecasting Techniques for Higher Education.
ERIC Educational Resources Information Center
Ahrens, Stephen W.
Various techniques used by state agencies, secondary schools, community colleges, and large universities to forecast enrollments are described and guidelines for constructing forecasting procedures are outlined. The forecasting techniques are divided into three categories: (1) quantitative techniques based on historical data that attempt curve…
NASA Astrophysics Data System (ADS)
Chen, C.; Rundle, J. B.; Holliday, J. R.; Nanjo, K.; Turcotte, D. L.; Li, S.; Tiampo, K. F.
2005-12-01
Forecast verification procedures for statistical events with binary outcomes typically rely on the use of contingency tables and Relative Operating Characteristic (ROC) diagrams. Originally developed for the statistical evaluation of tornado forecasts on a county-by-county basis, these methods can be adapted to the evaluation of competing earthquake forecasts. Here we apply these methods retrospectively to two forecasts for the m = 7.3 1999 Chi-Chi, Taiwan, earthquake. These forecasts are based on a method, Pattern Informatics (PI), that locates likely sites for future large earthquakes based on large change in activity of the smallest earthquakes. A competing null hypothesis, Relative Intensity (RI), is based on the idea that future large earthquake locations are correlated with sites having the greatest frequency of small earthquakes. We show that for Taiwan, the PI forecast method is superior to the RI forecast null hypothesis. Inspection of the two maps indicates that their forecast locations are indeed quite different. Our results confirm an earlier result suggesting that the earthquake preparation process for events such as the Chi-Chi earthquake involves anomalous changes in activation or quiescence, and that signatures of these processes can be detected in precursory seismicity data. Furthermore, we find that our methods can accurately forecast the locations of aftershocks from precursory seismicity changes alone, implying that the main shock together with its aftershocks represent a single manifestation of the formation of a high-stress region nucleating prior to the main shock.
SNAr-Based, facile synthesis of a library of Benzothiaoxazepine-1,1’-dioxides
Rolfe, Alan; Samarakoon, Thiwanka B.; Klimberg, Sarra V.; Brzozowski, Marek; Neuenswander, Benjamin; Lushington, Gerald H.
2011-01-01
The construction of a library of benzothiaoxazepine-1,1’-dioxides utilizing a one-pot, SNAr diversification – ODCT50 scavenging protocol is reported. This protocol combines microwave irradiation to facilitate the reaction, in conjunction with a soluble ROMP-derived scavenger (ODCT) to afford the desired products in good overall purity. Utilizing this protocol, a 78-member library was successfully synthesized and submitted for biological evaluation. PMID:20879738
NASA Technical Reports Server (NTRS)
Blankenship, Clay; Zavodsky, Bradley; Jedlovec, Gary; Wick, Gary; Neiman, Paul
2013-01-01
Atmospheric rivers are transient, narrow regions in the atmosphere responsible for the transport of large amounts of water vapor. These phenomena can have a large impact on precipitation. In particular, they can be responsible for intense rain events on the western coast of North America during the winter season. This paper focuses on attempts to improve forecasts of heavy precipitation events in the Western US due to atmospheric rivers. Profiles of water vapor derived from from Atmospheric Infrared Sounder (AIRS) observations are combined with GFS forecasts by a three-dimensional variational data assimilation in the Gridpoint Statistical Interpolation (GSI). Weather Research and Forecasting (WRF) forecasts initialized from the combined field are compared to forecasts initialized from the GFS forecast only for 3 test cases in the winter of 2011. Results will be presented showing the impact of the AIRS profile data on water vapor and temperature fields, and on the resultant precipitation forecasts.
NASA Astrophysics Data System (ADS)
Wanders, Niko; Wood, Eric
2016-04-01
Sub-seasonal to seasonal weather and hydrological forecasts have the potential to provide vital information for a variety of water-related decision makers. For example, seasonal forecasts of drought risk can enable farmers to make adaptive choices on crop varieties, labour usage, and technology investments. Seasonal and sub-seasonal predictions can increase preparedness to hydrological extremes that regularly occur in all regions of the world with large impacts on society. We investigated the skill of six seasonal forecast models from the NMME-2 ensemble coupled to two global hydrological models (VIC and PCRGLOBWB) for the period 1982-2012. The 31 years of NNME-2 hindcast data is used in combination with an ensemble mean and ESP forecast, to forecast important hydrological variables (e.g. soil moisture, groundwater storage, snow, reservoir levels and river discharge). By using two global hydrological models we are able to quantify both the uncertainty in the meteorological input and the uncertainty created by the different hydrological models. We show that the NMME-2 forecast outperforms the ESP forecasts in terms of anomaly correlation and brier skill score for all forecasted hydrological variables, with a low uncertainty in the performance amongst the hydrological models. However, the continuous ranked probability score (CRPS) of the NMME-2 ensemble is inferior to the ESP due to a large spread between the individual ensemble members. We use a cost analysis to show that the damage caused by floods and droughts in large scale rivers can globally be reduced by 48% (for leads from 1-2 months) to 20% (for leads between 6-9 months) when precautions are taken based on the NMME-2 ensemble instead of an ESP forecast. In collaboration with our local partner in West Africa (AGHRYMET), we looked at the performance of the sub-seasonal forecasts for crop planting dates and high flow season in West Africa. We show that the uncertainty in the optimal planting date is reduced from 30 days to 12 days (2.5 month lead) and an increased predictability of the high flow season from 45 days to 20 days (3-4 months lead). Additionally, we show that snow accumulation and melt onset in the Northern hemisphere can be forecasted with an uncertainty of 10 days (2.5 months lead). Both the overall skill, and the skill found in these last two examples, indicates that the new NMME-2 forecast dataset is valuable for sub-seasonal forecast applications. The high temporal resolution (daily), long leads (one year leads) and large hindcast archive enable new sub-seasonal forecasting applications to be explored. We show that the NMME-2 has a large potential for sub-seasonal hydrological forecasting and other potential hydrological applications (e.g. reservoir management), which could benefit from these new forecasts.
NASA Astrophysics Data System (ADS)
LI, J.; Chen, Y.; Wang, H. Y.
2016-12-01
In large basin flood forecasting, the forecasting lead time is very important. Advances in numerical weather forecasting in the past decades provides new input to extend flood forecasting lead time in large rivers. Challenges for fulfilling this goal currently is that the uncertainty of QPF with these kinds of NWP models are still high, so controlling the uncertainty of QPF is an emerging technique requirement.The Weather Research and Forecasting (WRF) model is one of these NWPs, and how to control the QPF uncertainty of WRF is the research topic of many researchers among the meteorological community. In this study, the QPF products in the Liujiang river basin, a big river with a drainage area of 56,000 km2, was compared with the ground observation precipitation from a rain gauge networks firstly, and the results show that the uncertainty of the WRF QPF is relatively high. So a post-processed algorithm by correlating the QPF with the observed precipitation is proposed to remove the systematical bias in QPF. With this algorithm, the post-processed WRF QPF is close to the ground observed precipitation in area-averaged precipitation. Then the precipitation is coupled with the Liuxihe model, a physically based distributed hydrological model that is widely used in small watershed flash flood forecasting. The Liuxihe Model has the advantage with gridded precipitation from NWP and could optimize model parameters when there are some observed hydrological data even there is only a few, it also has very high model resolution to improve model performance, and runs on high performance supercomputer with parallel algorithm if executed in large rivers. Two flood events in the Liujiang River were collected, one was used to optimize the model parameters and another is used to validate the model. The results show that the river flow simulation has been improved largely, and could be used for real-time flood forecasting trail in extending flood forecasting leading time.
H.K. Preisler; R.E. Burgan; J.C. Eidenshink; J.M. Klaver; R.W. Klaver
2009-01-01
The current study presents a statistical model for assessing the skill of fire danger indices and for forecasting the distribution of the expected numbers of large fires over a given region and for the upcoming week. The procedure permits development of daily maps that forecast, for the forthcoming week and within federal lands, percentiles of the distributions of (i)...
A Unified Data Assimilation Strategy for Regional Coupled Atmosphere-Ocean Prediction Systems
NASA Astrophysics Data System (ADS)
Xie, Lian; Liu, Bin; Zhang, Fuqing; Weng, Yonghui
2014-05-01
Improving tropical cyclone (TC) forecasts is a top priority in weather forecasting. Assimilating various observational data to produce better initial conditions for numerical models using advanced data assimilation techniques has been shown to benefit TC intensity forecasts, whereas assimilating large-scale environmental circulation into regional models by spectral nudging or Scale-Selective Data Assimilation (SSDA) has been demonstrated to improve TC track forecasts. Meanwhile, taking into account various air-sea interaction processes by high-resolution coupled air-sea modelling systems has also been shown to improve TC intensity forecasts. Despite the advances in data assimilation and air-sea coupled models, large errors in TC intensity and track forecasting remain. For example, Hurricane Nate (2011) has brought considerable challenge for the TC operational forecasting community, with very large intensity forecast errors (27, 25, and 40 kts for 48, 72, and 96 h, respectively) for the official forecasts. Considering the slow-moving nature of Hurricane Nate, it is reasonable to hypothesize that air-sea interaction processes played a critical role in the intensity change of the storm, and accurate representation of the upper ocean dynamics and thermodynamics is necessary to quantitatively describe the air-sea interaction processes. Currently, data assimilation techniques are generally only applied to hurricane forecasting in stand-alone atmospheric or oceanic model. In fact, most of the regional hurricane forecasting models only included data assimilation techniques for improving the initial condition of the atmospheric model. In such a situation, the benefit of adjustments in one model (atmospheric or oceanic) by assimilating observational data can be compromised by errors from the other model. Thus, unified data assimilation techniques for coupled air-sea modelling systems, which not only simultaneously assimilate atmospheric and oceanic observations into the coupled air-sea modelling system, but also nudging the large-scale environmental flow in the regional model towards global model forecasts are of increasing necessity. In this presentation, we will outline a strategy for an integrated approach in air-sea coupled data assimilation and discuss its benefits and feasibility from incremental results for select historical hurricane cases.
Curtis, Elana; Wikaire, Erena; Jiang, Yannan; McMillan, Louise; Loto, Robert; Poole, Phillippa; Barrow, Mark; Bagg, Warwick; Reid, Papaarangi
2017-08-27
To determine associations between admission markers of socioeconomic status, transitioning, bridging programme attendance and prior academic preparation on academic outcomes for indigenous Māori, Pacific and rural students admitted into medicine under access pathways designed to widen participation. Findings were compared with students admitted via the general (usual) admission pathway. Retrospective observational study using secondary data. 6-year medical programme (MBChB), University of Auckland, Aotearoa New Zealand. Students are selected and admitted into Year 2 following a first year (undergraduate) or prior degree (graduate). 1676 domestic students admitted into Year 2 between 2002 and 2012 via three pathways: GENERAL admission (1167), Māori and Pacific Admission Scheme-MAPAS (317) or Rural Origin Medical Preferential Entry-ROMPE (192). Of these, 1082 students completed the programme in the study period. Graduated from medical programme (yes/no), academic scores in Years 2-3 (Grade Point Average (GPA), scored 0-9). 735/778 (95%) of GENERAL, 111/121 (92%) of ROMPE and 146/183 (80%) of MAPAS students graduated from intended programme. The graduation rate was significantly lower in the MAPAS students (p<0.0001). The average Year 2-3 GPA was 6.35 (SD 1.52) for GENERAL, which was higher than 5.82 (SD 1.65, p=0.0013) for ROMPE and 4.33 (SD 1.56, p<0.0001) for MAPAS. Multiple regression analyses identified three key predictors of better academic outcomes: bridging programme attendance, admission as an undergraduate and admission GPA/Grade Point Equivalent (GPE). Attending local urban schools and higher school deciles were also associated with a greater likelihood of graduation. All regression models have controlled for predefined baseline confounders (gender, age and year of admission). There were varied associations between admission variables and academic outcomes across the three admission pathways. Equity-targeted admission programmes inclusive of variations in academic threshold for entry may support a widening participation agenda, however, additional academic and pastoral supports are recommended. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Preisler, H.K.; Burgan, R.E.; Eidenshink, J.C.; Klaver, Jacqueline M.; Klaver, R.W.
2009-01-01
The current study presents a statistical model for assessing the skill of fire danger indices and for forecasting the distribution of the expected numbers of large fires over a given region and for the upcoming week. The procedure permits development of daily maps that forecast, for the forthcoming week and within federal lands, percentiles of the distributions of (i) number of ignitions; (ii) number of fires above a given size; (iii) conditional probabilities of fires greater than a specified size, given ignition. As an illustration, we used the methods to study the skill of the Fire Potential Index an index that incorporates satellite and surface observations to map fire potential at a national scale in forecasting distributions of large fires. ?? 2009 IAWF.
Stratospheric wind errors, initial states and forecast skill in the GLAS general circulation model
NASA Technical Reports Server (NTRS)
Tenenbaum, J.
1983-01-01
Relations between stratospheric wind errors, initial states and 500 mb skill are investigated using the GLAS general circulation model initialized with FGGE data. Erroneous stratospheric winds are seen in all current general circulation models, appearing also as weak shear above the subtropical jet and as cold polar stratospheres. In this study it is shown that the more anticyclonic large-scale flows are correlated with large forecast stratospheric winds. In addition, it is found that for North America the resulting errors are correlated with initial state jet stream accelerations while for East Asia the forecast winds are correlated with initial state jet strength. Using 500 mb skill scores over Europe at day 5 to measure forecast performance, it is found that both poor forecast skill and excessive stratospheric winds are correlated with more anticyclonic large-scale flows over North America. It is hypothesized that the resulting erroneous kinetic energy contributes to the poor forecast skill, and that the problem is caused by a failure in the modeling of the stratospheric energy cycle in current general circulation models independent of vertical resolution.
2012-08-23
are referred to as aerogels , and despite an impressive collection of attrcative macroscopic proporties, gragility has been the primary drawback to...applications. In that regard, polymer-cross-linked silica aerogels have emerged as strong lightweight nanostructured alternatives rendering new...applications unrelated to aerogels before, as in balistic protection, possible. However, the exact location of the polymer ion the elementary structure of
1993-01-15
emct ,t ,n electrochemical cis-trans isomerization on the first voltammetric sweep through either reductive or doping. Spectroelectrochemical studies...predominantly- cis poly-RCOT films was irreversible, and indicated the presence of an electrochemical cis-trans isomerization on the first voltammetric sweep ...electrochemical measurements were performed under N2(g) in a Vacuum Atmospheres dry box. Cyclic voltametry was performed using a 3-electrode configuration in a l
Statistical Earthquake Focal Mechanism Forecasts
NASA Astrophysics Data System (ADS)
Kagan, Y. Y.; Jackson, D. D.
2013-12-01
The new whole Earth focal mechanism forecast, based on the GCMT catalog, has been created. In the present forecast, the sum of normalized seismic moment tensors within 1000 km radius is calculated and the P- and T-axes for the focal mechanism are evaluated on the basis of the sum. Simultaneously we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms. This average angle shows tectonic complexity of a region and indicates the accuracy of the prediction. The method was originally proposed by Kagan and Jackson (1994, JGR). Recent interest by CSEP and GEM has motivated some improvements, particularly to extend the previous forecast to polar and near-polar regions. The major problem in extending the forecast is the focal mechanism calculation on a spherical surface. In the previous forecast as our average focal mechanism was computed, it was assumed that longitude lines are approximately parallel within 1000 km radius. This is largely accurate in the equatorial and near-equatorial areas. However, when one approaches the 75 degree latitude, the longitude lines are no longer parallel: the bearing (azimuthal) difference at points separated by 1000 km reach about 35 degrees. In most situations a forecast point where we calculate an average focal mechanism is surrounded by earthquakes, so a bias should not be strong due to the difference effect cancellation. But if we move into polar regions, the bearing difference could approach 180 degrees. In a modified program focal mechanisms have been projected on a plane tangent to a sphere at a forecast point. New longitude axes which are parallel in the tangent plane are corrected for the bearing difference. A comparison with the old 75S-75N forecast shows that in equatorial regions the forecasted focal mechanisms are almost the same, and the difference in the forecasted focal mechanisms rotation angle is close to zero. However, though the forecasted focal mechanisms are similar, closer to the 75 latitude degree, the difference in the rotation angle is large (around a factor 1.5 in some places). The Gamma-index was calculated for the average focal mechanism moment. A non-zero Index indicates that earthquake focal mechanisms around the forecast point have different orientations. Thus deformation complexity displays itself in the average rotation angle and in the Index. However, sometimes the rotation angle is close to zero, whereas the Index is large, testifying to a large CLVD presence. Both new 0.5x0.5 and 0.1x0.1 degree forecasts are posted at http://eq.ess.ucla.edu/~kagan/glob_gcmt_index.html.
Kilbinger, Andreas F M
2012-01-01
In this article we present a review of our recent results in one area of research we are involved in. All research efforts in our group focus on functional polymers and new ways of gaining higher levels of control with regard to the placement of functional groups within these polymers. Here, the living ring opening metathesis polymerization (ROMP) will be reviewed for which end-functionalization methods had been rare until very recently. Polymers carrying particular functional groups only at the chain-ends are, however, very interesting for a variety of industrial and academic applications. Polymeric surfactants and polymer-protein conjugates are two examples for the former and polymer-β-sheet-peptide conjugates one example for the latter. The functionalization of macroscopic or nanoscopic surfaces often relies on mono-end functional polymers. Complex macromolecular architectures are often constructed from macromolecules carrying exactly one functional group at their chain- end. The ring opening metathesis polymerization is particularly interesting in this context as it is one of the most functional group tolerant polymerization methods known. Additionally, high molecular weight polymers are readily accessible with this technique, a feature that living radical polymerizations often struggle to achieve. Finding new ways of functionalizing the polymer chain-end of ROMP polymers has therefore been a task long overdue. Here, we present our contribution to this area of research.
An Insight Into the Microbiome of the Amblyomma maculatum (Acari: Ixodidae)
BUDACHETRI, KHEMRAJ; BROWNING, REBECCA E.; ADAMSON, STEVEN W.; DOWD, SCOT E.; CHAO, CHIEN-CHUNG; CHING, WEI-MEI; KARIM, SHAHID
2014-01-01
The aim of this study was to survey the bacterial diversity of Amblyomma maculatum Koch, 1844, and characterize its infection with Rickettsia parkeri. Pyrosequencing of the bacterial 16S rRNA was used to determine the total bacterial population in A. maculatum. Pyrosequencing analysis identified Rickettsia in A. maculatum midguts, salivary glands, and saliva, which indicates successful trafficking in the arthropod vector. The identity of Rickettsia spp. was determined based on sequencing the rickettsial outer membrane protein A (rompA) gene. The sequence homology search revealed the presence of R. parkeri, Rickettsia amblyommii, and Rickettsia endosymbiont of A. maculatum in midgut tissues, whereas the only rickettsia detected in salivary glands was R. parkeri, suggesting it is unique in its ability to migrate from midgut to salivary glands, and colonize this tissue before dissemination to the host. Owing to its importance as an emerging infectious disease, the R. parkeri pathogen burden was quantified by a rompB-based quantitative polymerase chain reaction (qPCR) assay and the diagnostic effectiveness of using R. parkeri polyclonal antibodies in tick tissues was tested. Together, these data indicate that field-collected A. maculatum had a R. parkeri infection rate of 12–32%. This study provides an insight into the A. maculatum microbiome and confirms the presence of R. parkeri, which will serve as the basis for future tick and microbiome interaction studies. PMID:24605461
Verification of an ensemble prediction system for storm surge forecast in the Adriatic Sea
NASA Astrophysics Data System (ADS)
Mel, Riccardo; Lionello, Piero
2014-12-01
In the Adriatic Sea, storm surges present a significant threat to Venice and to the flat coastal areas of the northern coast of the basin. Sea level forecast is of paramount importance for the management of daily activities and for operating the movable barriers that are presently being built for the protection of the city. In this paper, an EPS (ensemble prediction system) for operational forecasting of storm surge in the northern Adriatic Sea is presented and applied to a 3-month-long period (October-December 2010). The sea level EPS is based on the HYPSE (hydrostatic Padua Sea elevation) model, which is a standard single-layer nonlinear shallow water model, whose forcings (mean sea level pressure and surface wind fields) are provided by the ensemble members of the ECMWF (European Center for Medium-Range Weather Forecasts) EPS. Results are verified against observations at five tide gauges located along the Croatian and Italian coasts of the Adriatic Sea. Forecast uncertainty increases with the predicted value of the storm surge and with the forecast lead time. The EMF (ensemble mean forecast) provided by the EPS has a rms (root mean square) error lower than the DF (deterministic forecast), especially for short (up to 3 days) lead times. Uncertainty for short lead times of the forecast and for small storm surges is mainly caused by uncertainty of the initial condition of the hydrodynamical model. Uncertainty for large lead times and large storm surges is mainly caused by uncertainty in the meteorological forcings. The EPS spread increases with the rms error of the forecast. For large lead times the EPS spread and the forecast error substantially coincide. However, the EPS spread in this study, which does not account for uncertainty in the initial condition, underestimates the error during the early part of the forecast and for small storm surge values. On the contrary, it overestimates the rms error for large surge values. The PF (probability forecast) of the EPS has a clear skill in predicting the actual probability distribution of sea level, and it outperforms simple "dressed" PF methods. A probability estimate based on the single DF is shown to be inadequate. However, a PF obtained with a prescribed Gaussian distribution and centered on the DF value performs very similarly to the EPS-based PF.
Time-Series Forecast Modeling on High-Bandwidth Network Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Sim, Alex
With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less
NASA Technical Reports Server (NTRS)
Koster, Randal D.; Walker, Gregory K.; Mahanama, Sarith P.; Reichle, Rolf H.
2013-01-01
Offline simulations over the conterminous United States (CONUS) with a land surface model are used to address two issues relevant to the forecasting of large-scale seasonal streamflow: (i) the extent to which errors in soil moisture initialization degrade streamflow forecasts, and (ii) the extent to which a realistic increase in the spatial resolution of forecasted precipitation would improve streamflow forecasts. The addition of error to a soil moisture initialization field is found to lead to a nearly proportional reduction in streamflow forecast skill. The linearity of the response allows the determination of a lower bound for the increase in streamflow forecast skill achievable through improved soil moisture estimation, e.g., through satellite-based soil moisture measurements. An increase in the resolution of precipitation is found to have an impact on large-scale streamflow forecasts only when evaporation variance is significant relative to the precipitation variance. This condition is met only in the western half of the CONUS domain. Taken together, the two studies demonstrate the utility of a continental-scale land surface modeling system as a tool for addressing the science of hydrological prediction.
Time-Series Forecast Modeling on High-Bandwidth Network Measurements
Yoo, Wucherl; Sim, Alex
2016-06-24
With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jie; Cui, Mingjian; Hodge, Bri-Mathias
The large variability and uncertainty in wind power generation present a concern to power system operators, especially given the increasing amounts of wind power being integrated into the electric power system. Large ramps, one of the biggest concerns, can significantly influence system economics and reliability. The Wind Forecast Improvement Project (WFIP) was to improve the accuracy of forecasts and to evaluate the economic benefits of these improvements to grid operators. This paper evaluates the ramp forecasting accuracy gained by improving the performance of short-term wind power forecasting. This study focuses on the WFIP southern study region, which encompasses most ofmore » the Electric Reliability Council of Texas (ERCOT) territory, to compare the experimental WFIP forecasts to the existing short-term wind power forecasts (used at ERCOT) at multiple spatial and temporal scales. The study employs four significant wind power ramping definitions according to the power change magnitude, direction, and duration. The optimized swinging door algorithm is adopted to extract ramp events from actual and forecasted wind power time series. The results show that the experimental WFIP forecasts improve the accuracy of the wind power ramp forecasting. This improvement can result in substantial costs savings and power system reliability enhancements.« less
Uncertainty estimation of long-range ensemble forecasts of snowmelt flood characteristics
NASA Astrophysics Data System (ADS)
Kuchment, L.
2012-04-01
Long-range forecasts of snowmelt flood characteristics with the lead time of 2-3 months have important significance for regulation of flood runoff and mitigation of flood damages at almost all large Russian rivers At the same time, the application of current forecasting techniques based on regression relationships between the runoff volume and the indexes of river basin conditions can lead to serious errors in forecasting resulted in large economic losses caused by wrong flood regulation. The forecast errors can be caused by complicated processes of soil freezing and soil moisture redistribution, too high rate of snow melt, large liquid precipitation before snow melt. or by large difference of meteorological conditions during the lead-time periods from climatologic ones. Analysis of economic losses had shown that the largest damages could, to a significant extent, be avoided if the decision makers had an opportunity to take into account predictive uncertainty and could use more cautious strategies in runoff regulation. Development of methodology of long-range ensemble forecasting of spring/summer floods which is based on distributed physically-based runoff generation models has created, in principle, a new basis for improving hydrological predictions as well as for estimating their uncertainty. This approach is illustrated by forecasting of the spring-summer floods at the Vyatka River and the Seim River basins. The application of the physically - based models of snowmelt runoff generation give a essential improving of statistical estimates of the deterministic forecasts of the flood volume in comparison with the forecasts obtained from the regression relationships. These models had been used also for the probabilistic forecasts assigning meteorological inputs during lead time periods from the available historical daily series, and from the series simulated by using a weather generator and the Monte Carlo procedure. The weather generator consists of the stochastic models of daily temperature and precipitation. The performance of the probabilistic forecasts were estimated by the ranked probability skill scores. The application of Monte Carlo simulations using weather generator has given better results then using the historical meteorological series.
Stochastic demographic forecasting.
Lee, R D
1992-11-01
"This paper describes a particular approach to stochastic population forecasting, which is implemented for the U.S.A. through 2065. Statistical time series methods are combined with demographic models to produce plausible long run forecasts of vital rates, with probability distributions. The resulting mortality forecasts imply gains in future life expectancy that are roughly twice as large as those forecast by the Office of the Social Security Actuary.... Resulting stochastic forecasts of the elderly population, elderly dependency ratios, and payroll tax rates for health, education and pensions are presented." excerpt
Computers and Technological Forecasting
ERIC Educational Resources Information Center
Martino, Joseph P.
1971-01-01
Forecasting is becoming increasingly automated, thanks in large measure to the computer. It is now possible for a forecaster to submit his data to a computation center and call for the appropriate program. (No knowledge of statistics is required.) (Author)
Model Forecast Skill and Sensitivity to Initial Conditions in the Seasonal Sea Ice Outlook
NASA Technical Reports Server (NTRS)
Blanchard-Wrigglesworth, E.; Cullather, R. I.; Wang, W.; Zhang, J.; Bitz, C. M.
2015-01-01
We explore the skill of predictions of September Arctic sea ice extent from dynamical models participating in the Sea Ice Outlook (SIO). Forecasts submitted in August, at roughly 2 month lead times, are skillful. However, skill is lower in forecasts submitted to SIO, which began in 2008, than in hindcasts (retrospective forecasts) of the last few decades. The multimodel mean SIO predictions offer slightly higher skill than the single-model SIO predictions, but neither beats a damped persistence forecast at longer than 2 month lead times. The models are largely unsuccessful at predicting each other, indicating a large difference in model physics and/or initial conditions. Motivated by this, we perform an initial condition sensitivity experiment with four SIO models, applying a fixed -1 m perturbation to the initial sea ice thickness. The significant range of the response among the models suggests that different model physics make a significant contribution to forecast uncertainty.
Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)
NASA Astrophysics Data System (ADS)
Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.
2013-12-01
Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.
Drought forecasting in Luanhe River basin involving climatic indices
NASA Astrophysics Data System (ADS)
Ren, Weinan; Wang, Yixuan; Li, Jianzhu; Feng, Ping; Smith, Ronald J.
2017-11-01
Drought is regarded as one of the most severe natural disasters globally. This is especially the case in Tianjin City, Northern China, where drought can affect economic development and people's livelihoods. Drought forecasting, the basis of drought management, is an important mitigation strategy. In this paper, we evolve a probabilistic forecasting model, which forecasts transition probabilities from a current Standardized Precipitation Index (SPI) value to a future SPI class, based on conditional distribution of multivariate normal distribution to involve two large-scale climatic indices at the same time, and apply the forecasting model to 26 rain gauges in the Luanhe River basin in North China. The establishment of the model and the derivation of the SPI are based on the hypothesis of aggregated monthly precipitation that is normally distributed. Pearson correlation and Shapiro-Wilk normality tests are used to select appropriate SPI time scale and large-scale climatic indices. Findings indicated that longer-term aggregated monthly precipitation, in general, was more likely to be considered normally distributed and forecasting models should be applied to each gauge, respectively, rather than to the whole basin. Taking Liying Gauge as an example, we illustrate the impact of the SPI time scale and lead time on transition probabilities. Then, the controlled climatic indices of every gauge are selected by Pearson correlation test and the multivariate normality of SPI, corresponding climatic indices for current month and SPI 1, 2, and 3 months later are demonstrated using Shapiro-Wilk normality test. Subsequently, we illustrate the impact of large-scale oceanic-atmospheric circulation patterns on transition probabilities. Finally, we use a score method to evaluate and compare the performance of the three forecasting models and compare them with two traditional models which forecast transition probabilities from a current to a future SPI class. The results show that the three proposed models outperform the two traditional models and involving large-scale climatic indices can improve the forecasting accuracy.
NASA Astrophysics Data System (ADS)
Federico, Ivan; Pinardi, Nadia; Coppini, Giovanni; Oddo, Paolo; Lecci, Rita; Mossa, Michele
2017-01-01
SANIFS (Southern Adriatic Northern Ionian coastal Forecasting System) is a coastal-ocean operational system based on the unstructured grid finite-element three-dimensional hydrodynamic SHYFEM model, providing short-term forecasts. The operational chain is based on a downscaling approach starting from the large-scale system for the entire Mediterranean Basin (MFS, Mediterranean Forecasting System), which provides initial and boundary condition fields to the nested system. The model is configured to provide hydrodynamics and active tracer forecasts both in open ocean and coastal waters of southeastern Italy using a variable horizontal resolution from the open sea (3-4 km) to coastal areas (50-500 m). Given that the coastal fields are driven by a combination of both local (also known as coastal) and deep-ocean forcings propagating along the shelf, the performance of SANIFS was verified both in forecast and simulation mode, first (i) on the large and shelf-coastal scales by comparing with a large-scale survey CTD (conductivity-temperature-depth) in the Gulf of Taranto and then (ii) on the coastal-harbour scale (Mar Grande of Taranto) by comparison with CTD, ADCP (acoustic doppler current profiler) and tide gauge data. Sensitivity tests were performed on initialization conditions (mainly focused on spin-up procedures) and on surface boundary conditions by assessing the reliability of two alternative datasets at different horizontal resolution (12.5 and 6.5 km). The SANIFS forecasts at a lead time of 1 day were compared with the MFS forecasts, highlighting that SANIFS is able to retain the large-scale dynamics of MFS. The large-scale dynamics of MFS are correctly propagated to the shelf-coastal scale, improving the forecast accuracy (+17 % for temperature and +6 % for salinity compared to MFS). Moreover, the added value of SANIFS was assessed on the coastal-harbour scale, which is not covered by the coarse resolution of MFS, where the fields forecasted by SANIFS reproduced the observations well (temperature RMSE equal to 0.11 °C). Furthermore, SANIFS simulations were compared with hourly time series of temperature, sea level and velocity measured on the coastal-harbour scale, showing a good agreement. Simulations in the Gulf of Taranto described a circulation mainly characterized by an anticyclonic gyre with the presence of cyclonic vortexes in shelf-coastal areas. A surface water inflow from the open sea to Mar Grande characterizes the coastal-harbour scale.
A quality assessment of the MARS crop yield forecasting system for the European Union
NASA Astrophysics Data System (ADS)
van der Velde, Marijn; Bareuth, Bettina
2015-04-01
Timely information on crop production forecasts can become of increasing importance as commodity markets are more and more interconnected. Impacts across large crop production areas due to (e.g.) extreme weather and pest outbreaks can create ripple effects that may affect food prices and availability elsewhere. The MARS Unit (Monitoring Agricultural ResourceS), DG Joint Research Centre, European Commission, has been providing forecasts of European crop production levels since 1993. The operational crop production forecasting is carried out with the MARS Crop Yield Forecasting System (M-CYFS). The M-CYFS is used to monitor crop growth development, evaluate short-term effects of anomalous meteorological events, and provide monthly forecasts of crop yield at national and European Union level. The crop production forecasts are published in the so-called MARS bulletins. Forecasting crop yield over large areas in the operational context requires quality benchmarks. Here we present an analysis of the accuracy and skill of past crop yield forecasts of the main crops (e.g. soft wheat, grain maize), throughout the growing season, and specifically for the final forecast before harvest. Two simple benchmarks to assess the skill of the forecasts were defined as comparing the forecasts to 1) a forecast equal to the average yield and 2) a forecast using a linear trend established through the crop yield time-series. These reveal a variability in performance as a function of crop and Member State. In terms of production, the yield forecasts of 67% of the EU-28 soft wheat production and 80% of the EU-28 maize production have been forecast superior to both benchmarks during the 1993-2013 period. In a changing and increasingly variable climate crop yield forecasts can become increasingly valuable - provided they are used wisely. We end our presentation by discussing research activities that could contribute to this goal.
A Delphi forecast of technology in education
NASA Technical Reports Server (NTRS)
Robinson, B. E.
1973-01-01
The results are reported of a Delphi forecast of the utilization and social impacts of large-scale educational telecommunications technology. The focus is on both forecasting methodology and educational technology. The various methods of forecasting used by futurists are analyzed from the perspective of the most appropriate method for a prognosticator of educational technology, and review and critical analysis are presented of previous forecasts and studies. Graphic responses, summarized comments, and a scenario of education in 1990 are presented.
Forecasting distribution of numbers of large fires
Eidenshink, Jeffery C.; Preisler, Haiganoush K.; Howard, Stephen; Burgan, Robert E.
2014-01-01
Systems to estimate forest fire potential commonly utilize one or more indexes that relate to expected fire behavior; however they indicate neither the chance that a large fire will occur, nor the expected number of large fires. That is, they do not quantify the probabilistic nature of fire danger. In this work we use large fire occurrence information from the Monitoring Trends in Burn Severity project, and satellite and surface observations of fuel conditions in the form of the Fire Potential Index, to estimate two aspects of fire danger: 1) the probability that a 1 acre ignition will result in a 100+ acre fire, and 2) the probabilities of having at least 1, 2, 3, or 4 large fires within a Predictive Services Area in the forthcoming week. These statistical processes are the main thrust of the paper and are used to produce two daily national forecasts that are available from the U.S. Geological Survey, Earth Resources Observation and Science Center and via the Wildland Fire Assessment System. A validation study of our forecasts for the 2013 fire season demonstrated good agreement between observed and forecasted values.
NASA Astrophysics Data System (ADS)
Bliefernicht, Jan; Seidel, Jochen; Salack, Seyni; Waongo, Moussa; Laux, Patrick; Kunstmann, Harald
2017-04-01
Seasonal precipitation forecasts are a crucial source of information for an early warning of hydro-meteorological extremes in West Africa. However, the current seasonal forecasting system used by the West African weather services in the framework of the West African Climate Outlook forum (PRESAO) is limited to probabilistic precipitation forecasts of 1-month lead time. To improve this provision, we use an ensemble-based quantile-quantile transformation for bias correction of precipitation forecasts provided by a global seasonal ensemble prediction system, the Climate Forecast System Version 2 (CFS2). The statistical technique eliminates systematic differences between global forecasts and observations with the potential to preserve the signal from the model. The technique has also the advantage that it can be easily implemented at national weather services with low capacities. The statistical technique is used to generate probabilistic forecasts of monthly and seasonal precipitation amount and other precipitation indices useful for an early warning of large-scale drought and floods in West Africa. The evaluation of the statistical technique is done using CFS hindcasts (1982 to 2009) in a cross-validation mode to determine the performance of the precipitation forecasts for several lead times focusing on drought and flood events depicted over the Volta and Niger basins. In addition, operational forecasts provided by PRESAO are analyzed from 1998 to 2015. The precipitation forecasts are compared to low-skill reference forecasts generated from gridded observations (i.e. GPCC, CHIRPS) and a novel in-situ gauge database from national observation networks (see Poster EGU2017-10271). The forecasts are evaluated using state-of-the-art verification techniques to determine specific quality attributes of probabilistic forecasts such as reliability, accuracy and skill. In addition, cost-loss approaches are used to determine the value of probabilistic forecasts for multiple users in warning situations. The outcomes of the hindcasts experiment for the Volta basin illustrate that the statistical technique can clearly improve the CFS precipitation forecasts with the potential to provide skillful and valuable early precipitation warnings for large-scale drought and flood situations several months in ahead. In this presentation we give a detailed overview about the ensemble-based quantile-quantile-transformation, its validation and verification and the possibilities of this technique to complement PRESAO. We also highlight the performance of this technique for extremes such as the Sahel drought in the 80ties and in comparison to the various reference data sets (e.g. CFS2, PRESAO, observational data sets) used in this study.
Obtaining high-resolution stage forecasts by coupling large-scale hydrologic models with sensor data
NASA Astrophysics Data System (ADS)
Fries, K. J.; Kerkez, B.
2017-12-01
We investigate how "big" quantities of distributed sensor data can be coupled with a large-scale hydrologic model, in particular the National Water Model (NWM), to obtain hyper-resolution forecasts. The recent launch of the NWM provides a great example of how growing computational capacity is enabling a new generation of massive hydrologic models. While the NWM spans an unprecedented spatial extent, there remain many questions about how to improve forecast at the street-level, the resolution at which many stakeholders make critical decisions. Further, the NWM runs on supercomputers, so water managers who may have access to their own high-resolution measurements may not readily be able to assimilate them into the model. To that end, we ask the question: how can the advances of the large-scale NWM be coupled with new local observations to enable hyper-resolution hydrologic forecasts? A methodology is proposed whereby the flow forecasts of the NWM are directly mapped to high-resolution stream levels using Dynamical System Identification. We apply the methodology across a sensor network of 182 gages in Iowa. Of these sites, approximately one third have shown to perform well in high-resolution flood forecasting when coupled with the outputs of the NWM. The quality of these forecasts is characterized using Principal Component Analysis and Random Forests to identify where the NWM may benefit from new sources of local observations. We also discuss how this approach can help municipalities identify where they should place low-cost sensors to most benefit from flood forecasts of the NWM.
NASA Astrophysics Data System (ADS)
Dreano, Denis; Tsiaras, Kostas; Triantafyllou, George; Hoteit, Ibrahim
2017-07-01
Forecasting the state of large marine ecosystems is important for many economic and public health applications. However, advanced three-dimensional (3D) ecosystem models, such as the European Regional Seas Ecosystem Model (ERSEM), are computationally expensive, especially when implemented within an ensemble data assimilation system requiring several parallel integrations. As an alternative to 3D ecological forecasting systems, we propose to implement a set of regional one-dimensional (1D) water-column ecological models that run at a fraction of the computational cost. The 1D model domains are determined using a Gaussian mixture model (GMM)-based clustering method and satellite chlorophyll-a (Chl-a) data. Regionally averaged Chl-a data is assimilated into the 1D models using the singular evolutive interpolated Kalman (SEIK) filter. To laterally exchange information between subregions and improve the forecasting skills, we introduce a new correction step to the assimilation scheme, in which we assimilate a statistical forecast of future Chl-a observations based on information from neighbouring regions. We apply this approach to the Red Sea and show that the assimilative 1D ecological models can forecast surface Chl-a concentration with high accuracy. The statistical assimilation step further improves the forecasting skill by as much as 50%. This general approach of clustering large marine areas and running several interacting 1D ecological models is very flexible. It allows many combinations of clustering, filtering and regression technics to be used and can be applied to build efficient forecasting systems in other large marine ecosystems.
Kumar, Abhinendra; Yogisharadhya, Revanaiah; Ramakrishnan, Muthannan A; Viswas, K N; Shivachandra, Sathish B
2013-12-01
Pasteurella multocida serogroup B:2, a causative agent of haemorrhagic septicaemia (HS) in cattle and buffalo especially in tropical regions of Asian and African countries, is known to possess several outer membrane proteins (OMPs) as immunogenic antigens. In the present study, omp87 gene encoding for 87 kDa OMP (Omp87) protein of P. multocida serogroup B:2 strain P52, has been amplified (∼2304 bp), cloned in to pET32a vector and over-expressed in recombinant Escherichia coli as fusion protein. The recombinant Omp87 protein (∼102 kDa) including N-terminus hexa-histidine tag was purified under denaturing condition. Immunization of mice with rOmp87 resulted in increased antigen specific IgG titres in serum and provided protection of 66.6 and 83.3% following homologous (B:2) and heterologous (A:1) challenge, respectively. A homology model of Omp87 revealed the presence of two distinct domains; N-terminal domain with four POTRA repeats in the periplasmic space and a pore forming C-terminal β-barrel domain (β1- β16) in the outer membrane of P. multocida, which belong to Omp85-TpsB transporter superfamily of OMPs. The study indicated the potential possibilities to use rOmp87 protein along with suitable adjuvant in developing subunit vaccine for haemorrhagic septicaemia and pasteurellosis in livestock. Copyright © 2013 Elsevier Ltd. All rights reserved.
Melt Miscibility in Block Copolymers Containing Polyethylene and Substituted Polynorbornenes
NASA Astrophysics Data System (ADS)
Mulhearn, William; Register, Richard
Very few polymer species exist with a sufficiently weak repulsive interaction against polyethylene (PE), characterized by a low Flory parameter χ or interaction energy density X, to be useful for preparing PE-containing block copolymers with disordered melts at high molecular weights. Most suitably miscible polymers are chemically similar to PE, such as copolymers of ethylene with a minority content of an α-olefin, and so are only marginally useful for property modification due to similar physical properties like the glass transition temperature (Tg) . However, the family of polymers consisting of substituted norbornenes prepared via ring-opening metathesis polymerization (ROMP) and subsequent hydrogenation is unique in that many of its members exhibit very low X against PE (comparable with the interaction energy between poly(ethylene-alt-propylene) and PE), and some of these also exhibit high Tg. The miscibility between PE and a substituted, hydrogenated ROMP polynobornene, or between two dissimilar hydrogenated polynorbornenes, is a strong function of the substituent appended to the norbornene monomer. The mixing thermodynamics of this polymer series are irregular, in that the interaction energies do not follow X = (δ1 - δ2)2 where δ is the solubility parameter. However, other systematic trends do apply and we develop a set of mixing rules to quantitatively describe the experimental miscibility behavior. We also investigate statistical copolymerization of two norbornene monomers as a means to continuously tune miscibility with a homopolymer of a third monomer.
Page, Morgan T.; Van Der Elst, Nicholas; Hardebeck, Jeanne L.; Felzer, Karen; Michael, Andrew J.
2016-01-01
Following a large earthquake, seismic hazard can be orders of magnitude higher than the long‐term average as a result of aftershock triggering. Because of this heightened hazard, emergency managers and the public demand rapid, authoritative, and reliable aftershock forecasts. In the past, U.S. Geological Survey (USGS) aftershock forecasts following large global earthquakes have been released on an ad hoc basis with inconsistent methods, and in some cases aftershock parameters adapted from California. To remedy this, the USGS is currently developing an automated aftershock product based on the Reasenberg and Jones (1989) method that will generate more accurate forecasts. To better capture spatial variations in aftershock productivity and decay, we estimate regional aftershock parameters for sequences within the García et al. (2012) tectonic regions. We find that regional variations for mean aftershock productivity reach almost a factor of 10. We also develop a method to account for the time‐dependent magnitude of completeness following large events in the catalog. In addition to estimating average sequence parameters within regions, we develop an inverse method to estimate the intersequence parameter variability. This allows for a more complete quantification of the forecast uncertainties and Bayesian updating of the forecast as sequence‐specific information becomes available.
Pathak, Prachi; Kumar, Ashu; Thavaselvam, Duraipandian
2017-07-11
Brucellosis is an important zoonotic disease caused by different Brucella species and human brucellosis is commonly prevalent in different states of India. Among various Brucella species, B. melitensis is most pathogenic to human and included as category B biothreat which can cause infection through aerosol, cut, wounds in skin and contact with infected animals. The diagnosis of human brucellosis is very important for proper treatment and management of disease as there is no vaccine available for human use. The present study was designed to clone, express and purify immunodominant recombinant omp2a (rOmp2a) porin protein of B. melitensis and to evaluate this new antigen candidate for specific serodiagnosis of human brucellosis by highly sensitive iELISA (indirect enzyme linked immunosorbent assay). Omp2a gene of B. melitensis 16 M strain was cloned and expressed in pET-SUMO expression system. The recombinant protein was purified under denaturing conditions using 8 M urea. The purified recombinant protein was confirmed by western blotting by reacting with anti-HIS antibody. The sero-reactivity of the recombinant protein was also checked by reacting with antisera of experimentally infected mice with B. melitensis 16 M at different time points. Serodiagnostic potential of recombinant porin antigen was tested against 185 clinical serum samples collected from regions endemic to brucellosis in southern part of India by iELISA. The samples were grouped into five groups. Group 1 contained cultured confirmed positive serum samples of brucellosis (n = 15), group 2 contained sera samples from positive cases of brucellosis previously tested by conventional methods of RBPT (n = 28) and STAT (n = 26), group 3 contained sera samples negative by RBPT(n = 36) and STAT (n = 32), group 4 contained sera samples of other febrile illness and PUO case (n = 35) and group 5 contained confirmed negative sera samples from healthy donors (n = 23). The rOmp2a was found to be immunoreactive by iELISA and western blotting. The test showed a sensitivity of 93.75% and specificity of 95.83% when tested against 185 serum samples. For determination of statistical significance between experimental groups and control groups, Student's t test was performed on the data. Omp2a emerges as a potential antigen candidate for serodiagnosis of human brucellosis.
Flood Forecasting in Wales: Challenges and Solutions
NASA Astrophysics Data System (ADS)
How, Andrew; Williams, Christopher
2015-04-01
With steep, fast-responding river catchments, exposed coastal reaches with large tidal ranges and large population densities in some of the most at-risk areas; flood forecasting in Wales presents many varied challenges. Utilising advances in computing power and learning from best practice within the United Kingdom and abroad have seen significant improvements in recent years - however, many challenges still remain. Developments in computing and increased processing power comes with a significant price tag; greater numbers of data sources and ensemble feeds brings a better understanding of uncertainty but the wealth of data needs careful management to ensure a clear message of risk is disseminated; new modelling techniques utilise better and faster computation, but lack the history of record and experience gained from the continued use of more established forecasting models. As a flood forecasting team we work to develop coastal and fluvial forecasting models, set them up for operational use and manage the duty role that runs the models in real time. An overview of our current operational flood forecasting system will be presented, along with a discussion on some of the solutions we have in place to address the challenges we face. These include: • real-time updating of fluvial models • rainfall forecasting verification • ensemble forecast data • longer range forecast data • contingency models • offshore to nearshore wave transformation • calculation of wave overtopping
Hydrological Forecasting Practices in Brazil
NASA Astrophysics Data System (ADS)
Fan, Fernando; Paiva, Rodrigo; Collischonn, Walter; Ramos, Maria-Helena
2016-04-01
This work brings a review on current hydrological and flood forecasting practices in Brazil, including the main forecasts applications, the different kinds of techniques that are currently being employed and the institutions involved on forecasts generation. A brief overview of Brazil is provided, including aspects related to its geography, climate, hydrology and flood hazards. A general discussion about the Brazilian practices on hydrological short and medium range forecasting is presented. Detailed examples of some hydrological forecasting systems that are operational or in a research/pre-operational phase using the large scale hydrological model MGB-IPH are also presented. Finally, some suggestions are given about how the forecasting practices in Brazil can be understood nowadays, and what are the perspectives for the future.
Making large amounts of meteorological plots easily accessible to users
NASA Astrophysics Data System (ADS)
Lamy-Thepaut, Sylvie; Siemen, Stephan; Sahin, Cihan; Raoult, Baudouin
2015-04-01
The European Centre for Medium-Range Weather Forecasts (ECMWF) is an international organisation providing its member organisations with forecasts in the medium time range of 3 to 15 days, and some longer-range forecasts for up to a year ahead, with varying degrees of detail. As part of its mission, ECMWF generates an increasing number of forecast data products for its users. To support the work of forecasters and researchers and to let them make best use of ECMWF forecasts, the Centre also provides tools and interfaces to visualise their products. This allows users to make use of and explore forecasts without having to transfer large amounts of raw data. This is especially true for products based on ECMWF's 50 member ensemble forecast, where some specific processing and visualisation are applied to extract information. Every day, thousands of raw data are being pushed to the ECMWF's interactive web charts application called ecCharts, and thousands of products are processed and pushed to ECMWF's institutional web site ecCharts provides a highly interactive application to display and manipulate recent numerical forecasts to forecasters in national weather services and ECMWF's commercial customers. With ecCharts forecasters are able to explore ECMWF's medium-range forecasts in far greater detail than has previously been possible on the web, and this as soon as the forecast becomes available. All ecCharts's products are also available through a machine-to-machine web map service based on the OGC Web Map Service (WMS) standard. ECMWF institutional web site provides access to a large number of graphical products. It was entirely redesigned last year. It now shares the same infrastructure as ECMWF's ecCharts, and can benefit of some ecCharts functionalities, for example the dashboard. The dashboard initially developed for ecCharts allows users to organise their own collection of products depending on their work flow, and is being further developed. In its first implementation, It presents the user's products in a single interface with fast access to the original product, and possibilities of synchronous animations between them. But its functionalities are being extended to give users the freedom to collect not only ecCharts's 2D maps and graphs, but also other ECMWF Web products such as monthly and seasonal products, scores, and observation monitoring. The dashboard will play a key role to help the user to interpret the large amount of information that ECMWF is providing. This talk will present examples of how the new user interface can organise complex meteorological maps and graphs and show the new possibilities users have gained by using the web as a medium.
NASA Astrophysics Data System (ADS)
Higgins, S. M. W.; Du, H. L.; Smith, L. A.
2012-04-01
Ensemble forecasting on a lead time of seconds over several years generates a large forecast-outcome archive, which can be used to evaluate and weight "models". Challenges which arise as the archive becomes smaller are investigated: in weather forecasting one typically has only thousands of forecasts however those launched 6 hours apart are not independent of each other, nor is it justified to mix seasons with different dynamics. Seasonal forecasts, as from ENSEMBLES and DEMETER, typically have less than 64 unique launch dates; decadal forecasts less than eight, and long range climate forecasts arguably none. It is argued that one does not weight "models" so much as entire ensemble prediction systems (EPSs), and that the marginal value of an EPS will depend on the other members in the mix. The impact of using different skill scores is examined in the limits of both very large forecast-outcome archives (thereby evaluating the efficiency of the skill score) and in very small forecast-outcome archives (illustrating fundamental limitations due to sampling fluctuations and memory in the physical system being forecast). It is shown that blending with climatology (J. Bröcker and L.A. Smith, Tellus A, 60(4), 663-678, (2008)) tends to increase the robustness of the results; also a new kernel dressing methodology (simply insuring that the expected probability mass tends to lie outside the range of the ensemble) is illustrated. Fair comparisons using seasonal forecasts from the ENSEMBLES project are used to illustrate the importance of these results with fairly small archives. The robustness of these results across the range of small, moderate and huge archives is demonstrated using imperfect models of perfectly known nonlinear (chaotic) dynamical systems. The implications these results hold for distinguishing the skill of a forecast from its value to a user of the forecast are discussed.
Nowcasting and Forecasting the Monthly Food Stamps Data in the US Using Online Search Data
Fantazzini, Dean
2014-01-01
We propose the use of Google online search data for nowcasting and forecasting the number of food stamps recipients. We perform a large out-of-sample forecasting exercise with almost 3000 competing models with forecast horizons up to 2 years ahead, and we show that models including Google search data statistically outperform the competing models at all considered horizons. These results hold also with several robustness checks, considering alternative keywords, a falsification test, different out-of-samples, directional accuracy and forecasts at the state-level. PMID:25369315
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
... prices will likely be forecasted using trends from the Energy Information Administration's most recent... forecasted energy prices, using shipment projections and average energy efficiency projections. DOE... DEPARTMENT OF ENERGY 10 CFR Part 431 [Docket No. EERE-2013-BT-STD-0007] RIN 1904-AC95 Energy...
2016-01-01
The development of new ROMP-derived silica-immobilized heterocyclic phosphate reagents and their application in purification-free protocols is reported. Grafting of norbornenyl norbornenyl-functionalized (Nb-tagged) silica particles with functionalized Nb-tagged heterocyclic phosphate monomers efficiently yield high-load, hybrid silica-immobilized oligomeric heterobenzyl phosphates (Si–OHBP) and heterotriazolyl phosphates (Si–OHTP) as efficient alkylation agents. Applications of these reagents for the diversification of N-, O-, and S-nucleophilic species, for efficient heterobenzylation and hetero(triazolyl)methylation have been validated. PMID:27300761
1993-10-29
menthol , R-(-)-pantalactone, BBr3 (IM in CH2C12), PhMe2SiCI and t-BuLi were purchased from Aldrich and used as received. Mo(CHCMe2Ph)(NAr)(O-t-Bu)2 was...2,3- (COCl)2norbornadiene1 9 (9.05 g, 41.7 mmol) in THF (50 mL) was added dropwise to a stirred THF (200 mL) solution of IR, 2S, 5R-(-)- menthol (14.33 g
Skill of a global seasonal ensemble streamflow forecasting system
NASA Astrophysics Data System (ADS)
Candogan Yossef, Naze; Winsemius, Hessel; Weerts, Albrecht; van Beek, Rens; Bierkens, Marc
2013-04-01
Forecasting of water availability and scarcity is a prerequisite for managing the risks and opportunities caused by the inter-annual variability of streamflow. Reliable seasonal streamflow forecasts are necessary to prepare for an appropriate response in disaster relief, management of hydropower reservoirs, water supply, agriculture and navigation. Seasonal hydrological forecasting on a global scale could be valuable especially for developing regions of the world, where effective hydrological forecasting systems are scarce. In this study, we investigate the forecasting skill of the global seasonal streamflow forecasting system FEWS-World, using the global hydrological model PCR-GLOBWB. FEWS-World has been setup within the European Commission 7th Framework Programme project Global Water Scarcity Information Service (GLOWASIS). Skill is assessed in historical simulation mode as well as retroactive forecasting mode. The assessment in historical simulation mode used a meteorological forcing based on observations from the Climate Research Unit of the University of East Anglia and the ERA-40 reanalysis of the European Center for Medium-Range Weather Forecasts (ECMWF). We assessed the skill of the global hydrological model PCR-GLOBWB in reproducing past discharge extremes in 20 large rivers of the world. This preliminary assessment concluded that the prospects for seasonal forecasting with PCR-GLOBWB or comparable models are positive. However this assessment did not include actual meteorological forecasts. Thus the meteorological forcing errors were not assessed. Yet, in a forecasting setup, the predictive skill of a hydrological forecasting system is affected by errors due to uncertainty from numerical weather prediction models. For the assessment in retroactive forecasting mode, the model is forced with actual ensemble forecasts from the seasonal forecast archives of ECMWF. Skill is assessed at 78 stations on large river basins across the globe, for all the months of the year and for lead times up to 6 months. The forecasted discharges are compared with observed monthly streamflow records using the ensemble verification measures Brier Skill Score (BSS) and Continuous Ranked Probability Score (CRPS). The eventual goal is to transfer FEWS-World to operational forecasting mode, where the system will use operational seasonal forecasts from ECMWF. The results will be disseminated on the internet, and hopefully provide information that is valuable for users in data and model-poor regions of the world.
NASA Astrophysics Data System (ADS)
Lavers, David A.; Pappenberger, Florian; Richardson, David S.; Zsoter, Ervin
2016-11-01
In winter, heavy precipitation and floods along the west coasts of midlatitude continents are largely caused by intense water vapor transport (integrated vapor transport (IVT)) within the atmospheric river of extratropical cyclones. This study builds on previous findings that showed that forecasts of IVT have higher predictability than precipitation, by applying and evaluating the European Centre for Medium-Range Weather Forecasts Extreme Forecast Index (EFI) for IVT in ensemble forecasts during three winters across Europe. We show that the IVT EFI is more able (than the precipitation EFI) to capture extreme precipitation in forecast week 2 during forecasts initialized in a positive North Atlantic Oscillation (NAO) phase; conversely, the precipitation EFI is better during the negative NAO phase and at shorter leads. An IVT EFI example for storm Desmond in December 2015 highlights its potential to identify upcoming hydrometeorological extremes, which may prove useful to the user and forecasting communities.
Rate/state Coulomb stress transfer model for the CSEP Japan seismicity forecast
NASA Astrophysics Data System (ADS)
Toda, Shinji; Enescu, Bogdan
2011-03-01
Numerous studies retrospectively found that seismicity rate jumps (drops) by coseismic Coulomb stress increase (decrease). The Collaboratory for the Study of Earthquake Prediction (CSEP) instead provides us an opportunity for prospective testing of the Coulomb hypothesis. Here we adapt our stress transfer model incorporating rate and state dependent friction law to the CSEP Japan seismicity forecast. We demonstrate how to compute the forecast rates of large shocks in 2009 using the large earthquakes during the past 120 years. The time dependent impact of the coseismic stress perturbations explains qualitatively well the occurrence of the recent moderate size shocks. Such ability is partly similar to that of statistical earthquake clustering models. However, our model differs from them as follows: the off-fault aftershock zones can be simulated using finite fault sources; the regional areal patterns of triggered seismicity are modified by the dominant mechanisms of the potential sources; the imparted stresses due to large earthquakes produce stress shadows that lead to a reduction of the forecasted number of earthquakes. Although the model relies on several unknown parameters, it is the first physics based model submitted to the CSEP Japan test center and has the potential to be tuned for short-term earthquake forecasts.
Liu, Fengchen; Porco, Travis C.; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K.; Bailey, Robin L.; Keenan, Jeremy D.; Solomon, Anthony W.; Emerson, Paul M.; Gambhir, Manoj; Lietman, Thomas M.
2015-01-01
Background Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. Methods The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts’ opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon’s signed-rank statistic. Findings Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher’s information. Each individual expert’s forecast was poorer than the sum of experts. Interpretation Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. PMID:26302380
Liu, Fengchen; Porco, Travis C; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K; Bailey, Robin L; Keenan, Jeremy D; Solomon, Anthony W; Emerson, Paul M; Gambhir, Manoj; Lietman, Thomas M
2015-08-01
Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts' opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon's signed-rank statistic. Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher's information. Each individual expert's forecast was poorer than the sum of experts. Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. Clinicaltrials.gov NCT00792922.
Exploratory studies into seasonal flow forecasting potential for large lakes
NASA Astrophysics Data System (ADS)
Sene, Kevin; Tych, Wlodek; Beven, Keith
2018-01-01
In seasonal flow forecasting applications, one factor which can help predictability is a significant hydrological response time between rainfall and flows. On account of storage influences, large lakes therefore provide a useful test case although, due to the spatial scales involved, there are a number of modelling challenges related to data availability and understanding the individual components in the water balance. Here some possible model structures are investigated using a range of stochastic regression and transfer function techniques with additional insights gained from simple analytical approximations. The methods were evaluated using records for two of the largest lakes in the world - Lake Malawi and Lake Victoria - with forecast skill demonstrated several months ahead using water balance models formulated in terms of net inflows. In both cases slight improvements were obtained for lead times up to 4-5 months from including climate indices in the data assimilation component. The paper concludes with a discussion of the relevance of the results to operational flow forecasting systems for other large lakes.
Weather assessment and forecasting
NASA Technical Reports Server (NTRS)
1977-01-01
Data management program activities centered around the analyses of selected far-term Office of Applications (OA) objectives, with the intent of determining if significant data-related problems would be encountered and if so what alternative solutions would be possible. Three far-term (1985 and beyond) OA objectives selected for analyses as having potential significant data problems were large-scale weather forecasting, local weather and severe storms forecasting, and global marine weather forecasting. An overview of general weather forecasting activities and their implications upon the ground based data system is provided. Selected topics were specifically oriented to the use of satellites.
Predictability of Bristol Bay, Alaska, sockeye salmon returns one to four years in the future
Adkison, Milo D.; Peterson, R.M.
2000-01-01
Historically, forecast error for returns of sockeye salmon Oncorhynchus nerka to Bristol Bay, Alaska, has been large. Using cross-validation forecast error as our criterion, we selected forecast models for each of the nine principal Bristol Bay drainages. Competing forecast models included stock-recruitment relationships, environmental variables, prior returns of siblings, or combinations of these predictors. For most stocks, we found prior returns of siblings to be the best single predictor of returns; however, forecast accuracy was low even when multiple predictors were considered. For a typical drainage, an 80% confidence interval ranged from one half to double the point forecast. These confidence intervals appeared to be appropriately wide.
Accuracy of 24- and 48-Hour Forecasts of Haines' Index
Brian E. Potter; Jonathan E. Martin
2001-01-01
The University of Wisconsin-Madison produces Web-accessible, 24- and 48-hour forecasts of the Haines Index (a tool used to measure the atmospheric potential for large wildfire development) for most of North America using its nonhydrostatic modeling system. The authors examined the accuracy of these forecasts using data from 1999 and 2000. Measures used include root-...
Probabilistic Forecasting of Arctic Sea Ice Extent
NASA Astrophysics Data System (ADS)
Slater, A. G.
2013-12-01
Sea ice in the Arctic is changing rapidly. Most noticeable has been the series of record, or near-record, annual minimums in sea ice extent in the past six years. The changing regime of sea ice has prompted much interest in seasonal prediction of sea ice extent, particularly as opportunities for Arctic shipping and resource exploration or extraction increase. This study presents a daily sea ice extent probabilistic forecast method with a 50-day lead time. A base projection is made from historical data and near-real-time sea ice concentration is assimilated on the issue date of the forecast. When considering the September mean ice extent for the period 1995-2012, the performance of the 50-day lead time forecast is very good: correlation=0.94, Bias = 0.14 ×106 km^2 and RMSE = 0.36 ×106 km^2. Forecasts for the daily minimum contains equal skill levels. The system is highly competitive with any of the SEARCH Sea Ice Outlook estimates. The primary finding of this study is that large amounts of forecast skill can be gained from knowledge of the initial conditions of concentration (perhaps more than previously thought). Given the simplicity of the forecast model, improved skill should be available from system refinement and with suitable proxies for large scale atmosphere and ocean circulation.
NASA Astrophysics Data System (ADS)
Yin, Yip Chee; Hock-Eam, Lim
2012-09-01
Our empirical results show that we can predict GDP growth rate more accurately in continent with fewer large economies, compared to smaller economies like Malaysia. This difficulty is very likely positively correlated with subsidy or social security policies. The stage of economic development and level of competiveness also appears to have interactive effects on this forecast stability. These results are generally independent of the forecasting procedures. Countries with high stability in their economic growth, forecasting by model selection is better than model averaging. Overall forecast weight averaging (FWA) is a better forecasting procedure in most countries. FWA also outperforms simple model averaging (SMA) and has the same forecasting ability as Bayesian model averaging (BMA) in almost all countries.
Using ensembles in water management: forecasting dry and wet episodes
NASA Astrophysics Data System (ADS)
van het Schip-Haverkamp, Tessa; van den Berg, Wim; van de Beek, Remco
2015-04-01
Extreme weather situations as droughts and extensive precipitation are becoming more frequent, which makes it more important to obtain accurate weather forecasts for the short and long term. Ensembles can provide a solution in terms of scenario forecasts. MeteoGroup uses ensembles in a new forecasting technique which presents a number of weather scenarios for a dynamical water management project, called Water-Rijk, in which water storage and water retention plays a large role. The Water-Rijk is part of Park Lingezegen, which is located between Arnhem and Nijmegen in the Netherlands. In collaboration with the University of Wageningen, Alterra and Eijkelkamp a forecasting system is developed for this area which can provide water boards with a number of weather and hydrology scenarios in order to assist in the decision whether or not water retention or water storage is necessary in the near future. In order to make a forecast for drought and extensive precipitation, the difference 'precipitation- evaporation' is used as a measurement of drought in the weather forecasts. In case of an upcoming drought this difference will take larger negative values. In case of a wet episode, this difference will be positive. The Makkink potential evaporation is used which gives the most accurate potential evaporation values during the summer, when evaporation plays an important role in the availability of surface water. Scenarios are determined by reducing the large number of forecasts in the ensemble to a number of averaged members with each its own likelihood of occurrence. For the Water-Rijk project 5 scenario forecasts are calculated: extreme dry, dry, normal, wet and extreme wet. These scenarios are constructed for two forecasting periods, each using its own ensemble technique: up to 48 hours ahead and up to 15 days ahead. The 48-hour forecast uses an ensemble constructed from forecasts of multiple high-resolution regional models: UKMO's Euro4 model,the ECMWF model, WRF and Hirlam. Using multiple model runs and additional post processing, an ensemble can be created from non-ensemble models. The 15-day forecast uses the ECMWF Ensemble Prediction System forecast from which scenarios can be deduced directly. A combination of the ensembles from the two forecasting periods is used in order to have the highest possible resolution of the forecast for the first 48 hours followed by the lower resolution long term forecast.
NASA Astrophysics Data System (ADS)
Zhou, Qunfei
First-principles calculations based on quantum mechanics have been proved to be powerful for accurately regenerating experimental results, uncovering underlying myths of experimental phenomena, and accelerating the design of innovative materials. This work has been motivated by the demand to design next-generation thermionic emitting cathodes and techniques to allow for synthesis of photo-responsive polymers on complex surfaces with controlled thickness and patterns. For Os-coated tungsten thermionic dispenser cathodes, we used first-principles methods to explore the bulk and surface properties of W-Os alloys in order to explain the previously observed experimental phenomena that thermionic emission varies significantly with W-Os alloy composition. Meanwhile, we have developed a new quantum mechanical approach to quantitatively predict the thermionic emission current density from materials perspective without any semi-empirical approximations or complicated analytical models, which leads to better understanding of thermionic emission mechanism. The methods from this work could be used to accelerate the design of next-generation thermionic cathodes. For photoresponsive materials, we designed a novel type of azobenzene-containing monomer for light-mediated ring-opening metathesis polymerization (ROMP) toward the fabrication of patterned, photo-responsive polymers by controlling ring strain energy (RSE) of the monomer that drives ROMP. This allows for unprecedented remote, noninvasive, instantaneous spatial and temporal control of photo-responsive polymer deposition on complex surfaces.This work on the above two different materials systems showed the power of quantum mechanical calculations on predicting, understanding and discovering the structures and properties of both known and unknown materials in a fast, efficient and reliable way.
1994-09-09
KENNEDY SPACE CENTER, FLA. - The turbulent weather common to a Florida afternoon in the summer subsides into a serene canopy of cornflower blue, and a manmade "bird" takes flight. The Space Shuttle Discovery soars skyward from Launch Pad 39B on Mission STS-64 at 6:22:35 p.m. EDT, Sept. 9. On board are a crew of six: Commander Richard N. Richards; Pilot L. Blaine Hammond Jr.; and Mission Specialists Mark C. Lee, Carl J. Meade, Susan J. Helms and Dr. J.M. Linenger. Payloads for the flight include the Lidar In-Space Technology Experiment (LITE), the Shuttle Pointed Autonomous Research Tool for Astronomy-201 (SPARTAN-201) and the Robot Operated Material Processing System (ROMPS). Mission Specialists Lee and Meade also are scheduled to perform an extravehicular activity during the 64th Shuttle mission.
NASA Technical Reports Server (NTRS)
Kalnay, Eugenia; Dalcher, Amnon
1987-01-01
It is shown that it is possible to predict the skill of numerical weather forecasts - a quantity which is variable from day to day and region to region. This has been accomplished using as predictor the dispersion (measured by the average correlation) between members of an ensemble of forecasts started from five different analyses. The analyses had been previously derived for satellite-data-impact studies and included, in the Northern Hemisphere, moderate perturbations associated with the use of different observing systems. When the Northern Hemisphere was used as a verification region, the prediction of skill was rather poor. This is due to the fact that such a large area usually contains regions with excellent forecasts as well as regions with poor forecasts, and does not allow for discrimination between them. However, when regional verifications were used, the ensemble forecast dispersion provided a very good prediction of the quality of the individual forecasts.
European Wintertime Windstorms and its Links to Large-Scale Variability Modes
NASA Astrophysics Data System (ADS)
Befort, D. J.; Wild, S.; Walz, M. A.; Knight, J. R.; Lockwood, J. F.; Thornton, H. E.; Hermanson, L.; Bett, P.; Weisheimer, A.; Leckebusch, G. C.
2017-12-01
Winter storms associated with extreme wind speeds and heavy precipitation are the most costly natural hazard in several European countries. Improved understanding and seasonal forecast skill of winter storms will thus help society, policy-makers and (re-) insurance industry to be better prepared for such events. We firstly assess the ability to represent extra-tropical windstorms over the Northern Hemisphere of three seasonal forecast ensemble suites: ECMWF System3, ECMWF System4 and GloSea5. Our results show significant skill for inter-annual variability of windstorm frequency over parts of Europe in two of these forecast suites (ECMWF-S4 and GloSea5) indicating the potential use of current seasonal forecast systems. In a regression model we further derive windstorm variability using the forecasted NAO from the seasonal model suites thus estimating the suitability of the NAO as the only predictor. We find that the NAO as the main large-scale mode over Europe can explain some of the achieved skill and is therefore an important source of variability in the seasonal models. However, our results show that the regression model fails to reproduce the skill level of the directly forecast windstorm frequency over large areas of central Europe. This suggests that the seasonal models also capture other sources of variability/predictability of windstorms than the NAO. In order to investigate which other large-scale variability modes steer the interannual variability of windstorms we develop a statistical model using a Poisson GLM. We find that the Scandinavian Pattern (SCA) in fact explains a larger amount of variability for Central Europe during the 20th century than the NAO. This statistical model is able to skilfully reproduce the interannual variability of windstorm frequency especially for the British Isles and Central Europe with correlations up to 0.8.
NASA Astrophysics Data System (ADS)
Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles
2017-04-01
An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in France (major spring floods in June 2016 on the Loire river tributaries and flash floods in fall 2016) will be shown and discussed. References Bourgin, F. (2014). How to assess the predictive uncertainty in hydrological modelling? An exploratory work on a large sample of watersheds, AgroParisTech Wang, Q. J., Shrestha, D. L., Robertson, D. E. and Pokhrel, P (2012). A log-sinh transformation for data normalization and variance stabilization. Water Resources Research, , W05514, doi:10.1029/2011WR010973
Non-seismic tsunamis: filling the forecast gap
NASA Astrophysics Data System (ADS)
Moore, C. W.; Titov, V. V.; Spillane, M. C.
2015-12-01
Earthquakes are the generation mechanism in over 85% of tsunamis. However, non-seismic tsunamis, including those generated by meteorological events, landslides, volcanoes, and asteroid impacts, can inundate significant area and have a large far-field effect. The current National Oceanographic and Atmospheric Administration (NOAA) tsunami forecast system falls short in detecting these phenomena. This study attempts to classify the range of effects possible from these non-seismic threats, and to investigate detection methods appropriate for use in a forecast system. Typical observation platforms are assessed, including DART bottom pressure recorders and tide gauges. Other detection paths include atmospheric pressure anomaly algorithms for detecting meteotsunamis and the early identification of asteroids large enough to produce a regional hazard. Real-time assessment of observations for forecast use can provide guidance to mitigate the effects of a non-seismic tsunami.
ERIC Educational Resources Information Center
Richards, R.; Reeder, A. I.; Bulliard, J.-L.
2004-01-01
Melanoma and skin cancer are largely attributable to over-exposure to solar ultraviolet radiation (UVR). Reports of UVR levels within media weather forecasts appear to be well received by the public and have good potential to communicate the need for appropriate sun protection to a broad audience. This study describes provision of UVR messages by…
NASA Astrophysics Data System (ADS)
Subramanian, A. C.; Lavers, D.; Matsueda, M.; Shukla, S.; Cayan, D. R.; Ralph, M.
2017-12-01
Atmospheric rivers (ARs) - elongated plumes of intense moisture transport - are a primary source of hydrological extremes, water resources and impactful weather along the West Coast of North America and Europe. There is strong demand in the water management, societal infrastructure and humanitarian sectors for reliable sub-seasonal forecasts, particularly of extreme events, such as floods and droughts so that actions to mitigate disastrous impacts can be taken with sufficient lead-time. Many recent studies have shown that ARs in the Pacific and the Atlantic are modulated by large-scale modes of climate variability. Leveraging the improved understanding of how these large-scale climate modes modulate the ARs in these two basins, we use the state-of-the-art multi-model forecast systems such as the North American Multi-Model Ensemble (NMME) and the Subseasonal-to-Seasonal (S2S) database to help inform and assess the probabilistic prediction of ARs and related extreme weather events over the North American and European West Coasts. We will present results from evaluating probabilistic forecasts of extreme precipitation and AR activity at the sub-seasonal scale. In particular, results from the comparison of two winters (2015-16 and 2016-17) will be shown, winters which defied canonical El Niño teleconnection patterns over North America and Europe. We further extend this study to analyze probabilistic forecast skill of AR events in these two basins and the variability in forecast skill during certain regimes of large-scale climate modes.
Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Forecasts
Harris, Ruth A.
1998-01-01
The magnitude (Mw) 6.9 Loma Prieta earthquake struck the San Francisco Bay region of central California at 5:04 p.m. P.d.t. on October 17, 1989, killing 62 people and generating billions of dollars in property damage. Scientists were not surprised by the occurrence of a destructive earthquake in this region and had, in fact, been attempting to forecast the location of the next large earthquake in the San Francisco Bay region for decades. This paper summarizes more than 20 scientifically based forecasts made before the 1989 Loma Prieta earthquake for a large earthquake that might occur in the Loma Prieta area. The forecasts geographically closest to the actual earthquake primarily consisted of right-lateral strike-slip motion on the San Andreas Fault northwest of San Juan Bautista. Several of the forecasts did encompass the magnitude of the actual earthquake, and at least one approximately encompassed the along-strike rupture length. The 1989 Loma Prieta earthquake differed from most of the forecasted events in two ways: (1) it occurred with considerable dip-slip in addition to strike-slip motion, and (2) it was much deeper than expected.
Impact of Reservoir Operation to the Inflow Flood - a Case Study of Xinfengjiang Reservoir
NASA Astrophysics Data System (ADS)
Chen, L.
2017-12-01
Building of reservoir shall impact the runoff production and routing characteristics, and changes the flood formation. This impact, called as reservoir flood effect, could be divided into three parts, including routing effect, volume effect and peak flow effect, and must be evaluated in a whole by using hydrological model. After analyzing the reservoir flood formation, the Liuxihe Model for reservoir flood forecasting is proposed. The Xinfengjiang Reservoir is studied as a case. Results show that the routing effect makes peak flow appear 4 to 6 hours in advance, volume effect is bigger for large flood than small one, and when rainfall focus on the reservoir area, this effect also increases peak flow largely, peak flow effect makes peak flow increase 6.63% to 8.95%. Reservoir flood effect is obvious, which have significant impact to reservoir flood. If this effect is not considered in the flood forecasting model, the flood could not be forecasted accurately, particularly the peak flow. Liuxihe Model proposed for Xinfengjiang Reservoir flood forecasting has a good performance, and could be used for real-time flood forecasting of Xinfengjiang Reservoir.Key words: Reservoir flood effect, reservoir flood forecasting, physically based distributed hydrological model, Liuxihe Model, parameter optimization
Uncertainties in Forecasting Streamflow using Entropy Theory
NASA Astrophysics Data System (ADS)
Cui, H.; Singh, V. P.
2017-12-01
Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.
Utilizing Climate Forecasts for Improving Water and Power Systems Coordination
NASA Astrophysics Data System (ADS)
Arumugam, S.; Queiroz, A.; Patskoski, J.; Mahinthakumar, K.; DeCarolis, J.
2016-12-01
Climate forecasts, typically monthly-to-seasonal precipitation forecasts, are commonly used to develop streamflow forecasts for improving reservoir management. Irrespective of their high skill in forecasting, temperature forecasts in developing power demand forecasts are not often considered along with streamflow forecasts for improving water and power systems coordination. In this study, we consider a prototype system to analyze the utility of climate forecasts, both precipitation and temperature, for improving water and power systems coordination. The prototype system, a unit-commitment model that schedules power generation from various sources, is considered and its performance is compared with an energy system model having an equivalent reservoir representation. Different skill sets of streamflow forecasts and power demand forecasts are forced on both water and power systems representations for understanding the level of model complexity required for utilizing monthly-to-seasonal climate forecasts to improve coordination between these two systems. The analyses also identify various decision-making strategies - forward purchasing of fuel stocks, scheduled maintenance of various power systems and tradeoff on water appropriation between hydropower and other uses - in the context of various water and power systems configurations. Potential application of such analyses for integrating large power systems with multiple river basins is also discussed.
NASA Astrophysics Data System (ADS)
Owens, Mathew J.; Riley, Pete
2017-11-01
Long lead-time space-weather forecasting requires accurate prediction of the near-Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near-Sun solar wind and magnetic field conditions provide the inner boundary condition to three-dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics-based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near-Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near-Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near-Sun solar wind speed at a range of latitudes about the sub-Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun-Earth line. Propagating these conditions to Earth by a three-dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one-dimensional "upwind" scheme is used. The variance in the resulting near-Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996-2016, the upwind ensemble is found to provide a more "actionable" forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large).
Owens, Mathew J; Riley, Pete
2017-11-01
Long lead-time space-weather forecasting requires accurate prediction of the near-Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near-Sun solar wind and magnetic field conditions provide the inner boundary condition to three-dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics-based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near-Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near-Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near-Sun solar wind speed at a range of latitudes about the sub-Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun-Earth line. Propagating these conditions to Earth by a three-dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one-dimensional "upwind" scheme is used. The variance in the resulting near-Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996-2016, the upwind ensemble is found to provide a more "actionable" forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large).
Riley, Pete
2017-01-01
Abstract Long lead‐time space‐weather forecasting requires accurate prediction of the near‐Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near‐Sun solar wind and magnetic field conditions provide the inner boundary condition to three‐dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics‐based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near‐Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near‐Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near‐Sun solar wind speed at a range of latitudes about the sub‐Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun‐Earth line. Propagating these conditions to Earth by a three‐dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one‐dimensional “upwind” scheme is used. The variance in the resulting near‐Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996–2016, the upwind ensemble is found to provide a more “actionable” forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large). PMID:29398982
A channel dynamics model for real-time flood forecasting
Hoos, Anne B.; Koussis, Antonis D.; Beale, Guy O.
1989-01-01
A new channel dynamics scheme (alternative system predictor in real time (ASPIRE)), designed specifically for real-time river flow forecasting, is introduced to reduce uncertainty in the forecast. ASPIRE is a storage routing model that limits the influence of catchment model forecast errors to the downstream station closest to the catchment. Comparisons with the Muskingum routing scheme in field tests suggest that the ASPIRE scheme can provide more accurate forecasts, probably because discharge observations are used to a maximum advantage and routing reaches (and model errors in each reach) are uncoupled. Using ASPIRE in conjunction with the Kalman filter did not improve forecast accuracy relative to a deterministic updating procedure. Theoretical analysis suggests that this is due to a large process noise to measurement noise ratio.
Activities of the Japanese space weather forecast center at Communications Research Laboratory.
Watari, Shinichi; Tomita, Fumihiko
2002-12-01
The International Space Environment Service (ISES) is an international organization for space weather forecasts and belongs to the International Union of Radio Science (URSI). There are eleven ISES forecast centers in the world, and Communications Research Laboratory (CRL) runs the Japanese one. We make forecasts on the space environment and deliver them over the phones and through the Internet. Our forecasts could be useful for human activities in space. Currently solar activity is near maximum phase of the solar cycle 23. We report the several large disturbances of space environment occurred in 2001, during which low-latitude auroras were observed several times in Japan.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anghileri, Daniela; Voisin, Nathalie; Castelletti, Andrea F.
In this study, we develop a forecast-based adaptive control framework for Oroville reservoir, California, to assess the value of seasonal and inter-annual forecasts for reservoir operation.We use an Ensemble Streamflow Prediction (ESP) approach to generate retrospective, one-year-long streamflow forecasts based on the Variable Infiltration Capacity hydrology model. The optimal sequence of daily release decisions from the reservoir is then determined by Model Predictive Control, a flexible and adaptive optimization scheme.We assess the forecast value by comparing system performance based on the ESP forecasts with that based on climatology and a perfect forecast. In addition, we evaluate system performance based onmore » a synthetic forecast, which is designed to isolate the contribution of seasonal and inter-annual forecast skill to the overall value of the ESP forecasts.Using the same ESP forecasts, we generalize our results by evaluating forecast value as a function of forecast skill, reservoir features, and demand. Our results show that perfect forecasts are valuable when the water demand is high and the reservoir is sufficiently large to allow for annual carry-over. Conversely, ESP forecast value is highest when the reservoir can shift water on a seasonal basis.On average, for the system evaluated here, the overall ESP value is 35% less than the perfect forecast value. The inter-annual component of the ESP forecast contributes 20-60% of the total forecast value. Improvements in the seasonal component of the ESP forecast would increase the overall ESP forecast value between 15 and 20%.« less
NASA Astrophysics Data System (ADS)
Bosart, L. F.; Wallace, B. C.
2017-12-01
Two high-impact convective storm forecast challenges occurred between 17-20 May 2016 during NOAA's Hazardous Weather Testbed Spring Forecast Experiment (SFE) at the Storm Prediction Center. The first forecast challenge was 286 mm of unexpected record-breaking rain that fell on Vero Beach (VRB), Florida, between 1500 UTC 17 May and 0600 UTC 18 May, more than doubling the previous May daily rainfall record. The record rains in VRB occurred subsequent to the formation of a massive MCS over the central Gulf of Mexico between 0900-1000 UTC 17 May. This MCS, linked to the earlier convection associated with an anomalously strong subtropical jet (STJ) over the Gulf of Mexico, moved east-northeastward toward Florida. The second forecast challenge was a large MCS that formed over the Mexican mountains near the Texas-Mexican border, moved eastward and grew upscale prior to 1200 UTC 19 May. This MCS further strengthened offshore after 1800 UTC 19 May beneath the STJ. SPC SFE participants expected this MCS to move east-northeastward and bring heavy rain due to training echoes along the Gulf coast as far eastward as the Florida panhandle. Instead, this MCS transitioned into a bowing MCS that resembled a low-end derecho and produced a 4-6 hPa cold pool with widespread surface wind gusts between 35-50 kt. Both MCS events occurred in a large-scale baroclinic environment along the northern Gulf coast. Both MCS events responded to antecedent convection within this favorable large-scale environment. Rainfall amounts with the first heavy rain-producing MCS were severely underestimated by models and forecasters alike. The second MCS produced the greatest forecaster angst because rainfall totals were forecast too high (MCS propagated too fast) and severe wind reports were much more widespread than anticipated (because of cold pool formation). This presentation will attempt to untangle what happened and why it happened.
Interactive Forecasting with the National Weather Service River Forecast System
NASA Technical Reports Server (NTRS)
Smith, George F.; Page, Donna
1993-01-01
The National Weather Service River Forecast System (NWSRFS) consists of several major hydrometeorologic subcomponents to model the physics of the flow of water through the hydrologic cycle. The entire NWSRFS currently runs in both mainframe and minicomputer environments, using command oriented text input to control the system computations. As computationally powerful and graphically sophisticated scientific workstations became available, the National Weather Service (NWS) recognized that a graphically based, interactive environment would enhance the accuracy and timeliness of NWS river and flood forecasts. Consequently, the operational forecasting portion of the NWSRFS has been ported to run under a UNIX operating system, with X windows as the display environment on a system of networked scientific workstations. In addition, the NWSRFS Interactive Forecast Program was developed to provide a graphical user interface to allow the forecaster to control NWSRFS program flow and to make adjustments to forecasts as necessary. The potential market for water resources forecasting is immense and largely untapped. Any private company able to market the river forecasting technologies currently developed by the NWS Office of Hydrology could provide benefits to many information users and profit from providing these services.
Species-area relationships and extinction forecasts.
Halley, John M; Sgardeli, Vasiliki; Monokrousos, Nikolaos
2013-05-01
The species-area relationship (SAR) predicts that smaller areas contain fewer species. This is the basis of the SAR method that has been used to forecast large numbers of species committed to extinction every year due to deforestation. The method has a number of issues that must be handled with care to avoid error. These include the functional form of the SAR, the choice of equation parameters, the sampling procedure used, extinction debt, and forest regeneration. Concerns about the accuracy of the SAR technique often cite errors not much larger than the natural scatter of the SAR itself. Such errors do not undermine the credibility of forecasts predicting large numbers of extinctions, although they may be a serious obstacle in other SAR applications. Very large errors can arise from misinterpretation of extinction debt, inappropriate functional form, and ignoring forest regeneration. Major challenges remain to understand better the relationship between sampling protocol and the functional form of SARs and the dynamics of relaxation, especially in continental areas, and to widen the testing of extinction forecasts. © 2013 New York Academy of Sciences.
Web-Based Real Time Earthquake Forecasting and Personal Risk Management
NASA Astrophysics Data System (ADS)
Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.
2012-12-01
Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and ROC tests allow us to judge data completeness and estimate error. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges and pitfalls in serving up these datasets over the web.
Forecasting drought risks for a water supply storage system using bootstrap position analysis
Tasker, Gary; Dunne, Paul
1997-01-01
Forecasting the likelihood of drought conditions is an integral part of managing a water supply storage and delivery system. Position analysis uses a large number of possible flow sequences as inputs to a simulation of a water supply storage and delivery system. For a given set of operating rules and water use requirements, water managers can use such a model to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows a few months ahead conditioned on the current reservoir levels and streamflows. The large number of possible flow sequences are generated using a stochastic streamflow model with a random resampling of innovations. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality and it allows incorporation of long-range weather forecasts into the analysis.
Evaluation of precipitation nowcasting techniques for the Alpine region
NASA Astrophysics Data System (ADS)
Panziera, L.; Mandapaka, P.; Atencia, A.; Hering, A.; Germann, U.; Gabella, M.; Buzzi, M.
2010-09-01
This study presents a large sample evaluation of different nowcasting systems over the Southern Swiss Alps. Radar observations are taken as a reference against which to assess the performance of the following short-term quantitative precipitation forecasting methods: -Eulerian persistence: the current radar image is taken as forecast. -Lagrangian persistence: precipitation patterns are advected following the field of storm motion (the MAPLE algorithm is used). -NORA: novel nowcasting system which exploits the presence of the orographic forcing; by comparing meteorological predictors estimated in real-time with those from the large historical data set, the events with the highest resemblance are picked to produce the forecast. -COSMO2, the limited area numerical model operationally used at MeteoSwiss -Blending of the aforementioned nowcasting tools precipitation forecasts. The investigation is aimed to set up a probabilistic radar rainfall runoff model experiment for steep Alpine catchments as part of the European research project IMPRINTS.
Arctic sea ice trends, variability and implications for seasonal ice forecasting
Serreze, Mark C.; Stroeve, Julienne
2015-01-01
September Arctic sea ice extent over the period of satellite observations has a strong downward trend, accompanied by pronounced interannual variability with a detrended 1 year lag autocorrelation of essentially zero. We argue that through a combination of thinning and associated processes related to a warming climate (a stronger albedo feedback, a longer melt season, the lack of especially cold winters) the downward trend itself is steepening. The lack of autocorrelation manifests both the inherent large variability in summer atmospheric circulation patterns and that oceanic heat loss in winter acts as a negative (stabilizing) feedback, albeit insufficient to counter the steepening trend. These findings have implications for seasonal ice forecasting. In particular, while advances in observing sea ice thickness and assimilating thickness into coupled forecast systems have improved forecast skill, there remains an inherent limit to predictability owing to the largely chaotic nature of atmospheric variability. PMID:26032315
Forecasting wildland fire behavior using high-resolution large-eddy simulations
NASA Astrophysics Data System (ADS)
Munoz-Esparza, D.; Kosovic, B.; Jimenez, P. A.; Anderson, A.; DeCastro, A.; Brown, B.
2016-12-01
Wildland fires are responsible for large socio-economic impacts. Fires affect the environment, damage structures, threaten lives, cause health issues, and involve large suppression costs. These impacts can be mitigated via accurate fire spread forecast to inform the incident management team. To this end, the state of Colorado is funding the development of the Colorado Fire Prediction System (CO-FPS). The system is based on the Weather Research and Forecasting (WRF) model enhanced with a fire behavior module (WRF-Fire). Realistic representation of wildland fire behavior requires explicit representation of small scale weather phenomena to properly account for coupled atmosphere-wildfire interactions. Moreover, transport and dispersion of biomass burning emissions from wildfires is controlled by turbulent processes in the atmospheric boundary layer, which are difficult to parameterize and typically lead to large errors when simplified source estimation and injection height methods are used. Therefore, we utilize turbulence-resolving large-eddy simulations at a resolution of 111 m to forecast fire spread and smoke distribution using a coupled atmosphere-wildfire model. This presentation will describe our improvements to the level-set based fire-spread algorithm in WRF-Fire and an evaluation of the operational system using 12 wildfire events that occurred in Colorado in 2016, as well as other historical fires. In addition, the benefits of explicit representation of turbulence for smoke transport and dispersion will be demonstrated.
Forecasting wildland fire behavior using high-resolution large-eddy simulations
NASA Astrophysics Data System (ADS)
Munoz-Esparza, D.; Kosovic, B.; Jimenez, P. A.; Anderson, A.; DeCastro, A.; Brown, B.
2017-12-01
Wildland fires are responsible for large socio-economic impacts. Fires affect the environment, damage structures, threaten lives, cause health issues, and involve large suppression costs. These impacts can be mitigated via accurate fire spread forecast to inform the incident management team. To this end, the state of Colorado is funding the development of the Colorado Fire Prediction System (CO-FPS). The system is based on the Weather Research and Forecasting (WRF) model enhanced with a fire behavior module (WRF-Fire). Realistic representation of wildland fire behavior requires explicit representation of small scale weather phenomena to properly account for coupled atmosphere-wildfire interactions. Moreover, transport and dispersion of biomass burning emissions from wildfires is controlled by turbulent processes in the atmospheric boundary layer, which are difficult to parameterize and typically lead to large errors when simplified source estimation and injection height methods are used. Therefore, we utilize turbulence-resolving large-eddy simulations at a resolution of 111 m to forecast fire spread and smoke distribution using a coupled atmosphere-wildfire model. This presentation will describe our improvements to the level-set based fire-spread algorithm in WRF-Fire and an evaluation of the operational system using 12 wildfire events that occurred in Colorado in 2016, as well as other historical fires. In addition, the benefits of explicit representation of turbulence for smoke transport and dispersion will be demonstrated.
NASA Astrophysics Data System (ADS)
Hidalgo-Muñoz, J. M.; Gámiz-Fortis, S. R.; Castro-Díez, Y.; Argüeso, D.; Esteban-Parra, M. J.
2015-05-01
Identifying the relationship between large-scale climate signals and seasonal streamflow may provide a valuable tool for long-range seasonal forecasting in regions under water stress, such as the Iberian Peninsula (IP). The skill of the main teleconnection indices as predictors of seasonal streamflow in the IP was evaluated. The streamflow database used was composed of 382 stations, covering the period 1975-2008. Predictions were made using a leave-one-out cross-validation approach based on multiple linear regression, combining Variance Inflation Factor and Stepwise Backward selection to avoid multicollinearity and select the best subset of predictors. Predictions were made for four forecasting scenarios, from one to four seasons in advance. The correlation coefficient (RHO), Root Mean Square Error Skill Score (RMSESS), and the Gerrity Skill Score (GSS) were used to evaluate the forecasting skill. For autumn streamflow, good forecasting skill (RHO>0.5, RMSESS>20%, GSS>0.4) was found for a third of the stations located in the Mediterranean Andalusian Basin, the North Atlantic Oscillation of the previous winter being the main predictor. Also, fair forecasting skill (RHO>0.44, RMSESS>10%, GSS>0.2) was found in stations in the northwestern IP (16 of these located in the Douro and Tagus Basins) with two seasons in advance. For winter streamflow, fair forecasting skill was found for one season in advance in 168 stations, with the Snow Advance Index as the main predictor. Finally, forecasting was poorer for spring streamflow than for autumn and winter, since only 16 stations showed fair forecasting skill in with one season in advance, particularly in north-western of IP.
NASA Astrophysics Data System (ADS)
Omi, Takahiro; Ogata, Yosihiko; Hirata, Yoshito; Aihara, Kazuyuki
2015-04-01
Because aftershock occurrences can cause significant seismic risks for a considerable time after the main shock, prospective forecasting of the intermediate-term aftershock activity as soon as possible is important. The epidemic-type aftershock sequence (ETAS) model with the maximum likelihood estimate effectively reproduces general aftershock activity including secondary or higher-order aftershocks and can be employed for the forecasting. However, because we cannot always expect the accurate parameter estimation from incomplete early aftershock data where many events are missing, such forecasting using only a single estimated parameter set (plug-in forecasting) can frequently perform poorly. Therefore, we here propose Bayesian forecasting that combines the forecasts by the ETAS model with various probable parameter sets given the data. By conducting forecasting tests of 1 month period aftershocks based on the first 1 day data after the main shock as an example of the early intermediate-term forecasting, we show that the Bayesian forecasting performs better than the plug-in forecasting on average in terms of the log-likelihood score. Furthermore, to improve forecasting of large aftershocks, we apply a nonparametric (NP) model using magnitude data during the learning period and compare its forecasting performance with that of the Gutenberg-Richter (G-R) formula. We show that the NP forecast performs better than the G-R formula in some cases but worse in other cases. Therefore, robust forecasting can be obtained by employing an ensemble forecast that combines the two complementary forecasts. Our proposed method is useful for a stable unbiased intermediate-term assessment of aftershock probabilities.
The Value, Protocols, and Scientific Ethics of Earthquake Forecasting
NASA Astrophysics Data System (ADS)
Jordan, Thomas H.
2013-04-01
Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should provide public sources of information on short-term probabilities that are authoritative, scientific, open, and timely. Alert procedures should be negotiated with end-users to facilitate decisions at different levels of society, based in part on objective analysis of costs and benefits but also on less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Unfortunately, in most countries, operational forecasting systems do not conform to such high standards, and earthquake scientists are often called upon to advise the public in roles that exceed their civic authority, expertise in risk communication, and situational knowledge. Certain ethical principles are well established; e.g., announcing unreliable predictions in public forums should be avoided, because bad information can be dangerous. But what are the professional responsibilities of earthquake scientists during seismic crises, especially when the public information through official channels is thought to be inadequate or incorrect? How much should these responsibilities be discounted in the face of personal liability? How should scientists contend with highly uncertain forecasts? To what degree should the public be involved in controversies about forecasting results? No simple answers to these questions can be offered, but the need for answers can be reduced by improving operational forecasting systems. This will require more substantial, and more trustful, collaborations between scientists, civil authorities, and public stakeholders.
Load Forecasting of Central Urban Area Power Grid Based on Saturated Load Density Index
NASA Astrophysics Data System (ADS)
Huping, Yang; Chengyi, Tang; Meng, Yu
2018-03-01
In the current society, coordination between urban power grid development and city development has become more and more prominent. Electricity saturated load forecasting plays an important role in the planning and development of power grids. Electricity saturated load forecasting is a new concept put forward by China in recent years in the field of grid planning. Urban saturation load forecast is different from the traditional load forecasting method for specific years, the time span of it often relatively large, and involves a wide range of aspects. This study takes a county in eastern Jiangxi as an example, this paper chooses a variety of load forecasting methods to carry on the recent load forecasting calculation to central urban area. At the same time, this paper uses load density index method to predict the Longterm load forecasting of electric saturation load of central urban area lasted until 2030. And further study shows the general distribution of the urban saturation load in space.
Wind power forecasting: IEA Wind Task 36 & future research issues
NASA Astrophysics Data System (ADS)
Giebel, G.; Cline, J.; Frank, H.; Shaw, W.; Pinson, P.; Hodge, B.-M.; Kariniotakis, G.; Madsen, J.; Möhrlen, C.
2016-09-01
This paper presents the new International Energy Agency Wind Task 36 on Forecasting, and invites to collaborate within the group. Wind power forecasts have been used operatively for over 20 years. Despite this fact, there are still several possibilities to improve the forecasts, both from the weather prediction side and from the usage of the forecasts. The new International Energy Agency (IEA) Task on Forecasting for Wind Energy tries to organise international collaboration, among national meteorological centres with an interest and/or large projects on wind forecast improvements (NOAA, DWD, MetOffice, met.no, DMI,...), operational forecaster and forecast users. The Task is divided in three work packages: Firstly, a collaboration on the improvement of the scientific basis for the wind predictions themselves. This includes numerical weather prediction model physics, but also widely distributed information on accessible datasets. Secondly, we will be aiming at an international pre-standard (an IEA Recommended Practice) on benchmarking and comparing wind power forecasts, including probabilistic forecasts. This WP will also organise benchmarks, in cooperation with the IEA Task WakeBench. Thirdly, we will be engaging end users aiming at dissemination of the best practice in the usage of wind power predictions. As first results, an overview of current issues for research in short-term forecasting of wind power is presented.
Forecasting surface-layer atmospheric parameters at the Large Binocular Telescope site
NASA Astrophysics Data System (ADS)
Turchi, Alessio; Masciadri, Elena; Fini, Luca
2017-04-01
In this paper, we quantify the performance of an automated weather forecast system implemented on the Large Binocular Telescope (LBT) site at Mt Graham (Arizona) in forecasting the main atmospheric parameters close to the ground. The system employs a mesoscale non-hydrostatic numerical model (Meso-Nh). To validate the model, we compare the forecasts of wind speed, wind direction, temperature and relative humidity close to the ground with the respective values measured by instrumentation installed on the telescope dome. The study is performed over a large sample of nights uniformly distributed over 2 yr. The quantitative analysis is done using classical statistical operators [bias, root-mean-square error (RMSE) and σ] and contingency tables, which allows us to extract complementary key information, such as the percentage of correct detections (PC) and the probability of obtaining a correct detection within a defined interval of values (POD). The results of our study indicate that the model performance in forecasting the atmospheric parameters we have just cited are very good, in some cases excellent: RMSE for temperature is below 1°C, for relative humidity it is 14 per cent and for the wind speed it is around 2.5 m s-1. The relative error of the RMSE for wind direction varies from 9 to 17 per cent depending on the wind speed conditions. This work is performed in the context of the ALTA (Advanced LBT Turbulence and Atmosphere) Center project, whose final goal is to provide forecasts of all the atmospheric parameters and the optical turbulence to support LBT observations, adaptive optics facilities and interferometric facilities.
NASA Astrophysics Data System (ADS)
Nauslar, Nicholas J.
This dissertation is comprised of three different papers that all pertain to wildland fire applications. The first paper performs a verification analysis on mixing height, transport winds, and Haines Index from National Weather Service spot forecasts across the United States. The final two papers, which are closely related, examine atmospheric and ecological drivers of wildfire for the Southwest Area (SWA) (Arizona, New Mexico, west Texas, and Oklahoma panhandle) to better equip operational fire meteorologists and managers to make informed decisions on wildfire potential in this region. The verification analysis here utilizes NWS spot forecasts of mixing height, transport winds and Haines Index from 2009-2013 issued for a location within 50 km of an upper sounding location and valid for the day of the fire event. Mixing height was calculated from the 0000 UTC sounding via the Stull, Holzworth, and Richardson methods. Transport wind speeds were determined by averaging the wind speed through the boundary layer as determined by the three mixing height methods from the 0000 UTC sounding. Haines Index was calculated at low, mid, and high elevation based on the elevation of the sounding and spot forecast locations. Mixing height forecasts exhibited large mean absolute errors and biased towards over forecasting. Forecasts of transport wind speeds and Haines Index outperformed mixing height forecasts with smaller errors relative to their respective means. The rainfall and lightning associated with the North American Monsoon (NAM) can vary greatly intra- and inter-annually and has a large impact on wildfire activity across the SWA by igniting or suppressing wildfires. NAM onset thresholds and subsequent dates are determined for the SWA and each Predictive Service Area (PSA), which are sub-regions used by operational fire meteorologists to predict wildfire potential within the SWA, April through September from 1995-2013. Various wildfire activity thresholds using the number of wildfires and large wildfires identified days or time periods with increased wildfire activity for each PSA and the SWA. Self-organizing maps utilizing 500 and 700 hPa geopotential heights and precipitable water were implemented to identify atmospheric patterns contributing to the NAM onset and busy days/periods for each PSA and the SWA. Resulting SOM map types also showed the transition to, during, and from the NAM. Northward and eastward displacements of the subtropical ridge (i.e., four-corners high) over the SWA were associated with NAM onset, and a suppressed subtropical ridge and breakdown of the subtropical ridge map types over the SWA were associated with increased wildfire activity. We implemented boosted regression trees (BRT) to model wildfire occurrence for all and large wildfires for different wildfire types (i.e., lightning, human) across the SWA by PSA. BRT models for all wildfires demonstrated relatively small mean and mean absolute errors and showed better predictability on days with wildfires. Cross-validated accuracy assessments for large wildfires demonstrated the ability to discriminate between large wildfire and non-large wildfire days across all wildfire types. Measurements describing fuel conditions (i.e., 100 and 1000-hour dead fuel moisture, energy release component) were the most important predictors when considering all wildfire types and sizes. However, a combination of fuels and atmospheric predictors (i.e., lightning, temperature) proved most predictive for large wildfire occurrence, and the number of relevant predictors increases for large wildfires indicating more conditions need to align to support large wildfires.
Comparison of Observation Impacts in Two Forecast Systems using Adjoint Methods
NASA Technical Reports Server (NTRS)
Gelaro, Ronald; Langland, Rolf; Todling, Ricardo
2009-01-01
An experiment is being conducted to compare directly the impact of all assimilated observations on short-range forecast errors in different operational forecast systems. We use the adjoint-based method developed by Langland and Baker (2004), which allows these impacts to be efficiently calculated. This presentation describes preliminary results for a "baseline" set of observations, including both satellite radiances and conventional observations, used by the Navy/NOGAPS and NASA/GEOS-5 forecast systems for the month of January 2007. In each system, about 65% of the total reduction in 24-h forecast error is provided by satellite observations, although the impact of rawinsonde, aircraft, land, and ship-based observations remains significant. Only a small majority (50- 55%) of all observations assimilated improves the forecast, while the rest degrade it. It is found that most of the total forecast error reduction comes from observations with moderate-size innovations providing small to moderate impacts, not from outliers with very large positive or negative innovations. In a global context, the relative impacts of the major observation types are fairly similar in each system, although regional differences in observation impact can be significant. Of particular interest is the fact that while satellite radiances have a large positive impact overall, they degrade the forecast in certain locations common to both systems, especially over land and ice surfaces. Ongoing comparisons of this type, with results expected from other operational centers, should lead to more robust conclusions about the impacts of the various components of the observing system as well as about the strengths and weaknesses of the methodologies used to assimilate them.
Predictability and possible earlier awareness of extreme precipitation across Europe
NASA Astrophysics Data System (ADS)
Lavers, David; Pappenberger, Florian; Richardson, David; Zsoter, Ervin
2017-04-01
Extreme hydrological events can cause large socioeconomic damages in Europe. In winter, a large proportion of these flood episodes are associated with atmospheric rivers, a region of intense water vapour transport within the warm sector of extratropical cyclones. When preparing for such extreme events, forecasts of precipitation from numerical weather prediction models or river discharge forecasts from hydrological models are generally used. Given the strong link between water vapour transport (integrated vapour transport IVT) and heavy precipitation, it is possible that IVT could be used to warn of extreme events. Furthermore, as IVT is located in extratropical cyclones, it is hypothesized to be a more predictable variable due to its link with synoptic-scale atmospheric dynamics. In this research, we firstly provide an overview of the predictability of IVT and precipitation forecasts, and secondly introduce and evaluate the ECMWF Extreme Forecast Index (EFI) for IVT. The EFI is a tool that has been developed to evaluate how ensemble forecasts differ from the model climate, thus revealing the extremeness of the forecast. The ability of the IVT EFI to capture extreme precipitation across Europe during winter 2013/14, 2014/15, and 2015/16 is presented. The results show that the IVT EFI is more capable than the precipitation EFI of identifying extreme precipitation in forecast week 2 during forecasts initialized in a positive North Atlantic Oscillation (NAO) phase. However, the precipitation EFI is superior during the negative NAO phase and at shorter lead times. An IVT EFI example is shown for storm Desmond in December 2015 highlighting its potential to identify upcoming hydrometeorological extremes.
Karin L. Riley; Crystal Stonesifer; Haiganoush Preisler; Dave Calkin
2014-01-01
Can fire potential forecasts assist with pre-positioning of fire suppression resources, which could result in a cost savings to the United States government? Here, we present a preliminary assessment of the 7-Day Fire Potential Outlook forecasts made by the Predictive Services program. We utilized historical fire occurrence data and archived forecasts to assess how...
Satellite Sounder Data Assimilation for Improving Alaska Region Weather Forecast
NASA Technical Reports Server (NTRS)
Zhu, Jiang; Stevens, E.; Zhang, X.; Zavodsky, B. T.; Heinrichs, T.; Broderson, D.
2014-01-01
A case study and monthly statistical analysis using sounder data assimilation to improve the Alaska regional weather forecast model are presented. Weather forecast in Alaska faces challenges as well as opportunities. Alaska has a large land with multiple types of topography and coastal area. Weather forecast models must be finely tuned in order to accurately predict weather in Alaska. Being in the high-latitudes provides Alaska greater coverage of polar orbiting satellites for integration into forecasting models than the lower 48. Forecasting marine low stratus clouds is critical to the Alaska aviation and oil industry and is the current focus of the case study. NASA AIRS/CrIS sounder profiles data are used to do data assimilation for the Alaska regional weather forecast model to improve Arctic marine stratus clouds forecast. Choosing physical options for the WRF model is discussed. Preprocess of AIRS/CrIS sounder data for data assimilation is described. Local observation data, satellite data, and global data assimilation data are used to verify and/or evaluate the forecast results by the MET tools Model Evaluation Tools (MET).
Extended Range Prediction of Indian Summer Monsoon: Current status
NASA Astrophysics Data System (ADS)
Sahai, A. K.; Abhilash, S.; Borah, N.; Joseph, S.; Chattopadhyay, R.; S, S.; Rajeevan, M.; Mandal, R.; Dey, A.
2014-12-01
The main focus of this study is to develop forecast consensus in the extended range prediction (ERP) of monsoon Intraseasonal oscillations using a suit of different variants of Climate Forecast system (CFS) model. In this CFS based Grand MME prediction system (CGMME), the ensemble members are generated by perturbing the initial condition and using different configurations of CFSv2. This is to address the role of different physical mechanisms known to have control on the error growth in the ERP in the 15-20 day time scale. The final formulation of CGMME is based on 21 ensembles of the standalone Global Forecast System (GFS) forced with bias corrected forecasted SST from CFS, 11 low resolution CFST126 and 11 high resolution CFST382. Thus, we develop the multi-model consensus forecast for the ERP of Indian summer monsoon (ISM) using a suite of different variants of CFS model. This coordinated international effort lead towards the development of specific tailor made regional forecast products over Indian region. Skill of deterministic and probabilistic categorical rainfall forecast as well the verification of large-scale low frequency monsoon intraseasonal oscillations has been carried out using hindcast from 2001-2012 during the monsoon season in which all models are initialized at every five days starting from 16May to 28 September. The skill of deterministic forecast from CGMME is better than the best participating single model ensemble configuration (SME). The CGMME approach is believed to quantify the uncertainty in both initial conditions and model formulation. Main improvement is attained in probabilistic forecast which is because of an increase in the ensemble spread, thereby reducing the error due to over-confident ensembles in a single model configuration. For probabilistic forecast, three tercile ranges are determined by ranking method based on the percentage of ensemble members from all the participating models falls in those three categories. CGMME further added value to both deterministic and probability forecast compared to raw SME's and this better skill is probably flows from large spread and improved spread-error relationship. CGMME system is currently capable of generating ER prediction in real time and successfully delivering its experimental operational ER forecast of ISM for the last few years.
Earthquake cycles and physical modeling of the process leading up to a large earthquake
NASA Astrophysics Data System (ADS)
Ohnaka, Mitiyasu
2004-08-01
A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.
Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Florita, Anthony R; Krishnan, Venkat K
Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced.more » The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less
Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Florita, Anthony R; Krishnan, Venkat K
2017-08-31
Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) ismore » analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less
Optimising seasonal streamflow forecast lead time for operational decision making in Australia
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Zhao, Tongtiegang; Wang, Q. J.; Zhou, Senlin; Feikema, Paul
2016-10-01
Statistical seasonal forecasts of 3-month streamflow totals are released in Australia by the Bureau of Meteorology and updated on a monthly basis. The forecasts are often released in the second week of the forecast period, due to the onerous forecast production process. The current service relies on models built using data for complete calendar months, meaning the forecast production process cannot begin until the first day of the forecast period. Somehow, the bureau needs to transition to a service that provides forecasts before the beginning of the forecast period; timelier forecast release will become critical as sub-seasonal (monthly) forecasts are developed. Increasing the forecast lead time to one month ahead is not considered a viable option for Australian catchments that typically lack any predictability associated with snowmelt. The bureau's forecasts are built around Bayesian joint probability models that have antecedent streamflow, rainfall and climate indices as predictors. In this study, we adapt the modelling approach so that forecasts have any number of days of lead time. Daily streamflow and sea surface temperatures are used to develop predictors based on 28-day sliding windows. Forecasts are produced for 23 forecast locations with 0-14- and 21-day lead time. The forecasts are assessed in terms of continuous ranked probability score (CRPS) skill score and reliability metrics. CRPS skill scores, on average, reduce monotonically with increase in days of lead time, although both positive and negative differences are observed. Considering only skilful forecast locations, CRPS skill scores at 7-day lead time are reduced on average by 4 percentage points, with differences largely contained within +5 to -15 percentage points. A flexible forecasting system that allows for any number of days of lead time could benefit Australian seasonal streamflow forecast users by allowing more time for forecasts to be disseminated, comprehended and made use of prior to the commencement of a forecast season. The system would allow for forecasts to be updated if necessary.
Trout Fryxell, R T; Steelman, C D; Szalanski, A L; Billingsley, P M; Williamson, P C
2015-05-01
Rocky Mountain spotted fever (RMSF), caused by the etiological agent Rickettsia rickettsii, is the most severe and frequently reported rickettsial illness in the United States, and is commonly diagnosed throughout the southeast. With the discoveries of Rickettsia parkeri and other spotted fever group rickettsiae (SFGR) in ticks, it remains inconclusive if the cases reported as RMSF are truly caused by R. rickettsii or other SFGR. Arkansas reports one of the highest incidence rates of RMSF in the country; consequently, to identify the rickettsiae in Arkansas, 1,731 ticks, 250 white-tailed deer, and 189 canines were screened by polymerase chain reaction (PCR) for the rickettsial genes gltA, rompB, and ompA. None of the white-tailed deer were positive, while two of the canines (1.1%) and 502 (29.0%) of the ticks were PCR positive. Five different tick species were PCR positive: 244 (37%) Amblyomma americanum L., 130 (38%) Ixodes scapularis Say, 65 (39%) Amblyomma maculatum (Koch), 30 (9%) Rhipicephalus sanguineus Latreille, 7 (4%) Dermacentor variabilis Say, and 26 (44%) unidentified Amblyomma ticks. None of the sequenced products were homologous to R. rickettsii. The most common Rickettsia via rompB amplification was Rickettsia montanensis and nonpathogenic Candidatus Rickettsia amblyommii, whereas with ompA amplification the most common Rickettsia was Ca. R. amblyommii. Many tick specimens collected in northwest Arkansas were PCR positive and these were commonly A. americanum harboring Ca. R. amblyommii, a currently nonpathogenic Rickettsia. Data reported here indicate that pathogenic R. rickettsii was absent from these ticks and suggest by extension that other SFGR are likely the causative agents for Arkansas diagnosed RMSF cases. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Florida Model Information eXchange System (MIXS).
DOT National Transportation Integrated Search
2013-08-01
Transportation planning largely relies on travel demand forecasting, which estimates the number and type of vehicles that will use a roadway at some point in the future. Forecasting estimates are made by computer models that use a wide variety of dat...
NASA Astrophysics Data System (ADS)
Bouya, Zahra; Terkildsen, Michael
2016-07-01
The Australian Space Forecast Centre (ASFC) provides space weather forecasts to a diverse group of customers. Space Weather Services (SWS) within the Australian Bureau of Meteorology is focussed both on developing tailored products and services for the key customer groups, and supporting ASFC operations. Research in SWS is largely centred on the development of data-driven models using a range of solar-terrestrial data. This paper will cover some data requirements , approaches and recent SWS activities for data driven modelling with a focus on the regional Ionospheric specification and forecasting.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giebel, G.; Cline, J.; Frank, H.
Here, this paper presents the new International Energy Agency Wind Task 36 on Forecasting, and invites to collaborate within the group. Wind power forecasts have been used operatively for over 20 years. Despite this fact, there are still several possibilities to improve the forecasts, both from the weather prediction side and from the usage of the forecasts. The new International Energy Agency (IEA) Task on Forecasting for Wind Energy tries to organise international collaboration, among national meteorological centres with an interest and/or large projects on wind forecast improvements (NOAA, DWD, MetOffice, met.no, DMI,...), operational forecaster and forecast users. The Taskmore » is divided in three work packages: Firstly, a collaboration on the improvement of the scientific basis for the wind predictions themselves. This includes numerical weather prediction model physics, but also widely distributed information on accessible datasets. Secondly, we will be aiming at an international pre-standard (an IEA Recommended Practice) on benchmarking and comparing wind power forecasts, including probabilistic forecasts. This WP will also organise benchmarks, in cooperation with the IEA Task WakeBench. Thirdly, we will be engaging end users aiming at dissemination of the best practice in the usage of wind power predictions. As first results, an overview of current issues for research in short-term forecasting of wind power is presented.« less
NASA Technical Reports Server (NTRS)
Bretherton, Christopher S.
2002-01-01
The goal of this project was to compare observations of marine and arctic boundary layers with: (1) parameterization systems used in climate and weather forecast models; and (2) two and three dimensional eddy resolving (LES) models for turbulent fluid flow. Based on this comparison, we hoped to better understand, predict, and parameterize the boundary layer structure and cloud amount, type, and thickness as functions of large scale conditions that are predicted by global climate models. The principal achievements of the project were as follows: (1) Development of a novel boundary layer parameterization for large-scale models that better represents the physical processes in marine boundary layer clouds; and (2) Comparison of column output from the ECMWF global forecast model with observations from the SHEBA experiment. Overall the forecast model did predict most of the major precipitation events and synoptic variability observed over the year of observation of the SHEBA ice camp.
NASA Astrophysics Data System (ADS)
Tsai, Hsiao-Chung; Chen, Pang-Cheng; Elsberry, Russell L.
2017-04-01
The objective of this study is to evaluate the predictability of the extended-range forecasts of tropical cyclone (TC) in the western North Pacific using reforecasts from National Centers for Environmental Prediction (NCEP) Global Ensemble Forecast System (GEFS) during 1996-2015, and from the Climate Forecast System (CFS) during 1999-2010. Tsai and Elsberry have demonstrated that an opportunity exists to support hydrological operations by using the extended-range TC formation and track forecasts in the western North Pacific from the ECMWF 32-day ensemble. To demonstrate this potential for the decision-making processes regarding water resource management and hydrological operation in Taiwan reservoir watershed areas, special attention is given to the skill of the NCEP GEFS and CFS models in predicting the TCs affecting the Taiwan area. The first objective of this study is to analyze the skill of NCEP GEFS and CFS TC forecasts and quantify the forecast uncertainties via verifications of categorical binary forecasts and probabilistic forecasts. The second objective is to investigate the relationships among the large-scale environmental factors [e.g., El Niño Southern Oscillation (ENSO), Madden-Julian Oscillation (MJO), etc.] and the model forecast errors by using the reforecasts. Preliminary results are indicating that the skill of the TC activity forecasts based on the raw forecasts can be further improved if the model biases are minimized by utilizing these reforecasts.
Wind power forecasting: IEA Wind Task 36 & future research issues
Giebel, G.; Cline, J.; Frank, H.; ...
2016-10-03
Here, this paper presents the new International Energy Agency Wind Task 36 on Forecasting, and invites to collaborate within the group. Wind power forecasts have been used operatively for over 20 years. Despite this fact, there are still several possibilities to improve the forecasts, both from the weather prediction side and from the usage of the forecasts. The new International Energy Agency (IEA) Task on Forecasting for Wind Energy tries to organise international collaboration, among national meteorological centres with an interest and/or large projects on wind forecast improvements (NOAA, DWD, MetOffice, met.no, DMI,...), operational forecaster and forecast users. The Taskmore » is divided in three work packages: Firstly, a collaboration on the improvement of the scientific basis for the wind predictions themselves. This includes numerical weather prediction model physics, but also widely distributed information on accessible datasets. Secondly, we will be aiming at an international pre-standard (an IEA Recommended Practice) on benchmarking and comparing wind power forecasts, including probabilistic forecasts. This WP will also organise benchmarks, in cooperation with the IEA Task WakeBench. Thirdly, we will be engaging end users aiming at dissemination of the best practice in the usage of wind power predictions. As first results, an overview of current issues for research in short-term forecasting of wind power is presented.« less
NASA Astrophysics Data System (ADS)
Akanda, A. S.; Jutla, A. S.; Islam, S.
2009-12-01
Despite ravaging the continents through seven global pandemics in past centuries, the seasonal and interannual variability of cholera outbreaks remain a mystery. Previous studies have focused on the role of various environmental and climatic factors, but provided little or no predictive capability. Recent findings suggest a more prominent role of large scale hydroclimatic extremes - droughts and floods - and attempt to explain the seasonality and the unique dual cholera peaks in the Bengal Delta region of South Asia. We investigate the seasonal and interannual nature of cholera epidemiology in three geographically distinct locations within the region to identify the larger scale hydroclimatic controls that can set the ecological and environmental ‘stage’ for outbreaks and have significant memory on a seasonal scale. Here we show that two distinctly different, pre and post monsoon, cholera transmission mechanisms related to large scale climatic controls prevail in the region. An implication of our findings is that extreme climatic events such as prolonged droughts, record floods, and major cyclones may cause major disruption in the ecosystem and trigger large epidemics. We postulate that a quantitative understanding of the large-scale hydroclimatic controls and dominant processes with significant system memory will form the basis for forecasting such epidemic outbreaks. A multivariate regression method using these predictor variables to develop probabilistic forecasts of cholera outbreaks will be explored. Forecasts from such a system with a seasonal lead-time are likely to have measurable impact on early cholera detection and prevention efforts in endemic regions.
Mesoscale Predictability and Error Growth in Short Range Ensemble Forecasts
NASA Astrophysics Data System (ADS)
Gingrich, Mark
Although it was originally suggested that small-scale, unresolved errors corrupt forecasts at all scales through an inverse error cascade, some authors have proposed that those mesoscale circulations resulting from stationary forcing on the larger scale may inherit the predictability of the large-scale motions. Further, the relative contributions of large- and small-scale uncertainties in producing error growth in the mesoscales remain largely unknown. Here, 100 member ensemble forecasts are initialized from an ensemble Kalman filter (EnKF) to simulate two winter storms impacting the East Coast of the United States in 2010. Four verification metrics are considered: the local snow water equivalence, total liquid water, and 850 hPa temperatures representing mesoscale features; and the sea level pressure field representing a synoptic feature. It is found that while the predictability of the mesoscale features can be tied to the synoptic forecast, significant uncertainty existed on the synoptic scale at lead times as short as 18 hours. Therefore, mesoscale details remained uncertain in both storms due to uncertainties at the large scale. Additionally, the ensemble perturbation kinetic energy did not show an appreciable upscale propagation of error for either case. Instead, the initial condition perturbations from the cycling EnKF were maximized at large scales and immediately amplified at all scales without requiring initial upscale propagation. This suggests that relatively small errors in the synoptic-scale initialization may have more importance in limiting predictability than errors in the unresolved, small-scale initial conditions.
Extra-tropical Cyclones and Windstorms in Seasonal Forecasts
NASA Astrophysics Data System (ADS)
Leckebusch, Gregor C.; Befort, Daniel J.; Weisheimer, Antje; Knight, Jeff; Thornton, Hazel; Roberts, Julia; Hermanson, Leon
2015-04-01
Severe damages and large insured losses over Europe related to natural phenomena are mostly caused by extra-tropical cyclones and their related windstorm fields. Thus, an adequate representation of these events in seasonal prediction systems and reliable forecasts up to a season in advance would be of high value for society and economy. In this study, state-of-the-art seasonal forecast prediction systems are analysed (ECMWF, UK Met Office) regarding the general climatological representation and the seasonal prediction of extra-tropical cyclones and windstorms during the core winter season (DJF) with a lead time of up to four months. Two different algorithms are used to identify cyclones and windstorm events in these datasets. Firstly, we apply a cyclone identification and tracking algorithm based on the Laplacian of MSLP and secondly, we use an objective wind field tracking algorithm to identify and track continuous areas of extreme high wind speeds (cf. Leckebusch et al., 2008), which can be related to extra-tropical winter cyclones. Thus, for the first time, we can analyse the forecast of severe wind events near to the surface caused by extra-tropical cyclones. First results suggest a successful validation of the spatial climatological distributions of wind storm and cyclone occurrence in the seasonal forecast systems in comparison with reanalysis data (ECMWF-ERA40 & ERAInterim) in general. However, large biases are found for some areas. The skill of the seasonal forecast systems in simulating the year-to-year variability of the frequency of severe windstorm events and cyclones is investigated using the ranked probability skill score. Positive skill is found over large parts of the Northern Hemisphere as well as for the most intense extra-tropical cyclones and its related wind fields.
NASA Astrophysics Data System (ADS)
Kim, Ok-Yeon; Kim, Hye-Mi; Lee, Myong-In; Min, Young-Mi
2017-01-01
This study aims at predicting the seasonal number of typhoons (TY) over the western North Pacific with an Asia-Pacific Climate Center (APCC) multi-model ensemble (MME)-based dynamical-statistical hybrid model. The hybrid model uses the statistical relationship between the number of TY during the typhoon season (July-October) and the large-scale key predictors forecasted by APCC MME for the same season. The cross validation result from the MME hybrid model demonstrates high prediction skill, with a correlation of 0.67 between the hindcasts and observation for 1982-2008. The cross validation from the hybrid model with individual models participating in MME indicates that there is no single model which consistently outperforms the other models in predicting typhoon number. Although the forecast skill of MME is not always the highest compared to that of each individual model, the skill of MME presents rather higher averaged correlations and small variance of correlations. Given large set of ensemble members from multi-models, a relative operating characteristic score reveals an 82 % (above-) and 78 % (below-normal) improvement for the probabilistic prediction of the number of TY. It implies that there is 82 % (78 %) probability that the forecasts can successfully discriminate between above normal (below-normal) from other years. The forecast skill of the hybrid model for the past 7 years (2002-2008) is more skillful than the forecast from the Tropical Storm Risk consortium. Using large set of ensemble members from multi-models, the APCC MME could provide useful deterministic and probabilistic seasonal typhoon forecasts to the end-users in particular, the residents of tropical cyclone-prone areas in the Asia-Pacific region.
NASA Astrophysics Data System (ADS)
Aberson, Sim D.; Franklin, James L.
1999-03-01
In 1997, the Tropical Prediction Center (TPC) began operational Gulfstream-IV jet aircraft missions to improve the numerical guidance for hurricanes threatening the continental United States, Puerto Rico, and the Virgin Islands. During these missions, the new generation of Global Positioning System dropwindsondes were released from the aircraft at 150-200-km intervals along the flight track in the environment of the tropical cyclone to obtain profiles of wind, temperature, and humidity from flight level to the surface. The observations were ingested into the global model at the National Centers for Environmental Prediction, which subsequently serves as initial and boundary conditions to other numerical tropical cyclone models. Because of a lack of tropical cyclone activity in the Atlantic basin, only five such missions were conducted during the inaugural 1997 hurricane season.Due to logistical constraints, sampling in all quadrants of the storm environment was accomplished in only one of the five cases during 1997. Nonetheless, the dropwindsonde observations improved mean track forecasts from the Geophysical Fluid Dynamics Laboratory hurricane model by as much as 32%, and the intensity forecasts by as much as 20% during the hurricane watch period (within 48 h of projected landfall). Forecasts from another dynamical tropical cyclone model (VICBAR) also showed modest improvements with the dropwindsonde observations. These improvements, if confirmed by a larger sample, represent a large step toward the forecast accuracy goals of TPC. The forecast track improvements are as large as those accumulated over the past 20-25 years, and those for forecast intensity provide further evidence that better synoptic-scale data can lead to more skillful dynamical tropical cyclone intensity forecasts.
Large Scale Skill in Regional Climate Modeling and the Lateral Boundary Condition Scheme
NASA Astrophysics Data System (ADS)
Veljović, K.; Rajković, B.; Mesinger, F.
2009-04-01
Several points are made concerning the somewhat controversial issue of regional climate modeling: should a regional climate model (RCM) be expected to maintain the large scale skill of the driver global model that is supplying its lateral boundary condition (LBC)? Given that this is normally desired, is it able to do so without help via the fairly popular large scale nudging? Specifically, without such nudging, will the RCM kinetic energy necessarily decrease with time compared to that of the driver model or analysis data as suggested by a study using the Regional Atmospheric Modeling System (RAMS)? Finally, can the lateral boundary condition scheme make a difference: is the almost universally used but somewhat costly relaxation scheme necessary for a desirable RCM performance? Experiments are made to explore these questions running the Eta model in two versions differing in the lateral boundary scheme used. One of these schemes is the traditional relaxation scheme, and the other the Eta model scheme in which information is used at the outermost boundary only, and not all variables are prescribed at the outflow boundary. Forecast lateral boundary conditions are used, and results are verified against the analyses. Thus, skill of the two RCM forecasts can be and is compared not only against each other but also against that of the driver global forecast. A novel verification method is used in the manner of customary precipitation verification in that forecast spatial wind speed distribution is verified against analyses by calculating bias adjusted equitable threat scores and bias scores for wind speeds greater than chosen wind speed thresholds. In this way, focusing on a high wind speed value in the upper troposphere, verification of large scale features we suggest can be done in a manner that may be more physically meaningful than verifications via spectral decomposition that are a standard RCM verification method. The results we have at this point are somewhat limited in view of the integrations having being done only for 10-day forecasts. Even so, one should note that they are among very few done using forecast as opposed to reanalysis or analysis global driving data. Our results suggest that (1) running the Eta as an RCM no significant loss of large-scale kinetic energy with time seems to be taking place; (2) no disadvantage from using the Eta LBC scheme compared to the relaxation scheme is seen, while enjoying the advantage of the scheme being significantly less demanding than the relaxation given that it needs driver model fields at the outermost domain boundary only; and (3) the Eta RCM skill in forecasting large scales, with no large scale nudging, seems to be just about the same as that of the driver model, or, in the terminology of Castro et al., the Eta RCM does not lose "value of the large scale" which exists in the larger global analyses used for the initial condition and for verification.
NASA Technical Reports Server (NTRS)
Regonda, Satish K.; Zaitchik, Benjamin F.; Badr, Hamada S.; Rodell, Matthew
2016-01-01
Dynamically based seasonal forecasts are prone to systematic spatial biases due to imperfections in the underlying global climate model (GCM). This can result in low-forecast skill when the GCM misplaces teleconnections or fails to resolve geographic barriers, even if the prediction of large-scale dynamics is accurate. To characterize and address this issue, this study applies objective climate regionalization to identify discrepancies between the Climate Forecast SystemVersion 2 (CFSv2) and precipitation observations across the Contiguous United States (CONUS). Regionalization shows that CFSv2 1 month forecasts capture the general spatial character of warm season precipitation variability but that forecast regions systematically differ from observation in some transition zones. CFSv2 predictive skill for these misclassified areas is systematically reduced relative to correctly regionalized areas and CONUS as a whole. In these incorrectly regionalized areas, higher skill can be obtained by using a regional-scale forecast in place of the local grid cell prediction.
Forecasting the spatial transmission of influenza in the United States.
Pei, Sen; Kandula, Sasikiran; Yang, Wan; Shaman, Jeffrey
2018-03-13
Recurrent outbreaks of seasonal and pandemic influenza create a need for forecasts of the geographic spread of this pathogen. Although it is well established that the spatial progression of infection is largely attributable to human mobility, difficulty obtaining real-time information on human movement has limited its incorporation into existing infectious disease forecasting techniques. In this study, we develop and validate an ensemble forecast system for predicting the spatiotemporal spread of influenza that uses readily accessible human mobility data and a metapopulation model. In retrospective state-level forecasts for 35 US states, the system accurately predicts local influenza outbreak onset,-i.e., spatial spread, defined as the week that local incidence increases above a baseline threshold-up to 6 wk in advance of this event. In addition, the metapopulation prediction system forecasts influenza outbreak onset, peak timing, and peak intensity more accurately than isolated location-specific forecasts. The proposed framework could be applied to emergent respiratory viruses and, with appropriate modifications, other infectious diseases.
Evaluation of NU-WRF Rainfall Forecasts for IFloodS
NASA Technical Reports Server (NTRS)
Wu, Di; Peters-Lidard, Christa; Tao, Wei-Kuo; Petersen, Walter
2016-01-01
The Iowa Flood Studies (IFloodS) campaign was conducted in eastern Iowa as a pre- GPM-launch campaign from 1 May to 15 June 2013. During the campaign period, real time forecasts are conducted utilizing NASA-Unified Weather Research and Forecasting (NU-WRF) model to support the everyday weather briefing. In this study, two sets of the NU-WRF rainfall forecasts are evaluated with Stage IV and Multi-Radar Multi-Sensor (MRMS) Quantitative Precipitation Estimation (QPE), with the objective to understand the impact of Land Surface initialization on the predicted precipitation. NU-WRF is also compared with North American Mesoscale Forecast System (NAM) 12 kilometer forecast. In general, NU-WRF did a good job at capturing individual precipitation events. NU-WRF is also able to replicate a better rainfall spatial distribution compare with NAM. Further sensitivity tests show that the high-resolution makes a positive impact on rainfall forecast. The two sets of NU-WRF simulations produce very close rainfall characteristics. The Land surface initialization do not show significant impact on short term rainfall forecast, and it is largely due to the soil conditions during the field campaign period.
Stochastic Convection Parameterizations
NASA Technical Reports Server (NTRS)
Teixeira, Joao; Reynolds, Carolyn; Suselj, Kay; Matheou, Georgios
2012-01-01
computational fluid dynamics, radiation, clouds, turbulence, convection, gravity waves, surface interaction, radiation interaction, cloud and aerosol microphysics, complexity (vegetation, biogeochemistry, radiation versus turbulence/convection stochastic approach, non-linearities, Monte Carlo, high resolutions, large-Eddy Simulations, cloud structure, plumes, saturation in tropics, forecasting, parameterizations, stochastic, radiation-clod interaction, hurricane forecasts
Short-term forecasting of emergency inpatient flow.
Abraham, Gad; Byrnes, Graham B; Bain, Christopher A
2009-05-01
Hospital managers have to manage resources effectively, while maintaining a high quality of care. For hospitals where admissions from the emergency department to the wards represent a large proportion of admissions, the ability to forecast these admissions and the resultant ward occupancy is especially useful for resource planning purposes. Since emergency admissions often compete with planned elective admissions, modeling emergency demand may result in improved elective planning as well. We compare several models for forecasting daily emergency inpatient admissions and occupancy. The models are applied to three years of daily data. By measuring their mean square error in a cross-validation framework, we find that emergency admissions are largely random, and hence, unpredictable, whereas emergency occupancy can be forecasted using a model combining regression and autoregressive integrated moving average (ARIMA) model, or a seasonal ARIMA model, for up to one week ahead. Faced with variable admissions and occupancy, hospitals must prepare a reserve capacity of beds and staff. Our approach allows estimation of the required reserve capacity.
Arctic sea ice trends, variability and implications for seasonal ice forecasting.
Serreze, Mark C; Stroeve, Julienne
2015-07-13
September Arctic sea ice extent over the period of satellite observations has a strong downward trend, accompanied by pronounced interannual variability with a detrended 1 year lag autocorrelation of essentially zero. We argue that through a combination of thinning and associated processes related to a warming climate (a stronger albedo feedback, a longer melt season, the lack of especially cold winters) the downward trend itself is steepening. The lack of autocorrelation manifests both the inherent large variability in summer atmospheric circulation patterns and that oceanic heat loss in winter acts as a negative (stabilizing) feedback, albeit insufficient to counter the steepening trend. These findings have implications for seasonal ice forecasting. In particular, while advances in observing sea ice thickness and assimilating thickness into coupled forecast systems have improved forecast skill, there remains an inherent limit to predictability owing to the largely chaotic nature of atmospheric variability. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
NASA Technical Reports Server (NTRS)
Kozlowski, Danielle; Zavodsky, Bradley T.; Jedlovec, Gary J.
2011-01-01
The Short-term Prediction Research and Transition Center (SPoRT) is a collaborative partnership between NASA and operational forecasting partners, including a number of National Weather Service (NWS) Weather Forecasting Offices (WFO). As a part of the transition to operations process, SPoRT attempts to identify possible limitations in satellite observations and provide operational forecasters a product that will result in the most impact on their forecasts. One operational forecast challenge that some NWS offices face, is forecasting convection in data-void regions such as large bodies of water. The Atmospheric Infrared Sounder (AIRS) is a sounding instrument aboard NASA's Aqua satellite that provides temperature and moisture profiles of the atmosphere. This paper will demonstrate an approach to assimilate AIRS profile data into a regional configuration of the WRF model using its three-dimensional variational (3DVAR) assimilation component to be used as a proxy for the individual profiles.
A simplified real time method to forecast semi-enclosed basins storm surge
NASA Astrophysics Data System (ADS)
Pasquali, D.; Di Risio, M.; De Girolamo, P.
2015-11-01
Semi-enclosed basins are often prone to storm surge events. Indeed, their meteorological exposition, the presence of large continental shelf and their shape can lead to strong sea level set-up. A real time system aimed at forecasting storm surge may be of great help to protect human activities (i.e. to forecast flooding due to storm surge events), to manage ports and to safeguard coasts safety. This paper aims at illustrating a simple method able to forecast storm surge events in semi-enclosed basins in real time. The method is based on a mixed approach in which the results obtained by means of a simplified physics based model with low computational costs are corrected by means of statistical techniques. The proposed method is applied to a point of interest located in the Northern part of the Adriatic Sea. The comparison of forecasted levels against observed values shows the satisfactory reliability of the forecasts.
Monthly mean forecast experiments with the GISS model
NASA Technical Reports Server (NTRS)
Spar, J.; Atlas, R. M.; Kuo, E.
1976-01-01
The GISS general circulation model was used to compute global monthly mean forecasts for January 1973, 1974, and 1975 from initial conditions on the first day of each month and constant sea surface temperatures. Forecasts were evaluated in terms of global and hemispheric energetics, zonally averaged meridional and vertical profiles, forecast error statistics, and monthly mean synoptic fields. Although it generated a realistic mean meridional structure, the model did not adequately reproduce the observed interannual variations in the large scale monthly mean energetics and zonally averaged circulation. The monthly mean sea level pressure field was not predicted satisfactorily, but annual changes in the Icelandic low were simulated. The impact of temporal sea surface temperature variations on the forecasts was investigated by comparing two parallel forecasts for January 1974, one using climatological ocean temperatures and the other observed daily ocean temperatures. The use of daily updated sea surface temperatures produced no discernible beneficial effect.
Comparison of Wind Power and Load Forecasting Error Distributions: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, B. M.; Florita, A.; Orwig, K.
2012-07-01
The introduction of large amounts of variable and uncertain power sources, such as wind power, into the electricity grid presents a number of challenges for system operations. One issue involves the uncertainty associated with scheduling power that wind will supply in future timeframes. However, this is not an entirely new challenge; load is also variable and uncertain, and is strongly influenced by weather patterns. In this work we make a comparison between the day-ahead forecasting errors encountered in wind power forecasting and load forecasting. The study examines the distribution of errors from operational forecasting systems in two different Independent Systemmore » Operator (ISO) regions for both wind power and load forecasts at the day-ahead timeframe. The day-ahead timescale is critical in power system operations because it serves the unit commitment function for slow-starting conventional generators.« less
NASA Technical Reports Server (NTRS)
Atlas, R.
1984-01-01
Results are presented from a series of forecast experiments which were conducted to assess the importance of large-scale dynamical processes, diabatic heating, and initial data to the prediction of the President's Day cyclone. The synoptic situation and NMC model forecasts for this case are summarized, and the analysis/forecast system and experiments are described. The GLAS Model forecast from the GLAS analysis at 0000 GMT 18 February is found to have correctly predicted intense coastal cyclogenesis and heavy precipitation. A forecast with surface heat and moisture fluxes eliminated failed to predict any cyclogenesis while a similar forecast with only the surface moisture flux excluded showed weak development. Diabatic heating resulting from oceanic fluxes significantly contributed to the generation of low-level cyclonic vorticity and the intensification and slow rate of movement of an upper level ridge over the western Atlantic.
When mechanism matters: Bayesian forecasting using models of ecological diffusion
Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.
2017-01-01
Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.
Sources of Wind Variability at a Single Station in Complex Terrain During Tropical Cyclone Passage
2013-12-01
Mesoscale Prediction System CPA Closest point of approach ET Extratropical transition FNMOC Fleet Numerical Meteorology and Oceanography Center...forecasts. However, 2 the TC forecast tracks and warnings they issue necessarily focus on the large-scale structure of the storm , and are not...winds at one station. Also, this technique is a storm - centered forecast and even if the grid spacing is on order of one kilometer, it is unlikely
Regime-dependence of Impacts of Radar Rainfall Data Assimilation
NASA Astrophysics Data System (ADS)
Craig, G. C.; Keil, C.
2009-04-01
Experience from the first operational trials of assimilation of radar data in kilometre scale numerical weather prediction models (operating without cumulus parameterisation) shows that the positive impact of the radar data on convective precipitation forecasts typically decay within a few hours, although certain cases show much longer impacts. Here the impact time of radar data assimilation is related to characteristics of the meteorological environment. This QPF uncertainty is investigated using an ensemble of 10 forecasts at 2.8 km horizontal resolution based on different initial and boundary conditions from a global forecast ensemble. Control forecasts are compared with forecasts where radar reflectivity data is assimilated using latent heat nudging. Examination of different cases of convection in southern Germany suggests that the forecasts can be separated into two regimes using a convective timescale. Short impact times are associated with short convective timescales that are characteristic of equilibrium convection. In this regime the statistical properties of the convection are constrained by the large-scale forcing, and effects of the radar data are lost within a few hours as the convection rapidly returns to equilibrium. When the convective timescale is large (non-equilibrium conditions), the impact of the radar data is longer since convective systems are triggered by the latent heat nudging and are able to persist for many hours in the very unstable conditions present in these cases.
New ROMP Synthesis of Ferrocenyl Dendronized Polymers.
Liu, Xiong; Ling, Qiangjun; Zhao, Li; Qiu, Guirong; Wang, Yinghong; Song, Lianxiang; Zhang, Ying; Ruiz, Jaime; Astruc, Didier; Gu, Haibin
2017-10-01
First- and second-generation Percec-type dendronized ferrocenyl norbornene macromonomers containing, respectively, three and nine ferrocenyl termini are synthesized and polymerized by ring-opening metathesis polymerization using Grubbs' third-generation olefin metathesis catalyst with several monomer/catalyst feed ratios between 10 and 50. The rate of polymerization is highly dependent on the generation of the dendronized macromonomers, but all these ring-opening metathesis polymerization reactions are controlled, and near-quantitative monomer conversions are achieved. The numbers of ferrocenyl groups obtained are in agreement with the theoretical ones according to the cyclic voltammetry studies as determined using the Bard-Anson method. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Using Landslide Failure Forecast Models in Near Real Time: the Mt. de La Saxe case-study
NASA Astrophysics Data System (ADS)
Manconi, Andrea; Giordan, Daniele
2014-05-01
Forecasting the occurrence of landslide phenomena in space and time is a major scientific challenge. The approaches used to forecast landslides mainly depend on the spatial scale analyzed (regional vs. local), the temporal range of forecast (long- vs. short-term), as well as the triggering factor and the landslide typology considered. By focusing on short-term forecast methods for large, deep seated slope instabilities, the potential time of failure (ToF) can be estimated by studying the evolution of the landslide deformation over time (i.e., strain rate) provided that, under constant stress conditions, landslide materials follow creep mechanism before reaching rupture. In the last decades, different procedures have been proposed to estimate ToF by considering simplified empirical and/or graphical methods applied to time series of deformation data. Fukuzono, 1985 proposed a failure forecast method based on the experience performed during large scale laboratory experiments, which were aimed at observing the kinematic evolution of a landslide induced by rain. This approach, known also as the inverse-velocity method, considers the evolution over time of the inverse value of the surface velocity (v) as an indicator of the ToF, by assuming that failure approaches while 1/v tends to zero. Here we present an innovative method to aimed at achieving failure forecast of landslide phenomena by considering near-real-time monitoring data. Starting from the inverse velocity theory, we analyze landslide surface displacements on different temporal windows, and then apply straightforward statistical methods to obtain confidence intervals on the time of failure. Our results can be relevant to support the management of early warning systems during landslide emergency conditions, also when the predefined displacement and/or velocity thresholds are exceeded. In addition, our statistical approach for the definition of confidence interval and forecast reliability can be applied also to different failure forecast methods. We applied for the first time the herein presented approach in near real time during the emergency scenario relevant to the reactivation of the La Saxe rockslide, a large mass wasting menacing the population of Courmayeur, northern Italy, and the important European route E25. We show how the application of simplified but robust forecast models can be a convenient method to manage and support early warning systems during critical situations. References: Fukuzono T. (1985), A New Method for Predicting the Failure Time of a Slope, Proc. IVth International Conference and Field Workshop on Landslides, Tokyo.
NASA Astrophysics Data System (ADS)
Rundle, J. B.; Holliday, J. R.; Donnellan, A.; Graves, W.; Tiampo, K. F.; Klein, W.
2009-12-01
Risks from natural and financial catastrophes are currently managed by a combination of large public and private institutions. Public institutions usually are comprised of government agencies that conduct studies, formulate policies and guidelines, enforce regulations, and make “official” forecasts. Private institutions include insurance and reinsurance companies, and financial service companies that underwrite catastrophe (“cat”) bonds, and make private forecasts. Although decisions about allocating resources and developing solutions are made by large institutions, the costs of dealing with catastrophes generally fall for the most part on businesses and the general public. Information on potential risks is generally available to the public for some hazards but not others. For example, in the case of weather, private forecast services are provided by www.weather.com and www.wunderground.com. For earthquakes in California (only), the official forecast is the WGCEP-USGS forecast, but provided in a format that is difficult for the public to use. Other privately made forecasts are currently available, for example by the JPL QuakeSim and Russian groups, but these efforts are limited. As more of the world’s population moves increasingly into major seismic zones, new strategies are needed to allow individuals to manage their personal risk from large and damaging earthquakes. Examples include individual mitigation measures such as retrofitting, as well as microinsurance in both developing and developed countries, as well as other financial strategies. We argue that the “long tail” of the internet offers an ideal, and greatly underutilized mechanism to reach out to consumers and to provide them with the information and tools they need to confront and manage seismic hazard and risk on an individual, personalized basis. Information of this type includes not only global hazard forecasts, which are now possible, but also global risk estimation. Additionally, social networking tools are available that will allow self-organizing, disaster-resilient communities to arise as emergent structures from the underlying nonlinear social dynamics. In this talk, we argue that the current style of risk management is not making adequate use of modern internet technology, and that significantly more can be done. We suggest several avenues to proceed, in particular making use of the internet for earthquake forecast and information delivery, as well as tracking forecast validation and verification on a real-time basis. We also show examples of forecasts delivered over the internet, and describe how these are made.
Improving Assimilated Global Data Sets using TMI Rainfall and Columnar Moisture Observations
NASA Technical Reports Server (NTRS)
Hou, Arthur Y.; Zhang, Sara Q.; daSilva, Arlindo M.; Olson, William S.
1999-01-01
A global analysis that optimally combine observations from diverse sources with physical models of atmospheric and land processes can provide a comprehensive description of the climate systems. Currently, such data products contain significant errors in primary hydrological fields such as precipitation and evaporation, especially in the tropics. In this study, we show that assimilating precipitation and total precipitable water (TPW) retrievals derived from the TRMM Microwave Imager (TMI) improves not only the hydrological cycle but also key climate parameters such as clouds, radiation, and the large-scale circulation produced by the Goddard Earth Observing System (GEOS) data assimilation system (DAS). In particular, assimilating TMI rain improves clouds and radiation in areas of active convection, as well as the latent heating distribution and the large-scale motion field in the tropics, while assimilating TMI TPW heating distribution and the large-scale motion field in the tropics, while assimilating TMI TPW retrievals leads to reduced moisture biases and improved radiative fluxes in clear-sky regions. The improved analysis also improves short-range forecasts in the tropics. Ensemble forecasts initialized with the GEOS analysis incorporating TMI rain rates and TPW yield smaller biases in tropical precipitation forecasts beyond 1 day and better 500 hPa geopotential height forecasts up to 5 days. Results of this study demonstrate the potential of using high-quality space-borne rainfall and moisture observations to improve the quality of assimilated global data for climate analysis and weather forecasting applications
Bayesian flood forecasting methods: A review
NASA Astrophysics Data System (ADS)
Han, Shasha; Coulibaly, Paulin
2017-08-01
Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.
A Case Study of the Impact of AIRS Temperature Retrievals on Numerical Weather Prediction
NASA Technical Reports Server (NTRS)
Reale, O.; Atlas, R.; Jusem, J. C.
2004-01-01
Large errors in numerical weather prediction are often associated with explosive cyclogenesis. Most studes focus on the under-forecasting error, i.e. cases of rapidly developing cyclones which are poorly predicted in numerical models. However, the over-forecasting error (i.e., to predict an explosively developing cyclone which does not occur in reality) is a very common error that severely impacts the forecasting skill of all models and may also present economic costs if associated with operational forecasting. Unnecessary precautions taken by marine activities can result in severe economic loss. Moreover, frequent occurrence of over-forecasting can undermine the reliance on operational weather forecasting. Therefore, it is important to understand and reduce the prdctions of extreme weather associated with explosive cyclones which do not actually develop. In this study we choose a very prominent case of over-forecasting error in the northwestern Pacific. A 960 hPa cyclone develops in less than 24 hour in the 5-day forecast, with a deepening rate of about 30 hPa in one day. The cyclone is not versed in the analyses and is thus a case of severe over-forecasting. By assimilating AIRS data, the error is largely eliminated. By following the propagation of the anomaly that generates the spurious cyclone, it is found that a small mid-tropospheric geopotential height negative anomaly over the northern part of the Indian subcontinent in the initial conditions, propagates westward, is amplified by orography, and generates a very intense jet streak in the subtropical jet stream, with consequent explosive cyclogenesis over the Pacific. The AIRS assimilation eliminates this anomaly that may have been caused by erroneous upper-air data, and represents the jet stream more correctly. The energy associated with the jet is distributed over a much broader area and as a consequence a multiple, but much more moderate cyclogenesis is observed.
Fennec dust forecast intercomparison over the Sahara in June 2011
NASA Astrophysics Data System (ADS)
Chaboureau, Jean-Pierre; Flamant, Cyrille; Dauhut, Thibaut; Kocha, Cécile; Lafore, Jean-Philippe; Lavaysse, Chistophe; Marnas, Fabien; Mokhtari, Mohamed; Pelon, Jacques; Reinares Martínez, Irene; Schepanski, Kerstin; Tulet, Pierre
2016-06-01
In the framework of the Fennec international programme, a field campaign was conducted in June 2011 over the western Sahara. It led to the first observational data set ever obtained that documents the dynamics, thermodynamics and composition of the Saharan atmospheric boundary layer (SABL) under the influence of the heat low. In support to the aircraft operation, four dust forecasts were run daily at low and high resolutions with convection-parameterizing and convection-permitting models, respectively. The unique airborne and ground-based data sets allowed the first ever intercomparison of dust forecasts over the western Sahara. At monthly scale, large aerosol optical depths (AODs) were forecast over the Sahara, a feature observed by satellite retrievals but with different magnitudes. The AOD intensity was correctly predicted by the high-resolution models, while it was underestimated by the low-resolution models. This was partly because of the generation of strong near-surface wind associated with thunderstorm-related density currents that could only be reproduced by models representing convection explicitly. Such models yield emissions mainly in the afternoon that dominate the total emission over the western fringes of the Adrar des Iforas and the Aïr Mountains in the high-resolution forecasts. Over the western Sahara, where the harmattan contributes up to 80 % of dust emission, all the models were successful in forecasting the deep well-mixed SABL. Some of them, however, missed the large near-surface dust concentration generated by density currents and low-level winds. This feature, observed repeatedly by the airborne lidar, was partly forecast by one high-resolution model only.
NASA Astrophysics Data System (ADS)
Naulin, Jean-Philippe; Payrastre, Olivier; Gaume, Eric; Delrieu, Guy
2013-04-01
Accurate flood forecasts are crucial for an efficient flood event management. Until now, hydro-meteorological forecasts have been mainly used for early-warnings in France (Meteorological and flood vigilance maps) or over the world (Flash-flood guidances). These forecasts are generally limited to the main streams covered by the flood forecasting services or to specific watersheds with particular assets like check dams which are in most cases well gauged river sections, leaving aside large parts of the territory. A distributed hydro-meteorological forecasting approach will be presented, able to take advantage of the high spatial and temporal resolution rainfall estimates that are now available to provide information at ungauged sites. The proposed system aiming at detecting road inundation risks had been initially developed and tested in areas of limited size. Its extension to a whole region (the Gard region in the South of France) will be presented, including over 2000 crossing points between rivers and roads and its validation against a large data set of actually reported road inundations observed during recent flash-flood events. These first validation results appear promising. Such a tool would provide the necessary information for flood event management services to identify the areas at risk and to take the appropriate safety and rescue measures: pre-positioning of rescue means, stopping of the traffic on exposed roads, determination of safe accesses or evacuation routes. Moreover, beyond the specific application to the supervision of a road network, this work provides also results concerning the performances of hydro-meteorological forecasts for ungauged headwaters.
Fennec dust forecast intercomparison over the Sahara in June 2011
NASA Astrophysics Data System (ADS)
Chaboureau, J. P.; Flamant, C.; Dauhut, T.; Lafore, J. P.; Lavaysse, C.; Pelon, J.; Schepanski, K.; Tulet, P.
2016-12-01
In the framework of the Fennec international programme, a field campaign was conducted in June 2011 over the western Sahara. It led to the first observational data set ever obtained that documents the dynamics, thermodynam-ics and composition of the Saharan atmospheric boundary layer (SABL) under the influence of the heat low. In support to the aircraft operation, four dust forecasts were run daily at low and high resolutions with convection-parameterizing and convection-permitting models, respectively. The unique airborne and ground-based data sets allowed the first ever intercomparison of dust forecasts over the western Sahara. At monthly scale, large aerosol optical depths (AODs) were forecast over the Sahara, a feature observed by satellite retrievals but with different magnitudes. The AOD intensity was correctly predicted by the high-resolution models, while it was underestimated by the low-resolution models. This was partly because of the generation of strong near-surface wind associated with thunderstorm-related density currents that could only be reproduced by models representing convection explicitly. Such models yield emissions mainly in the afternoon that dominate the total emission over the western fringes of the Adrar des Iforas and the Aïr Mountains in the high-resolution forecasts. Over the western Sahara, where the harmattan contributes up to 80 % of dust emission, all the models were successful in forecasting the deep well-mixed SABL. Some of them, however, missed the large near-surface dust concentration generated by density currents and low-level winds. This feature, observed repeatedly by the airborne lidar, was partly forecast by one high-resolution model only.
Code of Federal Regulations, 2012 CFR
2012-01-01
... forecast. The forecast should be used by the board of directors and the manager to guide the system towards... projected results of future actions planned by the borrower's board of directors; (2) The financial goals... type of large power loads, projections of future borrowings and the associated interest, projected...
Code of Federal Regulations, 2013 CFR
2013-01-01
... forecast. The forecast should be used by the board of directors and the manager to guide the system towards... projected results of future actions planned by the borrower's board of directors; (2) The financial goals... type of large power loads, projections of future borrowings and the associated interest, projected...
NASA Astrophysics Data System (ADS)
Spennemann, Pablo; Rivera, Juan Antonio; Osman, Marisol; Saulo, Celeste; Penalba, Olga
2017-04-01
The importance of forecasting extreme wet and dry conditions from weeks to months in advance relies on the need to prevent considerable socio-economic losses, mainly in regions of large populations and where agriculture is a key value for the economies, like Southern South America (SSA). Therefore, to improve the understanding of the performance and uncertainties of seasonal soil moisture and precipitation forecasts over SSA, this study aims to: 1) perform a general assessment of the Climate Forecast System version-2 (CFSv2) soil moisture and precipitation forecasts; and 2) evaluate the CFSv2 ability to represent an extreme drought event merging observations with forecasted Standardized Precipitation Index (SPI) and the Standardized Soil Moisture Anomalies (SSMA) based on GLDAS-2.0 simulations. Results show that both SPI and SSMA forecast skill are regionally and seasonally dependent. In general a fast degradation of the forecasts skill is observed as the lead time increases with no significant metrics for forecast lead times longer than 2 months. Based on the assessment of the 2008-2009 extreme drought event it is evident that the CFSv2 forecasts have limitations regarding the identification of drought onset, duration, severity and demise, considering both meteorological (SPI) and agricultural (SSMA) drought conditions. These results have some implications upon the use of seasonal forecasts to assist agricultural practices in SSA, given that forecast skill is still too low to be useful for lead times longer than 2 months.
The state of the art of flood forecasting - Hydrological Ensemble Prediction Systems
NASA Astrophysics Data System (ADS)
Thielen-Del Pozo, J.; Pappenberger, F.; Salamon, P.; Bogner, K.; Burek, P.; de Roo, A.
2010-09-01
Flood forecasting systems form a key part of ‘preparedness' strategies for disastrous floods and provide hydrological services, civil protection authorities and the public with information of upcoming events. Provided the warning leadtime is sufficiently long, adequate preparatory actions can be taken to efficiently reduce the impacts of the flooding. Because of the specific characteristics of each catchment, varying data availability and end-user demands, the design of the best flood forecasting system may differ from catchment to catchment. However, despite the differences in concept and data needs, there is one underlying issue that spans across all systems. There has been an growing awareness and acceptance that uncertainty is a fundamental issue of flood forecasting and needs to be dealt with at the different spatial and temporal scales as well as the different stages of the flood generating processes. Today, operational flood forecasting centres change increasingly from single deterministic forecasts to probabilistic forecasts with various representations of the different contributions of uncertainty. The move towards these so-called Hydrological Ensemble Prediction Systems (HEPS) in flood forecasting represents the state of the art in forecasting science, following on the success of the use of ensembles for weather forecasting (Buizza et al., 2005) and paralleling the move towards ensemble forecasting in other related disciplines such as climate change predictions. The use of HEPS has been internationally fostered by initiatives such as "The Hydrologic Ensemble Prediction Experiment" (HEPEX), created with the aim to investigate how best to produce, communicate and use hydrologic ensemble forecasts in hydrological short-, medium- und long term prediction of hydrological processes. The advantages of quantifying the different contributions of uncertainty as well as the overall uncertainty to obtain reliable and useful flood forecasts also for extreme events, has become evident. However, despite the demonstrated advantages, worldwide the incorporation of HEPS in operational flood forecasting is still limited. The applicability of HEPS for smaller river basins was tested in MAP D-Phase, an acronym for "Demonstration of Probabilistic Hydrological and Atmospheric Simulation of flood Events in the Alpine region" which was launched in 2005 as a Forecast Demonstration Project of World Weather Research Programme of WMO, and entered a pre-operational and still active testing phase in 2007. In Europe, a comparatively high number of EPS driven systems for medium-large rivers exist. National flood forecasting centres of Sweden, Finland and the Netherlands, have already implemented HEPS in their operational forecasting chain, while in other countries including France, Germany, Czech Republic and Hungary, hybrids or experimental chains have been installed. As an example of HEPS, the European Flood Alert System (EFAS) is being presented. EFAS provides medium-range probabilistic flood forecasting information for large trans-national river basins. It incorporates multiple sets of weather forecast including different types of EPS and deterministic forecasts from different providers. EFAS products are evaluated and visualised as exceedance of critical levels only - both in forms of maps and time series. Different sources of uncertainty and its impact on the flood forecasting performance for every grid cell has been tested offline but not yet incorporated operationally into the forecasting chain for computational reasons. However, at stations where real-time discharges are available, a hydrological uncertainty processor is being applied to estimate the total predictive uncertainty from the hydrological and input uncertainties. Research on long-term EFAS results has shown the need for complementing statistical analysis with case studies for which examples will be shown.
An Investigation of Marine Fog Forecast Concepts.
1981-01-01
8217ANTA ANA C FORECASTING WEST COAST MARINE FOG or which the forecast is to be made .) SENT (TYPICALLY IN LATE I S M~IDDLE OR HIGH CLOUD PRESENT’ THERE...following discussions. Much mention will be made in the ensuing discussion of downslope motion and its role in lowering the inversion. Along a large portion...layer below 850 mb. In those cases, reference will be made to the time sequence of vertical profiles of wind at radiosonde stations. 25 Long Wave
Seasonal-to-Interannual Variability and Land Surface Processes
NASA Technical Reports Server (NTRS)
Koster, Randal
2004-01-01
Atmospheric chaos severely limits the predictability of precipitation on subseasonal to interannual timescales. Hope for accurate long-term precipitation forecasts lies with simulating atmospheric response to components of the Earth system, such as the ocean, that can be predicted beyond a couple of weeks. Indeed, seasonal forecasts centers now rely heavily on forecasts of ocean circulation. Soil moisture, another slow component of the Earth system, is relatively ignored by the operational seasonal forecasting community. It is starting, however, to garner more attention. Soil moisture anomalies can persist for months. Because these anomalies can have a strong impact on evaporation and other surface energy fluxes, and because the atmosphere may respond consistently to anomalies in the surface fluxes, an accurate soil moisture initialization in a forecast system has the potential to provide additional forecast skill. This potential has motivated a number of atmospheric general circulation model (AGCM) studies of soil moisture and its contribution to variability in the climate system. Some of these studies even suggest that in continental midlatitudes during summer, oceanic impacts on precipitation are quite small relative to soil moisture impacts. The model results, though, are strongly model-dependent, with some models showing large impacts and others showing almost none at all. A validation of the model results with observations thus naturally suggests itself, but this is exceedingly difficult. The necessary contemporaneous soil moisture, evaporation, and precipitation measurements at the large scale are virtually non-existent, and even if they did exist, showing statistically that soil moisture affects rainfall would be difficult because the other direction of causality - wherein rainfall affects soil moisture - is unquestionably active and is almost certainly dominant. Nevertheless, joint analyses of observations and AGCM results do reveal some suggestions of land-atmosphere feedback in the observational record, suggestions that soil moisture can affect precipitation over seasonal timescales and across certain large continental areas. The strength of this observed feedback in nature is not large but is still significant enough to be potentially useful, e.g., for forecasts. This talk will address all of these issues. It will begin with a brief overview of land surface modeling in atmospheric models but will then focus on recent research - using both observations and models - into the impact of land surface processes on variability in the climate system.
Predicting spatio-temporal failure in large scale observational and micro scale experimental systems
NASA Astrophysics Data System (ADS)
de las Heras, Alejandro; Hu, Yong
2006-10-01
Forecasting has become an essential part of modern thought, but the practical limitations still are manifold. We addressed future rates of change by comparing models that take into account time, and models that focus more on space. Cox regression confirmed that linear change can be safely assumed in the short-term. Spatially explicit Poisson regression, provided a ceiling value for the number of deforestation spots. With several observed and estimated rates, it was decided to forecast using the more robust assumptions. A Markov-chain cellular automaton thus projected 5-year deforestation in the Amazonian Arc of Deforestation, showing that even a stable rate of change would largely deplete the forest area. More generally, resolution and implementation of the existing models could explain many of the modelling difficulties still affecting forecasting.
NASA Astrophysics Data System (ADS)
Manconi, A.; Giordan, D.
2015-07-01
We apply failure forecast models by exploiting near-real-time monitoring data for the La Saxe rockslide, a large unstable slope threatening Aosta Valley in northern Italy. Starting from the inverse velocity theory, we analyze landslide surface displacements automatically and in near real time on different temporal windows and apply straightforward statistical methods to obtain confidence intervals on the estimated time of failure. Here, we present the result obtained for the La Saxe rockslide, a large unstable slope located in Aosta Valley, northern Italy. Based on this case study, we identify operational thresholds that are established on the reliability of the forecast models. Our approach is aimed at supporting the management of early warning systems in the most critical phases of the landslide emergency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin Wilde, Principal Investigator
2012-12-31
ABSTRACT Application of Real-Time Offsite Measurements in Improved Short-Term Wind Ramp Prediction Skill Improved forecasting performance immediately preceding wind ramp events is of preeminent concern to most wind energy companies, system operators, and balancing authorities. The value of near real-time hub height-level wind data and more general meteorological measurements to short-term wind power forecasting is well understood. For some sites, access to onsite measured wind data - even historical - can reduce forecast error in the short-range to medium-range horizons by as much as 50%. Unfortunately, valuable free-stream wind measurements at tall tower are not typically available at most windmore » plants, thereby forcing wind forecasters to rely upon wind measurements below hub height and/or turbine nacelle anemometry. Free-stream measurements can be appropriately scaled to hub-height levels, using existing empirically-derived relationships that account for surface roughness and turbulence. But there is large uncertainty in these relationships for a given time of day and state of the boundary layer. Alternatively, forecasts can rely entirely on turbine anemometry measurements, though such measurements are themselves subject to wake effects that are not stationary. The void in free-stream hub-height level measurements of wind can be filled by remote sensing (e.g., sodar, lidar, and radar). However, the expense of such equipment may not be sustainable. There is a growing market for traditional anemometry on tall tower networks, maintained by third parties to the forecasting process (i.e., independent of forecasters and the forecast users). This study examines the value of offsite tall-tower data from the WINDataNOW Technology network for short-horizon wind power predictions at a wind farm in northern Montana. The presentation shall describe successful physical and statistical techniques for its application and the practicality of its application in an operational setting. It shall be demonstrated that when used properly, the real-time offsite measurements materially improve wind ramp capture and prediction statistics, when compared to traditional wind forecasting techniques and to a simple persistence model.« less
NASA Astrophysics Data System (ADS)
Kozel, Tomas; Stary, Milos
2017-12-01
The main advantage of stochastic forecasting is fan of possible value whose deterministic method of forecasting could not give us. Future development of random process is described better by stochastic then deterministic forecasting. Discharge in measurement profile could be categorized as random process. Content of article is construction and application of forecasting model for managed large open water reservoir with supply function. Model is based on neural networks (NS) and zone models, which forecasting values of average monthly flow from inputs values of average monthly flow, learned neural network and random numbers. Part of data was sorted to one moving zone. The zone is created around last measurement average monthly flow. Matrix of correlation was assembled only from data belonging to zone. The model was compiled for forecast of 1 to 12 month with using backward month flows (NS inputs) from 2 to 11 months for model construction. Data was got ridded of asymmetry with help of Box-Cox rule (Box, Cox, 1964), value r was found by optimization. In next step were data transform to standard normal distribution. The data were with monthly step and forecast is not recurring. 90 years long real flow series was used for compile of the model. First 75 years were used for calibration of model (matrix input-output relationship), last 15 years were used only for validation. Outputs of model were compared with real flow series. For comparison between real flow series (100% successfully of forecast) and forecasts, was used application to management of artificially made reservoir. Course of water reservoir management using Genetic algorithm (GE) + real flow series was compared with Fuzzy model (Fuzzy) + forecast made by Moving zone model. During evaluation process was founding the best size of zone. Results show that the highest number of input did not give the best results and ideal size of zone is in interval from 25 to 35, when course of management was almost same for all numbers from interval. Resulted course of management was compared with course, which was obtained from using GE + real flow series. Comparing results showed that fuzzy model with forecasted values has been able to manage main malfunction and artificially disorders made by model were founded essential, after values of water volume during management were evaluated. Forecasting model in combination with fuzzy model provide very good results in management of water reservoir with storage function and can be recommended for this purpose.
NASA Astrophysics Data System (ADS)
Singh, Sanjeev Kumar; Prasad, V. S.
2018-02-01
This paper presents a systematic investigation of medium-range rainfall forecasts from two versions of the National Centre for Medium Range Weather Forecasting (NCMRWF)-Global Forecast System based on three-dimensional variational (3D-Var) and hybrid analysis system namely, NGFS and HNGFS, respectively, during Indian summer monsoon (June-September) 2015. The NGFS uses gridpoint statistical interpolation (GSI) 3D-Var data assimilation system, whereas HNGFS uses hybrid 3D ensemble-variational scheme. The analysis includes the evaluation of rainfall fields and comparisons of rainfall using statistical score such as mean precipitation, bias, correlation coefficient, root mean square error and forecast improvement factor. In addition to these, categorical scores like Peirce skill score and bias score are also computed to describe particular aspects of forecasts performance. The comparison results of mean precipitation reveal that both the versions of model produced similar large-scale feature of Indian summer monsoon rainfall for day-1 through day-5 forecasts. The inclusion of fully flow-dependent background error covariance significantly improved the wet biases in HNGFS over the Indian Ocean. The forecast improvement factor and Peirce skill score in the HNGFS have also found better than NGFS for day-1 through day-5 forecasts.
On the Dominant Factor Controlling Seasonal Hydrological Forecast Skill in China
Zhang, Xuejun; Tang, Qiuhong; Leng, Guoyong; ...
2017-11-20
Initial conditions (ICs) and climate forecasts (CFs) are the two primary sources of seasonal hydrological forecast skill. However, their relative contribution to predictive skill remains unclear in China. In this study, we investigate the relative roles of ICs and CFs in cumulative runoff (CR) and soil moisture (SM) forecasts using 31-year (1980–2010) ensemble streamflow prediction (ESP) and reverse-ESP (revESP) simulations with the Variable Capacity Infiltration (VIC) hydrologic model. The results show that the relative importance of ICs and CFs largely depends on climate regimes. The influence of ICs is stronger in a dry or wet-to-dry climate regime that covers themore » northern and western interior regions during the late fall to early summer. In particular, ICs may dominate the forecast skill for up to three months or even six months during late fall and winter months, probably due to the low precipitation value and variability in the dry period. In contrast, CFs become more important for most of southern China or during summer months. The impact of ICs on SM forecasts tends to cover larger domains than on CR forecasts. These findings will greatly benefit future work that will target efforts towards improving current forecast levels for the particular regions and forecast periods.« less
On the Dominant Factor Controlling Seasonal Hydrological Forecast Skill in China
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xuejun; Tang, Qiuhong; Leng, Guoyong
Initial conditions (ICs) and climate forecasts (CFs) are the two primary sources of seasonal hydrological forecast skill. However, their relative contribution to predictive skill remains unclear in China. In this study, we investigate the relative roles of ICs and CFs in cumulative runoff (CR) and soil moisture (SM) forecasts using 31-year (1980–2010) ensemble streamflow prediction (ESP) and reverse-ESP (revESP) simulations with the Variable Capacity Infiltration (VIC) hydrologic model. The results show that the relative importance of ICs and CFs largely depends on climate regimes. The influence of ICs is stronger in a dry or wet-to-dry climate regime that covers themore » northern and western interior regions during the late fall to early summer. In particular, ICs may dominate the forecast skill for up to three months or even six months during late fall and winter months, probably due to the low precipitation value and variability in the dry period. In contrast, CFs become more important for most of southern China or during summer months. The impact of ICs on SM forecasts tends to cover larger domains than on CR forecasts. These findings will greatly benefit future work that will target efforts towards improving current forecast levels for the particular regions and forecast periods.« less
[Improved euler algorithm for trend forecast model and its application to oil spectrum analysis].
Zheng, Chang-song; Ma, Biao
2009-04-01
The oil atomic spectrometric analysis technology is one of the most important methods for fault diagnosis and state monitoring of large machine equipment. The gray method is preponderant in the trend forecast at the same time. With the use of oil atomic spectrometric analysis result and combining the gray forecast theory, the present paper established a gray forecast model of the Fe/Cu concentration trend in the power-shift steering transmission. Aiming at the shortage of the gray method used in the trend forecast, the improved Euler algorithm was put forward for the first time to resolve the problem of the gray model and avoid the non-precision that the old gray model's forecast value depends on the first test value. This new method can make the forecast value more precision as shown in the example. Combined with the threshold value of the oil atomic spectrometric analysis, the new method was applied on the Fe/Cu concentration forecast and the premonition of fault information was obtained. So we can take steps to prevent the fault and this algorithm can be popularized to the state monitoring in the industry.
Impact of Lidar Wind Sounding on Mesoscale Forecast
NASA Technical Reports Server (NTRS)
Miller, Timothy L.; Chou, Shih-Hung; Goodman, H. Michael (Technical Monitor)
2001-01-01
An Observing System Simulation Experiment (OSSE) was conducted to study the impact of airborne lidar wind sounding on mesoscale weather forecast. A wind retrieval scheme, which interpolates wind data from a grid data system, simulates the retrieval of wind profile from a satellite lidar system. A mesoscale forecast system based on the PSU/NCAR MM5 model is developed and incorporated the assimilation of the retrieved line-of-sight wind. To avoid the "identical twin" problem, the NCEP reanalysis data is used as our reference "nature" atmosphere. The simulated space-based lidar wind observations were retrieved by interpolating the NCEP values to the observation locations. A modified dataset obtained by smoothing the NCEP dataset was used as the initial state whose forecast was sought to be improved by assimilating the retrieved lidar observations. Forecasts using wind profiles with various lidar instrument parameters has been conducted. The results show that to significantly improve the mesoscale forecast the satellite should fly near the storm center with large scanning radius. Increasing lidar firing rate also improves the forecast. Cloud cover and lack of aerosol degrade the quality of the lidar wind data and, subsequently, the forecast.
Potential predictability and forecast skill in ensemble climate forecast: the skill-persistence rule
NASA Astrophysics Data System (ADS)
Jin, Y.; Rong, X.; Liu, Z.
2017-12-01
This study investigates the factors that impact the forecast skill for the real world (actual skill) and perfect model (perfect skill) in ensemble climate model forecast with a series of fully coupled general circulation model forecast experiments. It is found that the actual skill of sea surface temperature (SST) in seasonal forecast is substantially higher than the perfect skill on a large part of the tropical oceans, especially the tropical Indian Ocean and the central-eastern Pacific Ocean. The higher actual skill is found to be related to the higher observational SST persistence, suggesting a skill-persistence rule: a higher SST persistence in the real world than in the model could overwhelm the model bias to produce a higher forecast skill for the real world than for the perfect model. The relation between forecast skill and persistence is further examined using a first-order autoregressive model (AR1) analytically for theoretical solutions and numerically for analogue experiments. The AR1 model study shows that the skill-persistence rule is strictly valid in the case of infinite ensemble size, but can be distorted by the sampling error and non-AR1 processes.
Poore, Joshua C; Forlines, Clifton L; Miller, Sarah M; Regan, John R; Irvine, John M
2014-12-01
The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures-personality, cognitive style, motivated cognition-predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated "top-down" cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains.
Forlines, Clifton L.; Miller, Sarah M.; Regan, John R.; Irvine, John M.
2014-01-01
The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures—personality, cognitive style, motivated cognition—predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated “top-down” cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains. PMID:25983670
Precipitation and floodiness: forecasts of flood hazard at the regional scale
NASA Astrophysics Data System (ADS)
Stephens, Liz; Day, Jonny; Pappenberger, Florian; Cloke, Hannah
2016-04-01
In 2008, a seasonal forecast of an increased likelihood of above-normal rainfall in West Africa led the Red Cross to take early humanitarian action (such as prepositioning of relief items) on the basis that this forecast implied heightened flood risk. However, there are a number of factors that lead to non-linearity between precipitation anomalies and flood hazard, so in this presentation we use a recently developed global-scale hydrological model driven by the ERA-Interim/Land precipitation reanalysis (1980-2010) to quantify this non-linearity. Using these data, we introduce the concept of floodiness to measure the incidence of floods over a large area, and quantify the link between monthly precipitation, river discharge and floodiness anomalies. Our analysis shows that floodiness is not well correlated with precipitation, demonstrating the problem of using seasonal precipitation forecasts as a proxy for forecasting flood hazard. This analysis demonstrates the value of developing hydrometeorological forecasts of floodiness for decision-makers. As a result, we are now working with the European Centre for Medium-Range Weather Forecasts and the Joint Research Centre, as partners of the operational Global Flood Awareness System (GloFAS), to implement floodiness forecasts in real-time.
NASA Astrophysics Data System (ADS)
Nanda, Trushnamayee; Beria, Harsh; Sahoo, Bhabagrahi; Chatterjee, Chandranath
2016-04-01
Increasing frequency of hydrologic extremes in a warming climate call for the development of reliable flood forecasting systems. The unavailability of meteorological parameters in real-time, especially in the developing parts of the world, makes it a challenging task to accurately predict flood, even at short lead times. The satellite-based Tropical Rainfall Measuring Mission (TRMM) provides an alternative to the real-time precipitation data scarcity. Moreover, rainfall forecasts by the numerical weather prediction models such as the medium term forecasts issued by the European Center for Medium range Weather Forecasts (ECMWF) are promising for multistep-ahead flow forecasts. We systematically evaluate these rainfall products over a large catchment in Eastern India (Mahanadi River basin). We found spatially coherent trends, with both the real-time TRMM rainfall and ECMWF rainfall forecast products overestimating low rainfall events and underestimating high rainfall events. However, no significant bias was found for the medium rainfall events. Another key finding was that these rainfall products captured the phase of the storms pretty well, but suffered from consistent under-prediction. The utility of the real-time TRMM and ECMWF forecast products are evaluated by rainfall-runoff modeling using different artificial neural network (ANN)-based models up to 3-days ahead. Keywords: TRMM; ECMWF; forecast; ANN; rainfall-runoff modeling
Improving of local ozone forecasting by integrated models.
Gradišar, Dejan; Grašič, Boštjan; Božnar, Marija Zlata; Mlakar, Primož; Kocijan, Juš
2016-09-01
This paper discuss the problem of forecasting the maximum ozone concentrations in urban microlocations, where reliable alerting of the local population when thresholds have been surpassed is necessary. To improve the forecast, the methodology of integrated models is proposed. The model is based on multilayer perceptron neural networks that use as inputs all available information from QualeAria air-quality model, WRF numerical weather prediction model and onsite measurements of meteorology and air pollution. While air-quality and meteorological models cover large geographical 3-dimensional space, their local resolution is often not satisfactory. On the other hand, empirical methods have the advantage of good local forecasts. In this paper, integrated models are used for improved 1-day-ahead forecasting of the maximum hourly value of ozone within each day for representative locations in Slovenia. The WRF meteorological model is used for forecasting meteorological variables and the QualeAria air-quality model for gas concentrations. Their predictions, together with measurements from ground stations, are used as inputs to a neural network. The model validation results show that integrated models noticeably improve ozone forecasts and provide better alert systems.
Johansson, Michael A; Reich, Nicholas G; Hota, Aditi; Brownstein, John S; Santillana, Mauricio
2016-09-26
Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model.
Johansson, Michael A.; Reich, Nicholas G.; Hota, Aditi; Brownstein, John S.; Santillana, Mauricio
2016-01-01
Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model. PMID:27665707
Environmental noise forecasting based on support vector machine
NASA Astrophysics Data System (ADS)
Fu, Yumei; Zan, Xinwu; Chen, Tianyi; Xiang, Shihan
2018-01-01
As an important pollution source, the noise pollution is always the researcher's focus. Especially in recent years, the noise pollution is seriously harmful to the human beings' environment, so the research about the noise pollution is a very hot spot. Some noise monitoring technologies and monitoring systems are applied in the environmental noise test, measurement and evaluation. But, the research about the environmental noise forecasting is weak. In this paper, a real-time environmental noise monitoring system is introduced briefly. This monitoring system is working in Mianyang City, Sichuan Province. It is monitoring and collecting the environmental noise about more than 20 enterprises in this district. Based on the large amount of noise data, the noise forecasting by the Support Vector Machine (SVM) is studied in detail. Compared with the time series forecasting model and the artificial neural network forecasting model, the SVM forecasting model has some advantages such as the smaller data size, the higher precision and stability. The noise forecasting results based on the SVM can provide the important and accuracy reference to the prevention and control of the environmental noise.
NASA Astrophysics Data System (ADS)
Thompson, R. J.; Cole, D. G.; Wilkinson, P. J.; Shea, M. A.; Smart, D.
1990-11-01
Volume 1: The following subject areas are covered: the magnetosphere environment; forecasting magnetically quiet periods; radiation hazards to human in deep space (a summary with special reference to large solar particle events); solar proton events (review and status); problems of the physics of solar-terrestrial interactions; prediction of solar proton fluxes from x-ray signatures; rhythms in solar activity and the prediction of episodes of large flares; the role of persistence in the 24-hour flare forecast; on the relationship between the observed sunspot number and the number of solar flares; the latitudinal distribution of coronal holes and geomagnetic storms due to coronal holes; and the signatures of flares in the interplanetary medium at 1 AU. Volume 2: The following subject areas were covered: a probability forecast for geomagnetic activity; cost recovery in solar-terrestrial predictions; magnetospheric specification and forecasting models; a geomagnetic forecast and monitoring system for power system operation; some aspects of predicting magnetospheric storms; some similarities in ionospheric disturbance characteristics in equatorial, mid-latitude, and sub-auroral regions; ionospheric support for low-VHF radio transmission; a new approach to prediction of ionospheric storms; a comparison of the total electron content of the ionosphere around L=4 at low sunspot numbers with the IRI model; the French ionospheric radio propagation predictions; behavior of the F2 layer at mid-latitudes; and the design of modern ionosondes.
Short term load forecasting using a self-supervised adaptive neural network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, H.; Pimmel, R.L.
The authors developed a self-supervised adaptive neural network to perform short term load forecasts (STLF) for a large power system covering a wide service area with several heavy load centers. They used the self-supervised network to extract correlational features from temperature and load data. In using data from the calendar year 1993 as a test case, they found a 0.90 percent error for hour-ahead forecasting and 1.92 percent error for day-ahead forecasting. These levels of error compare favorably with those obtained by other techniques. The algorithm ran in a couple of minutes on a PC containing an Intel Pentium --more » 120 MHz CPU. Since the algorithm included searching the historical database, training the network, and actually performing the forecasts, this approach provides a real-time, portable, and adaptable STLF.« less
NASA Astrophysics Data System (ADS)
Pavlovic, Radenko; Chen, Jack; Beaulieu, Paul-Andre; Anselmp, David; Gravel, Sylvie; Moran, Mike; Menard, Sylvain; Davignon, Didier
2014-05-01
A wildfire emissions processing system has been developed to incorporate near-real-time emissions from wildfires and large prescribed burns into Environment Canada's real-time GEM-MACH air quality (AQ) forecast system. Since the GEM-MACH forecast domain covers Canada and most of the U.S.A., including Alaska, fire location information is needed for both of these large countries. During AQ model runs, emissions from individual fire sources are injected into elevated model layers based on plume-rise calculations and then transport and chemistry calculations are performed. This "on the fly" approach to the insertion of the fire emissions provides flexibility and efficiency since on-line meteorology is used and computational overhead in emissions pre-processing is reduced. GEM-MACH-FireWork, an experimental wildfire version of GEM-MACH, was run in real-time mode for the summers of 2012 and 2013 in parallel with the normal operational version. 48-hour forecasts were generated every 12 hours (at 00 and 12 UTC). Noticeable improvements in the AQ forecasts for PM2.5 were seen in numerous regions where fire activity was high. Case studies evaluating model performance for specific regions and computed objective scores will be included in this presentation. Using the lessons learned from the last two summers, Environment Canada will continue to work towards the goal of incorporating near-real-time intermittent wildfire emissions into the operational air quality forecast system.
A Structured and Unstructured grid Relocatable ocean platform for Forecasting (SURF)
NASA Astrophysics Data System (ADS)
Trotta, Francesco; Fenu, Elisa; Pinardi, Nadia; Bruciaferri, Diego; Giacomelli, Luca; Federico, Ivan; Coppini, Giovanni
2016-11-01
We present a numerical platform named Structured and Unstructured grid Relocatable ocean platform for Forecasting (SURF). The platform is developed for short-time forecasts and is designed to be embedded in any region of the large-scale Mediterranean Forecasting System (MFS) via downscaling. We employ CTD data collected during a campaign around the Elba island to calibrate and validate SURF. The model requires an initial spin up period of a few days in order to adapt the initial interpolated fields and the subsequent solutions to the higher-resolution nested grids adopted by SURF. Through a comparison with the CTD data, we quantify the improvement obtained by SURF model compared to the coarse-resolution MFS model.
Forecasting eruption size: what we know, what we don't know
NASA Astrophysics Data System (ADS)
Papale, Paolo
2017-04-01
Any eruption forecast includes an evaluation of the expected size of the forthcoming eruption, usually expressed as the probability associated to given size classes. Such evaluation is mostly based on the previous volcanic history at the specific volcano, or it is referred to a broader class of volcanoes constituting "analogues" of the one under specific consideration. In any case, use of knowledge from past eruptions implies considering the completeness of the reference catalogue, and most importantly, the existence of systematic biases in the catalogue, that may affect probability estimates and translate into biased volcanic hazard forecasts. An analysis of existing catalogues, with major reference to the catalogue from the Smithsonian Global Volcanism Program, suggests that systematic biases largely dominate at global, regional and local scale: volcanic histories reconstructed at individual volcanoes, often used as a reference for volcanic hazard forecasts, are the result of systematic loss of information with time and poor sample representativeness. That situation strictly requires the use of techniques to complete existing catalogues, as well as careful consideration of the uncertainties deriving from inadequate knowledge and model-dependent data elaboration. A reconstructed global eruption size distribution, obtained by merging information from different existing catalogues, shows a mode in the VEI 1-2 range, <0.1% incidence of eruptions with VEI 7 or larger, and substantial uncertainties associated with individual VEI frequencies. Even larger uncertainties are expected to derive from application to individual volcanoes or classes of analogue volcanoes, suggesting large to very large uncertainties associated to volcanic hazard forecasts virtually at any individual volcano worldwide.
Forecasting production in Liquid Rich Shale plays
NASA Astrophysics Data System (ADS)
Nikfarman, Hanieh
Production from Liquid Rich Shale (LRS) reservoirs is taking center stage in the exploration and production of unconventional reservoirs. Production from the low and ultra-low permeability LRS plays is possible only through multi-fractured horizontal wells (MFHW's). There is no existing workflow that is applicable to forecasting multi-phase production from MFHW's in LRS plays. This project presents a practical and rigorous workflow for forecasting multiphase production from MFHW's in LRS reservoirs. There has been much effort in developing workflows and methodology for forecasting in tight/shale plays in recent years. The existing workflows, however, are applicable only to single phase flow, and are primarily used in shale gas plays. These methodologies do not apply to the multi-phase flow that is inevitable in LRS plays. To account for complexities of multiphase flow in MFHW's the only available technique is dynamic modeling in compositional numerical simulators. These are time consuming and not practical when it comes to forecasting production and estimating reserves for a large number of producers. A workflow was developed, and validated by compositional numerical simulation. The workflow honors physics of flow, and is sufficiently accurate while practical so that an analyst can readily apply it to forecast production and estimate reserves in a large number of producers in a short period of time. To simplify the complex multiphase flow in MFHW, the workflow divides production periods into an initial period where large production and pressure declines are expected, and the subsequent period where production decline may converge into a common trend for a number of producers across an area of interest in the field. Initial period assumes the production is dominated by single-phase flow of oil and uses the tri-linear flow model of Erdal Ozkan to estimate the production history. Commercial software readily available can simulate flow and forecast production in this period. In the subsequent Period, dimensionless rate and dimensionless time functions are introduced that help identify transition from initial period into subsequent period. The production trends in terms of the dimensionless parameters converge for a range of rock permeability and stimulation intensity. This helps forecast production beyond transition to the end of life of well. This workflow is applicable to single fluid system.
NASA Astrophysics Data System (ADS)
Gelfan, Alexander; Moreydo, Vsevolod; Motovilov, Yury; Solomatine, Dimitri P.
2018-04-01
A long-term forecasting ensemble methodology, applied to water inflows into the Cheboksary Reservoir (Russia), is presented. The methodology is based on a version of the semi-distributed hydrological model ECOMAG (ECOlogical Model for Applied Geophysics) that allows for the calculation of an ensemble of inflow hydrographs using two different sets of weather ensembles for the lead time period: observed weather data, constructed on the basis of the Ensemble Streamflow Prediction methodology (ESP-based forecast), and synthetic weather data, simulated by a multi-site weather generator (WG-based forecast). We have studied the following: (1) whether there is any advantage of the developed ensemble forecasts in comparison with the currently issued operational forecasts of water inflow into the Cheboksary Reservoir, and (2) whether there is any noticeable improvement in probabilistic forecasts when using the WG-simulated ensemble compared to the ESP-based ensemble. We have found that for a 35-year period beginning from the reservoir filling in 1982, both continuous and binary model-based ensemble forecasts (issued in the deterministic form) outperform the operational forecasts of the April-June inflow volume actually used and, additionally, provide acceptable forecasts of additional water regime characteristics besides the inflow volume. We have also demonstrated that the model performance measures (in the verification period) obtained from the WG-based probabilistic forecasts, which are based on a large number of possible weather scenarios, appeared to be more statistically reliable than the corresponding measures calculated from the ESP-based forecasts based on the observed weather scenarios.
NASA Astrophysics Data System (ADS)
Naulin, J.-P.; Payrastre, O.; Gaume, E.
2013-04-01
SummaryAccurate flood forecasts are critical to an efficient flood event management strategy. Until now, hydro-meteorological forecasts have mainly been used to establish early-warnings in France (meteorological and flood vigilance maps) or over the world (flash-flood guidances). These forecasts are typically limited either to the main streams covered by the flood forecasting services or to watersheds with specific assets like check dams, which in most cases are well gauged river sections, thus leaving aside large parts of the territory. This paper presents a distributed hydro-meteorological forecasting approach, which makes use of the high spatial and temporal resolution rainfall estimates that are now available, to provide information at ungauged sites. The proposed system intended to detect road inundation risks had initially been developed and tested in areas of limited size. This paper presents the extension of such a system to an entire region (i.e. the Gard region in Southern France), including over 2000 crossing points between rivers and roads and its validation with respect to a large data set of actual reported road inundations observed during recent flash flood events. These initial validation results appear to be most promising. The eventual proposed tool would provide the necessary information for flood event management services to identify the areas at risk and adopt appropriate safety and rescue measures: i.e. pre-positioning of rescue equipment, interruption of the traffic on the exposed roads and determination of safe access or evacuation routes. Moreover, beyond the specific application to the supervision of a road network, the research undertaken herein also provides results for the performance of hydro-meteorological forecasts on ungauged headwaters.
Development of On-line Wildfire Emissions for the Operational Canadian Air Quality Forecast System
NASA Astrophysics Data System (ADS)
Pavlovic, R.; Menard, S.; Chen, J.; Anselmo, D.; Paul-Andre, B.; Gravel, S.; Moran, M. D.; Davignon, D.
2013-12-01
An emissions processing system has been developed to incorporate near-real-time emissions from wildfires and large prescribed burns into Environment Canada's real-time GEM-MACH air quality (AQ) forecast system. Since the GEM-MACH forecast domain covers Canada and most of the USA, including Alaska, fire location information is needed for both of these large countries. Near-real-time satellite data are obtained and processed separately for the two countries for organizational reasons. Fire location and fuel consumption data for Canada are provided by the Canadian Forest Service's Canadian Wild Fire Information System (CWFIS) while fire location and emissions data for the U.S. are provided by the SMARTFIRE (Satellite Mapping Automated Reanalysis Tool for Fire Incident Reconciliation) system via the on-line BlueSky Gateway. During AQ model runs, emissions from individual fire sources are injected into elevated model layers based on plume-rise calculations and then transport and chemistry calculations are performed. This 'on the fly' approach to the insertion of emissions provides greater flexibility since on-line meteorology is used and reduces computational overhead in emission pre-processing. An experimental wildfire version of GEM-MACH was run in real-time mode for the summers of 2012 and 2013. 48-hour forecasts were generated every 12 hours (at 00 and 12 UTC). Noticeable improvements in the AQ forecasts for PM2.5 were seen in numerous regions where fire activity was high. Case studies evaluating model performance for specific regions, computed objective scores, and subjective evaluations by AQ forecasters will be included in this presentation. Using the lessons learned from the last two summers, Environment Canada will continue to work towards the goal of incorporating near-real-time intermittent wildfire emissions within the operational air quality forecast system.
NASA Astrophysics Data System (ADS)
Owens, M. J.; Riley, P.; Horbury, T. S.
2017-05-01
Effective space-weather prediction and mitigation requires accurate forecasting of near-Earth solar-wind conditions. Numerical magnetohydrodynamic models of the solar wind, driven by remote solar observations, are gaining skill at forecasting the large-scale solar-wind features that give rise to near-Earth variations over days and weeks. There remains a need for accurate short-term (hours to days) solar-wind forecasts, however. In this study we investigate the analogue ensemble (AnEn), or "similar day", approach that was developed for atmospheric weather forecasting. The central premise of the AnEn is that past variations that are analogous or similar to current conditions can be used to provide a good estimate of future variations. By considering an ensemble of past analogues, the AnEn forecast is inherently probabilistic and provides a measure of the forecast uncertainty. We show that forecasts of solar-wind speed can be improved by considering both speed and density when determining past analogues, whereas forecasts of the out-of-ecliptic magnetic field [BN] are improved by also considering the in-ecliptic magnetic-field components. In general, the best forecasts are found by considering only the previous 6 - 12 hours of observations. Using these parameters, the AnEn provides a valuable probabilistic forecast for solar-wind speed, density, and in-ecliptic magnetic field over lead times from a few hours to around four days. For BN, which is central to space-weather disturbance, the AnEn only provides a valuable forecast out to around six to seven hours. As the inherent predictability of this parameter is low, this is still likely a marked improvement over other forecast methods. We also investigate the use of the AnEn in forecasting geomagnetic indices Dst and Kp. The AnEn provides a valuable probabilistic forecast of both indices out to around four days. We outline a number of future improvements to AnEn forecasts of near-Earth solar-wind and geomagnetic conditions.
Ecological forecasting in the presence of abrupt regime shifts
NASA Astrophysics Data System (ADS)
Dippner, Joachim W.; Kröncke, Ingrid
2015-10-01
Regime shifts may cause an intrinsic decrease in the potential predictability of marine ecosystems. In such cases, forecasts of biological variables fail. To improve prediction of long-term variability in environmental variables, we constructed a multivariate climate index and applied it to forecast ecological time series. The concept is demonstrated herein using climate and macrozoobenthos data from the southern North Sea. Special emphasis is given to the influence of selection of length of fitting period to the quality of forecast skill especially in the presence of regime shifts. Our results indicate that the performance of multivariate predictors in biological forecasts is much better than that of single large-scale climate indices, especially in the presence of regime shifts. The approach used to develop the index is generally applicable to all geographical regions in the world and to all areas of marine biology, from the species level up to biodiversity. Such forecasts are of vital interest for practical aspects of the sustainable management of marine ecosystems and the conservation of ecosystem goods and services.
Winter precipitation forecast in the European and Mediterranean regions using cluster analysis
NASA Astrophysics Data System (ADS)
Molnos, S.
2017-12-01
The European and Mediterranean climates are sensitive to large-scale circulation of the atmosphere andocean making it difficult to forecast precipitation or temperature on seasonal time-scales. In addition, theMediterranean region has been identified as a hotspot for climate change and already today a drying in theMediterranean region is observed.Thus, it is critically important to predict seasonal droughts as early as possible such that water managersand stakeholders can mitigate impacts.We developed a novel cluster-based forecast method to empirically predict winter's precipitationanomalies in European and Mediterranean regions using precursors in autumn. This approach does notonly utilizes the amplitude but also the pattern of the precursors in generating the forecast.Using a toy model we show that it achieves a better forecast skill than more traditional regression models. Furthermore, we compare our algorithm with dynamic forecast models demonstrating that our prediction method performs better in terms of time and pattern correlation in the Mediterranean and European regions.
Forecasting the Solar Drivers of Severe Space Weather from Active-Region Magnetograms
NASA Technical Reports Server (NTRS)
Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor
2012-01-01
Large flares and fast CMEs are the drivers of the most severe space weather including Solar Energetic Particle Events (SEP Events). Large flares and their co-produced CMEs are powered by the explosive release of free magnetic energy stored in non-potential magnetic fields of sunspot active regions. The free energy is stored in and released from the low-beta regime of the active region s magnetic field above the photosphere, in the chromosphere and low corona. From our work over the past decade and from similar work of several other groups, it is now well established that (1) a proxy of the free magnetic energy stored above the photosphere can be measured from photospheric magnetograms, and (2) an active region s rate of production of major CME/flare eruptions in the coming day or so is strongly correlated with its present measured value of the free-energy proxy. These results have led us to use the large database of SOHO/MDI full-disk magnetograms spanning Solar Cycle 23 to obtain empirical forecasting curves that from an active region s present measured value of the free-energy proxy give the active region s expected rates of production of major flares, CMEs, fast CMEs, and SEP Events in the coming day or so (Falconer et al 2011, Space Weather, 9, S04003). We will present these forecasting curves and demonstrate the accuracy of their forecasts. In addition, we will show that the forecasts for major flares and fast CMEs can be made significantly more accurate by taking into account not only the value of the free energy proxy but also the active region s recent productivity of major flares; specifically, whether the active region has produced a major flare (GOES class M or X) during the past 24 hours before the time of the measured magnetogram. By empirically determining the conversion of the value of free-energy proxy measured from a GONG or HMI magnetogram to that which would be measured from an MDI magnetogram, we have made GONG and HMI magnetograms useable with our MDI-based forecasting curves to forecast event rates.
Optimization of Coronal Mass Ejection Ensemble Forecasting Using WSA-ENLIL with Coned Model
2013-03-01
previous versions by a large margin. The mean absolute forecast error of the median ensemble results was improved by over 43% over the original Coned...for reference for the six extra CMEs. .............................................................................................54 Figure 19...single-shot runs) with the flare location noted for reference for the six extra CMEs
Good Bye Traditional Budgeting, Hello Rolling Forecast: Has the Time Come?
ERIC Educational Resources Information Center
Zeller, Thomas L.; Metzger, Lawrence M.
2013-01-01
This paper argues for a new approach to accounting textbook budgeting material. The business environment is not stable. Change is continuous, for large and small business alike. A business must act and react to generate shareholder value. The rolling forecast provides the necessary navigational insight. The traditional annual static budget does…
Numerical prediction of the Mid-Atlantic states cyclone of 18-19 February 1979
NASA Technical Reports Server (NTRS)
Atlas, R.; Rosenberg, R.
1982-01-01
A series of forecast experiments was conducted to assess the accuracy of the GLAS model, and to determine the importance of large scale dynamical processes and diabatic heating to the cyclogenesis. The GLAS model correctly predicted intense coastal cyclogenesis and heavy precipitation. Repeated without surface heat and moisture fluxes, the model failed to predict any cyclone development. An extended range forecast, a forecast from the NMC analysis interpolated to the GLAS grid, and a forecast from the GLAS analysis with the surface moisture flux excluded predicted weak coastal low development. Diabatic heating resulting from oceanic fluxes significantly contributed to the generation of low level cyclonic vorticity and the intensification and slow rate of movement of an upper level ridge over the western Atlantic. As an upper level short wave trough approached this ridge, diabatic heating associated with the release of latent heat intensified, and the gradient of vorticity, vorticity advection and upper level divergence in advance of the trough were greatly increased, providing strong large scale forcing for the surface cyclogenesis.
Deep Learning Based Solar Flare Forecasting Model. I. Results for Line-of-sight Magnetograms
NASA Astrophysics Data System (ADS)
Huang, Xin; Wang, Huaning; Xu, Long; Liu, Jinfu; Li, Rong; Dai, Xinghua
2018-03-01
Solar flares originate from the release of the energy stored in the magnetic field of solar active regions, the triggering mechanism for these flares, however, remains unknown. For this reason, the conventional solar flare forecast is essentially based on the statistic relationship between solar flares and measures extracted from observational data. In the current work, the deep learning method is applied to set up the solar flare forecasting model, in which forecasting patterns can be learned from line-of-sight magnetograms of solar active regions. In order to obtain a large amount of observational data to train the forecasting model and test its performance, a data set is created from line-of-sight magnetogarms of active regions observed by SOHO/MDI and SDO/HMI from 1996 April to 2015 October and corresponding soft X-ray solar flares observed by GOES. The testing results of the forecasting model indicate that (1) the forecasting patterns can be automatically reached with the MDI data and they can also be applied to the HMI data; furthermore, these forecasting patterns are robust to the noise in the observational data; (2) the performance of the deep learning forecasting model is not sensitive to the given forecasting periods (6, 12, 24, or 48 hr); (3) the performance of the proposed forecasting model is comparable to that of the state-of-the-art flare forecasting models, even if the duration of the total magnetograms continuously spans 19.5 years. Case analyses demonstrate that the deep learning based solar flare forecasting model pays attention to areas with the magnetic polarity-inversion line or the strong magnetic field in magnetograms of active regions.
The use of seasonal forecasts in a crop failure early warning system for West Africa
NASA Astrophysics Data System (ADS)
Nicklin, K. J.; Challinor, A.; Tompkins, A.
2011-12-01
Seasonal rainfall in semi-arid West Africa is highly variable. Farming systems in the region are heavily dependent on the monsoon rains leading to large variability in crop yields and a population that is vulnerable to drought. The existing crop yield forecasting system uses observed weather to calculate a water satisfaction index, which is then related to expected crop yield (Traore et al, 2006). Seasonal climate forecasts may be able to increase the lead-time of yield forecasts and reduce the humanitarian impact of drought. This study assesses the potential for a crop failure early warning system, which uses dynamic seasonal forecasts and a process-based crop model. Two sets of simulations are presented. In the first, the crop model is driven with observed weather as a control run. Observed rainfall is provided by the GPCP 1DD data set, whilst observed temperature and solar radiation data are given by the ERA-Interim reanalysis. The crop model used is the groundnut version of the General Large Area Model for annual crops (GLAM), which has been designed to operate on the grids used by seasonal weather forecasts (Challinor et al, 2004). GLAM is modified for use in West Africa by allowing multiple planting dates each season, replanting failed crops and producing parameter sets for Spanish- and Virginia- type West African groundnut. Crop yields are simulated for three different assumptions concerning the distribution and relative abundance of Spanish- and Virginia- type groundnut. Model performance varies with location, but overall shows positive skill in reproducing observed crop failure. The results for the three assumptions are similar, suggesting that the performance of the system is limited by something other than information on the type of groundnut grown. In the second set of simulations the crop model is driven with observed weather up to the forecast date, followed by ECMWF system 3 seasonal forecasts until harvest. The variation of skill with forecast date is assessed along with the extent to which forecasts can be improved by bias correction of the rainfall data. Two forms of bias correction are applied: a novel method of spatially bias correcting daily data, and statistical bias correction of the frequency and intensity distribution. Results are presented using both observed yields and the control run as the reference for verification. The potential for current dynamic seasonal forecasts to form part of an operational system giving timely and accurate warnings of crop failure is discussed. Traore S.B. et al., 2006. A Review of Agrometeorological Monitoring Tools and Methods Used in the West African Sahel. In: Motha R.P. et al., Strengthening Operational Agrometeorological Services at the National Level. Technical Bulletin WAOB-2006-1 and AGM-9, WMO/TD No. 1277. Pages 209-220. www.wamis.org/agm/pubs/agm9/WMO-TD1277.pdf Challinor A.J. et al., 2004. Design and optimisation of a large-area process based model for annual crops. Agric. For. Meteorol. 124, 99-120.
Delensing CMB polarization with external datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kendrick M.; Hanson, Duncan; LoVerde, Marilena
2012-06-01
One of the primary scientific targets of current and future CMB polarization experiments is the search for a stochastic background of gravity waves in the early universe. As instrumental sensitivity improves, the limiting factor will eventually be B-mode power generated by gravitational lensing, which can be removed through use of so-called ''delensing'' algorithms. We forecast prospects for delensing using lensing maps which are obtained externally to CMB polarization: either from large-scale structure observations, or from high-resolution maps of CMB temperature. We conclude that the forecasts in either case are not encouraging, and that significantly delensing large-scale CMB polarization requires high-resolutionmore » polarization maps with sufficient sensitivity to measure the lensing B-mode. We also present a simple formalism for including delensing in CMB forecasts which is computationally fast and agrees well with Monte Carlos.« less
Prediction of ENSO episodes using canonical correlation analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnston, A.G.; Ropelewski, C.F.
Canonical correlation analysis (CCA) is explored as a multivariate linear statistical methodology with which to forecast fluctuations of the El Nino/Southern Oscillation (ENSO) in real time. CCA is capable of identifying critical sequences of predictor patterns that tend to evolve into subsequent pattern that can be used to form a forecast. The CCA model is used to forecast the 3-month mean sea surface temperature (SST) in several regions of the tropical Pacific and Indian oceans for projection times of 0 to 4 seasons beyond the immediately forthcoming season. The predictor variables, representing the climate situation in the four consecutive 3-monthmore » periods ending at the time of the forecast, are (1) quasi-global seasonal mean sea level pressure (SLP) and (2) SST in the predicted regions themselves. Forecast skill is estimated using cross-validation, and persistence is used as the primary skill control measure. Results indicate that a large region in the eastern equatorial Pacific (120[degrees]-170[degrees] W longitude) has the highest overall predictability, with excellent skill realized for winter forecasts made at the end of summer. CCA outperforms persistence in this region under most conditions, and does noticeably better with the SST included as a predictor in addition to the SLP. It is demonstrated that better forecast performance at the longer lead times would be obtained if some significantly earlier (i.e., up to 4 years) predictor data were included, because the ability to predict the lower-frequency ENSO phase changes would increase. The good performance of the current system at shorter lead times appears to be based largely on the ability to predict ENSO evolution for events already in progress. The forecasting of the eastern tropical Pacific SST using CCA is now done routinely on a monthly basis for a O-, 1-, and 2-season lead at the Climate Analysis Center.« less
Climate forecasting services: coming down from the ivory tower
NASA Astrophysics Data System (ADS)
Doblas-Reyes, F. J.; Caron, L. P.; Cortesi, N.; Soret, A.; Torralba, V.; Turco, M.; González Reviriego, N.; Jiménez, I.; Terrado, M.
2016-12-01
Subseasonal-to-seasonal (S2S) climate forecasts are increasingly used across a range of application areas (energy, water management, agriculture, health, insurance) through tailored services using the climate services paradigm. In this contribution we show the value of climate forecasting services through several examples of their application in the energy, reinsurance and agriculture sectors. Climate services aim at making climate information action oriented. In a climate forecasting context the task starts with the identification of climate variables, thresholds and events relevant to the users. These elements are then analysed to determine whether they can be both reliably and skilfully predicted at appropriate time scales. In this contribution we assess climate predictions of precipitation, temperature and wind indices from state-of-the-art operational multi-model forecast systems and if they respond to the expectations and requests from a range of users. This requires going beyond the more traditional assessment of monthly mean values to include assessments of global forecast quality of the frequency of warm, cold, windy and wet extremes (e.g. [1], [2]), as well as of using tools like the Euro-Atlantic weather regimes [3]. The forecast quality of extremes is generally similar to or slightly lower than that of monthly or seasonal averages, but offers a kind of information closer to what some users require. In addition to considering local climate variables, we also explore the use of large-scale climate indices, such as ENSO and NAO, that are associated with large regional synchronous variations of wind or tropical storm frequency. These indices help illustrating the relative merits of climate forecast information to users and are the cornerstone of climate stories that engage them in the co-production of climate information. [1] Doblas-Reyes et al, WIREs, 2013 [2] Pepler et al, Weather and Climate Extremes, 2015 [3] Pavan and Doblas-Reyes, Clim Dyn, 2013
NASA Astrophysics Data System (ADS)
Li, J.
2017-12-01
Large-watershed flood simulation and forecasting is very important for a distributed hydrological model in the application. There are some challenges including the model's spatial resolution effect, model performance and accuracy and so on. To cope with the challenge of the model's spatial resolution effect, different model resolution including 1000m*1000m, 600m*600m, 500m*500m, 400m*400m, 200m*200m were used to build the distributed hydrological model—Liuxihe model respectively. The purpose is to find which one is the best resolution for Liuxihe model in Large-watershed flood simulation and forecasting. This study sets up a physically based distributed hydrological model for flood forecasting of the Liujiang River basin in south China. Terrain data digital elevation model (DEM), soil type and land use type are downloaded from the website freely. The model parameters are optimized by using an improved Particle Swarm Optimization(PSO) algorithm; And parameter optimization could reduce the parameter uncertainty that exists for physically deriving model parameters. The different model resolution (200m*200m—1000m*1000m ) are proposed for modeling the Liujiang River basin flood with the Liuxihe model in this study. The best model's spatial resolution effect for flood simulation and forecasting is 200m*200m.And with the model's spatial resolution reduction, the model performance and accuracy also become worse and worse. When the model resolution is 1000m*1000m, the flood simulation and forecasting result is the worst, also the river channel divided based on this resolution is differs from the actual one. To keep the model with an acceptable performance, minimum model spatial resolution is needed. The suggested threshold model spatial resolution for modeling the Liujiang River basin flood is a 500m*500m grid cell, but the model spatial resolution with a 200m*200m grid cell is recommended in this study to keep the model at a best performance.
A travel time forecasting model based on change-point detection method
NASA Astrophysics Data System (ADS)
LI, Shupeng; GUANG, Xiaoping; QIAN, Yongsheng; ZENG, Junwei
2017-06-01
Travel time parameters obtained from road traffic sensors data play an important role in traffic management practice. A travel time forecasting model is proposed for urban road traffic sensors data based on the method of change-point detection in this paper. The first-order differential operation is used for preprocessing over the actual loop data; a change-point detection algorithm is designed to classify the sequence of large number of travel time data items into several patterns; then a travel time forecasting model is established based on autoregressive integrated moving average (ARIMA) model. By computer simulation, different control parameters are chosen for adaptive change point search for travel time series, which is divided into several sections of similar state.Then linear weight function is used to fit travel time sequence and to forecast travel time. The results show that the model has high accuracy in travel time forecasting.
Water quality in the Schuylkill River, Pennsylvania: the potential for long-lead forecasts
NASA Astrophysics Data System (ADS)
Block, P. J.; Peralez, J.
2012-12-01
Prior analysis of pathogen levels in the Schuylkill River has led to a categorical daily forecast of water quality (denoted as red, yellow, or green flag days.) The forecast, available to the public online through the Philadelphia Water Department, is predominantly based on the local precipitation forecast. In this study, we explore the feasibility of extending the forecast to the seasonal scale by associating large-scale climate drivers with local precipitation and water quality parameter levels. This advance information is relevant for recreational activities, ecosystem health, and water treatment (energy, chemicals), as the Schuylkill provides 40% of Philadelphia's water supply. Preliminary results indicate skillful prediction of average summertime water quality parameters and characteristics, including chloride, coliform, turbidity, alkalinity, and others, using season-ahead oceanic and atmospheric variables, predominantly from the North Atlantic. Water quality parameter trends, including historic land use changes along the river, association with climatic variables, and prediction models will be presented.
Network bandwidth utilization forecast model on high bandwidth networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wuchert; Sim, Alex
With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology,more » our forecast model reduces computation time by 83.2%. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.« less
Network Bandwidth Utilization Forecast Model on High Bandwidth Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Sim, Alex
With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology,more » our forecast model reduces computation time by 83.2percent. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.« less
Will Arctic sea ice thickness initialization improve seasonal forecast skill?
NASA Astrophysics Data System (ADS)
Day, J. J.; Hawkins, E.; Tietsche, S.
2014-11-01
Arctic sea ice thickness is thought to be an important predictor of Arctic sea ice extent. However, coupled seasonal forecast systems do not generally use sea ice thickness observations in their initialization and are therefore missing a potentially important source of additional skill. To investigate how large this source is, a set of ensemble potential predictability experiments with a global climate model, initialized with and without knowledge of the sea ice thickness initial state, have been run. These experiments show that accurate knowledge of the sea ice thickness field is crucially important for sea ice concentration and extent forecasts up to 8 months ahead, especially in summer. Perturbing sea ice thickness also has a significant impact on the forecast error in Arctic 2 m temperature a few months ahead. These results suggest that advancing capabilities to observe and assimilate sea ice thickness into coupled forecast systems could significantly increase skill.
NASA Astrophysics Data System (ADS)
Seibert, Mathias; Merz, Bruno; Apel, Heiko
2017-03-01
The Limpopo Basin in southern Africa is prone to droughts which affect the livelihood of millions of people in South Africa, Botswana, Zimbabwe and Mozambique. Seasonal drought early warning is thus vital for the whole region. In this study, the predictability of hydrological droughts during the main runoff period from December to May is assessed using statistical approaches. Three methods (multiple linear models, artificial neural networks, random forest regression trees) are compared in terms of their ability to forecast streamflow with up to 12 months of lead time. The following four main findings result from the study. 1. There are stations in the basin at which standardised streamflow is predictable with lead times up to 12 months. The results show high inter-station differences of forecast skill but reach a coefficient of determination as high as 0.73 (cross validated). 2. A large range of potential predictors is considered in this study, comprising well-established climate indices, customised teleconnection indices derived from sea surface temperatures and antecedent streamflow as a proxy of catchment conditions. El Niño and customised indices, representing sea surface temperature in the Atlantic and Indian oceans, prove to be important teleconnection predictors for the region. Antecedent streamflow is a strong predictor in small catchments (with median 42 % explained variance), whereas teleconnections exert a stronger influence in large catchments. 3. Multiple linear models show the best forecast skill in this study and the greatest robustness compared to artificial neural networks and random forest regression trees, despite their capabilities to represent nonlinear relationships. 4. Employed in early warning, the models can be used to forecast a specific drought level. Even if the coefficient of determination is low, the forecast models have a skill better than a climatological forecast, which is shown by analysis of receiver operating characteristics (ROCs). Seasonal statistical forecasts in the Limpopo show promising results, and thus it is recommended to employ them as complementary to existing forecasts in order to strengthen preparedness for droughts.
Sufficient Forecasting Using Factor Models
Fan, Jianqing; Xue, Lingzhou; Yao, Jiawei
2017-01-01
We consider forecasting a single time series when there is a large number of predictors and a possible nonlinear effect. The dimensionality was first reduced via a high-dimensional (approximate) factor model implemented by the principal component analysis. Using the extracted factors, we develop a novel forecasting method called the sufficient forecasting, which provides a set of sufficient predictive indices, inferred from high-dimensional predictors, to deliver additional predictive power. The projected principal component analysis will be employed to enhance the accuracy of inferred factors when a semi-parametric (approximate) factor model is assumed. Our method is also applicable to cross-sectional sufficient regression using extracted factors. The connection between the sufficient forecasting and the deep learning architecture is explicitly stated. The sufficient forecasting correctly estimates projection indices of the underlying factors even in the presence of a nonparametric forecasting function. The proposed method extends the sufficient dimension reduction to high-dimensional regimes by condensing the cross-sectional information through factor models. We derive asymptotic properties for the estimate of the central subspace spanned by these projection directions as well as the estimates of the sufficient predictive indices. We further show that the natural method of running multiple regression of target on estimated factors yields a linear estimate that actually falls into this central subspace. Our method and theory allow the number of predictors to be larger than the number of observations. We finally demonstrate that the sufficient forecasting improves upon the linear forecasting in both simulation studies and an empirical study of forecasting macroeconomic variables. PMID:29731537
NASA Astrophysics Data System (ADS)
Rheinheimer, David E.; Bales, Roger C.; Oroza, Carlos A.; Lund, Jay R.; Viers, Joshua H.
2016-05-01
We assessed the potential value of hydrologic forecasting improvements for a snow-dominated high-elevation hydropower system in the Sierra Nevada of California, using a hydropower optimization model. To mimic different forecasting skill levels for inflow time series, rest-of-year inflows from regression-based forecasts were blended in different proportions with representative inflows from a spatially distributed hydrologic model. The statistical approach mimics the simpler, historical forecasting approach that is still widely used. Revenue was calculated using historical electricity prices, with perfect price foresight assumed. With current infrastructure and operations, perfect hydrologic forecasts increased annual hydropower revenue by 0.14 to 1.6 million, with lower values in dry years and higher values in wet years, or about $0.8 million (1.2%) on average, representing overall willingness-to-pay for perfect information. A second sensitivity analysis found a wider range of annual revenue gain or loss using different skill levels in snow measurement in the regression-based forecast, mimicking expected declines in skill as the climate warms and historical snow measurements no longer represent current conditions. The value of perfect forecasts was insensitive to storage capacity for small and large reservoirs, relative to average inflow, and modestly sensitive to storage capacity with medium (current) reservoir storage. The value of forecasts was highly sensitive to powerhouse capacity, particularly for the range of capacities in the northern Sierra Nevada. The approach can be extended to multireservoir, multipurpose systems to help guide investments in forecasting.
Do location specific forecasts pose a new challenge for communicating uncertainty?
NASA Astrophysics Data System (ADS)
Abraham, Shyamali; Bartlett, Rachel; Standage, Matthew; Black, Alison; Charlton-Perez, Andrew; McCloy, Rachel
2015-04-01
In the last decade, the growth of local, site-specific weather forecasts delivered by mobile phone or website represents arguably the fastest change in forecast consumption since the beginning of Television weather forecasts 60 years ago. In this study, a street-interception survey of 274 members of the public a clear first preference for narrow weather forecasts above traditional broad weather forecasts is shown for the first time, with a clear bias towards this preference for users under 40. The impact of this change on the understanding of forecast probability and intensity information is explored. While the correct interpretation of the statement 'There is a 30% chance of rain tomorrow' is still low in the cohort, in common with previous studies, a clear impact of age and educational attainment on understanding is shown, with those under 40 and educated to degree level or above more likely to correctly interpret it. The interpretation of rainfall intensity descriptors ('Light', 'Moderate', 'Heavy') by the cohort is shown to be significantly different to official and expert assessment of the same descriptors and to have large variance amongst the cohort. However, despite these key uncertainties, members of the cohort generally seem to make appropriate decisions about rainfall forecasts. There is some evidence that the decisions made are different depending on the communication format used, and the cohort expressed a clear preference for tabular over graphical weather forecast presentation.
NASA Astrophysics Data System (ADS)
Kato, Takeyoshi; Sone, Akihito; Shimakage, Toyonari; Suzuoki, Yasuo
A microgrid (MG) is one of the measures for enhancing the high penetration of renewable energy (RE)-based distributed generators (DGs). For constructing a MG economically, the capacity optimization of controllable DGs against RE-based DGs is essential. By using a numerical simulation model developed based on the demonstrative studies on a MG using PAFC and NaS battery as controllable DGs and photovoltaic power generation system (PVS) as a RE-based DG, this study discusses the influence of forecast accuracy of PVS output on the capacity optimization and daily operation evaluated with the cost. The main results are as follows. The required capacity of NaS battery must be increased by 10-40% against the ideal situation without the forecast error of PVS power output. The influence of forecast error on the received grid electricity would not be so significant on annual basis because the positive and negative forecast error varies with days. The annual total cost of facility and operation increases by 2-7% due to the forecast error applied in this study. The impact of forecast error on the facility optimization and operation optimization is almost the same each other at a few percentages, implying that the forecast accuracy should be improved in terms of both the number of times with large forecast error and the average error.
NASA Astrophysics Data System (ADS)
Sone, Akihito; Kato, Takeyoshi; Shimakage, Toyonari; Suzuoki, Yasuo
A microgrid (MG) is one of the measures for enhancing the high penetration of renewable energy (RE)-based distributed generators (DGs). If a number of MGs are controlled to maintain the predetermined electricity demand including RE-based DGs as negative demand, they would contribute to supply-demand balancing of whole electric power system. For constructing a MG economically, the capacity optimization of controllable DGs against RE-based DGs is essential. By using a numerical simulation model developed based on a demonstrative study on a MG using PAFC and NaS battery as controllable DGs and photovoltaic power generation system (PVS) as a RE-based DG, this study discusses the influence of forecast accuracy of PVS output on the capacity optimization. Three forecast cases with different accuracy are compared. The main results are as follows. Even with no forecast error during every 30 min. as the ideal forecast method, the required capacity of NaS battery reaches about 40% of PVS capacity for mitigating the instantaneous forecast error within 30 min. The required capacity to compensate for the forecast error is doubled with the actual forecast method. The influence of forecast error can be reduced by adjusting the scheduled power output of controllable DGs according to the weather forecast. Besides, the required capacity can be reduced significantly if the error of balancing control in a MG is acceptable for a few percentages of periods, because the total periods of large forecast error is not so often.
Advances in the development of remote sensing technology for agricultural applications
NASA Technical Reports Server (NTRS)
Powers, J. E.; Erb, R. B.; Hall, F. G.; Macdonald, R. B.
1979-01-01
The application of remote sensing technology to crop forecasting is discussed. The importance of crop forecasts to the world economy and agricultural management is explained, and the development of aerial and spaceborne remote sensing for global crop forecasting by the United States is outlined. The structure, goals and technical aspects of the Large Area Crop Inventory Experiment (LACIE) are presented, and main findings on the accuracy, efficiency, applicability and areas for further study of the LACIE procedure are reviewed. The current status of NASA crop forecasting activities in the United States and worldwide is discussed, and the objectives and organization of the newly created Agriculture and Resources Inventory Surveys through Aerospace Remote Sensing (AgRISTARS) program are presented.
NASA Technical Reports Server (NTRS)
Li, Zhao; Molod, Andrea; Schubert, Siegfried
2018-01-01
Reliable prediction of precipitation remains one of the most pivotal and complex challenges in seasonal forecasting. Previous studies show that various large-scale climate modes, such as ENSO, PNA and NAO play significant role in winter precipitation variability over the Northern America. The influences are most pronounced in years of strong indices of such climate modes. This study evaluates model bias, predictability and forecast skills of monthly winter precipitation in GEOS5-S2S 2.0 retrospective forecast from 1981 to 2016, with emphasis on the forecast skill of precipitation over North America during the extreme events of ENSO, PNA and NAO by applying EOF and composite analysis.
Skill of Ensemble Seasonal Probability Forecasts
NASA Astrophysics Data System (ADS)
Smith, Leonard A.; Binter, Roman; Du, Hailiang; Niehoerster, Falk
2010-05-01
In operational forecasting, the computational complexity of large simulation models is, ideally, justified by enhanced performance over simpler models. We will consider probability forecasts and contrast the skill of ENSEMBLES-based seasonal probability forecasts of interest to the finance sector (specifically temperature forecasts for Nino 3.4 and the Atlantic Main Development Region (MDR)). The ENSEMBLES model simulations will be contrasted against forecasts from statistical models based on the observations (climatological distributions) and empirical dynamics based on the observations but conditioned on the current state (dynamical climatology). For some start dates, individual ENSEMBLES models yield significant skill even at a lead-time of 14 months. The nature of this skill is discussed, and chances of application are noted. Questions surrounding the interpretation of probability forecasts based on these multi-model ensemble simulations are then considered; the distributions considered are formed by kernel dressing the ensemble and blending with the climatology. The sources of apparent (RMS) skill in distributions based on multi-model simulations is discussed, and it is demonstrated that the inclusion of "zero-skill" models in the long range can improve Root-Mean-Square-Error scores, casting some doubt on the common justification for the claim that all models should be included in forming an operational probability forecast. It is argued that the rational response varies with lead time.
Six rules for accurate effective forecasting.
Saffo, Paul
2007-01-01
The primary goal of forecasting is to identify the full range of possibilities facing a company, society, or the world at large. In this article, Saffo demythologizes the forecasting process to help executives become sophisticated and participative consumers of forecasts, rather than passive absorbers. He illustrates how to use forecasts to at once broaden understanding of possibilities and narrow the decision space within which one must exercise intuition. The events of 9/11, for example, were a much bigger surprise than they should have been. After all, airliners flown into monuments were the stuff of Tom Clancy novels in the 1990s, and everyone knew that terrorists had a very personal antipathy toward the World Trade Center. So why was 9/11 such a surprise? What can executives do to avoid being blind-sided by other such wild cards, be they radical shifts in markets or the seemingly sudden emergence of disruptive technologies? In describing what forecasters are trying to achieve, Saffo outlines six simple, commonsense rules that smart managers should observe as they embark on a voyage of discovery with professional forecasters. Map a cone of uncertainty, he advises, look for the S curve, embrace the things that don't fit, hold strong opinions weakly, look back twice as far as you look forward, and know when not to make a forecast.
The Rise of Complexity in Flood Forecasting: Opportunities, Challenges and Tradeoffs
NASA Astrophysics Data System (ADS)
Wood, A. W.; Clark, M. P.; Nijssen, B.
2017-12-01
Operational flood forecasting is currently undergoing a major transformation. Most national flood forecasting services have relied for decades on lumped, highly calibrated conceptual hydrological models running on local office computing resources, providing deterministic streamflow predictions at gauged river locations that are important to stakeholders and emergency managers. A variety of recent technological advances now make it possible to run complex, high-to-hyper-resolution models for operational hydrologic prediction over large domains, and the US National Weather Service is now attempting to use hyper-resolution models to create new forecast services and products. Yet other `increased-complexity' forecasting strategies also exist that pursue different tradeoffs between model complexity (i.e., spatial resolution, physics) and streamflow forecast system objectives. There is currently a pressing need for a greater understanding in the hydrology community of the opportunities, challenges and tradeoffs associated with these different forecasting approaches, and for a greater participation by the hydrology community in evaluating, guiding and implementing these approaches. Intermediate-resolution forecast systems, for instance, use distributed land surface model (LSM) physics but retain the agility to deploy ensemble methods (including hydrologic data assimilation and hindcast-based post-processing). Fully coupled numerical weather prediction (NWP) systems, another example, use still coarser LSMs to produce ensemble streamflow predictions either at the model scale or after sub-grid scale runoff routing. Based on the direct experience of the authors and colleagues in research and operational forecasting, this presentation describes examples of different streamflow forecast paradigms, from the traditional to the recent hyper-resolution, to illustrate the range of choices facing forecast system developers. We also discuss the degree to which the strengths and weaknesses of each strategy map onto the requirements for different types of forecasting services (e.g., flash flooding, river flooding, seasonal water supply prediction).
NASA Astrophysics Data System (ADS)
Gelfan, Alexander; Moreido, Vsevolod
2017-04-01
Ensemble hydrological forecasting allows for describing uncertainty caused by variability of meteorological conditions in the river basin for the forecast lead-time. At the same time, in snowmelt-dependent river basins another significant source of uncertainty relates to variability of initial conditions of the basin (snow water equivalent, soil moisture content, etc.) prior to forecast issue. Accurate long-term hydrological forecast is most crucial for large water management systems, such as the Cheboksary reservoir (the catchment area is 374 000 sq.km) located in the Middle Volga river in Russia. Accurate forecasts of water inflow volume, maximum discharge and other flow characteristics are of great value for this basin, especially before the beginning of the spring freshet season that lasts here from April to June. The semi-distributed hydrological model ECOMAG was used to develop long-term ensemble forecast of daily water inflow into the Cheboksary reservoir. To describe variability of the meteorological conditions and construct ensemble of possible weather scenarios for the lead-time of the forecast, two approaches were applied. The first one utilizes 50 weather scenarios observed in the previous years (similar to the ensemble streamflow prediction (ESP) procedure), the second one uses 1000 synthetic scenarios simulated by a stochastic weather generator. We investigated the evolution of forecast uncertainty reduction, expressed as forecast efficiency, over various consequent forecast issue dates and lead time. We analyzed the Nash-Sutcliffe efficiency of inflow hindcasts for the period 1982 to 2016 starting from 1st of March with 15 days frequency for lead-time of 1 to 6 months. This resulted in the forecast efficiency matrix with issue dates versus lead-time that allows for predictability identification of the basin. The matrix was constructed separately for observed and synthetic weather ensembles.
Spatially explicit forecasts of large wildland fire probability and suppression costs for California
Haiganoush Preisler; Anthony L. Westerling; Krista M. Gebert; Francisco Munoz-Arriola; Thomas P. Holmes
2011-01-01
In the last decade, increases in fire activity and suppression expenditures have caused budgetary problems for federal land management agencies. Spatial forecasts of upcoming fire activity and costs have the potential to help reduce expenditures, and increase the efficiency of suppression efforts, by enabling them to focus resources where they have the greatest effect...
NASA Astrophysics Data System (ADS)
Brown, James D.; Wu, Limin; He, Minxue; Regonda, Satish; Lee, Haksu; Seo, Dong-Jun
2014-11-01
Retrospective forecasts of precipitation, temperature, and streamflow were generated with the Hydrologic Ensemble Forecast Service (HEFS) of the U.S. National Weather Service (NWS) for a 20-year period between 1979 and 1999. The hindcasts were produced for two basins in each of four River Forecast Centers (RFCs), namely the Arkansas-Red Basin RFC, the Colorado Basin RFC, the California-Nevada RFC, and the Middle Atlantic RFC. Precipitation and temperature forecasts were produced with the HEFS Meteorological Ensemble Forecast Processor (MEFP). Inputs to the MEFP comprised ;raw; precipitation and temperature forecasts from the frozen (circa 1997) version of the NWS Global Forecast System (GFS) and a climatological ensemble, which involved resampling historical observations in a moving window around the forecast valid date (;resampled climatology;). In both cases, the forecast horizon was 1-14 days. This paper outlines the hindcasting and verification strategy, and then focuses on the quality of the temperature and precipitation forecasts from the MEFP. A companion paper focuses on the quality of the streamflow forecasts from the HEFS. In general, the precipitation forecasts are more skillful than resampled climatology during the first week, but comprise little or no skill during the second week. In contrast, the temperature forecasts improve upon resampled climatology at all forecast lead times. However, there are notable differences among RFCs and for different seasons, aggregation periods and magnitudes of the observed and forecast variables, both for precipitation and temperature. For example, the MEFP-GFS precipitation forecasts show the highest correlations and greatest skill in the California Nevada RFC, particularly during the wet season (November-April). While generally reliable, the MEFP forecasts typically underestimate the largest observed precipitation amounts (a Type-II conditional bias). As a statistical technique, the MEFP cannot detect, and thus appropriately correct for, conditions that are undetected by the GFS. The calibration of the MEFP to provide reliable and skillful forecasts of a range of precipitation amounts (not only large amounts) is a secondary factor responsible for these Type-II conditional biases. Interpretation of the verification results leads to guidance on the expected performance and limitations of the MEFP, together with recommendations on future enhancements.
A first large-scale flood inundation forecasting model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schumann, Guy J-P; Neal, Jeffrey C.; Voisin, Nathalie
2013-11-04
At present continental to global scale flood forecasting focusses on predicting at a point discharge, with little attention to the detail and accuracy of local scale inundation predictions. Yet, inundation is actually the variable of interest and all flood impacts are inherently local in nature. This paper proposes a first large scale flood inundation ensemble forecasting model that uses best available data and modeling approaches in data scarce areas and at continental scales. The model was built for the Lower Zambezi River in southeast Africa to demonstrate current flood inundation forecasting capabilities in large data-scarce regions. The inundation model domainmore » has a surface area of approximately 170k km2. ECMWF meteorological data were used to force the VIC (Variable Infiltration Capacity) macro-scale hydrological model which simulated and routed daily flows to the input boundary locations of the 2-D hydrodynamic model. Efficient hydrodynamic modeling over large areas still requires model grid resolutions that are typically larger than the width of many river channels that play a key a role in flood wave propagation. We therefore employed a novel sub-grid channel scheme to describe the river network in detail whilst at the same time representing the floodplain at an appropriate and efficient scale. The modeling system was first calibrated using water levels on the main channel from the ICESat (Ice, Cloud, and land Elevation Satellite) laser altimeter and then applied to predict the February 2007 Mozambique floods. Model evaluation showed that simulated flood edge cells were within a distance of about 1 km (one model resolution) compared to an observed flood edge of the event. Our study highlights that physically plausible parameter values and satisfactory performance can be achieved at spatial scales ranging from tens to several hundreds of thousands of km2 and at model grid resolutions up to several km2. However, initial model test runs in forecast mode revealed that it is crucial to account for basin-wide hydrological response time when assessing lead time performances notwithstanding structural limitations in the hydrological model and possibly large inaccuracies in precipitation data.« less
Benefits of an ultra large and multiresolution ensemble for estimating available wind power
NASA Astrophysics Data System (ADS)
Berndt, Jonas; Hoppe, Charlotte; Elbern, Hendrik
2016-04-01
In this study we investigate the benefits of an ultra large ensemble with up to 1000 members including multiple nesting with a target horizontal resolution of 1 km. The ensemble shall be used as a basis to detect events of extreme errors in wind power forecasting. Forecast value is the wind vector at wind turbine hub height (~ 100 m) in the short range (1 to 24 hour). Current wind power forecast systems rest already on NWP ensemble models. However, only calibrated ensembles from meteorological institutions serve as input so far, with limited spatial resolution (˜10 - 80 km) and member number (˜ 50). Perturbations related to the specific merits of wind power production are yet missing. Thus, single extreme error events which are not detected by such ensemble power forecasts occur infrequently. The numerical forecast model used in this study is the Weather Research and Forecasting Model (WRF). Model uncertainties are represented by stochastic parametrization of sub-grid processes via stochastically perturbed parametrization tendencies and in conjunction via the complementary stochastic kinetic-energy backscatter scheme already provided by WRF. We perform continuous ensemble updates by comparing each ensemble member with available observations using a sequential importance resampling filter to improve the model accuracy while maintaining ensemble spread. Additionally, we use different ensemble systems from global models (ECMWF and GFS) as input and boundary conditions to capture different synoptic conditions. Critical weather situations which are connected to extreme error events are located and corresponding perturbation techniques are applied. The demanding computational effort is overcome by utilising the supercomputer JUQUEEN at the Forschungszentrum Juelich.
Interevent times in a new alarm-based earthquake forecasting model
NASA Astrophysics Data System (ADS)
Talbi, Abdelhak; Nanjo, Kazuyoshi; Zhuang, Jiancang; Satake, Kenji; Hamdache, Mohamed
2013-09-01
This study introduces a new earthquake forecasting model that uses the moment ratio (MR) of the first to second order moments of earthquake interevent times as a precursory alarm index to forecast large earthquake events. This MR model is based on the idea that the MR is associated with anomalous long-term changes in background seismicity prior to large earthquake events. In a given region, the MR statistic is defined as the inverse of the index of dispersion or Fano factor, with MR values (or scores) providing a biased estimate of the relative regional frequency of background events, here termed the background fraction. To test the forecasting performance of this proposed MR model, a composite Japan-wide earthquake catalogue for the years between 679 and 2012 was compiled using the Japan Meteorological Agency catalogue for the period between 1923 and 2012, and the Utsu historical seismicity records between 679 and 1922. MR values were estimated by sampling interevent times from events with magnitude M ≥ 6 using an earthquake random sampling (ERS) algorithm developed during previous research. Three retrospective tests of M ≥ 7 target earthquakes were undertaken to evaluate the long-, intermediate- and short-term performance of MR forecasting, using mainly Molchan diagrams and optimal spatial maps obtained by minimizing forecasting error defined by miss and alarm rate addition. This testing indicates that the MR forecasting technique performs well at long-, intermediate- and short-term. The MR maps produced during long-term testing indicate significant alarm levels before 15 of the 18 shallow earthquakes within the testing region during the past two decades, with an alarm region covering about 20 per cent (alarm rate) of the testing region. The number of shallow events missed by forecasting was reduced by about 60 per cent after using the MR method instead of the relative intensity (RI) forecasting method. At short term, our model succeeded in forecasting the occurrence region of the 2011 Mw 9.0 Tohoku earthquake, whereas the RI method did not. Cases where a period of quiescent seismicity occurred before the target event often lead to low MR scores, meaning that the target event was not predicted and indicating that our model could be further improved by taking into account quiescent periods in the alarm strategy.
Parsons, Tom
2007-01-01
The power law distribution of earthquake magnitudes and frequencies is a fundamental scaling relationship used for forecasting. However, can its slope (b value) be used on individual faults as a stress indicator? Some have concluded that b values drop just before large shocks. Others suggested that temporally stable low b value zones identify future large-earthquake locations. This study assesses the frequency of b value anomalies portending M ≥ 4.0 shocks versus how often they do not. I investigated M ≥ 4.0 Calaveras fault earthquakes because there have been 25 over the 37-year duration of the instrumental catalog on the most active southern half of the fault. With that relatively large sample, I conducted retrospective time and space earthquake forecasts. I calculated temporal b value changes in 5-km-radius cylindrical volumes of crust that were significant at 90% confidence, but these changes were poor forecasters of M ≥ 4.0 earthquakes. M ≥ 4.0 events were as likely to happen at times of high b values as they were at low ones. However, I could not rule out a hypothesis that spatial b value anomalies portend M ≥ 4.0 events; of 20 M ≥ 4 shocks that could be studied, 6 to 8 (depending on calculation method) occurred where b values were significantly less than the spatial mean, 1 to 2 happened above the mean, and 10 to 13 occurred within 90% confidence intervals of the mean and were thus inconclusive. Thus spatial b value variation might be a useful forecast tool, but resolution is poor, even on seismically active faults.
Weather Forecaster Understanding of Climate Models
NASA Astrophysics Data System (ADS)
Bol, A.; Kiehl, J. T.; Abshire, W. E.
2013-12-01
Weather forecasters, particularly those in broadcasting, are the primary conduit to the public for information on climate and climate change. However, many weather forecasters remain skeptical of model-based climate projections. To address this issue, The COMET Program developed an hour-long online lesson of how climate models work, targeting an audience of weather forecasters. The module draws on forecasters' pre-existing knowledge of weather, climate, and numerical weather prediction (NWP) models. In order to measure learning outcomes, quizzes were given before and after the lesson. Preliminary results show large learning gains. For all people that took both pre and post-tests (n=238), scores improved from 48% to 80%. Similar pre/post improvement occurred for National Weather Service employees (51% to 87%, n=22 ) and college faculty (50% to 90%, n=7). We believe these results indicate a fundamental misunderstanding among many weather forecasters of (1) the difference between weather and climate models, (2) how researchers use climate models, and (3) how they interpret model results. The quiz results indicate that efforts to educate the public about climate change need to include weather forecasters, a vital link between the research community and the general public.
Forecasted masses for 7000 Kepler Objects of Interest
NASA Astrophysics Data System (ADS)
Chen, Jingjing; Kipping, David M.
2018-01-01
Recent transit surveys have discovered thousands of planetary candidates with directly measured radii, but only a small fraction have measured masses. Planetary mass is crucial in assessing the feasibility of numerous observational signatures, such as radial velocities (RVs), atmospheres, moons and rings. In the absence of a direct measurement, a data-driven, probabilistic forecast enables observational planning, and so here we compute posterior distributions for the forecasted mass of ∼7000 Kepler Objects of Interest (KOIs). Our forecasts reveal that the predicted RV amplitudes of Neptunian planets are relatively consistent, as a result of transit survey detection bias, hovering around a few m s-1 level. We find that mass forecasts are unlikely to improve through more precise planetary radii, with the error budget presently dominated by the intrinsic model uncertainty. Our forecasts identify a couple of dozen KOIs near the Terran-Neptunian divide with particularly large RV semi-amplitudes, which could be promising targets to follow up, particularly in the near-infrared. With several more transit surveys planned in the near-future, the need to quickly forecast observational signatures is likely to grow, and the work here provides a template example of such calculations.
Weather-based forecasts of California crop yields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lobell, D B; Cahill, K N; Field, C B
2005-09-26
Crop yield forecasts provide useful information to a range of users. Yields for several crops in California are currently forecast based on field surveys and farmer interviews, while for many crops official forecasts do not exist. As broad-scale crop yields are largely dependent on weather, measurements from existing meteorological stations have the potential to provide a reliable, timely, and cost-effective means to anticipate crop yields. We developed weather-based models of state-wide yields for 12 major California crops (wine grapes, lettuce, almonds, strawberries, table grapes, hay, oranges, cotton, tomatoes, walnuts, avocados, and pistachios), and tested their accuracy using cross-validation over themore » 1980-2003 period. Many crops were forecast with high accuracy, as judged by the percent of yield variation explained by the forecast, the number of yields with correctly predicted direction of yield change, or the number of yields with correctly predicted extreme yields. The most successfully modeled crop was almonds, with 81% of yield variance captured by the forecast. Predictions for most crops relied on weather measurements well before harvest time, allowing for lead times that were longer than existing procedures in many cases.« less
NASA Technical Reports Server (NTRS)
Lambert, Winfred; Wheeler, Mark; Roeder, William
2005-01-01
The 45th Weather Squadron (45 WS) at Cape Canaveral Air-Force Station (CCAFS)ln Florida issues a probability of lightning occurrence in their daily 24-hour and weekly planning forecasts. This information is used for general planning of operations at CCAFS and Kennedy Space Center (KSC). These facilities are located in east-central Florida at the east end of a corridor known as 'Lightning Alley', an indication that lightning has a large impact on space-lift operations. Much of the current lightning probability forecast is based on a subjective analysis of model and observational data and an objective forecast tool developed over 30 years ago. The 45 WS requested that a new lightning probability forecast tool based on statistical analysis of more recent historical warm season (May-September) data be developed in order to increase the objectivity of the daily thunderstorm probability forecast. The resulting tool is a set of statistical lightning forecast equations, one for each month of the warm season, that provide a lightning occurrence probability for the day by 1100 UTC (0700 EDT) during the warm season.
NASA Astrophysics Data System (ADS)
Gilchrist, J. J.; Jordan, T. H.; Shaw, B. E.; Milner, K. R.; Richards-Dinger, K. B.; Dieterich, J. H.
2017-12-01
Within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM), we are developing physics-based forecasting models for earthquake ruptures in California. We employ the 3D boundary element code RSQSim (Rate-State Earthquake Simulator of Dieterich & Richards-Dinger, 2010) to generate synthetic catalogs with tens of millions of events that span up to a million years each. This code models rupture nucleation by rate- and state-dependent friction and Coulomb stress transfer in complex, fully interacting fault systems. The Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) fault and deformation models are used to specify the fault geometry and long-term slip rates. We have employed the Blue Waters supercomputer to generate long catalogs of simulated California seismicity from which we calculate the forecasting statistics for large events. We have performed probabilistic seismic hazard analysis with RSQSim catalogs that were calibrated with system-wide parameters and found a remarkably good agreement with UCERF3 (Milner et al., this meeting). We build on this analysis, comparing the conditional probabilities of sequences of large events from RSQSim and UCERF3. In making these comparisons, we consider the epistemic uncertainties associated with the RSQSim parameters (e.g., rate- and state-frictional parameters), as well as the effects of model-tuning (e.g., adjusting the RSQSim parameters to match UCERF3 recurrence rates). The comparisons illustrate how physics-based rupture simulators might assist forecasters in understanding the short-term hazards of large aftershocks and multi-event sequences associated with complex, multi-fault ruptures.
NASA Astrophysics Data System (ADS)
Ayscue, Emily P.
This study profiles the coastal tourism sector, a large and diverse consumer of climate and weather information. It is crucial to provide reliable, accurate and relevant resources for the climate and weather-sensitive portions of this stakeholder group in order to guide them in capitalizing on current climate and weather conditions and to prepare them for potential changes. An online survey of tourism business owners, managers and support specialists was conducted within the eight North Carolina oceanfront counties asking respondents about forecasts they use and for what purposes as well as why certain forecasts are not used. Respondents were also asked about their perceived dependency of their business on climate and weather as well as how valuable different forecasts are to their decision-making. Business types represented include: Agriculture, Outdoor Recreation, Accommodations, Food Services, Parks and Heritage, and Other. Weekly forecasts were the most popular forecasts with Monthly and Seasonal being the least used. MANOVA and ANOVA analyses revealed outdoor-oriented businesses (Agriculture and Outdoor Recreation) as perceiving themselves significantly more dependent on climate and weather than indoor-oriented ones (Food Services and Accommodations). Outdoor businesses also valued short-range forecasts significantly more than indoor businesses. This suggests a positive relationship between perceived climate and weather dependency and forecast value. The low perceived dependency and value of short-range forecasts of indoor businesses presents an opportunity to create climate and weather information resources directed at how they can capitalize on positive climate and weather forecasts and how to counter negative effects with forecasted adverse conditions. The low use of long-range forecasts among all business types can be related to the low value placed on these forecasts. However, these forecasts are still important in that they are used to make more financially risky decisions such as investment decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freedman, Jeffrey M.; Manobianco, John; Schroeder, John
This Final Report presents a comprehensive description, findings, and conclusions for the Wind Forecast Improvement Project (WFIP) -- Southern Study Area (SSA) work led by AWS Truepower (AWST). This multi-year effort, sponsored by the Department of Energy (DOE) and National Oceanographic and Atmospheric Administration (NOAA), focused on improving short-term (15-minute - 6 hour) wind power production forecasts through the deployment of an enhanced observation network of surface and remote sensing instrumentation and the use of a state-of-the-art forecast modeling system. Key findings from the SSA modeling and forecast effort include: 1. The AWST WFIP modeling system produced an overall 10more » - 20% improvement in wind power production forecasts over the existing Baseline system, especially during the first three forecast hours; 2. Improvements in ramp forecast skill, particularly for larger up and down ramps; 3. The AWST WFIP data denial experiments showed mixed results in the forecasts incorporating the experimental network instrumentation; however, ramp forecasts showed significant benefit from the additional observations, indicating that the enhanced observations were key to the model systems’ ability to capture phenomena responsible for producing large short-term excursions in power production; 4. The OU CAPS ARPS simulations showed that the additional WFIP instrument data had a small impact on their 3-km forecasts that lasted for the first 5-6 hours, and increasing the vertical model resolution in the boundary layer had a greater impact, also in the first 5 hours; and 5. The TTU simulations were inconclusive as to which assimilation scheme (3DVAR versus EnKF) provided better forecasts, and the additional observations resulted in some improvement to the forecasts in the first 1 - 3 hours.« less
Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System
Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren
2014-01-01
Abstract Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years. PMID:25553271
Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System.
Doyle, Andy; Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren
2014-12-01
Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finley, Cathy
2014-04-30
This report contains the results from research aimed at improving short-range (0-6 hour) hub-height wind forecasts in the NOAA weather forecast models through additional data assimilation and model physics improvements for use in wind energy forecasting. Additional meteorological observing platforms including wind profilers, sodars, and surface stations were deployed for this study by NOAA and DOE, and additional meteorological data at or near wind turbine hub height were provided by South Dakota State University and WindLogics/NextEra Energy Resources over a large geographical area in the U.S. Northern Plains for assimilation into NOAA research weather forecast models. The resulting improvements inmore » wind energy forecasts based on the research weather forecast models (with the additional data assimilation and model physics improvements) were examined in many different ways and compared with wind energy forecasts based on the current operational weather forecast models to quantify the forecast improvements important to power grid system operators and wind plant owners/operators participating in energy markets. Two operational weather forecast models (OP_RUC, OP_RAP) and two research weather forecast models (ESRL_RAP, HRRR) were used as the base wind forecasts for generating several different wind power forecasts for the NextEra Energy wind plants in the study area. Power forecasts were generated from the wind forecasts in a variety of ways, from very simple to quite sophisticated, as they might be used by a wide range of both general users and commercial wind energy forecast vendors. The error characteristics of each of these types of forecasts were examined and quantified using bulk error statistics for both the local wind plant and the system aggregate forecasts. The wind power forecast accuracy was also evaluated separately for high-impact wind energy ramp events. The overall bulk error statistics calculated over the first six hours of the forecasts at both the individual wind plant and at the system-wide aggregate level over the one year study period showed that the research weather model-based power forecasts (all types) had lower overall error rates than the current operational weather model-based power forecasts, both at the individual wind plant level and at the system aggregate level. The bulk error statistics of the various model-based power forecasts were also calculated by season and model runtime/forecast hour as power system operations are more sensitive to wind energy forecast errors during certain times of year and certain times of day. The results showed that there were significant differences in seasonal forecast errors between the various model-based power forecasts. The results from the analysis of the various wind power forecast errors by model runtime and forecast hour showed that the forecast errors were largest during the times of day that have increased significance to power system operators (the overnight hours and the morning/evening boundary layer transition periods), but the research weather model-based power forecasts showed improvement over the operational weather model-based power forecasts at these times.« less
Artificial Informational Polymers and Nanomaterials from Ring-Opening Metathesis Polymerization
NASA Astrophysics Data System (ADS)
James, Carrie Rae
Inspired by naturally occurring polymers (DNA, polypeptides, polysaccharides, etc.) that can self-assemble on the nanoscale into complex, information-rich architectures, we have synthesized nucleic acid based polymers using ROMP. These polymers were synthesized using a graft-through strategy, whereby nucleic acids bearing a strained cyclic olefin were directly polymerized. This is the first example of the graft-through polymerization of nucleic acids. Our approach takes advantage of non-charged peptide nucleic acids (PNAs) as elements to incorporate into ROMP polymer backbones. PNA is a synthetic nucleic acid analogue known for its increased affinity and specificity for complementary DNA or RNA. To accomplish the graft-through polymerization of PNA, we conjugated PNA to strained cyclic olefins using solid phase peptide conjugation chemistry. These PNA monomers were then directly polymerized into homo and block copolymers forming brushes, or comb-like arrangements, of information. Block copolymer amphiphiles of these materials, where the PNA brush served as the hydrophilic portion, were capable of self-assembly into spherical nanoparticles (PNA NPs). These PNA NPs were then studied with respect to their ability to hybridize complementary DNA sequences, as well as their ability to undergo cellular internalization. PNA NPs consisting of densely packed brushes of nucleic acids possessed increased thermal stability when mixed with their complementary DNA sequence, indicating a greater DNA binding affinity over their unpolymerized PNA counterparts. In addition, by arranging the PNA into dense brushes at the surface of the nanoparticle, Cy5.5 labeled PNA NPs were able to undergo cellular internalization into HeLa cells without the need for an additional cellular delivery device. Importantly, cellular internalization of PNA has remained a significant challenge in the literature due to the neutrally charged amino-ethyl glycine backbone of PNA. Therefore, this represents a novel way of facilitating cellular uptake of PNA. This materials strategy represents the first direct polymerization of nucleic acids, and presents a novel method for arranging biological information on the nanoscale at high density in order to confer novel attributes.
Design, Synthesis, and Self-Assembly of Polymers with Tailored Graft Distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Alice B.; Lin, Tzu-Pin; Thompson, Niklas B.
Grafting density and graft distribution impact the chain dimensions and physical properties of polymers. However, achieving precise control over these structural parameters presents long-standing synthetic challenges. In this report, we introduce a versatile strategy to synthesize polymers with tailored architectures via grafting-through ring-opening metathesis polymerization (ROMP). One-pot copolymerization of an ω-norbornenyl macromonomer and a discrete norbornenyl co-monomer (diluent) provides opportunities to control the backbone sequence and therefore the side chain distribution. Toward sequence control, the homopolymerization kinetics of 23 diluents were studied, representing diverse variations in the stereochemistry, anchor groups, and substituents. These modifications tuned the homopolymerization rate constants overmore » two orders of magnitude (0.36 M -1 s -1 < k homo < 82 M -1 s -1). Rate trends were identified and elucidated by complementary mechanistic and density functional theory (DFT) studies. Building on this foundation, complex architectures were achieved through copolymerizations of selected diluents with a poly (D,L-lactide) (PLA), polydimethylsiloxane (PDMS), or polystyrene (PS) macromonomer. The cross-propagation rate constants were obtained by non-linear least squares fitting of the instantaneous co-monomer concentrations according to the Mayo-Lewis terminal model. Indepth kinetic analyses indicate a wide range of accessible macromonomer/diluent reactivity ratios (0.08 < r 1/r 2 < 20), corresponding to blocky, gradient, or random backbone sequences. We further demonstrated the versatility of this copolymerization approach by synthesizing AB graft diblock polymers with tapered, uniform, and inverse-tapered molecular “shapes.” Small-angle X-ray scattering analysis of the self-assembled structures illustrates effects of the graft distribution on the domain spacing and backbone conformation. Collectively, the insights provided herein into the ROMP mechanism, monomer design, and homo- and copolymerization rate trends offer a general strategy for the design and synthesis of graft polymers with arbitrary architectures. Controlled copolymerization therefore expands the parameter space for molecular and materials design.« less
Gurfield, Nikos; Grewal, Saran; Cua, Lynnie S; Torres, Pedro J; Kelley, Scott T
2017-01-01
The Pacific coast tick, Dermacentor occidentalis Marx, is found throughout California and can harbor agents that cause human diseases such as anaplasmosis, ehrlichiosis, tularemia, Rocky Mountain spotted fever and rickettsiosis 364D. Previous studies have demonstrated that nonpathogenic endosymbiotic bacteria can interfere with Rickettsia co-infections in other tick species. We hypothesized that within D. occidentalis ticks, interference may exist between different nonpathogenic endosymbiotic or nonendosymbiotic bacteria and Spotted Fever group Rickettsia (SFGR). Using PCR amplification and sequencing of the romp A gene and intergenic region we identified a cohort of SFGR-infected and non-infected D. occidentalis ticks collected from San Diego County. We then amplified a partial segment of the 16S rRNA gene and used next-generation sequencing to elucidate the microbiomes and levels of co-infection in the ticks. The SFGR R. philipii str. 364D and R. rhipicephali were detected in 2.3% and 8.2% of the ticks, respectively, via romp A sequencing. Interestingly, next generation sequencing revealed an inverse relationship between the number of Francisella- like endosymbiont (FLE) 16S rRNA sequences and Rickettsia 16S rRNA sequences within individual ticks that is consistent with partial interference between FLE and SFGR infecting ticks. After excluding the Rickettsia and FLE endosymbionts from the analysis, there was a small but significant difference in microbial community diversity and a pattern of geographic isolation by distance between collection locales. In addition, male ticks had a greater diversity of bacteria than female ticks and ticks that weren't infected with SFGR had similar microbiomes to canine skin microbiomes. Although experimental studies are required for confirmation, our findings are consistent with the hypothesis that FLEs and, to a lesser extent, other bacteria, interfere with the ability of D. occidentalis to be infected with certain SFGR. The results also raise interesting possibilities about the effects of putative vertebrate hosts on the tick microbiome.
Design, Synthesis, and Self-Assembly of Polymers with Tailored Graft Distributions.
Chang, Alice B; Lin, Tzu-Pin; Thompson, Niklas B; Luo, Shao-Xiong; Liberman-Martin, Allegra L; Chen, Hsiang-Yun; Lee, Byeongdu; Grubbs, Robert H
2017-12-06
Grafting density and graft distribution impact the chain dimensions and physical properties of polymers. However, achieving precise control over these structural parameters presents long-standing synthetic challenges. In this report, we introduce a versatile strategy to synthesize polymers with tailored architectures via grafting-through ring-opening metathesis polymerization (ROMP). One-pot copolymerization of an ω-norbornenyl macromonomer and a discrete norbornenyl comonomer (diluent) provides opportunities to control the backbone sequence and therefore the side chain distribution. Toward sequence control, the homopolymerization kinetics of 23 diluents were studied, representing diverse variations in the stereochemistry, anchor groups, and substituents. These modifications tuned the homopolymerization rate constants over 2 orders of magnitude (0.36 M -1 s -1 < k homo < 82 M -1 s -1 ). Rate trends were identified and elucidated by complementary mechanistic and density functional theory (DFT) studies. Building on this foundation, complex architectures were achieved through copolymerizations of selected diluents with a poly(d,l-lactide) (PLA), polydimethylsiloxane (PDMS), or polystyrene (PS) macromonomer. The cross-propagation rate constants were obtained by nonlinear least-squares fitting of the instantaneous comonomer concentrations according to the Mayo-Lewis terminal model. In-depth kinetic analyses indicate a wide range of accessible macromonomer/diluent reactivity ratios (0.08 < r 1 /r 2 < 20), corresponding to blocky, gradient, or random backbone sequences. We further demonstrated the versatility of this copolymerization approach by synthesizing AB graft diblock polymers with tapered, uniform, and inverse-tapered molecular "shapes." Small-angle X-ray scattering analysis of the self-assembled structures illustrates effects of the graft distribution on the domain spacing and backbone conformation. Collectively, the insights provided herein into the ROMP mechanism, monomer design, and homo- and copolymerization rate trends offer a general strategy for the design and synthesis of graft polymers with arbitrary architectures. Controlled copolymerization therefore expands the parameter space for molecular and materials design.
A Bayesian Assessment of Seismic Semi-Periodicity Forecasts
NASA Astrophysics Data System (ADS)
Nava, F.; Quinteros, C.; Glowacka, E.; Frez, J.
2016-01-01
Among the schemes for earthquake forecasting, the search for semi-periodicity during large earthquakes in a given seismogenic region plays an important role. When considering earthquake forecasts based on semi-periodic sequence identification, the Bayesian formalism is a useful tool for: (1) assessing how well a given earthquake satisfies a previously made forecast; (2) re-evaluating the semi-periodic sequence probability; and (3) testing other prior estimations of the sequence probability. A comparison of Bayesian estimates with updated estimates of semi-periodic sequences that incorporate new data not used in the original estimates shows extremely good agreement, indicating that: (1) the probability that a semi-periodic sequence is not due to chance is an appropriate estimate for the prior sequence probability estimate; and (2) the Bayesian formalism does a very good job of estimating corrected semi-periodicity probabilities, using slightly less data than that used for updated estimates. The Bayesian approach is exemplified explicitly by its application to the Parkfield semi-periodic forecast, and results are given for its application to other forecasts in Japan and Venezuela.
Impact of hindcast length on estimates of seasonal climate predictability.
Shi, W; Schaller, N; MacLeod, D; Palmer, T N; Weisheimer, A
2015-03-16
It has recently been argued that single-model seasonal forecast ensembles are overdispersive, implying that the real world is more predictable than indicated by estimates of so-called perfect model predictability, particularly over the North Atlantic. However, such estimates are based on relatively short forecast data sets comprising just 20 years of seasonal predictions. Here we study longer 40 year seasonal forecast data sets from multimodel seasonal forecast ensemble projects and show that sampling uncertainty due to the length of the hindcast periods is large. The skill of forecasting the North Atlantic Oscillation during winter varies within the 40 year data sets with high levels of skill found for some subperiods. It is demonstrated that while 20 year estimates of seasonal reliability can show evidence of overdispersive behavior, the 40 year estimates are more stable and show no evidence of overdispersion. Instead, the predominant feature on these longer time scales is underdispersion, particularly in the tropics. Predictions can appear overdispersive due to hindcast length sampling errorLonger hindcasts are more robust and underdispersive, especially in the tropicsTwenty hindcasts are an inadequate sample size to assess seasonal forecast skill.
Additional Arctic observations improve weather and sea-ice forecasts for the Northern Sea Route
Inoue, Jun; Yamazaki, Akira; Ono, Jun; Dethloff, Klaus; Maturilli, Marion; Neuber, Roland; Edwards, Patti; Yamaguchi, Hajime
2015-01-01
During ice-free periods, the Northern Sea Route (NSR) could be an attractive shipping route. The decline in Arctic sea-ice extent, however, could be associated with an increase in the frequency of the causes of severe weather phenomena, and high wind-driven waves and the advection of sea ice could make ship navigation along the NSR difficult. Accurate forecasts of weather and sea ice are desirable for safe navigation, but large uncertainties exist in current forecasts, partly owing to the sparse observational network over the Arctic Ocean. Here, we show that the incorporation of additional Arctic observations improves the initial analysis and enhances the skill of weather and sea-ice forecasts, the application of which has socioeconomic benefits. Comparison of 63-member ensemble atmospheric forecasts, using different initial data sets, revealed that additional Arctic radiosonde observations were useful for predicting a persistent strong wind event. The sea-ice forecast, initialised by the wind fields that included the effects of the observations, skilfully predicted rapid wind-driven sea-ice advection along the NSR. PMID:26585690
Short-term forecasts gain in accuracy. [Regression technique using ''Box-Jenkins'' analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Box-Jenkins time-series models offer accuracy for short-term forecasts that compare with large-scale macroeconomic forecasts. Utilities need to be able to forecast peak demand in order to plan their generating, transmitting, and distribution systems. This new method differs from conventional models by not assuming specific data patterns, but by fitting available data into a tentative pattern on the basis of auto-correlations. Three types of models (autoregressive, moving average, or mixed autoregressive/moving average) can be used according to which provides the most appropriate combination of autocorrelations and related derivatives. Major steps in choosing a model are identifying potential models, estimating the parametersmore » of the problem, and running a diagnostic check to see if the model fits the parameters. The Box-Jenkins technique is well suited for seasonal patterns, which makes it possible to have as short as hourly forecasts of load demand. With accuracy up to two years, the method will allow electricity price-elasticity forecasting that can be applied to facility planning and rate design. (DCK)« less
Assessing public forecasts to encourage accountability: The case of MIT's Technology Review.
Funk, Jeffrey
2017-01-01
Although high degrees of reliability have been found for many types of forecasts purportedly due to the existence of accountability, public forecasts of technology are rarely assessed and continue to have a poor reputation. This paper's analysis of forecasts made by MIT's Technology Review provides a rare assessment and thus a means to encourage accountability. It first shows that few of the predicted "breakthrough technologies" currently have large markets. Only four have sales greater than $10 billion while eight technologies not predicted by Technology Review have sales greater than $10 billion including three with greater than $100 billion and one other with greater than $50 billion. Second, possible reasons for these poor forecasts are then discussed including an over emphasis on the science-based process of technology change, sometimes called the linear model of innovation. Third, this paper describes a different model of technology change, one that is widely used by private companies and that explains the emergence of those technologies that have greater than $10 billion in sales. Fourth, technology change and forecasts are discussed in terms of cognitive biases and mental models.
The potential of remotely sensed soil moisture for operational flood forecasting
NASA Astrophysics Data System (ADS)
Wanders, N.; Karssenberg, D.; de Roo, A.; de Jong, S.; Bierkens, M. F.
2013-12-01
Nowadays, remotely sensed soil moisture is readily available from multiple space born sensors. The high temporal resolution and global coverage make these products very suitable for large-scale land-surface applications. The potential to use these products in operational flood forecasting has thus far not been extensively studied. In this study, we evaluate the added value of assimilated remotely sensed soil moisture for the European Flood Awareness System (EFAS) and its potential to improve the timing and height of the flood peak and low flows. EFAS is used for operational flood forecasting in Europe and uses a distributed hydrological model for flood predictions for lead times up to 10 days. Satellite-derived soil moisture from ASCAT, AMSR-E and SMOS is assimilated into the EFAS system for the Upper Danube basin and results are compared to assimilation of only discharge observations. Discharge observations are available at the outlet and at six additional locations throughout the catchment. To assimilate soil moisture data into EFAS, an Ensemble Kalman Filter (EnKF) is used. Information on the spatial (cross-) correlation of the errors in the satellite products, derived from a detailed model-satellite soil moisture comparison study, is included to ensure optimal performance of the EnKF. For the validation, additional discharge observations not used in the EnKF are used as an independent validation dataset. Our results show that the accuracy of flood forecasts is increased when more discharge observations are used in that the Mean Absolute Error (MAE) of the ensemble mean is reduced by 65%. The additional inclusion of satellite data results in a further increase of the performance: forecasts of base flows are better and the uncertainty in the overall discharge is reduced, shown by a 10% reduction in the MAE. In addition, floods are predicted with a higher accuracy and the Continuous Ranked Probability Score (CRPS) shows a performance increase of 10-15% on average, compared to assimilation of discharge only. The rank histograms show that the forecast is not biased. The timing errors in the flood predictions are decreased when soil moisture data is used and imminent floods can be forecasted with skill one day earlier. In conclusion, our study shows that assimilation of satellite soil moisture increases the performance of flood forecasting systems for large catchments, like the Upper Danube. The additional gain is highest when discharge observations from both upstream and downstream areas are used in combination with the soil moisture data. These results show the potential of future soil moisture missions with a higher spatial resolution like SMAP to improve near-real time flood forecasting in large catchments.
NASA Astrophysics Data System (ADS)
Mohite, A. R.; Beria, H.; Behera, A. K.; Chatterjee, C.; Singh, R.
2016-12-01
Flood forecasting using hydrological models is an important and cost-effective non-structural flood management measure. For forecasting at short lead times, empirical models using real-time precipitation estimates have proven to be reliable. However, their skill depreciates with increasing lead time. Coupling a hydrologic model with real-time rainfall forecasts issued from numerical weather prediction (NWP) systems could increase the lead time substantially. In this study, we compared 1-5 days precipitation forecasts from India Meteorological Department (IMD) Multi-Model Ensemble (MME) with European Center for Medium Weather forecast (ECMWF) NWP forecasts for over 86 major river basins in India. We then evaluated the hydrologic utility of these forecasts over Basantpur catchment (approx. 59,000 km2) of the Mahanadi River basin. Coupled MIKE 11 RR (NAM) and MIKE 11 hydrodynamic (HD) models were used for the development of flood forecast system (FFS). RR model was calibrated using IMD station rainfall data. Cross-sections extracted from SRTM 30 were used as input to the MIKE 11 HD model. IMD started issuing operational MME forecasts from the year 2008, and hence, both the statistical and hydrologic evaluation were carried out from 2008-2014. The performance of FFS was evaluated using both the NWP datasets separately for the year 2011, which was a large flood year in Mahanadi River basin. We will present figures and metrics for statistical (threshold based statistics, skill in terms of correlation and bias) and hydrologic (Nash Sutcliffe efficiency, mean and peak error statistics) evaluation. The statistical evaluation will be at pan-India scale for all the major river basins and the hydrologic evaluation will be for the Basantpur catchment of the Mahanadi River basin.
A hybrid spatiotemporal drought forecasting model for operational use
NASA Astrophysics Data System (ADS)
Vasiliades, L.; Loukas, A.
2010-09-01
Drought forecasting plays an important role in the planning and management of natural resources and water resource systems in a river basin. Early and timelines forecasting of a drought event can help to take proactive measures and set out drought mitigation strategies to alleviate the impacts of drought. Spatiotemporal data mining is the extraction of unknown and implicit knowledge, structures, spatiotemporal relationships, or patterns not explicitly stored in spatiotemporal databases. As one of data mining techniques, forecasting is widely used to predict the unknown future based upon the patterns hidden in the current and past data. This study develops a hybrid spatiotemporal scheme for integrated spatial and temporal forecasting. Temporal forecasting is achieved using feed-forward neural networks and the temporal forecasts are extended to the spatial dimension using a spatial recurrent neural network model. The methodology is demonstrated for an operational meteorological drought index the Standardized Precipitation Index (SPI) calculated at multiple timescales. 48 precipitation stations and 18 independent precipitation stations, located at Pinios river basin in Thessaly region, Greece, were used for the development and spatiotemporal validation of the hybrid spatiotemporal scheme. Several quantitative temporal and spatial statistical indices were considered for the performance evaluation of the models. Furthermore, qualitative statistical criteria based on contingency tables between observed and forecasted drought episodes were calculated. The results show that the lead time of forecasting for operational use depends on the SPI timescale. The hybrid spatiotemporal drought forecasting model could be operationally used for forecasting up to three months ahead for SPI short timescales (e.g. 3-6 months) up to six months ahead for large SPI timescales (e.g. 24 months). The above findings could be useful in developing a drought preparedness plan in the region.
An Oceanographic and Climatological Atlas of Bristol Bay
1987-10-01
36 Forecasting Method ................................ 38 SUPERSTRUCTURE ICING.............................. 41 WIND...slicks and risk general advection of oil by large-scale ice move- analysis to coastal regions were computed. ment, and specific advection of oil by the...tide 1) Fetch wind (speed and direction) from tables or other sources. Forecast time of a surface map analysis of pressure highest range based on loss of
Ocean Data Impacts in Global HYCOM
2014-08-01
The purpose of assimilation is to reduce the model initial condition error. Improved initial con- ditions should lead to an improved forecast...the determination of locations where forecast errors are sensitive to the initial conditions are essential for improving the data assimilation system...longwave radiation, total (large scale plus convective) precipitation, ground/sea temperature, zonal and me- ridional wind velocities at 10m, mean sea
Operational hydrological forecasting in Bavaria. Part II: Ensemble forecasting
NASA Astrophysics Data System (ADS)
Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.
2009-04-01
In part I of this study, the operational flood forecasting system in Bavaria and an approach to identify and quantify forecast uncertainty was introduced. The approach is split into the calculation of an empirical 'overall error' from archived forecasts and the calculation of an empirical 'model error' based on hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The 'model error' can especially in upstream catchments where forecast uncertainty is strongly dependent on the current predictability of the atrmosphere be superimposed on the spread of a hydrometeorological ensemble forecast. In Bavaria, two meteorological ensemble prediction systems are currently tested for operational use: the 16-member COSMO-LEPS forecast and a poor man's ensemble composed of DWD GME, DWD Cosmo-EU, NCEP GFS, Aladin-Austria, MeteoSwiss Cosmo-7. The determination of the overall forecast uncertainty is dependent on the catchment characteristics: 1. Upstream catchment with high influence of weather forecast a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. b) Corresponding to the characteristics of the meteorological ensemble forecast, each resulting forecast hydrograph can be regarded as equally likely. c) The 'model error' distribution, with parameters dependent on hydrological case and lead time, is added to each forecast timestep of each ensemble member d) For each forecast timestep, the overall (i.e. over all 'model error' distribution of each ensemble member) error distribution is calculated e) From this distribution, the uncertainty range on a desired level (here: the 10% and 90% percentile) is extracted and drawn as forecast envelope. f) As the mean or median of an ensemble forecast does not necessarily exhibit meteorologically sound temporal evolution, a single hydrological forecast termed 'lead forecast' is chosen and shown in addition to the uncertainty bounds. This can be either an intermediate forecast between the extremes of the ensemble spread or a manually selected forecast based on a meteorologists advice. 2. Downstream catchments with low influence of weather forecast In downstream catchments with strong human impact on discharge (e.g. by reservoir operation) and large influence of upstream gauge observation quality on forecast quality, the 'overall error' may in most cases be larger than the combination of the 'model error' and an ensemble spread. Therefore, the overall forecast uncertainty bounds are calculated differently: a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. Here, additionally the corresponding inflow hydrograph from all upstream catchments must be used. b) As for an upstream catchment, the uncertainty range is determined by combination of 'model error' and the ensemble member forecasts c) In addition, the 'overall error' is superimposed on the 'lead forecast'. For reasons of consistency, the lead forecast must be based on the same meteorological forecast in the downstream and all upstream catchments. d) From the resulting two uncertainty ranges (one from the ensemble forecast and 'model error', one from the 'lead forecast' and 'overall error'), the envelope is taken as the most prudent uncertainty range. In sum, the uncertainty associated with each forecast run is calculated and communicated to the public in the form of 10% and 90% percentiles. As in part I of this study, the methodology as well as the useful- or uselessness of the resulting uncertainty ranges will be presented and discussed by typical examples.
Forecasting of monsoon heavy rains: challenges in NWP
NASA Astrophysics Data System (ADS)
Sharma, Kuldeep; Ashrit, Raghavendra; Iyengar, Gopal; Bhatla, R.; Rajagopal, E. N.
2016-05-01
Last decade has seen a tremendous improvement in the forecasting skill of numerical weather prediction (NWP) models. This is attributed to increased sophistication in NWP models, which resolve complex physical processes, advanced data assimilation, increased grid resolution and satellite observations. However, prediction of heavy rains is still a challenge since the models exhibit large error in amounts as well as spatial and temporal distribution. Two state-of-art NWP models have been investigated over the Indian monsoon region to assess their ability in predicting the heavy rainfall events. The unified model operational at National Center for Medium Range Weather Forecasting (NCUM) and the unified model operational at the Australian Bureau of Meteorology (Australian Community Climate and Earth-System Simulator -- Global (ACCESS-G)) are used in this study. The recent (JJAS 2015) Indian monsoon season witnessed 6 depressions and 2 cyclonic storms which resulted in heavy rains and flooding. The CRA method of verification allows the decomposition of forecast errors in terms of error in the rainfall volume, pattern and location. The case by case study using CRA technique shows that contribution to the rainfall errors come from pattern and displacement is large while contribution due to error in predicted rainfall volume is least.
On the reliable use of satellite-derived surface water products for global flood monitoring
NASA Astrophysics Data System (ADS)
Hirpa, F. A.; Revilla-Romero, B.; Thielen, J.; Salamon, P.; Brakenridge, R.; Pappenberger, F.; de Groeve, T.
2015-12-01
Early flood warning and real-time monitoring systems play a key role in flood risk reduction and disaster response management. To this end, real-time flood forecasting and satellite-based detection systems have been developed at global scale. However, due to the limited availability of up-to-date ground observations, the reliability of these systems for real-time applications have not been assessed in large parts of the globe. In this study, we performed comparative evaluations of the commonly used satellite-based global flood detections and operational flood forecasting system using 10 major flood cases reported over three years (2012-2014). Specially, we assessed the flood detection capabilities of the near real-time global flood maps from the Global Flood Detection System (GFDS), and from the Moderate Resolution Imaging Spectroradiometer (MODIS), and the operational forecasts from the Global Flood Awareness System (GloFAS) for the major flood events recorded in global flood databases. We present the evaluation results of the global flood detection and forecasting systems in terms of correctly indicating the reported flood events and highlight the exiting limitations of each system. Finally, we propose possible ways forward to improve the reliability of large scale flood monitoring tools.
NASA Astrophysics Data System (ADS)
Herman, J. D.; Steinschneider, S.; Nayak, M. A.
2017-12-01
Short-term weather forecasts are not codified into the operating policies of federal, multi-purpose reservoirs, despite their potential to improve service provision. This is particularly true for facilities that provide flood protection and water supply, since the potential flood damages are often too severe to accept the risk of inaccurate forecasts. Instead, operators must maintain empty storage capacity to mitigate flood risk, even if the system is currently in drought, as occurred in California from 2012-2016. This study investigates the potential for forecast-informed operating rules to improve water supply efficiency while maintaining flood protection, combining state-of-the-art weather hindcasts with a novel tree-based policy optimization framework. We hypothesize that forecasts need only accurately predict the occurrence of a storm, rather than its intensity, to be effective in regions like California where wintertime, synoptic-scale storms dominate the flood regime. We also investigate the potential for downstream groundwater injection to improve the utility of forecasts. These hypotheses are tested in a case study of Folsom Reservoir on the American River. Because available weather hindcasts are relatively short (10-20 years), we propose a new statistical framework to develop synthetic forecasts to assess the risk associated with inaccurate forecasts. The efficiency of operating policies is tested across a range of scenarios that include varying forecast skill and additional groundwater pumping capacity. Results suggest that the combined use of groundwater storage and short-term weather forecasts can substantially improve the tradeoff between water supply and flood control objectives in large, multi-purpose reservoirs in California.
Improved Weather and Power Forecasts for Energy Operations - the German Research Project EWeLiNE
NASA Astrophysics Data System (ADS)
Lundgren, Kristina; Siefert, Malte; Hagedorn, Renate; Majewski, Detlev
2014-05-01
The German energy system is going through a fundamental change. Based on the energy plans of the German federal government, the share of electrical power production from renewables should increase to 35% by 2020. This means that, in the near future at certain times renewable energies will provide a major part of Germany's power production. Operating a power supply system with a large share of weather-dependent power sources in a secure way requires improved power forecasts. One of the most promising strategies to improve the existing wind power and PV power forecasts is to optimize the underlying weather forecasts and to enhance the collaboration between the meteorology and energy sectors. Deutscher Wetterdienst addresses these challenges in collaboration with Fraunhofer IWES within the research project EWeLiNE. The overarching goal of the project is to improve the wind and PV power forecasts by combining improved power forecast models and optimized weather forecasts. During the project, the numerical weather prediction models COSMO-DE and COSMO-DE-EPS (Ensemble Prediction System) by Deutscher Wetterdienst will be generally optimized towards improved wind power and PV forecasts. For instance, it will be investigated whether the assimilation of new types of data, e.g. power production data, can lead to improved weather forecasts. With regard to the probabilistic forecasts, the focus is on the generation of ensembles and ensemble calibration. One important aspect of the project is to integrate the probabilistic information into decision making processes by developing user-specified products. In this paper we give an overview of the project and present first results.
NASA Astrophysics Data System (ADS)
Matte, Simon; Boucher, Marie-Amélie; Boucher, Vincent; Fortier Filion, Thomas-Charles
2017-06-01
A large effort has been made over the past 10 years to promote the operational use of probabilistic or ensemble streamflow forecasts. Numerous studies have shown that ensemble forecasts are of higher quality than deterministic ones. Many studies also conclude that decisions based on ensemble rather than deterministic forecasts lead to better decisions in the context of flood mitigation. Hence, it is believed that ensemble forecasts possess a greater economic and social value for both decision makers and the general population. However, the vast majority of, if not all, existing hydro-economic studies rely on a cost-loss ratio framework that assumes a risk-neutral decision maker. To overcome this important flaw, this study borrows from economics and evaluates the economic value of early warning flood systems using the well-known Constant Absolute Risk Aversion (CARA) utility function, which explicitly accounts for the level of risk aversion of the decision maker. This new framework allows for the full exploitation of the information related to a forecasts' uncertainty, making it especially suited for the economic assessment of ensemble or probabilistic forecasts. Rather than comparing deterministic and ensemble forecasts, this study focuses on comparing different types of ensemble forecasts. There are multiple ways of assessing and representing forecast uncertainty. Consequently, there exist many different means of building an ensemble forecasting system for future streamflow. One such possibility is to dress deterministic forecasts using the statistics of past error forecasts. Such dressing methods are popular among operational agencies because of their simplicity and intuitiveness. Another approach is the use of ensemble meteorological forecasts for precipitation and temperature, which are then provided as inputs to one or many hydrological model(s). In this study, three concurrent ensemble streamflow forecasting systems are compared: simple statistically dressed deterministic forecasts, forecasts based on meteorological ensembles, and a variant of the latter that also includes an estimation of state variable uncertainty. This comparison takes place for the Montmorency River, a small flood-prone watershed in southern central Quebec, Canada. The assessment of forecasts is performed for lead times of 1 to 5 days, both in terms of forecasts' quality (relative to the corresponding record of observations) and in terms of economic value, using the new proposed framework based on the CARA utility function. It is found that the economic value of a forecast for a risk-averse decision maker is closely linked to the forecast reliability in predicting the upper tail of the streamflow distribution. Hence, post-processing forecasts to avoid over-forecasting could help improve both the quality and the value of forecasts.
NASA Astrophysics Data System (ADS)
Liu, Y.; Zhang, Y.; Wood, A.; Lee, H. S.; Wu, L.; Schaake, J. C.
2016-12-01
Seasonal precipitation forecasts are a primary driver for seasonal streamflow prediction that is critical for a range of water resources applications, such as reservoir operations and drought management. However, it is well known that seasonal precipitation forecasts from climate models are often biased and also too coarse in spatial resolution for hydrologic applications. Therefore, post-processing procedures such as downscaling and bias correction are often needed. In this presentation, we discuss results from a recent study that applies a two-step methodology to downscale and correct the ensemble mean precipitation forecasts from the Climate Forecast System (CFS). First, CFS forecasts are downscaled and bias corrected using monthly reforecast analogs: we identify past precipitation forecasts that are similar to the current forecast, and then use the finer-scale observational analysis fields from the corresponding dates to represent the post-processed ensemble forecasts. Second, we construct the posterior distribution of forecast precipitation from the post-processed ensemble by integrating climate indices: a correlation analysis is performed to identify dominant climate indices for the study region, which are then used to weight the analysis analogs selected in the first step using a Bayesian approach. The methodology is applied to the California Nevada River Forecast Center (CNRFC) and the Middle Atlantic River Forecast Center (MARFC) regions for 1982-2015, using the North American Land Data Assimilation System (NLDAS-2) precipitation as the analysis. The results from cross validation show that the post-processed CFS precipitation forecast are considerably more skillful than the raw CFS with the analog approach only. Integrating climate indices can further improve the skill if the number of ensemble members considered is large enough; however, the improvement is generally limited to the first couple of months when compared against climatology. Impacts of various factors such as ensemble size, lead time, and choice of climate indices will also be discussed.
Exploring What Determines the Use of Forecasts of Varying Time Periods in Guanacaste, Costa Rica
NASA Astrophysics Data System (ADS)
Babcock, M.; Wong-Parodi, G.; Grossmann, I.; Small, M. J.
2016-12-01
Weather and climate forecasts are promoted as ways to improve water management, especially in the face of changing environmental conditions. However, studies indicate many stakeholders who may benefit from such information do not use it. This study sought to better understand which personal factors (e.g., trust in forecast sources, perceptions of accuracy) were important determinants of the use of 4-day, 3-month, and 12-month rainfall forecasts by stakeholders in water management-related sectors in the seasonally dry province of Guanacaste, Costa Rica. From August to October 2015, we surveyed 87 stakeholders from a mix of government agencies, local water committees, large farms, tourist businesses, environmental NGO's, and the public. The result of an exploratory factor analysis suggests that trust in "informal" forecast sources (traditional methods, family advice) and in "formal" sources (government, university and private company science) are independent of each other. The result of logistic regression analyses suggest that 1) greater understanding of forecasts is associated with a greater probability of 4-day and 3-month forecast use, but not 12-month forecast use, 2) a greater probability of 3-month forecast use is associated with a lower level of trust in "informal" sources, and 3), feeling less secure about water resources, and regularly using many sources of information (and specifically formal meetings and reports) are each associated with a greater probability of using 12-month forecasts. While limited by the sample size, and affected by the factoring method and regression model assumptions, these results do appear to suggest that while forecasts of all times scales are used to some extent, local decision makers' decisions to use 4-day and 3-month forecasts appear to be more intrinsically motivated (based on their level of understanding and trust) and the use of 12-month forecasts seems to be more motivated by a sense of requirement or mandate.
NASA Astrophysics Data System (ADS)
Seyoum, Mesgana; van Andel, Schalk Jan; Xuan, Yunqing; Amare, Kibreab
Flow forecasting in poorly gauged, flood-prone Ribb and Gumara sub-catchments of the Blue Nile was studied with the aim of testing the performance of Quantitative Precipitation Forecasts (QPFs). Four types of QPFs namely MM5 forecasts with a spatial resolution of 2 km; the Maximum, Mean and Minimum members (MaxEPS, MeanEPS and MinEPS where EPS stands for Ensemble Prediction System) of the fixed, low resolution (2.5 by 2.5 degrees) National Oceanic and Atmospheric Administration Global Forecast System (NOAA GFS) ensemble forecasts were used. Both the MM5 and the EPS were not calibrated (bias correction, downscaling (for EPS), etc.). In addition, zero forecasts assuming no rainfall in the coming days, and monthly average forecasts assuming average monthly rainfall in the coming days, were used. These rainfall forecasts were then used to drive the Hydrologic Engineering Center’s-Hydrologic Modeling System, HEC-HMS, hydrologic model for flow predictions. The results show that flow predictions using MaxEPS and MM5 precipitation forecasts over-predicted the peak flow for most of the seven events analyzed, whereas under-predicted peak flow was found using zero- and monthly average rainfall. The comparison of observed and predicted flow hydrographs shows that MM5, MaxEPS and MeanEPS precipitation forecasts were able to capture the rainfall signal that caused peak flows. Flow predictions based on MaxEPS and MeanEPS gave results that were quantitatively close to the observed flow for most events, whereas flow predictions based on MM5 resulted in large overestimations for some events. In follow-up research for this particular case study, calibration of the MM5 model will be performed. The overall analysis shows that freely available atmospheric forecasting products can provide additional information on upcoming rainfall and peak flow events in areas where only base-line forecasts such as no-rainfall or climatology are available.
Probabilistic empirical prediction of seasonal climate: evaluation and potential applications
NASA Astrophysics Data System (ADS)
Dieppois, B.; Eden, J.; van Oldenborgh, G. J.
2017-12-01
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a new evaluation of an established empirical system used to predict seasonal climate across the globe. Forecasts for surface air temperature, precipitation and sea level pressure are produced by the KNMI Probabilistic Empirical Prediction (K-PREP) system every month and disseminated via the KNMI Climate Explorer (climexp.knmi.nl). K-PREP is based on multiple linear regression and built on physical principles to the fullest extent with predictive information taken from the global CO2-equivalent concentration, large-scale modes of variability in the climate system and regional-scale information. K-PREP seasonal forecasts for the period 1981-2016 will be compared with corresponding dynamically generated forecasts produced by operational forecast systems. While there are many regions of the world where empirical forecast skill is extremely limited, several areas are identified where K-PREP offers comparable skill to dynamical systems. We discuss two key points in the future development and application of the K-PREP system: (a) the potential for K-PREP to provide a more useful basis for reference forecasts than those based on persistence or climatology, and (b) the added value of including K-PREP forecast information in multi-model forecast products, at least for known regions of good skill. We also discuss the potential development of stakeholder-driven applications of the K-PREP system, including empirical forecasts for circumboreal fire activity.
Applications of Principled Search Methods in Climate Influences and Mechanisms
NASA Technical Reports Server (NTRS)
Glymour, Clark
2005-01-01
Forest and grass fires cause economic losses in the billions of dollars in the U.S. alone. In addition, boreal forests constitute a large carbon store; it has been estimated that, were no burning to occur, an additional 7 gigatons of carbon would be sequestered in boreal soils each century. Effective wildfire suppression requires anticipation of locales and times for which wildfire is most probable, preferably with a two to four week forecast, so that limited resources can be efficiently deployed. The United States Forest Service (USFS), and other experts and agencies have developed several measures of fire risk combining physical principles and expert judgment, and have used them in automated procedures for forecasting fire risk. Forecasting accuracies for some fire risk indices in combination with climate and other variables have been estimated for specific locations, with the value of fire risk index variables assessed by their statistical significance in regressions. In other cases, the MAPSS forecasts [23, 241 for example, forecasting accuracy has been estimated only by simulated data. We describe alternative forecasting methods that predict fire probability by locale and time using statistical or machine learning procedures trained on historical data, and we give comparative assessments of their forecasting accuracy for one fire season year, April- October, 2003, for all U.S. Forest Service lands. Aside from providing an accuracy baseline for other forecasting methods, the results illustrate the interdependence between the statistical significance of prediction variables and the forecasting method used.
Data Assimilation at FLUXNET to Improve Models towards Ecological Forecasting (Invited)
NASA Astrophysics Data System (ADS)
Luo, Y.
2009-12-01
Dramatically increased volumes of data from observational and experimental networks such as FLUXNET call for transformation of ecological research to increase its emphasis on quantitative forecasting. Ecological forecasting will also meet the societal need to develop better strategies for natural resource management in a world of ongoing global change. Traditionally, ecological forecasting has been based on process-based models, informed by data in largely ad hoc ways. Although most ecological models incorporate some representation of mechanistic processes, today’s ecological models are generally not adequate to quantify real-world dynamics and provide reliable forecasts with accompanying estimates of uncertainty. A key tool to improve ecological forecasting is data assimilation, which uses data to inform initial conditions and to help constrain a model during simulation to yield results that approximate reality as closely as possible. In an era with dramatically increased availability of data from observational and experimental networks, data assimilation is a key technique that helps convert the raw data into ecologically meaningful products so as to accelerate our understanding of ecological processes, test ecological theory, forecast changes in ecological services, and better serve the society. This talk will use examples to illustrate how data from FLUXNET have been assimilated with process-based models to improve estimates of model parameters and state variables; to quantify uncertainties in ecological forecasting arising from observations, models and their interactions; and to evaluate information contributions of data and model toward short- and long-term forecasting of ecosystem responses to global change.
NASA Astrophysics Data System (ADS)
BozorgMagham, Amir E.; Ross, Shane D.; Schmale, David G.
2013-09-01
The language of Lagrangian coherent structures (LCSs) provides a new means for studying transport and mixing of passive particles advected by an atmospheric flow field. Recent observations suggest that LCSs govern the large-scale atmospheric motion of airborne microorganisms, paving the way for more efficient models and management strategies for the spread of infectious diseases affecting plants, domestic animals, and humans. In addition, having reliable predictions of the timing of hyperbolic LCSs may contribute to improved aerobiological sampling of microorganisms with unmanned aerial vehicles and LCS-based early warning systems. Chaotic atmospheric dynamics lead to unavoidable forecasting errors in the wind velocity field, which compounds errors in LCS forecasting. In this study, we reveal the cumulative effects of errors of (short-term) wind field forecasts on the finite-time Lyapunov exponent (FTLE) fields and the associated LCSs when realistic forecast plans impose certain limits on the forecasting parameters. Objectives of this paper are to (a) quantify the accuracy of prediction of FTLE-LCS features and (b) determine the sensitivity of such predictions to forecasting parameters. Results indicate that forecasts of attracting LCSs exhibit less divergence from the archive-based LCSs than the repelling features. This result is important since attracting LCSs are the backbone of long-lived features in moving fluids. We also show under what circumstances one can trust the forecast results if one merely wants to know if an LCS passed over a region and does not need to precisely know the passage time.
Test operation of a real-time tsunami inundation forecast system using actual data observed by S-net
NASA Astrophysics Data System (ADS)
Suzuki, W.; Yamamoto, N.; Miyoshi, T.; Aoi, S.
2017-12-01
If the tsunami inundation information can be rapidly and stably forecast before the large tsunami attacks, the information would have effectively people realize the impeding danger and necessity of evacuation. Toward that goal, we have developed a prototype system to perform the real-time tsunami inundation forecast for Chiba prefecture, eastern Japan, using off-shore ocean bottom pressure data observed by the seafloor observation network for earthquakes and tsunamis along the Japan Trench (S-net) (Aoi et al., 2015, AGU). Because tsunami inundation simulation requires a large computation cost, we employ a database approach searching the pre-calculated tsunami scenarios that reasonably explain the observed S-net pressure data based on the multi-index method (Yamamoto et al., 2016, EPS). The scenario search is regularly repeated, not triggered by the occurrence of the tsunami event, and the forecast information is generated from the selected scenarios that meet the criterion. Test operation of the prototype system using the actual observation data started in April, 2017 and the performance and behavior of the system during non-tsunami event periods have been examined. It is found that the treatment of the noises affecting the observed data is the main issue to be solved toward the improvement of the system. Even if the observed pressure data are filtered to extract the tsunami signals, the noises in ordinary times or unusually large noises like high ocean waves due to storm affect the comparison between the observed and scenario data. Due to the noises, the tsunami scenarios are selected and the tsunami is forecast although any tsunami event does not actually occur. In most cases, the selected scenarios due to the noises have the fault models in the region along the Kurile or Izu-Bonin Trenches, far from the S-net region, or the fault models below the land. Based on the parallel operation of the forecast system with a different scenario search condition and examination of the fault models, we improve the stability and performance of the forecast system.This work was supported by Council for Science, Technology and Innovation(CSTI), Cross-ministerial Strategic Innovation Promotion Program (SIP), "Enhancement of societal resiliency against natural disasters"(Funding agency: JST).
Dispersion Modeling Using Ensemble Forecasts Compared to ETEX Measurements.
NASA Astrophysics Data System (ADS)
Straume, Anne Grete; N'dri Koffi, Ernest; Nodop, Katrin
1998-11-01
Numerous numerical models are developed to predict long-range transport of hazardous air pollution in connection with accidental releases. When evaluating and improving such a model, it is important to detect uncertainties connected to the meteorological input data. A Lagrangian dispersion model, the Severe Nuclear Accident Program, is used here to investigate the effect of errors in the meteorological input data due to analysis error. An ensemble forecast, produced at the European Centre for Medium-Range Weather Forecasts, is then used as model input. The ensemble forecast members are generated by perturbing the initial meteorological fields of the weather forecast. The perturbations are calculated from singular vectors meant to represent possible forecast developments generated by instabilities in the atmospheric flow during the early part of the forecast. The instabilities are generated by errors in the analyzed fields. Puff predictions from the dispersion model, using ensemble forecast input, are compared, and a large spread in the predicted puff evolutions is found. This shows that the quality of the meteorological input data is important for the success of the dispersion model. In order to evaluate the dispersion model, the calculations are compared with measurements from the European Tracer Experiment. The model manages to predict the measured puff evolution concerning shape and time of arrival to a fairly high extent, up to 60 h after the start of the release. The modeled puff is still too narrow in the advection direction.
Buitrago, Jaime; Asfour, Shihab
2017-01-01
Short-term load forecasting is crucial for the operations planning of an electrical grid. Forecasting the next 24 h of electrical load in a grid allows operators to plan and optimize their resources. The purpose of this study is to develop a more accurate short-term load forecasting method utilizing non-linear autoregressive artificial neural networks (ANN) with exogenous multi-variable input (NARX). The proposed implementation of the network is new: the neural network is trained in open-loop using actual load and weather data, and then, the network is placed in closed-loop to generate a forecast using the predicted load as the feedback input.more » Unlike the existing short-term load forecasting methods using ANNs, the proposed method uses its own output as the input in order to improve the accuracy, thus effectively implementing a feedback loop for the load, making it less dependent on external data. Using the proposed framework, mean absolute percent errors in the forecast in the order of 1% have been achieved, which is a 30% improvement on the average error using feedforward ANNs, ARMAX and state space methods, which can result in large savings by avoiding commissioning of unnecessary power plants. Finally, the New England electrical load data are used to train and validate the forecast prediction.« less
Quality Assessment of the Cobel-Isba Numerical Forecast System of Fog and Low Clouds
NASA Astrophysics Data System (ADS)
Bergot, Thierry
2007-06-01
Short-term forecasting of fog is a difficult issue which can have a large societal impact. Fog appears in the surface boundary layer and is driven by the interactions between land surface and the lower layers of the atmosphere. These interactions are still not well parameterized in current operational NWP models, and a new methodology based on local observations, an adaptive assimilation scheme and a local numerical model is tested. The proposed numerical forecast method of foggy conditions has been run during three years at Paris-CdG international airport. This test over a long-time period allows an in-depth evaluation of the forecast quality. This study demonstrates that detailed 1-D models, including detailed physical parameterizations and high vertical resolution, can reasonably represent the major features of the life cycle of fog (onset, development and dissipation) up to +6 h. The error on the forecast onset and burn-off time is typically 1 h. The major weakness of the methodology is related to the evolution of low clouds (stratus lowering). Even if the occurrence of fog is well forecasted, the value of the horizontal visibility is only crudely forecasted. Improvements in the microphysical parameterization and in the translation algorithm converting NWP prognostic variables into a corresponding horizontal visibility seems necessary to accurately forecast the value of the visibility.
Nishiura, Hiroshi
2011-02-16
Real-time forecasting of epidemics, especially those based on a likelihood-based approach, is understudied. This study aimed to develop a simple method that can be used for the real-time epidemic forecasting. A discrete time stochastic model, accounting for demographic stochasticity and conditional measurement, was developed and applied as a case study to the weekly incidence of pandemic influenza (H1N1-2009) in Japan. By imposing a branching process approximation and by assuming the linear growth of cases within each reporting interval, the epidemic curve is predicted using only two parameters. The uncertainty bounds of the forecasts are computed using chains of conditional offspring distributions. The quality of the forecasts made before the epidemic peak appears largely to depend on obtaining valid parameter estimates. The forecasts of both weekly incidence and final epidemic size greatly improved at and after the epidemic peak with all the observed data points falling within the uncertainty bounds. Real-time forecasting using the discrete time stochastic model with its simple computation of the uncertainty bounds was successful. Because of the simplistic model structure, the proposed model has the potential to additionally account for various types of heterogeneity, time-dependent transmission dynamics and epidemiological details. The impact of such complexities on forecasting should be explored when the data become available as part of the disease surveillance.
Potential predictability and forecast skill in ensemble climate forecast: a skill-persistence rule
NASA Astrophysics Data System (ADS)
Jin, Yishuai; Rong, Xinyao; Liu, Zhengyu
2017-12-01
This study investigates the factors relationship between the forecast skills for the real world (actual skill) and perfect model (perfect skill) in ensemble climate model forecast with a series of fully coupled general circulation model forecast experiments. It is found that the actual skill for sea surface temperature (SST) in seasonal forecast is substantially higher than the perfect skill on a large part of the tropical oceans, especially the tropical Indian Ocean and the central-eastern Pacific Ocean. The higher actual skill is found to be related to the higher observational SST persistence, suggesting a skill-persistence rule: a higher SST persistence in the real world than in the model could overwhelm the model bias to produce a higher forecast skill for the real world than for the perfect model. The relation between forecast skill and persistence is further proved using a first-order autoregressive model (AR1) analytically for theoretical solutions and numerically for analogue experiments. The AR1 model study shows that the skill-persistence rule is strictly valid in the case of infinite ensemble size, but could be distorted by sampling errors and non-AR1 processes. This study suggests that the so called "perfect skill" is model dependent and cannot serve as an accurate estimate of the true upper limit of real world prediction skill, unless the model can capture at least the persistence property of the observation.
NASA Technical Reports Server (NTRS)
Hoffman, Ross N.
1993-01-01
A preliminary assessment of the impact of the ERS 1 scatterometer wind data on the current European Centre for Medium-Range Weather Forecasts analysis and forecast system has been carried out. Although the scatterometer data results in changes to the analyses and forecasts, there is no consistent improvement or degradation. Our results are based on comparing analyses and forecasts from assimilation cycles. The two sets of analyses are very similar except for the low level wind fields over the ocean. Impacts on the analyzed wind fields are greater over the southern ocean, where other data are scarce. For the most part the mass field increments are too small to balance the wind increments. The effect of the nonlinear normal mode initialization on the analysis differences is quite small, but we observe that the differences tend to wash out in the subsequent 6-hour forecast. In the Northern Hemisphere, analysis differences are very small, except directly at the scatterometer locations. Forecast comparisons reveal large differences in the Southern Hemisphere after 72 hours. Notable differences in the Northern Hemisphere do not appear until late in the forecast. Overall, however, the Southern Hemisphere impacts are neutral. The experiments described are preliminary in several respects. We expect these data to ultimately prove useful for global data assimilation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buitrago, Jaime; Asfour, Shihab
Short-term load forecasting is crucial for the operations planning of an electrical grid. Forecasting the next 24 h of electrical load in a grid allows operators to plan and optimize their resources. The purpose of this study is to develop a more accurate short-term load forecasting method utilizing non-linear autoregressive artificial neural networks (ANN) with exogenous multi-variable input (NARX). The proposed implementation of the network is new: the neural network is trained in open-loop using actual load and weather data, and then, the network is placed in closed-loop to generate a forecast using the predicted load as the feedback input.more » Unlike the existing short-term load forecasting methods using ANNs, the proposed method uses its own output as the input in order to improve the accuracy, thus effectively implementing a feedback loop for the load, making it less dependent on external data. Using the proposed framework, mean absolute percent errors in the forecast in the order of 1% have been achieved, which is a 30% improvement on the average error using feedforward ANNs, ARMAX and state space methods, which can result in large savings by avoiding commissioning of unnecessary power plants. Finally, the New England electrical load data are used to train and validate the forecast prediction.« less
NASA Astrophysics Data System (ADS)
Pillosu, F. M.; Jurlina, T.; Baugh, C.; Tsonevsky, I.; Hewson, T.; Prates, F.; Pappenberger, F.; Prudhomme, C.
2017-12-01
During hurricane Harvey the greater east Texas area was affected by extensive flash flooding. Their localised nature meant they were too small for conventional large scale flood forecasting systems to capture. We are testing the use of two real time forecast products from the European Centre for Medium-range Weather Forecasts (ECMWF) in combination with local vulnerability information to provide flash flood forecasting tools at the medium range (up to 7 days ahead). Meteorological forecasts are the total precipitation extreme forecast index (EFI), a measure of how the ensemble forecast probability distribution differs from the model-climate distribution for the chosen location, time of year and forecast lead time; and the shift of tails (SOT) which complements the EFI by quantifying how extreme an event could potentially be. Both products give the likelihood of flash flood generating precipitation. For hurricane Harvey, 3-day EFI and SOT products for the period 26th - 29th August 2017 were used, generated from the twice daily, 18 km, 51 ensemble member ECMWF Integrated Forecast System. After regridding to 1 km resolution the forecasts were combined with vulnerable area data to produce a flash flood hazard risk area. The vulnerability data were floodplains (EU Joint Research Centre), road networks (Texas Department of Transport) and urban areas (Census Bureau geographic database), together reflecting the susceptibility to flash floods from the landscape. The flash flood hazard risk area forecasts were verified using a traditional approach against observed National Weather Service flash flood reports, a total of 153 reported flash floods have been detected in that period. Forecasts performed best for SOT = 5 (hit ratio = 65%, false alarm ratio = 44%) and EFI = 0.7 (hit ratio = 74%, false alarm ratio = 45%) at 72 h lead time. By including the vulnerable areas data, our verification results improved by 5-15%, demonstrating the value of vulnerability information within natural hazard forecasts. This research shows that flash flooding from hurricane Harvey was predictable up to 4 days ahead and that filtering the forecasts to vulnerable areas provides a more focused guidance to civil protection agencies planning their emergency response.
Earthquake focal mechanism forecasting in Italy for PSHA purposes
NASA Astrophysics Data System (ADS)
Roselli, Pamela; Marzocchi, Warner; Mariucci, Maria Teresa; Montone, Paola
2018-01-01
In this paper, we put forward a procedure that aims to forecast focal mechanism of future earthquakes. One of the primary uses of such forecasts is in probabilistic seismic hazard analysis (PSHA); in fact, aiming at reducing the epistemic uncertainty, most of the newer ground motion prediction equations consider, besides the seismicity rates, the forecast of the focal mechanism of the next large earthquakes as input data. The data set used to this purpose is relative to focal mechanisms taken from the latest stress map release for Italy containing 392 well-constrained solutions of events, from 1908 to 2015, with Mw ≥ 4 and depths from 0 down to 40 km. The data set considers polarity focal mechanism solutions until to 1975 (23 events), whereas for 1976-2015, it takes into account only the Centroid Moment Tensor (CMT)-like earthquake focal solutions for data homogeneity. The forecasting model is rooted in the Total Weighted Moment Tensor concept that weighs information of past focal mechanisms evenly distributed in space, according to their distance from the spatial cells and magnitude. Specifically, for each cell of a regular 0.1° × 0.1° spatial grid, the model estimates the probability to observe a normal, reverse, or strike-slip fault plane solution for the next large earthquakes, the expected moment tensor and the related maximum horizontal stress orientation. These results will be available for the new PSHA model for Italy under development. Finally, to evaluate the reliability of the forecasts, we test them with an independent data set that consists of some of the strongest earthquakes with Mw ≥ 3.9 occurred during 2016 in different Italian tectonic provinces.
Application of the Haines Index in the fire warning system
NASA Astrophysics Data System (ADS)
Kalin, Lovro; Marija, Mokoric; Tomislav, Kozaric
2016-04-01
Croatia, as all Mediterranean countries, is strongly affected by large wildfires, particularly in the coastal region. In the last two decades the number and intensity of fires has been significantly increased, which is unanimously associated with climate change, e.g. global warming. More extreme fires are observed, and the fire-fighting season has been expanded to June and September. The meteorological support for fire protection and planning is therefore even more important. At the Meteorological and Hydrological Service of Croatia a comprehensive monitoring and warning system has been established. It includes standard components, such as short term forecast of Fire Weather Index (FWI), but long range forecast as well. However, due to more frequent hot and dry seasons, FWI index often does not provide additional information of extremely high fire danger, since it regularly takes the highest values for long periods. Therefore the additional tools have been investigated. One of widely used meteorological products is the Haines index (HI). It provides information of potential fire growth, taking into account only the vertical instability of the atmosphere, and not the state of the fuel. Several analyses and studies carried out at the Service confirmed the correlation of high HI values with large and extreme fires. The Haines index forecast has been used at the Service for several years, employing European Centre for Medium Range Weather Forecast (ECMWF) global prediction model, as well as the limited-area Aladin model. The verification results show that these forecast are reliable, when compared to radiosonde measurements. All these results provided the introduction of the additional fire warnings, that are issued by the Service's Forecast Department.
The potential predictability of fire danger provided by ECMWF forecast
NASA Astrophysics Data System (ADS)
Di Giuseppe, Francesca
2017-04-01
The European Forest Fire Information System (EFFIS), is currently being developed in the framework of the Copernicus Emergency Management Services to monitor and forecast fire danger in Europe. The system provides timely information to civil protection authorities in 38 nations across Europe and mostly concentrates on flagging regions which might be at high danger of spontaneous ignition due to persistent drought. The daily predictions of fire danger conditions are based on the US Forest Service National Fire Danger Rating System (NFDRS), the Canadian forest service Fire Weather Index Rating System (FWI) and the Australian McArthur (MARK-5) rating systems. Weather forcings are provided in real time by the European Centre for Medium range Weather Forecasts (ECMWF) forecasting system. The global system's potential predictability is assessed using re-analysis fields as weather forcings. The Global Fire Emissions Database (GFED4) provides 11 years of observed burned areas from satellite measurements and is used as a validation dataset. The fire indices implemented are good predictors to highlight dangerous conditions. High values are correlated with observed fire and low values correspond to non observed events. A more quantitative skill evaluation was performed using the Extremal Dependency Index which is a skill score specifically designed for rare events. It revealed that the three indices were more skilful on a global scale than the random forecast to detect large fires. The performance peaks in the boreal forests, in the Mediterranean, the Amazon rain-forests and southeast Asia. The skill-scores were then aggregated at country level to reveal which nations could potentiallty benefit from the system information in aid of decision making and fire control support. Overall we found that fire danger modelling based on weather forecasts, can provide reasonable predictability over large parts of the global landmass.
Large Scale Traffic Simulations
DOT National Transportation Integrated Search
1997-01-01
Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computation speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated "looping" between t...
Observation Impact over the Antarctic During the Concordiasi Field Campaign
NASA Technical Reports Server (NTRS)
Boullot, Nathalie; Rabier, Florence; Langland, Rolf; Gelaro, Ron; Cardinali, Carla; Guidard, Vincent; Bauer, Peter; Doerenbecher, Alexis
2014-01-01
The impact of observations on analysis uncertainty and forecast performance was investigated for Austral Spring 2010 over the Southern polar area for four different systems (NRL, GMAO, ECMWF and Meteo-France), at the time of the Concordiasi field experiment. The largest multi model variance in 500 hPa height analyses is found in the southern sub-Antarctic oceanic region, where there are strong atmospheric dynamics, rapid forecast error growth, and fewer upper air wind observation data to constrain the analyses. In terms of data impact the most important observation components are shown to be AMSU, IASI, AIRS, GPS-RO, radiosonde, surface and atmospheric motion vector observations. For sounding data, radiosondes and dropsondes, one can note a large impact of temperature at low levels and a large impact of wind at high levels. Observing system experiments using the Concordiasi dropsondes show a large impact of the observations over the Antarctic plateau extending to lower latitudes with the forecast range, with a large impact around 50 to 70deg South. These experiments indicate there is a potential benefit of better using radiance data over land and sea-ice and innovative atmospheric motion vectors obtained from a combination of various satellites to fill the current data gaps and improve NWP in this region.
Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment
NASA Astrophysics Data System (ADS)
Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection
2011-12-01
Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated for reliability and skill by retrospective testing, and the models should be under continuous prospective testing against long-term forecasts and alternative time-dependent models. (d) Short-term models used in operational forecasting should be consistent with the long-term forecasts used in probabilistic seismic hazard analysis. (e) Alert procedures should be standardized to facilitate decisions at different levels of government, based in part on objective analysis of costs and benefits. (f) In establishing alert protocols, consideration should also be given to the less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Authoritative statements of increased risk, even when the absolute probability is low, can provide a psychological benefit to the public by filling information vacuums that lead to informal predictions and misinformation. Formal OEF procedures based on probabilistic forecasting appropriately separate hazard estimation by scientists from the decision-making role of civil protection authorities. The prosecution of seven Italian scientists on manslaughter charges stemming from their actions before the L'Aquila earthquake makes clear why this separation should be explicit in defining OEF protocols.
Parsons, Thomas E.; Segou, Margaret; Sevilgen, Volkan; Milner, Kevin; Field, Edward; Toda, Shinji; Stein, Ross S.
2014-01-01
We calculate stress changes resulting from the M= 6.0 West Napa earthquake on north San Francisco Bay area faults. The earthquake ruptured within a series of long faults that pose significant hazard to the Bay area, and we are thus concerned with potential increases in the probability of a large earthquake through stress transfer. We conduct this exercise as a prospective test because the skill of stress-based aftershock forecasting methodology is inconclusive. We apply three methods: (1) generalized mapping of regional Coulomb stress change, (2) stress changes resolved on Uniform California Earthquake Rupture Forecast faults, and (3) a mapped rate/state aftershock forecast. All calculations were completed within 24 h after the main shock and were made without benefit of known aftershocks, which will be used to evaluative the prospective forecast. All methods suggest that we should expect heightened seismicity on parts of the southern Rodgers Creek, northern Hayward, and Green Valley faults.
NASA Astrophysics Data System (ADS)
Parsons, Tom; Segou, Margaret; Sevilgen, Volkan; Milner, Kevin; Field, Edward; Toda, Shinji; Stein, Ross S.
2014-12-01
We calculate stress changes resulting from the M = 6.0 West Napa earthquake on north San Francisco Bay area faults. The earthquake ruptured within a series of long faults that pose significant hazard to the Bay area, and we are thus concerned with potential increases in the probability of a large earthquake through stress transfer. We conduct this exercise as a prospective test because the skill of stress-based aftershock forecasting methodology is inconclusive. We apply three methods: (1) generalized mapping of regional Coulomb stress change, (2) stress changes resolved on Uniform California Earthquake Rupture Forecast faults, and (3) a mapped rate/state aftershock forecast. All calculations were completed within 24 h after the main shock and were made without benefit of known aftershocks, which will be used to evaluative the prospective forecast. All methods suggest that we should expect heightened seismicity on parts of the southern Rodgers Creek, northern Hayward, and Green Valley faults.
Current and future data assimilation development in the Copernicus Atmosphere Monitoring Service
NASA Astrophysics Data System (ADS)
Engelen, R. J.; Ades, M.; Agusti-panareda, A.; Flemming, J.; Inness, A.; Kipling, Z.; Parrington, M.; Peuch, V. H.
2017-12-01
The European Copernicus Atmosphere Monitoring Service (CAMS) operationally provides daily forecasts of global atmospheric composition and regional air quality. The global forecasting system is using ECMWF's Integrated Forecasting System (IFS), which is used for numerical weather prediction and which has been extended with modules for atmospheric chemistry, aerosols and greenhouse gases. The system assimilates observations from more than 60 satellite sensors to constrain both the meteorology and the atmospheric composition species. While an operational forecasting system needs to be robust and reliable, it also needs to stay state-of-the-art to provide the best possible forecasts. Continuous development is therefore an important component of the CAMS systems. We will present on-going efforts on improving the 4D-Var data assimilation system, such as using ensemble data assimilation to improve the background error covariances and more accurate use of satellite observations. We will also outline plans for including emissions in the daily CAMS analyses, which is an area where research activities have a large potential to feed into operational applications.
NASA Astrophysics Data System (ADS)
Judt, Falko; Chen, Shuyi S.
2015-07-01
Hurricane surface wind is a key measure of storm intensity. However, a climatology of hurricane winds is lacking to date, largely because hurricanes are relatively rare events and difficult to observe over the open ocean. Here we present a new hurricane wind climatology based on objective surface wind analyses, which are derived from Stepped Frequency Microwave Radiometer measurements acquired by NOAA WP-3D and U.S. Air Force WC-130J hurricane hunter aircraft. The wind data were collected during 72 aircraft reconnaissance missions into 21 western Atlantic hurricanes from 1998 to 2012. This climatology provides an opportunity to validate hurricane intensity forecasts beyond the simplistic maximum wind speed metric and allows evaluating the predictive skill of probabilistic hurricane intensity forecasts using high-resolution model ensembles. An example of application is presented here using a 1.3 km grid spacing Weather Research and Forecasting model ensemble forecast of Hurricane Earl (2010).
Image-based optimization of coronal magnetic field models for improved space weather forecasting
NASA Astrophysics Data System (ADS)
Uritsky, V. M.; Davila, J. M.; Jones, S. I.; MacNeice, P. J.
2017-12-01
The existing space weather forecasting frameworks show a significant dependence on the accuracy of the photospheric magnetograms and the extrapolation models used to reconstruct the magnetic filed in the solar corona. Minor uncertainties in the magnetic field magnitude and direction near the Sun, when propagated through the heliosphere, can lead to unacceptible prediction errors at 1 AU. We argue that ground based and satellite coronagraph images can provide valid geometric constraints that could be used for improving coronal magnetic field extrapolation results, enabling more reliable forecasts of extreme space weather events such as major CMEs. In contrast to the previously developed loop segmentation codes designed for detecting compact closed-field structures above solar active regions, we focus on the large-scale geometry of the open-field coronal regions up to 1-2 solar radii above the photosphere. By applying the developed image processing techniques to high-resolution Mauna Loa Solar Observatory images, we perform an optimized 3D B-line tracing for a full Carrington rotation using the magnetic field extrapolation code developed S. Jones at al. (ApJ 2016, 2017). Our tracing results are shown to be in a good qualitative agreement with the large-scale configuration of the optical corona, and lead to a more consistent reconstruction of the large-scale coronal magnetic field geometry, and potentially more accurate global heliospheric simulation results. Several upcoming data products for the space weather forecasting community will be also discussed.
Skill of Global Raw and Postprocessed Ensemble Predictions of Rainfall over Northern Tropical Africa
NASA Astrophysics Data System (ADS)
Vogel, Peter; Knippertz, Peter; Fink, Andreas H.; Schlueter, Andreas; Gneiting, Tilmann
2018-04-01
Accumulated precipitation forecasts are of high socioeconomic importance for agriculturally dominated societies in northern tropical Africa. In this study, we analyze the performance of nine operational global ensemble prediction systems (EPSs) relative to climatology-based forecasts for 1 to 5-day accumulated precipitation based on the monsoon seasons 2007-2014 for three regions within northern tropical Africa. To assess the full potential of raw ensemble forecasts across spatial scales, we apply state-of-the-art statistical postprocessing methods in form of Bayesian Model Averaging (BMA) and Ensemble Model Output Statistics (EMOS), and verify against station and spatially aggregated, satellite-based gridded observations. Raw ensemble forecasts are uncalibrated, unreliable, and underperform relative to climatology, independently of region, accumulation time, monsoon season, and ensemble. Differences between raw ensemble and climatological forecasts are large, and partly stem from poor prediction for low precipitation amounts. BMA and EMOS postprocessed forecasts are calibrated, reliable, and strongly improve on the raw ensembles, but - somewhat disappointingly - typically do not outperform climatology. Most EPSs exhibit slight improvements over the period 2007-2014, but overall have little added value compared to climatology. We suspect that the parametrization of convection is a potential cause for the sobering lack of ensemble forecast skill in a region dominated by mesoscale convective systems.
Siedlecki, Samantha A.; Kaplan, Isaac C.; Hermann, Albert J.; Nguyen, Thanh Tam; Bond, Nicholas A.; Newton, Jan A.; Williams, Gregory D.; Peterson, William T.; Alin, Simone R.; Feely, Richard A.
2016-01-01
Resource managers at the state, federal, and tribal levels make decisions on a weekly to quarterly basis, and fishers operate on a similar timeframe. To determine the potential of a support tool for these efforts, a seasonal forecast system is experimented with here. JISAO’s Seasonal Coastal Ocean Prediction of the Ecosystem (J-SCOPE) features dynamical downscaling of regional ocean conditions in Washington and Oregon waters using a combination of a high-resolution regional model with biogeochemistry and forecasts from NOAA’s Climate Forecast System (CFS). Model performance and predictability were examined for sea surface temperature (SST), bottom temperature, bottom oxygen, pH, and aragonite saturation state through model hindcasts, reforecast, and forecast comparisons with observations. Results indicate J-SCOPE forecasts have measurable skill on seasonal timescales. Experiments suggest that seasonal forecasting of ocean conditions important for fisheries is possible with the right combination of components. Those components include regional predictability on seasonal timescales of the physical environment from a large-scale model, a high-resolution regional model with biogeochemistry that simulates seasonal conditions in hindcasts, a relationship with local stakeholders, and a real-time observational network. Multiple efforts and approaches in different regions would advance knowledge to provide additional tools to fishers and other stakeholders. PMID:27273473
NASA Astrophysics Data System (ADS)
Siedlecki, Samantha A.; Kaplan, Isaac C.; Hermann, Albert J.; Nguyen, Thanh Tam; Bond, Nicholas A.; Newton, Jan A.; Williams, Gregory D.; Peterson, William T.; Alin, Simone R.; Feely, Richard A.
2016-06-01
Resource managers at the state, federal, and tribal levels make decisions on a weekly to quarterly basis, and fishers operate on a similar timeframe. To determine the potential of a support tool for these efforts, a seasonal forecast system is experimented with here. JISAO’s Seasonal Coastal Ocean Prediction of the Ecosystem (J-SCOPE) features dynamical downscaling of regional ocean conditions in Washington and Oregon waters using a combination of a high-resolution regional model with biogeochemistry and forecasts from NOAA’s Climate Forecast System (CFS). Model performance and predictability were examined for sea surface temperature (SST), bottom temperature, bottom oxygen, pH, and aragonite saturation state through model hindcasts, reforecast, and forecast comparisons with observations. Results indicate J-SCOPE forecasts have measurable skill on seasonal timescales. Experiments suggest that seasonal forecasting of ocean conditions important for fisheries is possible with the right combination of components. Those components include regional predictability on seasonal timescales of the physical environment from a large-scale model, a high-resolution regional model with biogeochemistry that simulates seasonal conditions in hindcasts, a relationship with local stakeholders, and a real-time observational network. Multiple efforts and approaches in different regions would advance knowledge to provide additional tools to fishers and other stakeholders.
Siedlecki, Samantha A; Kaplan, Isaac C; Hermann, Albert J; Nguyen, Thanh Tam; Bond, Nicholas A; Newton, Jan A; Williams, Gregory D; Peterson, William T; Alin, Simone R; Feely, Richard A
2016-06-07
Resource managers at the state, federal, and tribal levels make decisions on a weekly to quarterly basis, and fishers operate on a similar timeframe. To determine the potential of a support tool for these efforts, a seasonal forecast system is experimented with here. JISAO's Seasonal Coastal Ocean Prediction of the Ecosystem (J-SCOPE) features dynamical downscaling of regional ocean conditions in Washington and Oregon waters using a combination of a high-resolution regional model with biogeochemistry and forecasts from NOAA's Climate Forecast System (CFS). Model performance and predictability were examined for sea surface temperature (SST), bottom temperature, bottom oxygen, pH, and aragonite saturation state through model hindcasts, reforecast, and forecast comparisons with observations. Results indicate J-SCOPE forecasts have measurable skill on seasonal timescales. Experiments suggest that seasonal forecasting of ocean conditions important for fisheries is possible with the right combination of components. Those components include regional predictability on seasonal timescales of the physical environment from a large-scale model, a high-resolution regional model with biogeochemistry that simulates seasonal conditions in hindcasts, a relationship with local stakeholders, and a real-time observational network. Multiple efforts and approaches in different regions would advance knowledge to provide additional tools to fishers and other stakeholders.
Single Turnover at Molecular Polymerization Catalysts Reveals Spatiotemporally Resolved Reactions.
Easter, Quinn T; Blum, Suzanne A
2017-10-23
Multiple active individual molecular ruthenium catalysts have been pinpointed within growing polynorbornene, thereby revealing information on the reaction dynamics and location that is unavailable through traditional ensemble experiments. This is the first single-turnover imaging of a molecular catalyst by fluorescence microscopy and allows detection of individual monomer reactions at an industrially important molecular ruthenium ring-opening metathesis polymerization (ROMP) catalyst under synthetically relevant conditions (e.g. unmodified industrial catalyst, ambient pressure, condensed phase, ca. 0.03 m monomer). These results further establish the key fundamentals of this imaging technique for characterizing the reactivity and location of active molecular catalysts even when they are the minor components. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Optical/thermal analysis methodology for a space-qualifiable RTP furnace
NASA Technical Reports Server (NTRS)
Bugby, D.; Dardarian, S.; Cole, E.
1993-01-01
A methodology to predict the coupled optical/thermal performance of a reflective cavity heating system was developed and a laboratory test to verify the method was carried out. The procedure was utilized to design a rapid thermal processing (RTP) furnace for the Robot-Operated Material Processing in Space (ROMPS) Program which is a planned STS HH-G canister experiment involving robotics and material processing in microgravity. The laboratory test employed a tungsten-halogen reflector/lamp to heat thin, p-type silicon wafers. Measurements instrumentation consisted of 5-mil Pt/Pt-Rh thermocouples and an optical pyrometer. The predicted results, utilizing an optical ray-tracing program and a lumped-capacitance thermal analyzer, showed good agreement with the measured data for temperatures exceeding 1300 C.
Can Regional Climate Models Improve Warm Season Forecasts in the North American Monsoon Region?
NASA Astrophysics Data System (ADS)
Dominguez, F.; Castro, C. L.
2009-12-01
The goal of this work is to improve warm season forecasts in the North American Monsoon Region. To do this, we are dynamically downscaling warm season CFS (Climate Forecast System) reforecasts from 1982-2005 for the contiguous U.S. using the Weather Research and Forecasting (WRF) regional climate model. CFS is the global coupled ocean-atmosphere model used by the Climate Prediction Center (CPC), a branch of the National Center for Environmental Prediction (NCEP), to provide official U.S. seasonal climate forecasts. Recently, NCEP has produced a comprehensive long-term retrospective ensemble CFS reforecasts for the years 1980-2005. These reforecasts show that CFS model 1) has an ability to forecast tropical Pacific SSTs and large-scale teleconnection patterns, at least as evaluated for the winter season; 2) has greater skill in forecasting winter than summer climate; and 3) demonstrates an increase in skill when a greater number of ensembles members are used. The decrease in CFS skill during the warm season is due to the fact that the physical mechanisms of rainfall at this time are more related to mesoscale processes, such as the diurnal cycle of convection, low-level moisture transport, propagation and organization of convection, and surface moisture recycling. In general, these are poorly represented in global atmospheric models. Preliminary simulations for years with extreme summer climate conditions in the western and central U.S. (specifically 1988 and 1993) show that CFS-WRF simulations can provide a more realistic representation of convective rainfall processes. Thus a RCM can potentially add significant value in climate forecasting of the warm season provided the downscaling methodology incorporates the following: 1) spectral nudging to preserve the variability in the large scale circulation while still permitting the development of smaller-scale variability in the RCM; and 2) use of realistic soil moisture initial condition, in this case provided by the North American Regional Reanalysis. With these conditions, downscaled CFS-WRF reforecast simulations can produce realistic continental-scale patterns of warm season precipitation. This includes a reasonable representation of the North American monsoon in the southwest U.S. and northwest Mexico, which is notoriously difficult to represent in a global atmospheric model. We anticipate that this research will help lead the way toward substantially improved real time operational forecasts of North American summer climate with a RCM.
NASA Astrophysics Data System (ADS)
Kourafalou, V.; Kang, H.; Perlin, N.; Le Henaff, M.; Lamkin, J. T.
2016-02-01
Connectivity around the South Florida coastal regions and between South Florida and Cuba are largely influenced by a) local coastal processes and b) circulation in the Florida Straits, which is controlled by the larger scale Florida Current variability. Prediction of the physical connectivity is a necessary component for several activities that require ocean forecasts, such as oil spills, fisheries research, search and rescue. This requires a predictive system that can accommodate the intense coastal to offshore interactions and the linkages to the complex regional circulation. The Florida Straits, South Florida and Florida Keys Hybrid Coordinate Ocean Model is such a regional ocean predictive system, covering a large area over the Florida Straits and the adjacent land areas, representing both coastal and oceanic processes. The real-time ocean forecast system is high resolution ( 900m), embedded in larger scale predictive models. It includes detailed coastal bathymetry, high resolution/high frequency atmospheric forcing and provides 7-day forecasts, updated daily (see: http://coastalmodeling.rsmas.miami.edu/). The unprecedented high resolution and coastal details of this system provide value added on global forecasts through downscaling and allow a variety of applications. Examples will be presented, focusing on the period of a 2015 fisheries cruise around the coastal areas of Cuba, where model predictions helped guide the measurements on biophysical connectivity, under intense variability of the mesoscale eddy field and subsequent Florida Current meandering.
NASA Astrophysics Data System (ADS)
Wood, Andy; Clark, Elizabeth; Mendoza, Pablo; Nijssen, Bart; Newman, Andy; Clark, Martyn; Nowak, Kenneth; Arnold, Jeffrey
2017-04-01
Many if not most national operational streamflow prediction systems rely on a forecaster-in-the-loop approach that require the hands-on-effort of an experienced human forecaster. This approach evolved from the need to correct for long-standing deficiencies in the models and datasets used in forecasting, and the practice often leads to skillful flow predictions despite the use of relatively simple, conceptual models. Yet the 'in-the-loop' forecast process is not reproducible, which limits opportunities to assess and incorporate new techniques systematically, and the effort required to make forecasts in this way is an obstacle to expanding forecast services - e.g., though adding new forecast locations or more frequent forecast updates, running more complex models, or producing forecast and hindcasts that can support verification. In the last decade, the hydrologic forecasting community has begun develop more centralized, 'over-the-loop' systems. The quality of these new forecast products will depend on their ability to leverage research in areas including earth system modeling, parameter estimation, data assimilation, statistical post-processing, weather and climate prediction, verification, and uncertainty estimation through the use of ensembles. Currently, many national operational streamflow forecasting and water management communities have little experience with the strengths and weaknesses of over-the-loop approaches, even as such systems are beginning to be deployed operationally in centers such as ECMWF. There is thus a need both to evaluate these forecasting advances and to demonstrate their potential in a public arena, raising awareness in forecast user communities and development programs alike. To address this need, the US National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the US Army Corps of Engineers, using the NCAR 'System for Hydromet Analysis Research and Prediction Applications' (SHARP) to implement, assess and demonstrate real-time over-the-loop ensemble flow forecasts in a range of US watersheds. The system relies on fully ensemble techniques, including: an 100-member ensemble of meteorological model forcings and an ensemble particle filter data assimilation for initializing watershed states; analog/regression-based downscaling of ensemble weather forecasts from GEFS; and statistical post-processing of ensemble forecast outputs, all of which run in real-time within a workflow managed by ECWMF's ecFlow libraries over large US regional domains. We describe SHARP and present early hindcast and verification results for short to seasonal range streamflow forecasts in a number of US case study watersheds.
Rubin, D.M.
1992-01-01
Forecasting of one-dimensional time series previously has been used to help distinguish periodicity, chaos, and noise. This paper presents two-dimensional generalizations for making such distinctions for spatial patterns. The techniques are evaluated using synthetic spatial patterns and then are applied to a natural example: ripples formed in sand by blowing wind. Tests with the synthetic patterns demonstrate that the forecasting techniques can be applied to two-dimensional spatial patterns, with the same utility and limitations as when applied to one-dimensional time series. One limitation is that some combinations of periodicity and randomness exhibit forecasting signatures that mimic those of chaos. For example, sine waves distorted with correlated phase noise have forecasting errors that increase with forecasting distance, errors that, are minimized using nonlinear models at moderate embedding dimensions, and forecasting properties that differ significantly between the original and surrogates. Ripples formed in sand by flowing air or water typically vary in geometry from one to another, even when formed in a flow that is uniform on a large scale; each ripple modifies the local flow or sand-transport field, thereby influencing the geometry of the next ripple downcurrent. Spatial forecasting was used to evaluate the hypothesis that such a deterministic process - rather than randomness or quasiperiodicity - is responsible for the variation between successive ripples. This hypothesis is supported by a forecasting error that increases with forecasting distance, a greater accuracy of nonlinear relative to linear models, and significant differences between forecasts made with the original ripples and those made with surrogate patterns. Forecasting signatures cannot be used to distinguish ripple geometry from sine waves with correlated phase noise, but this kind of structure can be ruled out by two geometric properties of the ripples: Successive ripples are highly correlated in wavelength, and ripple crests display dislocations such as branchings and mergers. ?? 1992 American Institute of Physics.
Sub-seasonal predictability of water scarcity at global and local scale
NASA Astrophysics Data System (ADS)
Wanders, N.; Wada, Y.; Wood, E. F.
2016-12-01
Forecasting the water demand and availability for agriculture and energy production has been neglected in previous research, partly due to the fact that most large-scale hydrological models lack the skill to forecast human water demands at sub-seasonal time scale. We study the potential of a sub-seasonal water scarcity forecasting system for improved water management decision making and improved estimates of water demand and availability. We have generated 32 years of global sub-seasonal multi-model water availability, demand and scarcity forecasts. The quality of the forecasts is compared to a reference forecast derived from resampling historic weather observations. The newly developed system has been evaluated for both the global scale and in a real-time local application in the Sacramento valley for the Trinity, Shasta and Oroville reservoirs, where the water demand for agriculture and hydropower is high. On the global scale we find that the reference forecast shows high initial forecast skill (up to 8 months) for water scarcity in the eastern US, Central Asia and Sub-Saharan Africa. Adding dynamical sub-seasonal forecasts results in a clear improvement for most regions in the world, increasing the forecasts' lead time by 2 or more months on average. The strongest improvements are found in the US, Brazil, Central Asia and Australia. For the Sacramento valley we can accurately predict anomalies in the reservoir inflow, hydropower potential and the downstream irrigation water demand 6 months in advance. This allow us to forecast potential water scarcity in the Sacramento valley and adjust the reservoir management to prevent deficits in energy or irrigation water availability. The newly developed forecast system shows that it is possible to reduce the vulnerability to upcoming water scarcity events and allows optimization of the distribution of the available water between the agricultural and energy sector half a year in advance.
Medina, Daniel C.; Findley, Sally E.; Guindo, Boubacar; Doumbia, Seydou
2007-01-01
Background Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with diarrhea, acute respiratory infection, and malaria. With the increasing awareness that the aforementioned infectious diseases impose an enormous burden on developing countries, public health programs therein could benefit from parsimonious general-purpose forecasting methods to enhance infectious disease intervention. Unfortunately, these disease time-series often i) suffer from non-stationarity; ii) exhibit large inter-annual plus seasonal fluctuations; and, iii) require disease-specific tailoring of forecasting methods. Methodology/Principal Findings In this longitudinal retrospective (01/1996–06/2004) investigation, diarrhea, acute respiratory infection of the lower tract, and malaria consultation time-series are fitted with a general-purpose econometric method, namely the multiplicative Holt-Winters, to produce contemporaneous on-line forecasts for the district of Niono, Mali. This method accommodates seasonal, as well as inter-annual, fluctuations and produces reasonably accurate median 2- and 3-month horizon forecasts for these non-stationary time-series, i.e., 92% of the 24 time-series forecasts generated (2 forecast horizons, 3 diseases, and 4 age categories = 24 time-series forecasts) have mean absolute percentage errors circa 25%. Conclusions/Significance The multiplicative Holt-Winters forecasting method: i) performs well across diseases with dramatically distinct transmission modes and hence it is a strong general-purpose forecasting method candidate for non-stationary epidemiological time-series; ii) obliquely captures prior non-linear interactions between climate and the aforementioned disease dynamics thus, obviating the need for more complex disease-specific climate-based parametric forecasting methods in the district of Niono; furthermore, iii) readily decomposes time-series into seasonal components thereby potentially assisting with programming of public health interventions, as well as monitoring of disease dynamics modification. Therefore, these forecasts could improve infectious diseases management in the district of Niono, Mali, and elsewhere in the Sahel. PMID:18030322
Medina, Daniel C; Findley, Sally E; Guindo, Boubacar; Doumbia, Seydou
2007-11-21
Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with diarrhea, acute respiratory infection, and malaria. With the increasing awareness that the aforementioned infectious diseases impose an enormous burden on developing countries, public health programs therein could benefit from parsimonious general-purpose forecasting methods to enhance infectious disease intervention. Unfortunately, these disease time-series often i) suffer from non-stationarity; ii) exhibit large inter-annual plus seasonal fluctuations; and, iii) require disease-specific tailoring of forecasting methods. In this longitudinal retrospective (01/1996-06/2004) investigation, diarrhea, acute respiratory infection of the lower tract, and malaria consultation time-series are fitted with a general-purpose econometric method, namely the multiplicative Holt-Winters, to produce contemporaneous on-line forecasts for the district of Niono, Mali. This method accommodates seasonal, as well as inter-annual, fluctuations and produces reasonably accurate median 2- and 3-month horizon forecasts for these non-stationary time-series, i.e., 92% of the 24 time-series forecasts generated (2 forecast horizons, 3 diseases, and 4 age categories = 24 time-series forecasts) have mean absolute percentage errors circa 25%. The multiplicative Holt-Winters forecasting method: i) performs well across diseases with dramatically distinct transmission modes and hence it is a strong general-purpose forecasting method candidate for non-stationary epidemiological time-series; ii) obliquely captures prior non-linear interactions between climate and the aforementioned disease dynamics thus, obviating the need for more complex disease-specific climate-based parametric forecasting methods in the district of Niono; furthermore, iii) readily decomposes time-series into seasonal components thereby potentially assisting with programming of public health interventions, as well as monitoring of disease dynamics modification. Therefore, these forecasts could improve infectious diseases management in the district of Niono, Mali, and elsewhere in the Sahel.
NASA Astrophysics Data System (ADS)
Chen, L. C.; Mo, K. C.; Zhang, Q.; Huang, J.
2014-12-01
Drought prediction from monthly to seasonal time scales is of critical importance to disaster mitigation, agricultural planning, and multi-purpose reservoir management. Starting in December 2012, NOAA Climate Prediction Center (CPC) has been providing operational Standardized Precipitation Index (SPI) Outlooks using the North American Multi-Model Ensemble (NMME) forecasts, to support CPC's monthly drought outlooks and briefing activities. The current NMME system consists of six model forecasts from U.S. and Canada modeling centers, including the CFSv2, CM2.1, GEOS-5, CCSM3.0, CanCM3, and CanCM4 models. In this study, we conduct an assessment of the predictive skill of meteorological drought using real-time NMME forecasts for the period from May 2012 to May 2014. The ensemble SPI forecasts are the equally weighted mean of the six model forecasts. Two performance measures, the anomaly correlation coefficient and root-mean-square errors against the observations, are used to evaluate forecast skill.Similar to the assessment based on NMME retrospective forecasts, predictive skill of monthly-mean precipitation (P) forecasts is generally low after the second month and errors vary among models. Although P forecast skill is not large, SPI predictive skill is high and the differences among models are small. The skill mainly comes from the P observations appended to the model forecasts. This factor also contributes to the similarity of SPI prediction among the six models. Still, NMME SPI ensemble forecasts have higher skill than those based on individual models or persistence, and the 6-month SPI forecasts are skillful out to four months. The three major drought events occurred during the 2012-2014 period, the 2012 Central Great Plains drought, the 2013 Upper Midwest flash drought, and 2013-2014 California drought, are used as examples to illustrate the system's strength and limitations. For precipitation-driven drought events, such as the 2012 Central Great Plains drought, NMME SPI forecasts perform well in predicting drought severity and spatial patterns. For fast-developing drought events, such as the 2013 Upper Midwest flash drought, the system failed to capture the onset of the drought.
NASA Astrophysics Data System (ADS)
Bogner, Konrad; Monhart, Samuel; Liniger, Mark; Spririg, Christoph; Jordan, Fred; Zappa, Massimiliano
2015-04-01
In recent years large progresses have been achieved in the operational prediction of floods and hydrological drought with up to ten days lead time. Both the public and the private sectors are currently using probabilistic runoff forecast in order to monitoring water resources and take actions when critical conditions are to be expected. The use of extended-range predictions with lead times exceeding 10 days is not yet established. The hydropower sector in particular might have large benefits from using hydro meteorological forecasts for the next 15 to 60 days in order to optimize the operations and the revenues from their watersheds, dams, captions, turbines and pumps. The new Swiss Competence Centers in Energy Research (SCCER) targets at boosting research related to energy issues in Switzerland. The objective of HEPS4POWER is to demonstrate that operational extended-range hydro meteorological forecasts have the potential to become very valuable tools for fine tuning the production of energy from hydropower systems. The project team covers a specific system-oriented value chain starting from the collection and forecast of meteorological data (MeteoSwiss), leading to the operational application of state-of-the-art hydrological models (WSL) and terminating with the experience in data presentation and power production forecasts for end-users (e-dric.ch). The first task of the HEPS4POWER will be the downscaling and post-processing of ensemble extended-range meteorological forecasts (EPS). The goal is to provide well-tailored forecasts of probabilistic nature that should be reliable in statistical and localized at catchment or even station level. The hydrology related task will consist in feeding the post-processed meteorological forecasts into a HEPS using a multi-model approach by implementing models with different complexity. Also in the case of the hydrological ensemble predictions, post-processing techniques need to be tested in order to improve the quality of the forecasts against observed discharge. Analysis should be specifically oriented to the maximisation of hydroelectricity production. Thus, verification metrics should include economic measures like cost loss approaches. The final step will include the transfer of the HEPS system to several hydropower systems, the connection with the energy market prices and the development of probabilistic multi-reservoir production and management optimizations guidelines. The baseline model chain yielding three-days forecasts established for a hydropower system in southern-Switzerland will be presented alongside with the work-plan to achieve seasonal ensemble predictions.
What might we learn from climate forecasts?
Smith, Leonard A.
2002-01-01
Most climate models are large dynamical systems involving a million (or more) variables on big computers. Given that they are nonlinear and not perfect, what can we expect to learn from them about the earth's climate? How can we determine which aspects of their output might be useful and which are noise? And how should we distribute resources between making them “better,” estimating variables of true social and economic interest, and quantifying how good they are at the moment? Just as “chaos” prevents accurate weather forecasts, so model error precludes accurate forecasts of the distributions that define climate, yielding uncertainty of the second kind. Can we estimate the uncertainty in our uncertainty estimates? These questions are discussed. Ultimately, all uncertainty is quantified within a given modeling paradigm; our forecasts need never reflect the uncertainty in a physical system. PMID:11875200
NSF's Perspective on Space Weather Research for Building Forecasting Capabilities
NASA Astrophysics Data System (ADS)
Bisi, M. M.; Pulkkinen, A. A.; Bisi, M. M.; Pulkkinen, A. A.; Webb, D. F.; Oughton, E. J.; Azeem, S. I.
2017-12-01
Space weather research at the National Science Foundation (NSF) is focused on scientific discovery and on deepening knowledge of the Sun-Geospace system. The process of maturation of knowledge base is a requirement for the development of improved space weather forecast models and for the accurate assessment of potential mitigation strategies. Progress in space weather forecasting requires advancing in-depth understanding of the underlying physical processes, developing better instrumentation and measurement techniques, and capturing the advancements in understanding in large-scale physics based models that span the entire chain of events from the Sun to the Earth. This presentation will provide an overview of current and planned programs pertaining to space weather research at NSF and discuss the recommendations of the Geospace Section portfolio review panel within the context of space weather forecasting capabilities.
NASA Astrophysics Data System (ADS)
Barodka, Siarhei; Kliutko, Yauhenia; Krasouski, Alexander; Papko, Iryna; Svetashev, Alexander; Turishev, Leonid
2013-04-01
Nowadays numerical simulation of thundercloud formation processes is of great interest as an actual problem from the practical point of view. Thunderclouds significantly affect airplane flights, and mesoscale weather forecast has much to contribute to facilitate the aviation forecast procedures. An accurate forecast can certainly help to avoid aviation accidents due to weather conditions. The present study focuses on modelling of the convective clouds development and thunder clouds detection on the basis of mesoscale atmospheric processes simulation, aiming at significantly improving the aeronautical forecast. In the analysis, the primary weather radar information has been used to be further adapted for mesoscale forecast systems. Two types of domains have been selected for modelling: an internal one (with radius of 8 km), and an external one (with radius of 300 km). The internal domain has been directly applied to study the local clouds development, and the external domain data has been treated as initial and final conditions for cloud cover formation. The domain height has been chosen according to the civil aviation forecast data (i.e. not exceeding 14 km). Simulations of weather conditions and local clouds development have been made within selected domains with the WRF modelling system. In several cases, thunderclouds are detected within the convective clouds. To specify the given category of clouds, we employ a simulation technique of solid phase formation processes in the atmosphere. Based on modelling results, we construct vertical profiles indicating the amount of solid phase in the atmosphere. Furthermore, we obtain profiles demonstrating the amount of ice particles and large particles (hailstones). While simulating the processes of solid phase formation, we investigate vertical and horizontal air flows. Consequently, we attempt to separate the total amount of solid phase into categories of small ice particles, large ice particles and hailstones. Also, we strive to reveal and differentiate the basic atmospheric parameters of sublimation and coagulation processes, aiming to predict ice particles precipitation. To analyze modelling results we apply the VAPOR three-dimensional visualization package. For the chosen domains, a diurnal synoptic situation has been simulated, including rain, sleet, ice pellets, and hail. As a result, we have obtained a large scope of data describing various atmospheric parameters: cloud cover, major wind components, basic levels of isobaric surfaces, and precipitation rate. Based on this data, we show both distinction in precipitation formation due to various heights and its differentiation of the ice particles. The relation between particle rise in the atmosphere and its size is analyzed: at 8-10 km altitude large ice particles, resulted from coagulation, dominate, while at 6-7 km altitude one can find snow and small ice particles formed by condensation growth. Also, mechanical trajectories of solid precipitation particles for various ice formation processes have been calculated.
Transforming community access to space science models
NASA Astrophysics Data System (ADS)
MacNeice, Peter; Hesse, Michael; Kuznetsova, Maria; Maddox, Marlo; Rastaetter, Lutz; Berrios, David; Pulkkinen, Antti
2012-04-01
Researching and forecasting the ever changing space environment (often referred to as space weather) and its influence on humans and their activities are model-intensive disciplines. This is true because the physical processes involved are complex, but, in contrast to terrestrial weather, the supporting observations are typically sparse. Models play a vital role in establishing a physically meaningful context for interpreting limited observations, testing theory, and producing both nowcasts and forecasts. For example, with accurate forecasting of hazardous space weather conditions, spacecraft operators can place sensitive systems in safe modes, and power utilities can protect critical network components from damage caused by large currents induced in transmission lines by geomagnetic storms.
Transforming Community Access to Space Science Models
NASA Technical Reports Server (NTRS)
MacNeice, Peter; Heese, Michael; Kunetsova, Maria; Maddox, Marlo; Rastaetter, Lutz; Berrios, David; Pulkkinen, Antti
2012-01-01
Researching and forecasting the ever changing space environment (often referred to as space weather) and its influence on humans and their activities are model-intensive disciplines. This is true because the physical processes involved are complex, but, in contrast to terrestrial weather, the supporting observations are typically sparse. Models play a vital role in establishing a physically meaningful context for interpreting limited observations, testing theory, and producing both nowcasts and forecasts. For example, with accurate forecasting of hazardous space weather conditions, spacecraft operators can place sensitive systems in safe modes, and power utilities can protect critical network components from damage caused by large currents induced in transmission lines by geomagnetic storms.
Does the OVX matter for volatility forecasting? Evidence from the crude oil market
NASA Astrophysics Data System (ADS)
Lv, Wendai
2018-02-01
In this paper, I investigate that whether the OVX and its truncated parts with a certain threshold can significantly help in forecasting the oil futures price volatility basing on the Heterogeneous Autoregressive model of Realized Volatility (HAR-RV). In-sample estimation results show that the OVX has a significantly positive impact on futures volatility. The impact of large OVX on future volatility has slightly powerful compared to the small ones. Moreover, the HARQ-RV model outperforms the HAR-RV in predicting the oil futures volatility. More importantly, the decomposed OVX have more powerful in forecasting the oil futures price volatility compared to the OVX itself.
Hurricane feedback research may improve intensity forecasts
NASA Astrophysics Data System (ADS)
Schultz, Colin
2012-06-01
Forecasts of a hurricane's intensity are generally much less accurate than forecasts of its most likely path. Large-scale atmospheric patterns dictate where a hurricane will go and how quickly it will get there. The storm's intensity, however, depends on small-scale shifts in atmospheric stratification, upwelling rates, and other transient dynamics that are difficult to predict. Properly understanding the risk posed by an impending storm depends on having a firm grasp of all three properties: translational speed, intensity, and path. Drawing on 40 years of hurricane records representing 3090 different storms, Mei et al. propose that a hurricane's translational speed and intensity may be closely linked.
Olshansky, S J
1988-01-01
Official forecasts of mortality made by the U.S. Office of the Actuary throughout this century have consistently underestimated observed mortality declines. This is due, in part, to their reliance on the static extrapolation of past trends, an atheoretical statistical method that pays scant attention to the behavioral, medical, and social factors contributing to mortality change. A "multiple cause-delay model" more realistically portrays the effects on mortality of the presence of more favorable risk factors at the population level. Such revised assumptions produce large increases in forecasts of the size of the elderly population, and have a dramatic impact on related estimates of population morbidity, disability, and health care costs.
Remote Sensing and River Discharge Forecasting for Major Rivers in South Asia (Invited)
NASA Astrophysics Data System (ADS)
Webster, P. J.; Hopson, T. M.; Hirpa, F. A.; Brakenridge, G. R.; De-Groeve, T.; Shrestha, K.; Gebremichael, M.; Restrepo, P. J.
2013-12-01
The South Asia is a flashpoint for natural disasters particularly flooding of the Indus, Ganges, and Brahmaputra has profound societal impacts for the region and globally. The 2007 Brahmaputra floods affecting India and Bangladesh, the 2008 avulsion of the Kosi River in India, the 2010 flooding of the Indus River in Pakistan and the 2013 Uttarakhand exemplify disasters on scales almost inconceivable elsewhere. Their frequent occurrence of floods combined with large and rapidly growing populations, high levels of poverty and low resilience, exacerbate the impact of the hazards. Mitigation of these devastating hazards are compounded by limited flood forecast capability, lack of rain/gauge measuring stations and forecast use within and outside the country, and transboundary data sharing on natural hazards. Here, we demonstrate the utility of remotely-derived hydrologic and weather products in producing skillful flood forecasting information without reliance on vulnerable in situ data sources. Over the last decade a forecast system has been providing operational probabilistic forecasts of severe flooding of the Brahmaputra and Ganges Rivers in Bangldesh was developed (Hopson and Webster 2010). The system utilizes ECMWF weather forecast uncertainty information and ensemble weather forecasts, rain gauge and satellite-derived precipitation estimates, together with the limited near-real-time river stage observations from Bangladesh. This system has been expanded to Pakistan and has successfully forecast the 2010-2012 flooding (Shrestha and Webster 2013). To overcome the in situ hydrological data problem, recent efforts in parallel with the numerical modeling have utilized microwave satellite remote sensing of river widths to generate operational discharge advective-based forecasts for the Ganges and Brahmaputra. More than twenty remotely locations upstream of Bangldesh were used to produce stand-alone river flow nowcasts and forecasts at 1-15 days lead time. showing that satellite-based flow estimates are a useful source of dynamical surface water information in data-scarce regions and that they could be used for model calibration and data assimilation purposes in near-time hydrologic forecast applications (Hirpa et al. 2013). More recent efforts during this year's monsoon season are optimally combining these different independent sources of river forecast information along with archived flood inundation imagery of the Dartmouth Flood Observatory to improve the visualization and overall skill of the ongoing CFAB ensemble weather forecast-based flood forecasting system within the unique context of the ongoing flood forecasting efforts for Bangladesh.
Use of High-Resolution WRF Simulations to Forecast Lightning Threat
NASA Technical Reports Server (NTRS)
McCaul, E. W., Jr.; LaCasse, K.; Goodman, S. J.; Cecil, D. J.
2008-01-01
Recent observational studies have confirmed the existence of a robust statistical relationship between lightning flash rates and the amount of large precipitating ice hydrometeors aloft in storms. This relationship is exploited, in conjunction with the capabilities of cloud-resolving forecast models such as WRF, to forecast explicitly the threat of lightning from convective storms using selected output fields from the model forecasts. The simulated vertical flux of graupel at -15C and the shape of the simulated reflectivity profile are tested in this study as proxies for charge separation processes and their associated lightning risk. Our lightning forecast method differs from others in that it is entirely based on high-resolution simulation output, without reliance on any climatological data. short [6-8 h) simulations are conducted for a number of case studies for which three-dmmensional lightning validation data from the North Alabama Lightning Mapping Array are available. Experiments indicate that initialization of the WRF model on a 2 km grid using Eta boundary conditions, Doppler radar radial velocity fields, and METAR and ACARS data y&eld satisfactory simulations. __nalyses of the lightning threat fields suggests that both the graupel flux and reflectivity profile approaches, when properly calibrated, can yield reasonable lightning threat forecasts, although an ensemble approach is probably desirable in order to reduce the tendency for misplacement of modeled storms to hurt the accuracy of the forecasts. Our lightning threat forecasts are also compared to other more traditional means of forecasting thunderstorms, such as those based on inspection of the convective available potential energy field.
Seasonal forecasting of groundwater levels in natural aquifers in the United Kingdom
NASA Astrophysics Data System (ADS)
Mackay, Jonathan; Jackson, Christopher; Pachocka, Magdalena; Brookshaw, Anca; Scaife, Adam
2014-05-01
Groundwater aquifers comprise the world's largest freshwater resource and provide resilience to climate extremes which could become more frequent under future climate changes. Prolonged dry conditions can induce groundwater drought, often characterised by significantly low groundwater levels which may persist for months to years. In contrast, lasting wet conditions can result in anomalously high groundwater levels which result in flooding, potentially at large economic cost. Using computational models to produce groundwater level forecasts allows appropriate management strategies to be considered in advance of extreme events. The majority of groundwater level forecasting studies to date use data-based models, which exploit the long response time of groundwater levels to meteorological drivers and make forecasts based only on the current state of the system. Instead, seasonal meteorological forecasts can be used to drive hydrological models and simulate groundwater levels months into the future. Such approaches have not been used in the past due to a lack of skill in these long-range forecast products. However systems such as the latest version of the Met Office Global Seasonal Forecast System (GloSea5) are now showing increased skill up to a 3-month lead time. We demonstrate the first groundwater level ensemble forecasting system using a multi-member ensemble of hindcasts from GloSea5 between 1996 and 2009 to force 21 simple lumped conceptual groundwater models covering most of the UK's major aquifers. We present the results from this hindcasting study and demonstrate that the system can be used to forecast groundwater levels with some skill up to three months into the future.
Weather Research and Forecasting Model Wind Sensitivity Study at Edwards Air Force Base, CA
NASA Technical Reports Server (NTRS)
Watson, Leela R.; Bauman, William H., III
2008-01-01
NASA prefers to land the space shuttle at Kennedy Space Center (KSC). When weather conditions violate Flight Rules at KSC, NASA will usually divert the shuttle landing to Edwards Air Force Base (EAFB) in Southern California. But forecasting surface winds at EAFB is a challenge for the Spaceflight Meteorology Group (SMG) forecasters due to the complex terrain that surrounds EAFB, One particular phenomena identified by SMG is that makes it difficult to forecast the EAFB surface winds is called "wind cycling". This occurs when wind speeds and directions oscillate among towers near the EAFB runway leading to a challenging deorbit bum forecast for shuttle landings. The large-scale numerical weather prediction models cannot properly resolve the wind field due to their coarse horizontal resolutions, so a properly tuned high-resolution mesoscale model is needed. The Weather Research and Forecasting (WRF) model meets this requirement. The AMU assessed the different WRF model options to determine which configuration best predicted surface wind speed and direction at EAFB, To do so, the AMU compared the WRF model performance using two hot start initializations with the Advanced Research WRF and Non-hydrostatic Mesoscale Model dynamical cores and compared model performance while varying the physics options.
Heterogeneity: The key to failure forecasting
Vasseur, Jérémie; Wadsworth, Fabian B.; Lavallée, Yan; Bell, Andrew F.; Main, Ian G.; Dingwell, Donald B.
2015-01-01
Elastic waves are generated when brittle materials are subjected to increasing strain. Their number and energy increase non-linearly, ending in a system-sized catastrophic failure event. Accelerating rates of geophysical signals (e.g., seismicity and deformation) preceding large-scale dynamic failure can serve as proxies for damage accumulation in the Failure Forecast Method (FFM). Here we test the hypothesis that the style and mechanisms of deformation, and the accuracy of the FFM, are both tightly controlled by the degree of microstructural heterogeneity of the material under stress. We generate a suite of synthetic samples with variable heterogeneity, controlled by the gas volume fraction. We experimentally demonstrate that the accuracy of failure prediction increases drastically with the degree of material heterogeneity. These results have significant implications in a broad range of material-based disciplines for which failure forecasting is of central importance. In particular, the FFM has been used with only variable success to forecast failure scenarios both in the field (volcanic eruptions and landslides) and in the laboratory (rock and magma failure). Our results show that this variability may be explained, and the reliability and accuracy of forecast quantified significantly improved, by accounting for material heterogeneity as a first-order control on forecasting power. PMID:26307196
Heterogeneity: The key to failure forecasting.
Vasseur, Jérémie; Wadsworth, Fabian B; Lavallée, Yan; Bell, Andrew F; Main, Ian G; Dingwell, Donald B
2015-08-26
Elastic waves are generated when brittle materials are subjected to increasing strain. Their number and energy increase non-linearly, ending in a system-sized catastrophic failure event. Accelerating rates of geophysical signals (e.g., seismicity and deformation) preceding large-scale dynamic failure can serve as proxies for damage accumulation in the Failure Forecast Method (FFM). Here we test the hypothesis that the style and mechanisms of deformation, and the accuracy of the FFM, are both tightly controlled by the degree of microstructural heterogeneity of the material under stress. We generate a suite of synthetic samples with variable heterogeneity, controlled by the gas volume fraction. We experimentally demonstrate that the accuracy of failure prediction increases drastically with the degree of material heterogeneity. These results have significant implications in a broad range of material-based disciplines for which failure forecasting is of central importance. In particular, the FFM has been used with only variable success to forecast failure scenarios both in the field (volcanic eruptions and landslides) and in the laboratory (rock and magma failure). Our results show that this variability may be explained, and the reliability and accuracy of forecast quantified significantly improved, by accounting for material heterogeneity as a first-order control on forecasting power.
Use of High-resolution WRF Simulations to Forecast Lightning Threat
NASA Technical Reports Server (NTRS)
McCaul, William E.; LaCasse, K.; Goodman, S. J.
2006-01-01
Recent observational studies have confirmed the existence of a robust statistical relationship between lightning flash rates and the amount of large precipitating ice hydrometeors in storms. This relationship is exploited, in conjunction with the capabilities of recent forecast models such as WRF, to forecast the threat of lightning from convective storms using the output fields from the model forecasts. The simulated vertical flux of graupel at -15C is used in this study as a proxy for charge separation processes and their associated lightning risk. Six-h simulations are conducted for a number of case studies for which three-dimensional lightning validation data from the North Alabama Lightning Mapping Array are available. Experiments indicate that initialization of the WRF model on a 2 km grid using Eta boundary conditions, Doppler radar radial velocity and reflectivity fields, and METAR and ACARS data yield the most realistic simulations. An array of subjective and objective statistical metrics are employed to document the utility of the WRF forecasts. The simulation results are also compared to other more traditional means of forecasting convective storms, such as those based on inspection of the convective available potential energy field.
High-Resolution WRF Forecasts of Lightning Threat
NASA Technical Reports Server (NTRS)
Goodman, S. J.; McCaul, E. W., Jr.; LaCasse, K.
2007-01-01
Tropical Rainfall Measuring Mission (TRMM)lightning and precipitation observations have confirmed the existence of a robust relationship between lightning flash rates and the amount of large precipitating ice hydrometeors in storms. This relationship is exploited, in conjunction with the capabilities of the Weather Research and Forecast (WRF) model, to forecast the threat of lightning from convective storms using the output fields from the model forecasts. The simulated vertical flux of graupel at -15C is used in this study as a proxy for charge separation processes and their associated lightning risk. Initial experiments using 6-h simulations are conducted for a number of case studies for which three-dimensional lightning validation data from the North Alabama Lightning Mapping Array are available. The WRF has been initialized on a 2 km grid using Eta boundary conditions, Doppler radar radial velocity and reflectivity fields, and METAR and ACARS data. An array of subjective and objective statistical metrics is employed to document the utility of the WRF forecasts. The simulation results are also compared to other more traditional means of forecasting convective storms, such as those based on inspection of the convective available potential energy field.
The Next Level in Automated Solar Flare Forecasting: the EU FLARECAST Project
NASA Astrophysics Data System (ADS)
Georgoulis, M. K.; Bloomfield, D.; Piana, M.; Massone, A. M.; Gallagher, P.; Vilmer, N.; Pariat, E.; Buchlin, E.; Baudin, F.; Csillaghy, A.; Soldati, M.; Sathiapal, H.; Jackson, D.; Alingery, P.; Argoudelis, V.; Benvenuto, F.; Campi, C.; Florios, K.; Gontikakis, C.; Guennou, C.; Guerra, J. A.; Kontogiannis, I.; Latorre, V.; Murray, S.; Park, S. H.; Perasso, A.; Sciacchitano, F.; von Stachelski, S.; Torbica, A.; Vischi, D.
2017-12-01
We attempt an informative description of the Flare Likelihood And Region Eruption Forecasting (FLARECAST) project, European Commission's first large-scale investment to explore the limits of reliability and accuracy achieved for the forecasting of major solar flares. We outline the consortium, top-level objectives and first results of the project, highlighting the diversity and fusion of expertise needed to deliver what was promised. The project's final product, featuring an openly accessible, fully modular and free to download flare forecasting facility will be delivered in early 2018. The project's three objectives, namely, science, research-to-operations and dissemination / communication, are also discussed: in terms of science, we encapsulate our close-to-final assessment on how close (or far) are we from a practically exploitable solar flare forecasting. In terms of R2O, we briefly describe the architecture of the FLARECAST infrastructure that includes rigorous validation for each forecasting step. From the three different communication levers of the project we finally focus on lessons learned from the two-way interaction with the community of stakeholders and governmental organizations. The FLARECAST project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No. 640216.
Heterogeneity: The key to failure forecasting
NASA Astrophysics Data System (ADS)
Vasseur, Jérémie; Wadsworth, Fabian B.; Lavallée, Yan; Bell, Andrew F.; Main, Ian G.; Dingwell, Donald B.
2015-08-01
Elastic waves are generated when brittle materials are subjected to increasing strain. Their number and energy increase non-linearly, ending in a system-sized catastrophic failure event. Accelerating rates of geophysical signals (e.g., seismicity and deformation) preceding large-scale dynamic failure can serve as proxies for damage accumulation in the Failure Forecast Method (FFM). Here we test the hypothesis that the style and mechanisms of deformation, and the accuracy of the FFM, are both tightly controlled by the degree of microstructural heterogeneity of the material under stress. We generate a suite of synthetic samples with variable heterogeneity, controlled by the gas volume fraction. We experimentally demonstrate that the accuracy of failure prediction increases drastically with the degree of material heterogeneity. These results have significant implications in a broad range of material-based disciplines for which failure forecasting is of central importance. In particular, the FFM has been used with only variable success to forecast failure scenarios both in the field (volcanic eruptions and landslides) and in the laboratory (rock and magma failure). Our results show that this variability may be explained, and the reliability and accuracy of forecast quantified significantly improved, by accounting for material heterogeneity as a first-order control on forecasting power.
Prestemon, Jeffrey P.; Butry, David T.; Thomas, Douglas S.
2017-01-01
Research shows that some categories of human-ignited wildfires might be forecastable, due to their temporal clustering, with the possibility that resources could be pre-deployed to help reduce the incidence of such wildfires. We estimated several kinds of incendiary and other human-ignited wildfire forecast models at the weekly time step for tribal land units in the United States, evaluating their forecast skill out of sample. Analyses show that an Autoregressive Conditional Poisson (ACP) model of both incendiary and non-incendiary human-ignited wildfires is more accurate out of sample compared to alternatives, and the simplest of the ACP models performed the best. Additionally, an ensemble of these and simpler, less analytically intensive approaches performed even better. Wildfire hotspot forecast models using all model types were evaluated in a simulation mode to assess the net benefits of forecasts in the context of law enforcement resource reallocations. Our analyses show that such hotspot tools could yield large positive net benefits for the tribes in terms of suppression expenditures averted for incendiary wildfires but that the hotspot tools were less likely to be beneficial for addressing outbreaks of non-incendiary human-ignited wildfires. PMID:28769549
Prestemon, Jeffrey P; Butry, David T; Thomas, Douglas S
2016-01-01
Research shows that some categories of human-ignited wildfires might be forecastable, due to their temporal clustering, with the possibility that resources could be pre-deployed to help reduce the incidence of such wildfires. We estimated several kinds of incendiary and other human-ignited wildfire forecast models at the weekly time step for tribal land units in the United States, evaluating their forecast skill out of sample. Analyses show that an Autoregressive Conditional Poisson (ACP) model of both incendiary and non-incendiary human-ignited wildfires is more accurate out of sample compared to alternatives, and the simplest of the ACP models performed the best. Additionally, an ensemble of these and simpler, less analytically intensive approaches performed even better. Wildfire hotspot forecast models using all model types were evaluated in a simulation mode to assess the net benefits of forecasts in the context of law enforcement resource reallocations. Our analyses show that such hotspot tools could yield large positive net benefits for the tribes in terms of suppression expenditures averted for incendiary wildfires but that the hotspot tools were less likely to be beneficial for addressing outbreaks of non-incendiary human-ignited wildfires.
Validation of Volcanic Ash Forecasting Performed by the Washington Volcanic Ash Advisory Center
NASA Astrophysics Data System (ADS)
Salemi, A.; Hanna, J.
2009-12-01
In support of NOAA’s mission to protect life and property, the Satellite Analysis Branch (SAB) uses satellite imagery to monitor volcanic eruptions and track volcanic ash. The Washington Volcanic Ash Advisory Center (VAAC) was established in late 1997 through an agreement with the International Civil Aviation Organization (ICAO). A volcanic ash advisory (VAA) is issued every 6 hours while an eruption is occurring. Information about the current location and height of the volcanic ash as well as any pertinent meteorological information is contained within the VAA. In addition, when ash is detected in satellite imagery, 6-, 12- and 18-hour forecasts of ash height and location are provided. This information is garnered from many sources including Meteorological Watch Offices (MWOs), pilot reports (PIREPs), model forecast winds, radiosondes and volcano observatories. The Washington VAAC has performed a validation of their 6, 12 and 18 hour airborne volcanic ash forecasts issued since October, 2007. The volcanic ash forecasts are viewed dichotomously (yes/no) with the frequency of yes and no events placed into a contingency table. A large variety of categorical statistics useful in describing forecast performance are then computed from the resulting contingency table.
Data-driven forecasting algorithms for building energy consumption
NASA Astrophysics Data System (ADS)
Noh, Hae Young; Rajagopal, Ram
2013-04-01
This paper introduces two forecasting methods for building energy consumption data that are recorded from smart meters in high resolution. For utility companies, it is important to reliably forecast the aggregate consumption profile to determine energy supply for the next day and prevent any crisis. The proposed methods involve forecasting individual load on the basis of their measurement history and weather data without using complicated models of building system. The first method is most efficient for a very short-term prediction, such as the prediction period of one hour, and uses a simple adaptive time-series model. For a longer-term prediction, a nonparametric Gaussian process has been applied to forecast the load profiles and their uncertainty bounds to predict a day-ahead. These methods are computationally simple and adaptive and thus suitable for analyzing a large set of data whose pattern changes over the time. These forecasting methods are applied to several sets of building energy consumption data for lighting and heating-ventilation-air-conditioning (HVAC) systems collected from a campus building at Stanford University. The measurements are collected every minute, and corresponding weather data are provided hourly. The results show that the proposed algorithms can predict those energy consumption data with high accuracy.
Assessing public forecasts to encourage accountability: The case of MIT’s Technology Review
2017-01-01
Although high degrees of reliability have been found for many types of forecasts purportedly due to the existence of accountability, public forecasts of technology are rarely assessed and continue to have a poor reputation. This paper’s analysis of forecasts made by MIT’s Technology Review provides a rare assessment and thus a means to encourage accountability. It first shows that few of the predicted “breakthrough technologies” currently have large markets. Only four have sales greater than $10 billion while eight technologies not predicted by Technology Review have sales greater than $10 billion including three with greater than $100 billion and one other with greater than $50 billion. Second, possible reasons for these poor forecasts are then discussed including an over emphasis on the science-based process of technology change, sometimes called the linear model of innovation. Third, this paper describes a different model of technology change, one that is widely used by private companies and that explains the emergence of those technologies that have greater than $10 billion in sales. Fourth, technology change and forecasts are discussed in terms of cognitive biases and mental models. PMID:28797114
Benefits of Sharing Information: Supermodel Ensemble and Applications in South America
NASA Astrophysics Data System (ADS)
Dias, P. L.
2006-05-01
A model intercomparison program involving a large number of academic and operational institutions has been implemented in South America since 2003, motivated by the SALLJEX Intercomparison Program in 2003 (a research program focused on the identification of the role of the Andes low level jet moisture transport from the Amazon to the Plata basin) and the WMO/THORPEX (www.wmo.int/thorpex) goals to improve predictability through the proper combination of numerical weather forecasts. This program also explores the potential predictability associated with the combination of a large number of possible scenarios in the time scale of a few days to up to 15 days. Five academic institutions and five operational forecasting centers in several countries in South America, 1 academic institution in the USA, and the main global forecasting centers (NCEP, UKMO, ECMWF) agreed to provide numerical products based on operational and experimental models. The metric for model validation is concentrated on the fit of the forecast to surface observations. Meteorological data from airports, synoptic stations operated by national weather services, automatic data platforms maintained by different institutions, the PIRATA buoys etc are all collected through LDM/NCAR or direct transmission. Approximately 40 models outputs are available on a daily basis, twice a day. A simple procedure based on data assimilation principles was quite successful in combining the available forecasts in order to produce temperature, dew point, wind, pressure and precipitation forecasts at station points in S. America. The procedure is based on removing each model bias at the observational point and a weighted average based on the mean square error of the forecasts. The base period for estimating the bias and mean square error is of the order of 15 to 30 days. Products of the intercomparison model program and the optimal statistical combination of the available forecasts are public and available in real time (www.master.iag.usp.br/). Monitoring of the use of the products reveal a growing trend in the last year (reaching about 10.000 accesses per day in recent months). The intercomparison program provides a rich data set for educational products (real time use in Synoptic Meteorology and Numerical Weather Forecasting lectures), operational weather forecasts in national or regional weather centers and for research purposes. During the first phase of the program it was difficult to convince potential participants to share the information in the public homepage. However, as the system evolved, more and more institutions became associated with the program. The general opinion of the participants is that the system provides an unified metric for evaluation, a forum for discussion of the physical origin of the model forecast differences and therefore improvement of the quality of the numerical guidance.
a 24/7 High Resolution Storm Surge, Inundation and Circulation Forecasting System for Florida Coast
NASA Astrophysics Data System (ADS)
Paramygin, V.; Davis, J. R.; Sheng, Y.
2012-12-01
A 24/7 forecasting system for Florida is needed because of the high risk of tropical storm surge-induced coastal inundation and damage, and the need to support operational management of water resources, utility infrastructures, and fishery resources. With the anticipated climate change impacts, including sea level rise, coastal areas are facing the challenges of increasing inundation risk and increasing population. Accurate 24/7 forecasting of water level, inundation, and circulation will significantly enhance the sustainability of coastal communities and environments. Supported by the Southeast Coastal Ocean Observing Regional Association (SECOORA) through NOAA IOOS, a 24/7 high-resolution forecasting system for storm surge, coastal inundation, and baroclinic circulation is being developed for Florida using CH3D Storm Surge Modeling System (CH3D-SSMS). CH3D-SSMS is based on the CH3D hydrodynamic model coupled to a coastal wave model SWAN and basin scale surge and wave models. CH3D-SSMS has been verified with surge, wave, and circulation data from several recent hurricanes in the U.S.: Isabel (2003); Charley, Dennis and Ivan (2004); Katrina and Wilma (2005); Ike and Fay (2008); and Irene (2011), as well as typhoons in the Pacific: Fanapi (2010) and Nanmadol (2011). The effects of tropical cyclones on flow and salinity distribution in estuarine and coastal waters has been simulated for Apalachicola Bay as well as Guana-Tolomato-Matanzas Estuary using CH3D-SSMS. The system successfully reproduced different physical phenomena including large waves during Ivan that damaged I-10 Bridges, a large alongshore wave and coastal flooding during Wilma, salinity drop during Fay, and flooding in Taiwan as a result of combined surge and rain effect during Fanapi. The system uses 4 domains that cover entire Florida coastline: West, which covers the Florida panhandle and Tampa Bay; Southwest spans from Florida Keys to Charlotte Harbor; Southeast, covering Biscayne Bay and Miami and East, which continues north to the Florida/Georgia border. The system has a data acquisition and processing module that is used to collect data for model runs (e.g. wind, river flow, precipitation). Depending on the domain, forecasts runs can take ~1-18 hours to complete on a single CPU (8-core) system (1-2 hrs for 2D setup and up to 18 hrs for a 3D setup) with 4 forecasts generated per day. All data is archived / catalogued and model forecast skill is continuously being evaluated. In addition to the baseline forecasts, additional forecasts are being perform using various options for wind forcing (GFS, GFDL, WRF, and parametric hurricane models), model configurations (2D/ 3D), and open boundary conditions by coupling with large scale models (ROMS, NCOM, HYCOM), as well as incorporating real-time and forecast river flow and precipitation data to better understand how to improve model skill. In addition, new forecast products (e.g. more informative inundation maps) are being developed to targeted stakeholders. To support modern data standards, CH3D-SSMS results are available online via a THREDDS server in CF-Compliant NetCDF format as well as other stakeholder-friendly (e.g. GIS) formats. The SECOORA website provides visualization of the model via GODIVA-THREDDS interface.
Flexible reserve markets for wind integration
NASA Astrophysics Data System (ADS)
Fernandez, Alisha R.
The increased interconnection of variable generation has motivated the use of improved forecasting to more accurately predict future production with the purpose to lower total system costs for balancing when the expected output exceeds or falls short of the actual output. Forecasts are imperfect, and the forecast errors associated with utility-scale generation from variable generators need new balancing capabilities that cannot be handled by existing ancillary services. Our work focuses on strategies for integrating large amounts of wind generation under the flex reserve market, a market that would called upon for short-term energy services during an under or oversupply of wind generation to maintain electric grid reliability. The flex reserve market would be utilized for time intervals that fall in-between the current ancillary services markets that would be longer than second-to-second energy services for maintaining system frequency and shorter than reserve capacity services that are called upon for several minutes up to an hour during an unexpected contingency on the grid. In our work, the wind operator would access the flex reserve market as an energy service to correct for unanticipated forecast errors, akin to paying the generators participating in the market to increase generation during a shortfall or paying the other generators to decrease generation during an excess of wind generation. Such a market does not currently exist in the Mid-Atlantic United States. The Pennsylvania-New Jersey-Maryland Interconnection (PJM) is the Mid-Atlantic electric grid case study that was used to examine if a flex reserve market can be utilized for integrating large capacities of wind generation in a lowcost manner for those providing, purchasing and dispatching these short-term balancing services. The following work consists of three studies. The first examines the ability of a hydroelectric facility to provide short-term forecast error balancing services via a flex reserve market, identifying the operational constraints that inhibit a multi-purpose dam facility to meet the desired flexible energy demand. The second study transitions from the hydroelectric facility as the decision maker providing flex reserve services to the wind plant as the decision maker purchasing these services. In this second study, methods for allocating the costs of flex reserve services under different wind policy scenarios are explored that aggregate farms into different groupings to identify the least-cost strategy for balancing the costs of hourly day-ahead forecast errors. The least-cost strategy may be different for an individual wind plant and for the system operator, noting that the least-cost strategy is highly sensitive to cost allocation and aggregation schemes. The latter may also cause cross-subsidies in the cost for balancing wind forecast errors among the different wind farms. The third study builds from the second, with the objective to quantify the amount of flex reserves needed for balancing future forecast errors using a probabilistic approach (quantile regression) to estimating future forecast errors. The results further examine the usefulness of separate flexible markets PJM could use for balancing oversupply and undersupply events, similar to the regulation up and down markets used in Europe. These three studies provide the following results and insights to large-scale wind integration using actual PJM wind farm data that describe the markets and generators within PJM. • Chapter 2 provides an in-depth analysis of the valuable, yet highly-constrained, energy services multi-purpose hydroelectric facilities can provide, though the opportunity cost for providing these services can result in large deviations from the reservoir policies with minimal revenue gain in comparison to dedicating the whole of dam capacity to providing day-ahead, baseload generation. • Chapter 3 quantifies the system-wide efficiency gains and the distributive effects of PJM's decision to act as a single balancing authority, which means that it procures ancillary services across its entire footprint simultaneously. This can be contrasted to Midwest Independent System Operator (MISO), which has several balancing authorities operating under its footprint. • Chapter 4 uses probabilistic methods to estimate the uncertainty in the forecast errors and the quantity of energy needed to balance these forecast errors at a certain percentile. Current practice is to use a point forecast that describes the conditional expectation of the dependent variable at each time step. The approach here uses quantile regression to describe the relationship between independent variable and the conditional quantiles (equivalently the percentiles) of the dependent variable. An estimate of the conditional density is performed, which contains information about the covariate relationship of the sign of the forecast errors (negative for too much wind generation and positive for too little wind generation) and the wind power forecast. This additional knowledge may be implemented in the decision process to more accurately schedule day-ahead wind generation bids and provide an example for using separate markets for balancing an oversupply and undersupply of generation. Such methods are currently used for coordinating large footprints of wind generation in Europe.
2011-01-01
Background Real-time forecasting of epidemics, especially those based on a likelihood-based approach, is understudied. This study aimed to develop a simple method that can be used for the real-time epidemic forecasting. Methods A discrete time stochastic model, accounting for demographic stochasticity and conditional measurement, was developed and applied as a case study to the weekly incidence of pandemic influenza (H1N1-2009) in Japan. By imposing a branching process approximation and by assuming the linear growth of cases within each reporting interval, the epidemic curve is predicted using only two parameters. The uncertainty bounds of the forecasts are computed using chains of conditional offspring distributions. Results The quality of the forecasts made before the epidemic peak appears largely to depend on obtaining valid parameter estimates. The forecasts of both weekly incidence and final epidemic size greatly improved at and after the epidemic peak with all the observed data points falling within the uncertainty bounds. Conclusions Real-time forecasting using the discrete time stochastic model with its simple computation of the uncertainty bounds was successful. Because of the simplistic model structure, the proposed model has the potential to additionally account for various types of heterogeneity, time-dependent transmission dynamics and epidemiological details. The impact of such complexities on forecasting should be explored when the data become available as part of the disease surveillance. PMID:21324153
Scale and modeling issues in water resources planning
Lins, H.F.; Wolock, D.M.; McCabe, G.J.
1997-01-01
Resource planners and managers interested in utilizing climate model output as part of their operational activities immediately confront the dilemma of scale discordance. Their functional responsibilities cover relatively small geographical areas and necessarily require data of relatively high spatial resolution. Climate models cover a large geographical, i.e. global, domain and produce data at comparatively low spatial resolution. Although the scale differences between model output and planning input are large, several techniques have been developed for disaggregating climate model output to a scale appropriate for use in water resource planning and management applications. With techniques in hand to reduce the limitations imposed by scale discordance, water resource professionals must now confront a more fundamental constraint on the use of climate models-the inability to produce accurate representations and forecasts of regional climate. Given the current capabilities of climate models, and the likelihood that the uncertainty associated with long-term climate model forecasts will remain high for some years to come, the water resources planning community may find it impractical to utilize such forecasts operationally.
Second Generation Crop Yield Models Review
NASA Technical Reports Server (NTRS)
Hodges, T. (Principal Investigator)
1982-01-01
Second generation yield models, including crop growth simulation models and plant process models, may be suitable for large area crop yield forecasting in the yield model development project. Subjective and objective criteria for model selection are defined and models which might be selected are reviewed. Models may be selected to provide submodels as input to other models; for further development and testing; or for immediate testing as forecasting tools. A plant process model may range in complexity from several dozen submodels simulating (1) energy, carbohydrates, and minerals; (2) change in biomass of various organs; and (3) initiation and development of plant organs, to a few submodels simulating key physiological processes. The most complex models cannot be used directly in large area forecasting but may provide submodels which can be simplified for inclusion into simpler plant process models. Both published and unpublished models which may be used for development or testing are reviewed. Several other models, currently under development, may become available at a later date.
Wind and wave extremes over the world oceans from very large ensembles
NASA Astrophysics Data System (ADS)
Breivik, Øyvind; Aarnes, Ole Johan; Abdalla, Saleh; Bidlot, Jean-Raymond; Janssen, Peter A. E. M.
2014-07-01
Global return values of marine wind speed and significant wave height are estimated from very large aggregates of archived ensemble forecasts at +240 h lead time. Long lead time ensures that the forecasts represent independent draws from the model climate. Compared with ERA-Interim, a reanalysis, the ensemble yields higher return estimates for both wind speed and significant wave height. Confidence intervals are much tighter due to the large size of the data set. The period (9 years) is short enough to be considered stationary even with climate change. Furthermore, the ensemble is large enough for nonparametric 100 year return estimates to be made from order statistics. These direct return estimates compare well with extreme value estimates outside areas with tropical cyclones. Like any method employing modeled fields, it is sensitive to tail biases in the numerical model, but we find that the biases are moderate outside areas with tropical cyclones.
ERIC Educational Resources Information Center
Noser, Thomas C.; Tanner, John R.; Shah, Situl
2008-01-01
The purpose of this study was to measure the comprehension of basic mathematical skills of students enrolled in statistics classes at a large regional university, and to determine if the scores earned on a basic math skills test are useful in forecasting student performance in these statistics classes, and to determine if students' basic math…
Fast Algorithms for Mining Co-evolving Time Series
2011-09-01
Keogh et al., 2001, 2004] and (b) forecasting, like an autoregressive integrated moving average model ( ARIMA ) and related meth- ods [Box et al., 1994...computing hardware? We develop models to mine time series with missing values, to extract compact representation from time sequences, to segment the...sequences, and to do forecasting. For large scale data, we propose algorithms for learning time series models , in particular, including Linear Dynamical
Lejiang Yu; Shiyuan Zhong; Xindi Bian; Warren E. Heilman
2015-01-01
This study examines the spatial and temporal variability of wind speed at 80m above ground (the average hub height of most modern wind turbines) in the contiguous United States using Climate Forecast System Reanalysis (CFSR) data from 1979 to 2011. The mean 80-m wind exhibits strong seasonality and large spatial variability, with higher (lower) wind speeds in the...
Real-time Mainshock Forecast by Statistical Discrimination of Foreshock Clusters
NASA Astrophysics Data System (ADS)
Nomura, S.; Ogata, Y.
2016-12-01
Foreshock discremination is one of the most effective ways for short-time forecast of large main shocks. Though many large earthquakes accompany their foreshocks, discreminating them from enormous small earthquakes is difficult and only probabilistic evaluation from their spatio-temporal features and magnitude evolution may be available. Logistic regression is the statistical learning method best suited to such binary pattern recognition problems where estimates of a-posteriori probability of class membership are required. Statistical learning methods can keep learning discreminating features from updating catalog and give probabilistic recognition of forecast in real time. We estimated a non-linear function of foreshock proportion by smooth spline bases and evaluate the possibility of foreshocks by the logit function. In this study, we classified foreshocks from earthquake catalog by the Japan Meteorological Agency by single-link clustering methods and learned spatial and temporal features of foreshocks by the probability density ratio estimation. We use the epicentral locations, time spans and difference in magnitudes for learning and forecasting. Magnitudes of main shocks are also predicted our method by incorporating b-values into our method. We discuss the spatial pattern of foreshocks from the classifier composed by our model. We also implement a back test to validate predictive performance of the model by this catalog.
Regional crop yield forecasting: a probabilistic approach
NASA Astrophysics Data System (ADS)
de Wit, A.; van Diepen, K.; Boogaard, H.
2009-04-01
Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.
Process-conditioned bias correction for seasonal forecasting: a case-study with ENSO in Peru
NASA Astrophysics Data System (ADS)
Manzanas, R.; Gutiérrez, J. M.
2018-05-01
This work assesses the suitability of a first simple attempt for process-conditioned bias correction in the context of seasonal forecasting. To do this, we focus on the northwestern part of Peru and bias correct 1- and 4-month lead seasonal predictions of boreal winter (DJF) precipitation from the ECMWF System4 forecasting system for the period 1981-2010. In order to include information about the underlying large-scale circulation which may help to discriminate between precipitation affected by different processes, we introduce here an empirical quantile-quantile mapping method which runs conditioned on the state of the Southern Oscillation Index (SOI), which is accurately predicted by System4 and is known to affect the local climate. Beyond the reduction of model biases, our results show that the SOI-conditioned method yields better ROC skill scores and reliability than the raw model output over the entire region of study, whereas the standard unconditioned implementation provides no added value for any of these metrics. This suggests that conditioning the bias correction on simple but well-simulated large-scale processes relevant to the local climate may be a suitable approach for seasonal forecasting. Yet, further research on the suitability of the application of similar approaches to the one considered here for other regions, seasons and/or variables is needed.
NASA Astrophysics Data System (ADS)
Aberson, Sim David
In 1997, the National Hurricane Center and the Hurricane Research Division began conducting operational synoptic surveillance missions with the Gulfstream IV-SP jet aircraft to improve operational forecast models. During the first two years, twenty-four missions were conducted around tropical cyclones threatening the continental United States, Puerto Rico, and the Virgin Islands. Global Positioning System dropwindsondes were released from the aircraft at 150--200 km intervals along the flight track in the tropical cyclone environment to obtain wind, temperature, and humidity profiles from flight level (around 150 hPa) to the surface. The observations were processed and formatted aboard the aircraft and transmitted to the National Centers for Environmental Prediction (NCEP). There, they were ingested into the Global Data Assimilation System that subsequently provides initial and time-dependent boundary conditions for numerical models that forecast tropical cyclone track and intensity. Three dynamical models were employed in testing the targeting and sampling strategies. With the assimilation into the numerical guidance of all the observations gathered during the surveillance missions, only the 12-h Geophysical Fluid Dynamics Laboratory Hurricane Model forecast showed statistically significant improvement. Neither the forecasts from the Aviation run of the Global Spectral Model nor the shallow-water VICBAR model were improved with the assimilation of the dropwindsonde data. This mediocre result is found to be due mainly to the difficulty in operationally quantifying the storm-motion vector used to create accurate synthetic data to represent the tropical cyclone vortex in the models. A secondary limit on forecast improvements from the surveillance missions is the limited amount of data provided by the one surveillance aircraft in regular missions. The inability of some surveillance missions to surround the tropical cyclone with dropwindsonde observations is a possible third limit, though the results are inconclusive. Due to limited aircraft resources, optimal observing strategies for these missions must be developed. Since observations in areas of decaying error modes are unlikely to have large impact on subsequent forecasts, such strategies should be based on taking observations in those geographic locations corresponding to the most rapidly growing error modes in the numerical models and on known deficiencies in current data assimilation systems. Here, the most rapidly growing modes are represented by areas of large forecast spread in the NCEP bred-mode global ensemble forecasting system. The sampling strategy requires sampling the entire target region at approximately the same resolution as the North American rawinsonde network to limit the possibly spurious spread of information from dropwindsonde observations into data-sparse regions where errors are likely to grow. When only the subset of data in these fully-sampled target regions is assimilated into the numerical models, statistically significant reduction of the track forecast errors of up to 25% within the critical first two days of the forecast are seen. These model improvements are comparable with the cumulative business-as-usual track forecast model improvements expected over eighteen years.
NASA Astrophysics Data System (ADS)
Penn, C. A.; Clow, D. W.; Sexstone, G. A.
2017-12-01
Water supply forecasts are an important tool for water resource managers in areas where surface water is relied on for irrigating agricultural lands and for municipal water supplies. Forecast errors, which correspond to inaccurate predictions of total surface water volume, can lead to mis-allocated water and productivity loss, thus costing stakeholders millions of dollars. The objective of this investigation is to provide water resource managers with an improved understanding of factors contributing to forecast error, and to help increase the accuracy of future forecasts. In many watersheds of the western United States, snowmelt contributes 50-75% of annual surface water flow and controls both the timing and volume of peak flow. Water supply forecasts from the Natural Resources Conservation Service (NRCS), National Weather Service, and similar cooperators use precipitation and snowpack measurements to provide water resource managers with an estimate of seasonal runoff volume. The accuracy of these forecasts can be limited by available snowpack and meteorological data. In the headwaters of the Rio Grande, NRCS produces January through June monthly Water Supply Outlook Reports. This study evaluates the accuracy of these forecasts since 1990, and examines what factors may contribute to forecast error. The Rio Grande headwaters has experienced recent changes in land cover from bark beetle infestation and a large wildfire, which can affect hydrological processes within the watershed. To investigate trends and possible contributing factors in forecast error, a semi-distributed hydrological model was calibrated and run to simulate daily streamflow for the period 1990-2015. Annual and seasonal watershed and sub-watershed water balance properties were compared with seasonal water supply forecasts. Gridded meteorological datasets were used to assess changes in the timing and volume of spring precipitation events that may contribute to forecast error. Additionally, a spatially-distributed physics-based snow model was used to assess possible effects of land cover change on snowpack properties. Trends in forecasted error are variable while baseline model results show a consistent under-prediction in the recent decade, highlighting possible compounding effects of climate and land cover changes.
Assessing skill of a global bimonthly streamflow ensemble prediction system
NASA Astrophysics Data System (ADS)
van Dijk, A. I.; Peña-Arancibia, J.; Sheffield, J.; Wood, E. F.
2011-12-01
Ideally, a seasonal streamflow forecasting system might be conceived of as a system that ingests skillful climate forecasts from general circulation models and propagates these through thoroughly calibrated hydrological models that are initialised using hydrometric observations. In practice, there are practical problems with each of these aspects. Instead, we analysed whether a comparatively simple hydrological model-based Ensemble Prediction System (EPS) can provide global bimonthly streamflow forecasts with some skill and if so, under what circumstances the greatest skill may be expected. The system tested produces ensemble forecasts for each of six annual bimonthly periods based on the previous 30 years of global daily gridded 1° resolution climate variables and an initialised global hydrological model. To incorporate some of the skill derived from ocean conditions, a post-EPS analog method was used to sample from the ensemble based on El Niño Southern Oscillation (ENSO), Indian Ocean Dipole (IOD), North Atlantic Oscillation (NAO) and Pacific Decadal Oscillation (PDO) index values observed prior to the forecast. Forecasts skill was assessed through a hind-casting experiment for the period 1979-2008. Potential skill was calculated with reference to a model run with the actual forcing for the forecast period (the 'perfect' model) and was compared to actual forecast skill calculated for each of the six forecast times for an average 411 Australian and 51 pan-tropical catchments. Significant potential skill in bimonthly forecasts was largely limited to northern regions during the snow melt period, seasonally wet tropical regions at the transition of wet to dry season, and the Indonesian region where rainfall is well correlated to ENSO. The actual skill was approximately 34-50% of the potential skill. We attribute this primarily to limitations in the model structure, parameterisation and global forcing data. Use of better climate forecasts and remote sensing observations of initial catchment conditions should help to increase actual skill in future. Future work also could address the potential skill gain from using weather and climate forecasts and from a calibrated and/or alternative hydrological model or model ensemble. The approach and data might be useful as a benchmark for joint seasonal forecasting experiments planned under GEWEX.
Large Scale Meteorological Pattern of Extreme Rainfall in Indonesia
NASA Astrophysics Data System (ADS)
Kuswanto, Heri; Grotjahn, Richard; Rachmi, Arinda; Suhermi, Novri; Oktania, Erma; Wijaya, Yosep
2014-05-01
Extreme Weather Events (EWEs) cause negative impacts socially, economically, and environmentally. Considering these facts, forecasting EWEs is crucial work. Indonesia has been identified as being among the countries most vulnerable to the risk of natural disasters, such as floods, heat waves, and droughts. Current forecasting of extreme events in Indonesia is carried out by interpreting synoptic maps for several fields without taking into account the link between the observed events in the 'target' area with remote conditions. This situation may cause misidentification of the event leading to an inaccurate prediction. Grotjahn and Faure (2008) compute composite maps from extreme events (including heat waves and intense rainfall) to help forecasters identify such events in model output. The composite maps show large scale meteorological patterns (LSMP) that occurred during historical EWEs. Some vital information about the EWEs can be acquired from studying such maps, in addition to providing forecaster guidance. Such maps have robust mid-latitude meteorological patterns (for Sacramento and California Central Valley, USA EWEs). We study the performance of the composite approach for tropical weather condition such as Indonesia. Initially, the composite maps are developed to identify and forecast the extreme weather events in Indramayu district- West Java, the main producer of rice in Indonesia and contributes to about 60% of the national total rice production. Studying extreme weather events happening in Indramayu is important since EWEs there affect national agricultural and fisheries activities. During a recent EWE more than a thousand houses in Indramayu suffered from serious flooding with each home more than one meter underwater. The flood also destroyed a thousand hectares of rice plantings in 5 regencies. Identifying the dates of extreme events is one of the most important steps and has to be carried out carefully. An approach has been applied to identify the dates involving observations from multiple sites (rain gauges). The approach combines the POT (Peaks Over Threshold) with 'declustering' of the data to approximate independence based on the autocorrelation structure of each rainfall series. The cross correlation among sites is considered also to develop the event's criteria yielding a rational choice of the extreme dates given the 'spotty' nature of the intense convection. Based on the identified dates, we are developing a supporting tool for forecasting extreme rainfall based on the corresponding large-scale meteorological patterns (LSMPs). The LSMPs methodology focuses on the larger-scale patterns that the model are better able to forecast, as those larger-scale patterns create the conditions fostering the local EWE. Bootstrap resampling method is applied to highlight the key features that statistically significant with the extreme events. Grotjahn, R., and G. Faure. 2008: Composite Predictor Maps of Extraordinary Weather Events in the Sacramento California Region. Weather and Forecasting. 23: 313-335.
Forecasting the Solar Drivers of Solar Energetic Particle Events
NASA Technical Reports Server (NTRS)
Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor
2012-01-01
Large flares and fast CMEs are the drivers of the most severe space weather including Solar Energetic Particle Events (SEP Events). Large flares and their co-produced CMEs are powered by the explosive release of free magnetic energy stored in non-potential magnetic fields of sunspot active regions. The free energy is stored in and released from the low-beta regime of the active region's magnetic field above the photosphere, in the chromosphere and low corona. From our work over the past decade and from similar work of several other groups, it is now well established that (1) a proxy of the free magnetic energy stored above the photosphere can be measured from photospheric magnetograms, maps of the measured field in the photosphere, and (2) an active region's rate of production of major CME/flare eruptions in the coming day or so is strongly correlated with its present measured value of the free-energy proxy. These results have led us to use the large database of SOHO/MDI full-disk magnetograms spanning Solar Cycle 23 to obtain empirical forecasting curves that from an active region's present measured value of the free-energy proxy give the active region's expected rates of production of major flares, CMEs, fast CMEs, and SEP Events in the coming day or so (Falconer et al 2011, Space Weather, 9, S04003). We will present these forecasting curves and demonstrate the accuracy of their forecasts. In addition, we will show that the forecasts for major flares and fast CMEs can be made significantly more accurate by taking into account not only the value of the free energy proxy but also the active region's recent productivity of major flares; specifically, whether the active region has produced a major flare (GOES class M or X) during the past 24 hours before the time of the measured magnetogram.
Monthly forecasting of agricultural pests in Switzerland
NASA Astrophysics Data System (ADS)
Hirschi, M.; Dubrovsky, M.; Spirig, C.; Samietz, J.; Calanca, P.; Weigel, A. P.; Fischer, A. M.; Rotach, M. W.
2012-04-01
Given the repercussions of pests and diseases on agricultural production, detailed forecasting tools have been developed to simulate the degree of infestation depending on actual weather conditions. The life cycle of pests is most successfully predicted if the micro-climate of the immediate environment (habitat) of the causative organisms can be simulated. Sub-seasonal pest forecasts therefore require weather information for the relevant habitats and the appropriate time scale. The pest forecasting system SOPRA (www.sopra.info) currently in operation in Switzerland relies on such detailed weather information, using hourly weather observations up to the day the forecast is issued, but only a climatology for the forecasting period. Here, we aim at improving the skill of SOPRA forecasts by transforming the weekly information provided by ECMWF monthly forecasts (MOFCs) into hourly weather series as required for the prediction of upcoming life phases of the codling moth, the major insect pest in apple orchards worldwide. Due to the probabilistic nature of operational monthly forecasts and the limited spatial and temporal resolution, their information needs to be post-processed for use in a pest model. In this study, we developed a statistical downscaling approach for MOFCs that includes the following steps: (i) application of a stochastic weather generator to generate a large pool of daily weather series consistent with the climate at a specific location, (ii) a subsequent re-sampling of weather series from this pool to optimally represent the evolution of the weekly MOFC anomalies, and (iii) a final extension to hourly weather series suitable for the pest forecasting model. Results show a clear improvement in the forecast skill of occurrences of upcoming codling moth life phases when incorporating MOFCs as compared to the operational pest forecasting system. This is true both in terms of root mean squared errors and of the continuous rank probability scores of the probabilistic forecasts vs. the mean absolute errors of the deterministic system. Also, the application of the climate conserving recalibration (CCR, Weigel et al. 2009) technique allows for successful correction of the under-confidence in the forecasted occurrences of codling moth life phases. Reference: Weigel, A. P.; Liniger, M. A. & Appenzeller, C. (2009). Seasonal Ensemble Forecasts: Are Recalibrated Single Models Better than Multimodels? Mon. Wea. Rev., 137, 1460-1479.
NASA Astrophysics Data System (ADS)
Subramanian, Aneesh C.; Palmer, Tim N.
2017-06-01
Stochastic schemes to represent model uncertainty in the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system has helped improve its probabilistic forecast skill over the past decade by both improving its reliability and reducing the ensemble mean error. The largest uncertainties in the model arise from the model physics parameterizations. In the tropics, the parameterization of moist convection presents a major challenge for the accurate prediction of weather and climate. Superparameterization is a promising alternative strategy for including the effects of moist convection through explicit turbulent fluxes calculated from a cloud-resolving model (CRM) embedded within a global climate model (GCM). In this paper, we compare the impact of initial random perturbations in embedded CRMs, within the ECMWF ensemble prediction system, with stochastically perturbed physical tendency (SPPT) scheme as a way to represent model uncertainty in medium-range tropical weather forecasts. We especially focus on forecasts of tropical convection and dynamics during MJO events in October-November 2011. These are well-studied events for MJO dynamics as they were also heavily observed during the DYNAMO field campaign. We show that a multiscale ensemble modeling approach helps improve forecasts of certain aspects of tropical convection during the MJO events, while it also tends to deteriorate certain large-scale dynamic fields with respect to stochastically perturbed physical tendencies approach that is used operationally at ECMWF.
NASA Astrophysics Data System (ADS)
Brown, James; Seo, Dong-Jun
2010-05-01
Operational forecasts of hydrometeorological and hydrologic variables often contain large uncertainties, for which ensemble techniques are increasingly used. However, the utility of ensemble forecasts depends on the unbiasedness of the forecast probabilities. We describe a technique for quantifying and removing biases from ensemble forecasts of hydrometeorological and hydrologic variables, intended for use in operational forecasting. The technique makes no a priori assumptions about the distributional form of the variables, which is often unknown or difficult to model parametrically. The aim is to estimate the conditional cumulative distribution function (ccdf) of the observed variable given a (possibly biased) real-time ensemble forecast from one or several forecasting systems (multi-model ensembles). The technique is based on Bayesian optimal linear estimation of indicator variables, and is analogous to indicator cokriging (ICK) in geostatistics. By developing linear estimators for the conditional expectation of the observed variable at many thresholds, ICK provides a discrete approximation of the full ccdf. Since ICK minimizes the conditional error variance of the indicator expectation at each threshold, it effectively minimizes the Continuous Ranked Probability Score (CRPS) when infinitely many thresholds are employed. However, the ensemble members used as predictors in ICK, and other bias-correction techniques, are often highly cross-correlated, both within and between models. Thus, we propose an orthogonal transform of the predictors used in ICK, which is analogous to using their principal components in the linear system of equations. This leads to a well-posed problem in which a minimum number of predictors are used to provide maximum information content in terms of the total variance explained. The technique is used to bias-correct precipitation ensemble forecasts from the NCEP Global Ensemble Forecast System (GEFS), for which independent validation results are presented. Extension to multimodel ensembles from the NCEP GFS and Short Range Ensemble Forecast (SREF) systems is also proposed.
NASA Astrophysics Data System (ADS)
Giebel, Gregor; Cline, Joel; Frank, Helmut; Shaw, Will; Pinson, Pierre; Hodge, Bri-Mathias; Kariniotakis, Georges; Sempreviva, Anna Maria; Draxl, Caroline
2017-04-01
Wind power forecasts have been used operatively for over 20 years. Despite this fact, there are still several possibilities to improve the forecasts, both from the weather prediction side and from the usage of the forecasts. The new International Energy Agency (IEA) Task on Wind Power Forecasting tries to organise international collaboration, among national weather centres with an interest and/or large projects on wind forecast improvements (NOAA, DWD, UK MetOffice, …) and operational forecaster and forecast users. The Task is divided in three work packages: Firstly, a collaboration on the improvement of the scientific basis for the wind predictions themselves. This includes numerical weather prediction model physics, but also widely distributed information on accessible datasets for verification. Secondly, we will be aiming at an international pre-standard (an IEA Recommended Practice) on benchmarking and comparing wind power forecasts, including probabilistic forecasts aiming at industry and forecasters alike. This WP will also organise benchmarks, in cooperation with the IEA Task WakeBench. Thirdly, we will be engaging end users aiming at dissemination of the best practice in the usage of wind power predictions, especially probabilistic ones. The Operating Agent is Gregor Giebel of DTU, Co-Operating Agent is Joel Cline of the US Department of Energy. Collaboration in the task is solicited from everyone interested in the forecasting business. We will collaborate with IEA Task 31 Wakebench, which developed the Windbench benchmarking platform, which this task will use for forecasting benchmarks. The task runs for three years, 2016-2018. Main deliverables are an up-to-date list of current projects and main project results, including datasets which can be used by researchers around the world to improve their own models, an IEA Recommended Practice on performance evaluation of probabilistic forecasts, a position paper regarding the use of probabilistic forecasts, and one or more benchmark studies implemented on the Windbench platform hosted at CENER. Additionally, spreading of relevant information in both the forecasters and the users community is paramount. The poster also shows the work done in the first half of the Task, e.g. the collection of available datasets and the learnings from a public workshop on 9 June in Barcelona on Experiences with the Use of Forecasts and Gaps in Research. Participation is open for all interested parties in member states of the IEA Annex on Wind Power, see ieawind.org for the up-to-date list. For collaboration, please contact the author grgi@dtu.dk).
Coal resources, reserves and peak coal production in the United States
Milici, Robert C.; Flores, Romeo M.; Stricker, Gary D.
2013-01-01
In spite of its large endowment of coal resources, recent studies have indicated that United States coal production is destined to reach a maximum and begin an irreversible decline sometime during the middle of the current century. However, studies and assessments illustrating coal reserve data essential for making accurate forecasts of United States coal production have not been compiled on a national basis. As a result, there is a great deal of uncertainty in the accuracy of the production forecasts. A very large percentage of the coal mined in the United States comes from a few large-scale mines (mega-mines) in the Powder River Basin of Wyoming and Montana. Reported reserves at these mines do not account for future potential reserves or for future development of technology that may make coal classified currently as resources into reserves in the future. In order to maintain United States coal production at or near current levels for an extended period of time, existing mines will eventually have to increase their recoverable reserves and/or new large-scale mines will have to be opened elsewhere. Accordingly, in order to facilitate energy planning for the United States, this paper suggests that probabilistic assessments of the remaining coal reserves in the country would improve long range forecasts of coal production. As it is in United States coal assessment projects currently being conducted, a major priority of probabilistic assessments would be to identify the numbers and sizes of remaining large blocks of coal capable of supporting large-scale mining operations for extended periods of time and to conduct economic evaluations of those resources.
NASA Technical Reports Server (NTRS)
Sippel, Jason A.; Zhang, Fuqing; Weng, Yonghui; Braun, Scott A.; Cecil, Daniel J.
2015-01-01
This study explores the potential of assimilating data from multiple instruments onboard high-altitude, long-endurance unmanned aircraft to improve hurricane analyses and forecasts. A recent study found a significant positive impact on analyses and forecasts of Hurricane Karl when an ensemble Kalman filter was used to assimilate data from the High-altitude Imaging Wind and Rain Airborne Profiler (HIWRAP), a new Doppler radar onboard the NASA Global Hawk (GH) unmanned airborne system. The GH can also carry other useful instruments, including dropsondes and the Hurricane Imaging Radiometer (HIRAD), which is a new radiometer that estimates large swaths of wind speeds and rainfall at the ocean surface. The primary finding is that simultaneously assimilating data from HIWRAP and the other GH-compatible instruments results in further analysis and forecast improvement for Karl. The greatest improvement comes when HIWRAP, HIRAD, and dropsonde data are simultaneously assimilated.
Volcanic Eruption Forecasts From Accelerating Rates of Drumbeat Long-Period Earthquakes
NASA Astrophysics Data System (ADS)
Bell, Andrew F.; Naylor, Mark; Hernandez, Stephen; Main, Ian G.; Gaunt, H. Elizabeth; Mothes, Patricia; Ruiz, Mario
2018-02-01
Accelerating rates of quasiperiodic "drumbeat" long-period earthquakes (LPs) are commonly reported before eruptions at andesite and dacite volcanoes, and promise insights into the nature of fundamental preeruptive processes and improved eruption forecasts. Here we apply a new Bayesian Markov chain Monte Carlo gamma point process methodology to investigate an exceptionally well-developed sequence of drumbeat LPs preceding a recent large vulcanian explosion at Tungurahua volcano, Ecuador. For more than 24 hr, LP rates increased according to the inverse power law trend predicted by material failure theory, and with a retrospectively forecast failure time that agrees with the eruption onset within error. LPs resulted from repeated activation of a single characteristic source driven by accelerating loading, rather than a distributed failure process, showing that similar precursory trends can emerge from quite different underlying physics. Nevertheless, such sequences have clear potential for improving forecasts of eruptions at Tungurahua and analogous volcanoes.
Spatial and Temporal scales of time-averaged 700 MB height anomalies
NASA Technical Reports Server (NTRS)
Gutzler, D.
1981-01-01
The monthly and seasonal forecasting technique is based to a large extent on the extrapolation of trends in the positions of the centers of time averaged geopotential height anomalies. The complete forecasted height pattern is subsequently drawn around the forecasted anomaly centers. The efficacy of this technique was tested and time series of observed monthly mean and 5 day mean 700 mb geopotential heights were examined. Autocorrelation statistics are generated to document the tendency for persistence of anomalies. These statistics are compared to a red noise hypothesis to check for evidence of possible preferred time scales of persistence. Space-time spectral analyses at middle latitudes are checked for evidence of periodicities which could be associated with predictable month-to-month trends. A local measure of the average spatial scale of anomalies is devised for guidance in the completion of the anomaly pattern around the forecasted centers.
Tourism demand in the Algarve region: Evolution and forecast using SVARMA models
NASA Astrophysics Data System (ADS)
Lopes, Isabel Cristina; Soares, Filomena; Silva, Eliana Costa e.
2017-06-01
Tourism is one of the Portuguese economy's key sectors, and its relative weight has grown over recent years. The Algarve region is particularly focused on attracting foreign tourists and has built over the years a large offer of diversified hotel units. In this paper we present multivariate time series approach to forecast the number of overnight stays in hotel units (hotels, guesthouses or hostels, and tourist apartments) in Algarve. We adjust a seasonal vector autoregressive and moving averages model (SVARMA) to monthly data between 2006 and 2016. The forecast values were compared with the actual values of the overnight stays in Algarve in 2016 and led to a MAPE of 15.1% and RMSE= 53847.28. The MAPE for the Hotel series was merely 4.56%. These forecast values can be used by a hotel manager to predict their occupancy and to determine the best pricing policy.
Parsons, Thomas E.; Geist, Eric L.
2009-01-01
The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg–Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg–Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.
Forecasting intense geomagnetic activity using interplanetary magnetic field data
NASA Astrophysics Data System (ADS)
Saiz, E.; Cid, C.; Cerrato, Y.
2008-12-01
Southward interplanetary magnetic fields are considered traces of geoeffectiveness since they are a main agent of magnetic reconnection of solar wind and magnetosphere. The first part of this work revises the ability to forecast intense geomagnetic activity using different procedures available in the literature. The study shows that current methods do not succeed in making confident predictions. This fact led us to develop a new forecasting procedure, which provides trustworthy results in predicting large variations of Dst index over a sample of 10 years of observations and is based on the value Bz only. The proposed forecasting method appears as a worthy tool for space weather purposes because it is not affected by the lack of solar wind plasma data, which usually occurs during severe geomagnetic activity. Moreover, the results obtained guide us to provide a new interpretation of the physical mechanisms involved in the interaction between the solar wind and the magnetosphere using Faraday's law.
NASA Astrophysics Data System (ADS)
Stephens, E.; Day, J. J.; Pappenberger, F.; Cloke, H.
2015-12-01
There are a number of factors that lead to nonlinearity between precipitation anomalies and flood hazard; this nonlinearity is a pertinent issue for applications that use a precipitation forecast as a proxy for imminent flood hazard. We assessed the degree of this nonlinearity for the first time using a recently developed global-scale hydrological model driven by the ERA-Interim/Land precipitation reanalysis (1980-2010). We introduced new indices to assess large-scale flood hazard, or floodiness, and quantified the link between monthly precipitation, river discharge, and floodiness anomalies at the global and regional scales. The results show that monthly floodiness is not well correlated with precipitation, therefore demonstrating the value of hydrometeorological systems for providing floodiness forecasts for decision-makers. A method is described for forecasting floodiness using the Global Flood Awareness System, building a climatology of regional floodiness from which to forecast floodiness anomalies out to 2 weeks.
NASA Astrophysics Data System (ADS)
Bewley, Thomas
2015-11-01
Accurate long-term forecasts of the path and intensity of hurricanes are imperative to protect property and save lives. Accurate estimations and forecasts of the spread of large-scale contaminant plumes, such as those from Deepwater Horizon, Fukushima, and recent volcanic eruptions in Iceland, are essential for assessing environment impact, coordinating remediation efforts, and in certain cases moving folks out of harm's way. The challenges in estimating and forecasting such systems include: (a) environmental flow modeling, (b) high-performance real-time computing, (c) assimilating measured data into numerical simulations, and (d) acquiring in-situ data, beyond what can be measured from satellites, that is maximally relevant for reducing forecast uncertainty. This talk will focus on new techniques for addressing (c) and (d), namely, data assimilation and adaptive observation, in both hurricanes and large-scale environmental plumes. In particular, we will present a new technique for the energy-efficient coordination of swarms of sensor-laden balloons for persistent, in-situ, distributed, real-time measurement of developing hurricanes, leveraging buoyancy control only (coupled with the predictable and strongly stratified flowfield within the hurricane). Animations of these results are available at http://flowcontrol.ucsd.edu/3dhurricane.mp4 and http://flowcontrol.ucsd.edu/katrina.mp4. We also will survey our unique hybridization of the venerable Ensemble Kalman and Variational approaches to large-scale data assimilation in environmental flow systems, and how essentially the dual of this hybrid approach may be used to solve the adaptive observation problem in a uniquely effective and rigorous fashion.
Large-Scale Traffic Microsimulation From An MPO Perspective
DOT National Transportation Integrated Search
1997-01-01
One potential advancement of the four-step travel model process is the forecasting and simulation of individual activities and travel. A common concern with such an approach is that the data and computational requirements for a large-scale, regional ...
Large earthquake rates from geologic, geodetic, and seismological perspectives
NASA Astrophysics Data System (ADS)
Jackson, D. D.
2017-12-01
Earthquake rate and recurrence information comes primarily from geology, geodesy, and seismology. Geology gives the longest temporal perspective, but it reveals only surface deformation, relatable to earthquakes only with many assumptions. Geodesy is also limited to surface observations, but it detects evidence of the processes leading to earthquakes, again subject to important assumptions. Seismology reveals actual earthquakes, but its history is too short to capture important properties of very large ones. Unfortunately, the ranges of these observation types barely overlap, so that integrating them into a consistent picture adequate to infer future prospects requires a great deal of trust. Perhaps the most important boundary is the temporal one at the beginning of the instrumental seismic era, about a century ago. We have virtually no seismological or geodetic information on large earthquakes before then, and little geological information after. Virtually all-modern forecasts of large earthquakes assume some form of equivalence between tectonic- and seismic moment rates as functions of location, time, and magnitude threshold. That assumption links geology, geodesy, and seismology, but it invokes a host of other assumptions and incurs very significant uncertainties. Questions include temporal behavior of seismic and tectonic moment rates; shape of the earthquake magnitude distribution; upper magnitude limit; scaling between rupture length, width, and displacement; depth dependence of stress coupling; value of crustal rigidity; and relation between faults at depth and their surface fault traces, to name just a few. In this report I'll estimate the quantitative implications for estimating large earthquake rate. Global studies like the GEAR1 project suggest that surface deformation from geology and geodesy best show the geography of very large, rare earthquakes in the long term, while seismological observations of small earthquakes best forecasts moderate earthquakes up to about magnitude 7. Regional forecasts for a few decades, like those in UCERF3, could be improved by calibrating tectonic moment rate to past seismicity rates. Century-long forecasts must be speculative. Estimates of maximum magnitude and rate of giant earthquakes over geologic time scales require more than science.
Forecasting distribution of numbers of large fires
Haiganoush K. Preisler; Jeff Eidenshink; Stephen Howard; Robert E. Burgan
2015-01-01
Systems to estimate forest fire potential commonly utilize one or more indexes that relate to expected fire behavior; however they indicate neither the chance that a large fire will occur, nor the expected number of large fires. That is, they do not quantify the probabilistic nature of fire danger. In this work we use large fire occurrence information from the...
Long-term flow forecasts based on climate and hydrologic modeling: Uruguay River basin
NASA Astrophysics Data System (ADS)
Tucci, Carlos Eduardo Morelli; Clarke, Robin Thomas; Collischonn, Walter; da Silva Dias, Pedro Leite; de Oliveira, Gilvan Sampaio
2003-07-01
This paper describes a procedure for predicting seasonal flow in the Rio Uruguay drainage basin (area 75,000 km2, lying in Brazilian territory), using sequences of future daily rainfall given by the global climate model (GCM) of the Brazilian agency for climate prediction (Centro de Previsão de Tempo e Clima, or CPTEC). Sequences of future daily rainfall given by this model were used as input to a rainfall-runoff model appropriate for large drainage basins. Forecasts of flow in the Rio Uruguay were made for the period 1995-2001 of the full record, which began in 1940. Analysis showed that GCM forecasts underestimated rainfall over almost all the basin, particularly in winter, although interannual variability in regional rainfall was reproduced relatively well. A statistical procedure was used to correct for the underestimation of rainfall. When the corrected rainfall sequences were transformed to flow by the hydrologic model, forecasts of flow in the Rio Uruguay basin were better than forecasts based on historic mean or median flows by 37% for monthly flows and by 54% for 3-monthly flows.
Statistical and Machine Learning forecasting methods: Concerns and ways forward
Makridakis, Spyros; Assimakopoulos, Vassilios
2018-01-01
Machine Learning (ML) methods have been proposed in the academic literature as alternatives to statistical ones for time series forecasting. Yet, scant evidence is available about their relative performance in terms of accuracy and computational requirements. The purpose of this paper is to evaluate such performance across multiple forecasting horizons using a large subset of 1045 monthly time series used in the M3 Competition. After comparing the post-sample accuracy of popular ML methods with that of eight traditional statistical ones, we found that the former are dominated across both accuracy measures used and for all forecasting horizons examined. Moreover, we observed that their computational requirements are considerably greater than those of statistical methods. The paper discusses the results, explains why the accuracy of ML models is below that of statistical ones and proposes some possible ways forward. The empirical results found in our research stress the need for objective and unbiased ways to test the performance of forecasting methods that can be achieved through sizable and open competitions allowing meaningful comparisons and definite conclusions. PMID:29584784
Load Forecasting in Electric Utility Integrated Resource Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carvallo, Juan Pablo; Larsen, Peter H.; Sanstad, Alan H
Integrated resource planning (IRP) is a process used by many vertically-integrated U.S. electric utilities to determine least-cost/risk supply and demand-side resources that meet government policy objectives and future obligations to customers and, in many cases, shareholders. Forecasts of energy and peak demand are a critical component of the IRP process. There have been few, if any, quantitative studies of IRP long-run (planning horizons of two decades) load forecast performance and its relationship to resource planning and actual procurement decisions. In this paper, we evaluate load forecasting methods, assumptions, and outcomes for 12 Western U.S. utilities by examining and comparing plansmore » filed in the early 2000s against recent plans, up to year 2014. We find a convergence in the methods and data sources used. We also find that forecasts in more recent IRPs generally took account of new information, but that there continued to be a systematic over-estimation of load growth rates during the period studied. We compare planned and procured resource expansion against customer load and year-to-year load growth rates, but do not find a direct relationship. Load sensitivities performed in resource plans do not appear to be related to later procurement strategies even in the presence of large forecast errors. These findings suggest that resource procurement decisions may be driven by other factors than customer load growth. Our results have important implications for the integrated resource planning process, namely that load forecast accuracy may not be as important for resource procurement as is generally believed, that load forecast sensitivities could be used to improve the procurement process, and that management of load uncertainty should be prioritized over more complex forecasting techniques.« less
Advanced, Cost-Based Indices for Forecasting the Generation of Photovoltaic Power
NASA Astrophysics Data System (ADS)
Bracale, Antonio; Carpinelli, Guido; Di Fazio, Annarita; Khormali, Shahab
2014-01-01
Distribution systems are undergoing significant changes as they evolve toward the grids of the future, which are known as smart grids (SGs). The perspective of SGs is to facilitate large-scale penetration of distributed generation using renewable energy sources (RESs), encourage the efficient use of energy, reduce systems' losses, and improve the quality of power. Photovoltaic (PV) systems have become one of the most promising RESs due to the expected cost reduction and the increased efficiency of PV panels and interfacing converters. The ability to forecast power-production information accurately and reliably is of primary importance for the appropriate management of an SG and for making decisions relative to the energy market. Several forecasting methods have been proposed, and many indices have been used to quantify the accuracy of the forecasts of PV power production. Unfortunately, the indices that have been used have deficiencies and usually do not directly account for the economic consequences of forecasting errors in the framework of liberalized electricity markets. In this paper, advanced, more accurate indices are proposed that account directly for the economic consequences of forecasting errors. The proposed indices also were compared to the most frequently used indices in order to demonstrate their different, improved capability. The comparisons were based on the results obtained using a forecasting method based on an artificial neural network. This method was chosen because it was deemed to be one of the most promising methods available due to its capability for forecasting PV power. Numerical applications also are presented that considered an actual PV plant to provide evidence of the forecasting performances of all of the indices that were considered.
Seasonal drought predictability in Portugal using statistical-dynamical techniques
NASA Astrophysics Data System (ADS)
Ribeiro, A. F. S.; Pires, C. A. L.
2016-08-01
Atmospheric forecasting and predictability are important to promote adaption and mitigation measures in order to minimize drought impacts. This study estimates hybrid (statistical-dynamical) long-range forecasts of the regional drought index SPI (3-months) over homogeneous regions from mainland Portugal, based on forecasts from the UKMO operational forecasting system, with lead-times up to 6 months. ERA-Interim reanalysis data is used for the purpose of building a set of SPI predictors integrating recent past information prior to the forecast launching. Then, the advantage of combining predictors with both dynamical and statistical background in the prediction of drought conditions at different lags is evaluated. A two-step hybridization procedure is performed, in which both forecasted and observed 500 hPa geopotential height fields are subjected to a PCA in order to use forecasted PCs and persistent PCs as predictors. A second hybridization step consists on a statistical/hybrid downscaling to the regional SPI, based on regression techniques, after the pre-selection of the statistically significant predictors. The SPI forecasts and the added value of combining dynamical and statistical methods are evaluated in cross-validation mode, using the R2 and binary event scores. Results are obtained for the four seasons and it was found that winter is the most predictable season, and that most of the predictive power is on the large-scale fields from past observations. The hybridization improves the downscaling based on the forecasted PCs, since they provide complementary information (though modest) beyond that of persistent PCs. These findings provide clues about the predictability of the SPI, particularly in Portugal, and may contribute to the predictability of crops yields and to some guidance on users (such as farmers) decision making process.
Probabilistic flood warning using grand ensemble weather forecasts
NASA Astrophysics Data System (ADS)
He, Y.; Wetterhall, F.; Cloke, H.; Pappenberger, F.; Wilson, M.; Freer, J.; McGregor, G.
2009-04-01
As the severity of floods increases, possibly due to climate and landuse change, there is urgent need for more effective and reliable warning systems. The incorporation of numerical weather predictions (NWP) into a flood warning system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and can lead to a high number of false or missed warnings. An ensemble of weather forecasts from one Ensemble Prediction System (EPS), when used on catchment hydrology, can provide improved early flood warning as some of the uncertainties can be quantified. EPS forecasts from a single weather centre only account for part of the uncertainties originating from initial conditions and stochastic physics. Other sources of uncertainties, including numerical implementations and/or data assimilation, can only be assessed if a grand ensemble of EPSs from different weather centres is used. When various models that produce EPS from different weather centres are aggregated, the probabilistic nature of the ensemble precipitation forecasts can be better retained and accounted for. The availability of twelve global EPSs through the 'THORPEX Interactive Grand Global Ensemble' (TIGGE) offers a new opportunity for the design of an improved probabilistic flood forecasting framework. This work presents a case study using the TIGGE database for flood warning on a meso-scale catchment. The upper reach of the River Severn catchment located in the Midlands Region of England is selected due to its abundant data for investigation and its relatively small size (4062 km2) (compared to the resolution of the NWPs). This choice was deliberate as we hypothesize that the uncertainty in the forcing of smaller catchments cannot be represented by a single EPS with a very limited number of ensemble members, but only through the variance given by a large number ensembles and ensemble system. A coupled atmospheric-hydrologic-hydraulic cascade system driven by the TIGGE ensemble forecasts is set up to study the potential benefits of using the TIGGE database in early flood warning. Physically based and fully distributed LISFLOOD suite of models is selected to simulate discharge and flood inundation consecutively. The results show the TIGGE database is a promising tool to produce forecasts of discharge and flood inundation comparable with the observed discharge and simulated inundation driven by the observed discharge. The spread of discharge forecasts varies from centre to centre, but it is generally large, implying a significant level of uncertainties. Precipitation input uncertainties dominate and propagate through the cascade chain. The current NWPs fall short of representing the spatial variability of precipitation on a comparatively small catchment. This perhaps indicates the need to improve NWPs resolution and/or disaggregation techniques to narrow down the spatial gap between meteorology and hydrology. It is not necessarily true that early flood warning becomes more reliable when more ensemble forecasts are employed. It is difficult to identify the best forecast centre(s), but in general the chance of detecting floods is increased by using the TIGGE database. Only one flood event was studied because most of the TIGGE data became available after October 2007. It is necessary to test the TIGGE ensemble forecasts with other flood events in other catchments with different hydrological and climatic regimes before general conclusions can be made on its robustness and applicability.
Characterisation of flooding in Alexandria in October 2015 and suggested mitigating measures
NASA Astrophysics Data System (ADS)
Bhattacharya, Biswa; Zevenbergen, Chris; Wahaab, R. A. Wahaab R. A.; Elbarki, W. A. I. Elbarki W. A. I.; Busker, T. Busker T.; Salinas Rodriguez, C. N. A. Salinas Rodriguez C. N. A.
2017-04-01
In October 2015 Alexandria (Egypt) experienced exceptional flooding. The flooding was caused by heavy rainfall in a short period of time in a city which normally does not receive a large amount of rainfall. The heavy rainfall caused a tremendous volume of runoff, which the city's drainage system was unable to drain off to the Mediterranean Sea. Seven people have died due to the flood, and there were huge direct and indirect damages. The city does not have a flood forecasting system. An analysis with rainfall forecast from the European Centre for Medium Range Weather Forecast (ECMWF) showed that the extreme rainfall could have been forecasted about a week back. Naturally, if a flood forecasting model was in place the flooding could have been predicted well in advance. Alexandria, along with several other Arab cities, are not prepared at all for natural hazards. Preparedness actions leading to improved adaptation and resilience are not in place. The situation is being further exacerbated with rapid urbanisation and climate change. The local authorities estimate that about 30000 new buildings have been (illegally) constructed during the last five years at a location near the main pumping station (Max Point). This issue may have a very serious adverse effect on hydrology and requires further study to estimate the additional runoff from the newly urbanised areas. The World Bank has listed Alexandria as one of the five coastal cities, which may have very significant risk of coastal flooding due to the climate change. Setting up of a flood forecasting model along with an evidence-based research on the drainage system's capacity is seen as immediate actions that can significantly improve the preparedness of the city towards flooding. Furthermore, the region has got a number of large lakes, which potentially can be used to store extra water as a flood mitigation measure. Two water bodies, namely the Maryot Lake and the Airport Lake, are identified from which water can be pumped out in advance to keep storage available in case of flooding. Keywords: Alexandria, flood, Egypt, rainfall, forecasting.
NASA Astrophysics Data System (ADS)
Anastasiadis, Anastasios; Sandberg, Ingmar; Papaioannou, Athanasios; Georgoulis, Manolis; Tziotziou, Kostas; Jiggens, Piers; Hilgers, Alain
2015-04-01
We present a novel integrated prediction system, of both solar flares and solar energetic particle (SEP) events, which is in place to provide short-term warnings for hazardous solar radiation storms. FORSPEF system provides forecasting of solar eruptive events, such as solar flares with a projection to coronal mass ejections (CMEs) (occurrence and velocity) and the likelihood of occurrence of a SEP event. It also provides nowcasting of SEP events based on actual solar flare and CME near real-time alerts, as well as SEP characteristics (peak flux, fluence, rise time, duration) per parent solar event. The prediction of solar flares relies on a morphological method which is based on the sophisticated derivation of the effective connected magnetic field strength (Beff) of potentially flaring active-region (AR) magnetic configurations and it utilizes analysis of a large number of AR magnetograms. For the prediction of SEP events a new reductive statistical method has been implemented based on a newly constructed database of solar flares, CMEs and SEP events that covers a large time span from 1984-2013. The method is based on flare location (longitude), flare size (maximum soft X-ray intensity), and the occurrence (or not) of a CME. Warnings are issued for all > C1.0 soft X-ray flares. The warning time in the forecasting scheme extends to 24 hours with a refresh rate of 3 hours while the respective warning time for the nowcasting scheme depends on the availability of the near real-time data and falls between 15-20 minutes. We discuss the modules of the FORSPEF system, their interconnection and the operational set up. The dual approach in the development of FORPSEF (i.e. forecasting and nowcasting scheme) permits the refinement of predictions upon the availability of new data that characterize changes on the Sun and the interplanetary space, while the combined usage of solar flare and SEP forecasting methods upgrades FORSPEF to an integrated forecasting solution. This work has been funded through the "FORSPEF: FORecasting Solar Particle Events and Flares", ESA Contract No. 4000109641/13/NL/AK
NASA Astrophysics Data System (ADS)
Hill, A.; Weiss, C.; Ancell, B. C.
2017-12-01
The basic premise of observation targeting is that additional observations, when gathered and assimilated with a numerical weather prediction (NWP) model, will produce a more accurate forecast related to a specific phenomenon. Ensemble-sensitivity analysis (ESA; Ancell and Hakim 2007; Torn and Hakim 2008) is a tool capable of accurately estimating the proper location of targeted observations in areas that have initial model uncertainty and large error growth, as well as predicting the reduction of forecast variance due to the assimilated observation. ESA relates an ensemble of NWP model forecasts, specifically an ensemble of scalar forecast metrics, linearly to earlier model states. A thorough investigation is presented to determine how different factors of the forecast process are impacting our ability to successfully target new observations for mesoscale convection forecasts. Our primary goals for this work are to determine: (1) If targeted observations hold more positive impact over non-targeted (i.e. randomly chosen) observations; (2) If there are lead-time constraints to targeting for convection; (3) How inflation, localization, and the assimilation filter influence impact prediction and realized results; (4) If there exist differences between targeted observations at the surface versus aloft; and (5) how physics errors and nonlinearity may augment observation impacts.Ten cases of dryline-initiated convection between 2011 to 2013 are simulated within a simplified OSSE framework and presented here. Ensemble simulations are produced from a cycling system that utilizes the Weather Research and Forecasting (WRF) model v3.8.1 within the Data Assimilation Research Testbed (DART). A "truth" (nature) simulation is produced by supplying a 3-km WRF run with GFS analyses and integrating the model forward 90 hours, from the beginning of ensemble initialization through the end of the forecast. Target locations for surface and radiosonde observations are computed 6, 12, and 18 hours into the forecast based on a chosen scalar forecast response metric (e.g., maximum reflectivity at convection initiation). A variety of experiments are designed to achieve the aforementioned goals and will be presented, along with their results, detailing the feasibility of targeting for mesoscale convection forecasts.
NASA Astrophysics Data System (ADS)
Huang, Ling; Luo, Yali
2017-08-01
Based on The Observing System Research and Predictability Experiment Interactive Grand Global Ensemble (TIGGE) data set, this study evaluates the ability of global ensemble prediction systems (EPSs) from the European Centre for Medium-Range Weather Forecasts (ECMWF), U.S. National Centers for Environmental Prediction, Japan Meteorological Agency (JMA), Korean Meteorological Administration, and China Meteorological Administration (CMA) to predict presummer rainy season (April-June) precipitation in south China. Evaluation of 5 day forecasts in three seasons (2013-2015) demonstrates the higher skill of probability matching forecasts compared to simple ensemble mean forecasts and shows that the deterministic forecast is a close second. The EPSs overestimate light-to-heavy rainfall (0.1 to 30 mm/12 h) and underestimate heavier rainfall (>30 mm/12 h), with JMA being the worst. By analyzing the synoptic situations predicted by the identified more skillful (ECMWF) and less skillful (JMA and CMA) EPSs and the ensemble sensitivity for four representative cases of torrential rainfall, the transport of warm-moist air into south China by the low-level southwesterly flow, upstream of the torrential rainfall regions, is found to be a key synoptic factor that controls the quantitative precipitation forecast. The results also suggest that prediction of locally produced torrential rainfall is more challenging than prediction of more extensively distributed torrential rainfall. A slight improvement in the performance is obtained by shortening the forecast lead time from 30-36 h to 18-24 h to 6-12 h for the cases with large-scale forcing, but not for the locally produced cases.
Deodhar, Suruchi; Bisset, Keith; Chen, Jiangzhuo; Barrett, Chris; Wilson, Mandy; Marathe, Madhav
2016-01-01
Public health decision makers need access to high resolution situation assessment tools for understanding the extent of various epidemics in different regions of the world. In addition, they need insights into the future course of epidemics by way of forecasts. Such forecasts are essential for planning the allocation of limited resources and for implementing several policy-level and behavioral intervention strategies. The need for such forecasting systems became evident in the wake of the recent Ebola outbreak in West Africa. We have developed EpiCaster, an integrated Web application for situation assessment and forecasting of various epidemics, such as Flu and Ebola, that are prevalent in different regions of the world. Using EpiCaster, users can assess the magnitude and severity of different epidemics at highly resolved spatio-temporal levels. EpiCaster provides time-varying heat maps and graphical plots to view trends in the disease dynamics. EpiCaster also allows users to visualize data gathered through surveillance mechanisms, such as Google Flu Trends (GFT) and the World Health Organization (WHO). The forecasts provided by EpiCaster are generated using different epidemiological models, and the users can select the models through the interface to filter the corresponding forecasts. EpiCaster also allows the users to study epidemic propagation in the presence of a number of intervention strategies specific to certain diseases. Here we describe the modeling techniques, methodologies and computational infrastructure that EpiCaster relies on to support large-scale predictive analytics for situation assessment and forecasting of global epidemics. PMID:27796009
NASA Astrophysics Data System (ADS)
Sedlmeier, Katrin; Gubler, Stefanie; Spierig, Christoph; Flubacher, Moritz; Maurer, Felix; Quevedo, Karim; Escajadillo, Yury; Avalos, Griña; Liniger, Mark A.; Schwierz, Cornelia
2017-04-01
Seasonal climate forecast products potentially have a high value for users of different sectors. During the first phase (2012-2015) of the project CLIMANDES (a pilot project of the Global Framework for Climate Services led by WMO [http://www.wmo.int/gfcs/climandes]), a demand study conducted with Peruvian farmers indicated a large interest in seasonal climate information for agriculture. The study further showed that the required information should by precise, timely, and understandable. In addition to the actual forecast, two complex measures are essential to understand seasonal climate predictions and their limitations correctly: forecast uncertainty and forecast skill. The former can be sampled by using an ensemble of climate simulations, the latter derived by comparing forecasts of past time periods to observations. Including uncertainty and skill information in an understandable way for end-users (who are often not technically educated) poses a great challenge. However, neglecting this information would lead to a false sense of determinism which could prove fatal to the credibility of climate information. Within the second phase (2016-2018) of the project CLIMANDES, one goal is to develop a prototype of a user-tailored seasonal forecast for the agricultural sector in Peru. In this local context, the basic education level of the rural farming community presents a major challenge for the communication of seasonal climate predictions. This contribution proposes different graphical presentations of climate forecasts along with possible approaches to visualize and communicate the associated skill and uncertainties, considering end users with varying levels of technical knowledge.
Salinity anomaly as a trigger for ENSO events
Zhu, Jieshun; Huang, Bohua; Zhang, Rong-Hua; Hu, Zeng-Zhen; Kumar, Arun; Balmaseda, Magdalena A.; Marx, Lawrence; Kinter III, James L.
2014-01-01
According to the classical theories of ENSO, subsurface anomalies in ocean thermal structure are precursors for ENSO events and their initial specification is essential for skillful ENSO forecast. Although ocean salinity in the tropical Pacific (particularly in the western Pacific warm pool) can vary in response to El Niño events, its effect on ENSO evolution and forecasts of ENSO has been less explored. Here we present evidence that, in addition to the passive response, salinity variability may also play an active role in ENSO evolution, and thus important in forecasting El Niño events. By comparing two forecast experiments in which the interannually variability of salinity in the ocean initial states is either included or excluded, the salinity variability is shown to be essential to correctly forecast the 2007/08 La Niña starting from April 2007. With realistic salinity initial states, the tendency to decay of the subsurface cold condition during the spring and early summer 2007 was interrupted by positive salinity anomalies in the upper central Pacific, which working together with the Bjerknes positive feedback, contributed to the development of the La Niña event. Our study suggests that ENSO forecasts will benefit from more accurate salinity observations with large-scale spatial coverage. PMID:25352285
Salinity anomaly as a trigger for ENSO events.
Zhu, Jieshun; Huang, Bohua; Zhang, Rong-Hua; Hu, Zeng-Zhen; Kumar, Arun; Balmaseda, Magdalena A; Marx, Lawrence; Kinter, James L
2014-10-29
According to the classical theories of ENSO, subsurface anomalies in ocean thermal structure are precursors for ENSO events and their initial specification is essential for skillful ENSO forecast. Although ocean salinity in the tropical Pacific (particularly in the western Pacific warm pool) can vary in response to El Niño events, its effect on ENSO evolution and forecasts of ENSO has been less explored. Here we present evidence that, in addition to the passive response, salinity variability may also play an active role in ENSO evolution, and thus important in forecasting El Niño events. By comparing two forecast experiments in which the interannually variability of salinity in the ocean initial states is either included or excluded, the salinity variability is shown to be essential to correctly forecast the 2007/08 La Niña starting from April 2007. With realistic salinity initial states, the tendency to decay of the subsurface cold condition during the spring and early summer 2007 was interrupted by positive salinity anomalies in the upper central Pacific, which working together with the Bjerknes positive feedback, contributed to the development of the La Niña event. Our study suggests that ENSO forecasts will benefit from more accurate salinity observations with large-scale spatial coverage.
Demand for satellite-provided domestic communications services up to the year 2000
NASA Technical Reports Server (NTRS)
Stevenson, S.; Poley, W.; Lekan, J.; Salzman, J. A.
1984-01-01
Three fixed service telecommunications demand assessment studies were completed for NASA by The Western Union Telegraph Company and the U.S. Telephone and Telegraph Corporation. They provided forecasts of the total U.S. domestic demand, from 1980 to the year 2000, for voice, data, and video services. That portion that is technically and economically suitable for transmission by satellite systems, both large trunking systems and customer premises services (CPS) systems was also estimated. In order to provide a single set of forecasts a NASA synthesis of the above studies was conducted. The services, associated forecast techniques, and data bases employed by both contractors were examined, those elements of each judged to be the most appropriate were selected, and new forecasts were made. The demand for voice, data, and video services was first forecast in fundamental units of call-seconds, bits/year, and channels, respectively. Transmission technology characteristics and capabilities were then forecast, and the fundamental demand converted to an equivalent transmission capacity. The potential demand for satellite-provided services was found to grow by a factor of 6, from 400 to 2400 equivalent 36 MHz satellite transponders over the 20-year period. About 80 percent of this was found to be more appropriate for trunking systems and 20 percent CPS.
Demand for satellite-provided domestic communications services up to the year 2000
NASA Astrophysics Data System (ADS)
Stevenson, S.; Poley, W.; Lekan, J.; Salzman, J. A.
1984-11-01
Three fixed service telecommunications demand assessment studies were completed for NASA by The Western Union Telegraph Company and the U.S. Telephone and Telegraph Corporation. They provided forecasts of the total U.S. domestic demand, from 1980 to the year 2000, for voice, data, and video services. That portion that is technically and economically suitable for transmission by satellite systems, both large trunking systems and customer premises services (CPS) systems was also estimated. In order to provide a single set of forecasts a NASA synthesis of the above studies was conducted. The services, associated forecast techniques, and data bases employed by both contractors were examined, those elements of each judged to be the most appropriate were selected, and new forecasts were made. The demand for voice, data, and video services was first forecast in fundamental units of call-seconds, bits/year, and channels, respectively. Transmission technology characteristics and capabilities were then forecast, and the fundamental demand converted to an equivalent transmission capacity. The potential demand for satellite-provided services was found to grow by a factor of 6, from 400 to 2400 equivalent 36 MHz satellite transponders over the 20-year period. About 80 percent of this was found to be more appropriate for trunking systems and 20 percent CPS.
NASA Astrophysics Data System (ADS)
Lellouche, J. M.; Le Galloudec, O.; Greiner, E.; Garric, G.; Regnier, C.; Drillet, Y.
2016-02-01
Mercator Ocean currently delivers in real-time daily services (weekly analyses and daily forecast) with a global 1/12° high resolution system. The model component is the NEMO platform driven at the surface by the IFS ECMWF atmospheric analyses and forecasts. Observations are assimilated by means of a reduced-order Kalman filter with a 3D multivariate modal decomposition of the forecast error. It includes an adaptive-error estimate and a localization algorithm. Along track altimeter data, satellite Sea Surface Temperature and in situ temperature and salinity vertical profiles are jointly assimilated to estimate the initial conditions for numerical ocean forecasting. A 3D-Var scheme provides a correction for the slowly-evolving large-scale biases in temperature and salinity.Since May 2015, Mercator Ocean opened the Copernicus Marine Service (CMS) and is in charge of the global ocean analyses and forecast, at eddy resolving resolution. In this context, R&D activities have been conducted at Mercator Ocean these last years in order to improve the real-time 1/12° global system for the next CMS version in 2016. The ocean/sea-ice model and the assimilation scheme benefit among others from the following improvements: large-scale and objective correction of atmospheric quantities with satellite data, new Mean Dynamic Topography taking into account the last version of GOCE geoid, new adaptive tuning of some observational errors, new Quality Control on the assimilated temperature and salinity vertical profiles based on dynamic height criteria, assimilation of satellite sea-ice concentration, new freshwater runoff from ice sheets melting …This presentation doesn't focus on the impact of each update, but rather on the overall behavior of the system integrating all updates. This assessment reports on the products quality improvements, highlighting the level of performance and the reliability of the new system.
Benefits of seasonal forecasts of crop yields
NASA Astrophysics Data System (ADS)
Sakurai, G.; Okada, M.; Nishimori, M.; Yokozawa, M.
2017-12-01
Major factors behind recent fluctuations in food prices include increased biofuel production and oil price fluctuations. In addition, several extreme climate events that reduced worldwide food production coincided with upward spikes in food prices. The stabilization of crop yields is one of the most important tasks to stabilize food prices and thereby enhance food security. Recent development of technologies related to crop modeling and seasonal weather forecasting has made it possible to forecast future crop yields for maize and soybean. However, the effective use of these technologies remains limited. Here we present the potential benefits of seasonal crop-yield forecasts on a global scale for choice of planting day. For this purpose, we used a model (PRYSBI-2) that can well replicate past crop yields both for maize and soybean. This model system uses a Bayesian statistical approach to estimate the parameters of a basic process-based model of crop growth. The spatial variability of model parameters was considered by estimating the posterior distribution of the parameters from historical yield data by using the Markov-chain Monte Carlo (MCMC) method with a resolution of 1.125° × 1.125°. The posterior distributions of model parameters were estimated for each spatial grid with 30 000 MCMC steps of 10 chains each. By using this model and the estimated parameter distributions, we were able to estimate not only crop yield but also levels of associated uncertainty. We found that the global average crop yield increased about 30% as the result of the optimal selection of planting day and that the seasonal forecast of crop yield had a large benefit in and near the eastern part of Brazil and India for maize and the northern area of China for soybean. In these countries, the effects of El Niño and Indian Ocean dipole are large. The results highlight the importance of developing a system to forecast global crop yields.
NASA Astrophysics Data System (ADS)
Stelten, S. A.; Gallus, W. A., Jr.
2015-12-01
A large portion of precipitation seen in the Great Plains region of the United States falls from nocturnal convection. Quite often, nocturnally initiated convection may grow upscale into a Mesoscale Convective System (MCS) that in turn may cause high impact weather events such as severe wind, flooding, and even tornadoes. Thus, correctly predicting nocturnal convective initiation is an integral part of forecasting for the Great Plains. Unfortunately, it is also one of the most challenging aspects of forecasting for this region. Many forecasters familiar with the Great Plains region have noted that elevated nocturnal convective initiation seems to favor a few distinct and rather diverse modes, which pose varying degrees of forecasting difficulties. This study investigates four of these modes, including initiation caused by the interaction of the low level jet and a frontal feature, initiation at the nose of the low level jet without the presence of a frontal feature, linear features ahead of and perpendicular to a forward propagating MCS, and initiation occurring with no discernible large scale forcing mechanism. Improving elevated nocturnal convective initiation forecasts was one of the primary goals of the Plains Elevated Convection At Night (PECAN) field campaign that took place from June 1 to July 15, 2015, which collected a wealth of convective initiation data. To coincide with these data sets, nocturnal convective initiation episodes from the 2015 summer season were classified into each of the aforementioned groups. This allowed for a thorough investigation of the frequency of each type of initiation event, as well as identification of typical characteristics of the atmosphere (forcing mechanisms present, available instability, strength/location of low level jet, etc.) during each event type. Then, using archived model data and the vast data sets collected during the PECAN field campaign, model performance during PECAN for each convective initiation mode was compared to the high quality data sets in order to flesh out why certain convective initiation modes may be more difficult to forecast than others.
NASA Astrophysics Data System (ADS)
Federico, Ivan; Oddo, Paolo; Pinardi, Nadia; Coppini, Giovanni
2014-05-01
The Southern Adriatic Northern Ionian Forecasting System (SANIFS) operational chain is based on a nesting approach. The large scale model for the entire Mediterranean basin (MFS, Mediterranean Forecasting system, operated by INGV, e.g. Tonani et al. 2008, Oddo et al. 2009) provides lateral open boundary conditions to the regional model for Adriatic and Ionian seas (AIFS, Adriatic Ionian Forecasting System) which provides the open-sea fields (initial conditions and lateral open boundary conditions) to SANIFS. The latter, here presented, is a coastal ocean model based on SHYFEM (Shallow HYdrodynamics Finite Element Model) code, which is an unstructured grid, finite element three-dimensional hydrodynamic model (e.g. Umgiesser et al., 2004, Ferrarin et al., 2013). The SANIFS hydrodynamic model component has been designed to provide accurate information of hydrodynamics and active tracer fields in the coastal waters of Southern Eastern Italy (Apulia, Basilicata and Calabria regions), where the model is characterized by a resolution of about of 200-500 m. The horizontal resolution is also accurate in open-sea areas, where the elements size is approximately 3 km. During the development phase the model has been initialized and forced at the lateral open boundaries through a full nesting strategy directly with the MFS fields. The heat fluxes has been computed by bulk formulae using as input data the operational analyses of European Centre for Medium-Range Weather Forecasts. Short range pre-operational forecast tests have been performed in different seasons to evaluate the robustness of the implemented model in different oceanographic conditions. Model results are validated by means of comparison with MFS operational results and observations. The model is able to reproduce the large-scale oceanographic structures of the area (keeping similar structures of MFS in open sea), while in the coastal area significant improvements in terms of reproduced structures and dynamics are evident.
Shi, Yuan; Liu, Xu; Kok, Suet-Yheng; Rajarethinam, Jayanthi; Liang, Shaohong; Yap, Grace; Chong, Chee-Seng; Lee, Kim-Sung; Tan, Sharon S Y; Chin, Christopher Kuan Yew; Lo, Andrew; Kong, Waiming; Ng, Lee Ching; Cook, Alex R
2016-09-01
With its tropical rainforest climate, rapid urbanization, and changing demography and ecology, Singapore experiences endemic dengue; the last large outbreak in 2013 culminated in 22,170 cases. In the absence of a vaccine on the market, vector control is the key approach for prevention. We sought to forecast the evolution of dengue epidemics in Singapore to provide early warning of outbreaks and to facilitate the public health response to moderate an impending outbreak. We developed a set of statistical models using least absolute shrinkage and selection operator (LASSO) methods to forecast the weekly incidence of dengue notifications over a 3-month time horizon. This forecasting tool used a variety of data streams and was updated weekly, including recent case data, meteorological data, vector surveillance data, and population-based national statistics. The forecasting methodology was compared with alternative approaches that have been proposed to model dengue case data (seasonal autoregressive integrated moving average and step-down linear regression) by fielding them on the 2013 dengue epidemic, the largest on record in Singapore. Operationally useful forecasts were obtained at a 3-month lag using the LASSO-derived models. Based on the mean average percentage error, the LASSO approach provided more accurate forecasts than the other methods we assessed. We demonstrate its utility in Singapore's dengue control program by providing a forecast of the 2013 outbreak for advance preparation of outbreak response. Statistical models built using machine learning methods such as LASSO have the potential to markedly improve forecasting techniques for recurrent infectious disease outbreaks such as dengue. Shi Y, Liu X, Kok SY, Rajarethinam J, Liang S, Yap G, Chong CS, Lee KS, Tan SS, Chin CK, Lo A, Kong W, Ng LC, Cook AR. 2016. Three-month real-time dengue forecast models: an early warning system for outbreak alerts and policy decision support in Singapore. Environ Health Perspect 124:1369-1375; http://dx.doi.org/10.1289/ehp.1509981.
Extended-range forecasting of Chinese summer surface air temperature and heat waves
NASA Astrophysics Data System (ADS)
Zhu, Zhiwei; Li, Tim
2018-03-01
Because of growing demand from agricultural planning, power management and activity scheduling, extended-range (5-30-day lead) forecasting of summer surface air temperature (SAT) and heat waves over China is carried out in the present study via spatial-temporal projection models (STPMs). Based on the training data during 1960-1999, the predictability sources are found to propagate from Europe, Northeast Asia, and the tropical Pacific, to influence the intraseasonal 10-80 day SAT over China. STPMs are therefore constructed using the projection domains, which are determined by these previous predictability sources. For the independent forecast period (2000-2013), the STPMs can reproduce EOF-filtered 30-80 day SAT at all lead times of 5-30 days over most part of China, and observed 30-80 and 10-80 day SAT at 25-30 days over eastern China. Significant pattern correlation coefficients account for more than 50% of total forecasts at all 5-30-day lead times against EOF-filtered and observed 30-80 day SAT, and at a 20-day lead time against observed 10-80 day SAT. The STPMs perform poorly in reproducing 10-30 day SAT. Forecasting for the first two modes of 10-30 day SAT only shows useful skill within a 15-day lead time. Forecasting for the third mode of 10-30 day SAT is useless after a 10-day lead time. The forecasted heat waves over China are determined by the reconstructed SAT which is the summation of the forecasted 10-80 day SAT and the lower frequency (longer than 80-day) climatological SAT. Over a large part of China, the STPMs can forecast more than 30% of heat waves within a 15-day lead time. In general, the STPMs demonstrate the promising skill for extended-range forecasting of Chinese summer SAT and heat waves.
Flood Forecast Accuracy and Decision Support System Approach: the Venice Case
NASA Astrophysics Data System (ADS)
Canestrelli, A.; Di Donato, M.
2016-02-01
In the recent years numerical models for weather predictions have experienced continuous advances in technology. As a result, all the disciplines making use of weather forecasts have made significant steps forward. In the case of the Safeguard of Venice, a large effort has been put in order to improve the forecast of tidal levels. In this context, the Istituzione Centro Previsioni e Segnalazioni Maree (ICPSM) of the Venice Municipality has developed and tested many different forecast models, both of the statistical and deterministic type, and has shown to produce very accurate forecasts. For Venice, the maximum admissible forecast error should be (ideally) of the order of ten centimeters at 24 hours. The entity of the forecast error clearly affects the decisional process, which mainly consists of alerting the population, activating the movable barriers installed at the three tidal inlets and contacting the port authority. This process becomes more challenging whenever the weather predictions, and therefore the water level forecasts, suddenly change. These new forecasts have to be quickly transformed into operational tasks. Therefore, it is of the utter importance to set up scheduled alerts and emergency plans by means of easy-to-follow procedures. On this direction, Technital has set up a Decision Support System based on expert procedures that minimizes the human mistakes and, as a consequence, reduces the risk of flooding of the historical center. Moreover, the Decision Support System can communicate predefined alerts to all the interested subjects. The System uses the water levels forecasts produced by the ICPSM by taking into account the accuracy at different leading times. The Decision Support System has been successfully tested with 8 years of data, 6 of them in real time. Venice experience shows that the Decision Support System is an essential tool which assesses the risks associated with a particular event, provides clear operational procedures and minimizes the impact of natural floods on human lives, private properties and historical monuments.
A Global Aerosol Model Forecast for the ACE-Asia Field Experiment
NASA Technical Reports Server (NTRS)
Chin, Mian; Ginoux, Paul; Lucchesi, Robert; Huebert, Barry; Weber, Rodney; Anderson, Tad; Masonis, Sarah; Blomquist, Byron; Bandy, Alan; Thornton, Donald
2003-01-01
We present the results of aerosol forecast during the Aerosol Characterization Experiment (ACE-Asia) field experiment in spring 2001, using the Georgia Tech/Goddard Global Ozone Chemistry Aerosol Radiation and Transport (GOCART) model and the meteorological forecast fields from the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The aerosol model forecast provides direct information on aerosol optical thickness and concentrations, enabling effective flight planning, while feedbacks from measurements constantly evaluate the model, making successful model improvements. We verify the model forecast skill by comparing model predicted total aerosol extinction, dust, sulfate, and SO2 concentrations with those quantities measured by the C-130 aircraft during the ACE-Asia intensive operation period. The GEOS DAS meteorological forecast system shows excellent skills in predicting winds, relative humidity, and temperature for the ACE-Asia experiment area as well as for each individual flight, with skill scores usually above 0.7. The model is also skillful in forecast of pollution aerosols, with most scores above 0.5. The model correctly predicted the dust outbreak events and their trans-Pacific transport, but it constantly missed the high dust concentrations observed in the boundary layer. We attribute this missing dust source to the desertification regions in the Inner Mongolia Province in China, which have developed in recent years but were not included in the model during forecasting. After incorporating the desertification sources, the model is able to reproduce the observed high dust concentrations at low altitudes over the Yellow Sea. Two key elements for a successful aerosol model forecast are correct source locations that determine where the emissions take place, and realistic forecast winds and convection that determine where the aerosols are transported. We demonstrate that our global model can not only account for the large-scale intercontinental transport, but also produce the small-scale spatial and temporal variations that are adequate for aircraft measurements planning.
Modeling, Forecasting and Mitigating Extreme Earthquakes
NASA Astrophysics Data System (ADS)
Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.
2012-12-01
Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).
Operational hydrological forecasting in Bavaria. Part I: Forecast uncertainty
NASA Astrophysics Data System (ADS)
Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.
2009-04-01
In Bavaria, operational flood forecasting has been established since the disastrous flood of 1999. Nowadays, forecasts based on rainfall information from about 700 raingauges and 600 rivergauges are calculated and issued for nearly 100 rivergauges. With the added experience of the 2002 and 2005 floods, awareness grew that the standard deterministic forecast, neglecting the uncertainty associated with each forecast is misleading, creating a false feeling of unambiguousness. As a consequence, a system to identify, quantify and communicate the sources and magnitude of forecast uncertainty has been developed, which will be presented in part I of this study. In this system, the use of ensemble meteorological forecasts plays a key role which will be presented in part II. Developing the system, several constraints stemming from the range of hydrological regimes and operational requirements had to be met: Firstly, operational time constraints obviate the variation of all components of the modeling chain as would be done in a full Monte Carlo simulation. Therefore, an approach was chosen where only the most relevant sources of uncertainty were dynamically considered while the others were jointly accounted for by static error distributions from offline analysis. Secondly, the dominant sources of uncertainty vary over the wide range of forecasted catchments: In alpine headwater catchments, typically of a few hundred square kilometers in size, rainfall forecast uncertainty is the key factor for forecast uncertainty, with a magnitude dynamically changing with the prevailing predictability of the atmosphere. In lowland catchments encompassing several thousands of square kilometers, forecast uncertainty in the desired range (usually up to two days) is mainly dependent on upstream gauge observation quality, routing and unpredictable human impact such as reservoir operation. The determination of forecast uncertainty comprised the following steps: a) From comparison of gauge observations and several years of archived forecasts, overall empirical error distributions termed 'overall error' were for each gauge derived for a range of relevant forecast lead times. b) The error distributions vary strongly with the hydrometeorological situation, therefore a subdivision into the hydrological cases 'low flow, 'rising flood', 'flood', flood recession' was introduced. c) For the sake of numerical compression, theoretical distributions were fitted to the empirical distributions using the method of moments. Here, the normal distribution was generally best suited. d) Further data compression was achieved by representing the distribution parameters as a function (second-order polynome) of lead time. In general, the 'overall error' obtained from the above procedure is most useful in regions where large human impact occurs and where the influence of the meteorological forecast is limited. In upstream regions however, forecast uncertainty is strongly dependent on the current predictability of the atmosphere, which is contained in the spread of an ensemble forecast. Including this dynamically in the hydrological forecast uncertainty estimation requires prior elimination of the contribution of the weather forecast to the 'overall error'. This was achieved by calculating long series of hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The resulting error distribution is termed 'model error' and can be applied on hydrological ensemble forecasts, where ensemble rainfall forecasts are used as forcing. The concept will be illustrated by examples (good and bad ones) covering a wide range of catchment sizes, hydrometeorological regimes and quality of hydrological model calibration. The methodology to combine the static and dynamic shares of uncertainty will be presented in part II of this study.
NASA Astrophysics Data System (ADS)
Zhu, Yi-Qing; Liang, Wei-Feng; Zhang, Song
2018-01-01
Using multiple-scale mobile gravity data in the Sichuan-Yunnan area, we systematically analyzed the relationships between spatial-temporal gravity changes and the 2014 Ludian, Yunnan Province Ms6.5 earthquake and the 2014 Kangding Ms6.3, 2013 Lushan Ms7.0, and 2008 Wenchuan Ms8.0 earthquakes in Sichuan Province. Our main results are as follows. (1) Before the occurrence of large earthquakes, gravity anomalies occur in a large area around the epicenters. The directions of gravity change gradient belts usually agree roughly with the directions of the main fault zones of the study area. Such gravity changes might reflect the increase of crustal stress, as well as the significant active tectonic movements and surface deformations along fault zones, during the period of gestation of great earthquakes. (2) Continuous significant changes of the multiple-scale gravity fields, as well as greater gravity changes with larger time scales, can be regarded as medium-range precursors of large earthquakes. The subsequent large earthquakes always occur in the area where the gravity changes greatly. (3) The spatial-temporal gravity changes are very useful in determining the epicenter of coming large earthquakes. The large gravity networks are useful to determine the general areas of coming large earthquakes. However, the local gravity networks with high spatial-temporal resolution are suitable for determining the location of epicenters. Therefore, denser gravity observation networks are necessary for better forecasts of the epicenters of large earthquakes. (4) Using gravity changes from mobile observation data, we made medium-range forecasts of the Kangding, Ludian, Lushan, and Wenchuan earthquakes, with especially successful forecasts of the location of their epicenters. Based on the above discussions, we emphasize that medium-/long-term potential for large earthquakes might exist nowadays in some areas with significant gravity anomalies in the study region. Thus, the monitoring should be strengthened.
Comparative verification between GEM model and official aviation terminal forecasts
NASA Technical Reports Server (NTRS)
Miller, Robert G.
1988-01-01
The Generalized Exponential Markov (GEM) model uses the local standard airways observation (SAO) to predict hour-by-hour the following elements: temperature, pressure, dew point depression, first and second cloud-layer height and amount, ceiling, total cloud amount, visibility, wind, and present weather conditions. GEM is superior to persistence at all projections for all elements in a large independent sample. A minute-by-minute GEM forecasting system utilizing the Automated Weather Observation System (AWOS) is under development.
Pollitz, Fred
2012-01-01
Synthetic seismicity simulations have been explored by the Southern California Earthquake Center (SCEC) Earthquake Simulators Group in order to guide long‐term forecasting efforts related to the Unified California Earthquake Rupture Forecast (Tullis et al., 2012a). In this study I describe the viscoelastic earthquake simulator (ViscoSim) of Pollitz, 2009. Recapitulating to a large extent material previously presented by Pollitz (2009, 2011) I describe its implementation of synthetic ruptures and how it differs from other simulators being used by the group.
Visualizing complex hydrodynamic features
NASA Astrophysics Data System (ADS)
Kempf, Jill L.; Marshall, Robert E.; Yen, Chieh-Cheng
1990-08-01
The Lake Erie Forecasting System is a cooperative project by university, private and governmental institutions to provide continuous forecasting of three-dimensional structure within the lake. The forecasts will include water velocity and temperature distributions throughout the body of water, as well as water level and wind-wave distributions at the lake's surface. Many hydrodynamic features can be extracted from this data, including coastal jets, large-scale thermocline motion and zones of upwelling and downwelling. A visualization system is being developed that will aid in understanding these features and their interactions. Because of the wide variety of features, they cannot all be adequately represented by a single rendering technique. Particle tracing, surface rendering, and volumetric techniques are all necessary. This visualization effortis aimed towards creating a system that will provide meaningful forecasts for those using the lake for recreational and commercial purposes. For example, the fishing industry needs to know about large-scale thermocline motion in order to find the best fishing areas and power plants need to know water intAke temperatures. The visualization system must convey this information in a manner that is easily understood by these users. Scientists must also be able to use this system to verify their hydrodynamic simulation. The focus of the system, therefore, is to provide the information to serve these diverse interests, without overwhelming any single user with unnecessary data.
Nowcasting of rainfall and of combined sewage flow in urban drainage systems.
Achleitner, Stefan; Fach, Stefan; Einfalt, Thomas; Rauch, Wolfgang
2009-01-01
Nowcasting of rainfall may be used additionally to online rain measurements to optimize the operation of urban drainage systems. Uncertainties quoted for the rain volume are in the range of 5% to 10% mean square error (MSE), where for rain intensities 45% to 75% MSE are noted. For larger forecast periods up to 3 hours, the uncertainties will increase up to some hundred percents. Combined with the growing number of real time control concepts in sewer systems, rainfall forecast is used more and more in urban drainage systems. Therefore it is of interest how the uncertainties influence the final evaluation of a defined objective function. Uncertainty levels associated with the forecast itself are not necessarily transferable to resulting uncertainties in the catchment's flow dynamics. The aim of this paper is to analyse forecasts of rainfall and specific sewer output variables. For this study the combined sewer system of the city of Linz in the northern part of Austria located on the Danube has been selected. The city itself represents a total area of 96 km2 with 39 municipalities connected. It was found that the available weather radar data leads to large deviations in the forecast for precipitation at forecast horizons larger than 90 minutes. The same is true for sewer variables such a CSO overflow for small sub-catchments. Although the results improve for larger spatial scales, acceptable levels at forecast horizons larger than 90 minutes are not reached.
Olshansky, S Jay; Goldman, Dana P; Zheng, Yuhui; Rowe, John W
2009-01-01
Context: The aging of the baby boom generation, the extension of life, and progressive increases in disability-free life expectancy have generated a dramatic demographic transition in the United States. Official government forecasts may, however, have inadvertently underestimated life expectancy, which would have major policy implications, since small differences in forecasts of life expectancy produce very large differences in the number of people surviving to an older age. This article presents a new set of population and life expectancy forecasts for the United States, focusing on transitions that will take place by midcentury. Methods: Forecasts were made with a cohort-components methodology, based on the premise that the risk of death will be influenced in the coming decades by accelerated advances in biomedical technology that either delay the onset and age progression of major fatal diseases or that slow the aging process itself. Findings: Results indicate that the current forecasts of the U.S. Social Security Administration and U.S. Census Bureau may underestimate the rise in life expectancy at birth for men and women combined, by 2050, from 3.1 to 7.9 years. Conclusions: The cumulative outlays for Medicare and Social Security could be higher by $3.2 to $8.3 trillion relative to current government forecasts. This article discusses the implications of these results regarding the benefits and costs of an aging society and the prospect that health disparities could attenuate some of these changes. PMID:20021588
Decadal-Scale Forecasting of Climate Drivers for Marine Applications.
Salinger, J; Hobday, A J; Matear, R J; O'Kane, T J; Risbey, J S; Dunstan, P; Eveson, J P; Fulton, E A; Feng, M; Plagányi, É E; Poloczanska, E S; Marshall, A G; Thompson, P A
Climate influences marine ecosystems on a range of time scales, from weather-scale (days) through to climate-scale (hundreds of years). Understanding of interannual to decadal climate variability and impacts on marine industries has received less attention. Predictability up to 10 years ahead may come from large-scale climate modes in the ocean that can persist over these time scales. In Australia the key drivers of climate variability affecting the marine environment are the Southern Annular Mode, the Indian Ocean Dipole, the El Niño/Southern Oscillation, and the Interdecadal Pacific Oscillation, each has phases that are associated with different ocean circulation patterns and regional environmental variables. The roles of these drivers are illustrated with three case studies of extreme events-a marine heatwave in Western Australia, a coral bleaching of the Great Barrier Reef, and flooding in Queensland. Statistical and dynamical approaches are described to generate forecasts of climate drivers that can subsequently be translated to useful information for marine end users making decisions at these time scales. Considerable investment is still needed to support decadal forecasting including improvement of ocean-atmosphere models, enhancement of observing systems on all scales to support initiation of forecasting models, collection of important biological data, and integration of forecasts into decision support tools. Collaboration between forecast developers and marine resource sectors-fisheries, aquaculture, tourism, biodiversity management, infrastructure-is needed to support forecast-based tactical and strategic decisions that reduce environmental risk over annual to decadal time scales. © 2016 Elsevier Ltd. All rights reserved.
Application of a GCM Ensemble Seasonal Climate Forecasts to Crop Yield Prediction in East Africa
NASA Astrophysics Data System (ADS)
Ogutu, G.; Franssen, W.; Supit, I.; Hutjes, R. W. A.
2016-12-01
We evaluated the potential use of ECMWF System-4 seasonal climate forecasts (S4) for impacts analysis over East Africa. Using the 15 member, 7 months ensemble forecasts initiated every month for 1981-2010, we tested precipitation (tp), air temperature (tas) and surface shortwave radiation (rsds) forecast skill against the WATCH forcing Data ERA-Interim (WFDEI) re-analysis and other data. We used these forecasts as input in the WOFOST crop model to predict maize yields. Forecast skill is assessed using anomaly correlation (ACC), Ranked Probability Skill Score (RPSS) and the Relative Operating Curve Skill Score (ROCSS) for MAM, JJA and OND growing seasons. Predicted maize yields (S4-yields) are verified against historical observed FAO and nationally reported (NAT) yield statistics, and yields from the same crop model forced by WFDEI (WFDEI-yields). Predictability of the climate forecasts vary with season, location and lead-time. The OND tp forecasts show skill over a larger area up to three months lead-time compared to MAM and JJA. Upper- and lower-tercile tp forecasts are 20-80% better than climatology. Good tas forecast skill is apparent with three months lead-time. The rsds is less skillful than tp and tas in all seasons when verified against WFDEI but higher against others. S4-forecasts captures ENSO related anomalous years with region dependent skill. Anomalous ENSO influence is also seen in simulated yields. Focussing on the main sowing dates in the northern (July), equatorial (March-April) and southern (December) regions, WFDEI-yields are lower than FAO and NAT but anomalies are comparable. Yield anomalies are predictable 3-months before sowing in most of the regions. Differences in interannual variability in the range of ±40% may be related to sensitivity of WOFOST to drought stress while the ACCs are largely positive ranging from 0.3 to 0.6. Above and below-normal yields are predictable with 2-months lead time. We evidenced a potential use of seasonal climate forecasts with a crop simulation model to predict anomalous maize yields over East Africa. The findings open a window to better use of climate forecasts in food security early warning systems, and pre-season policy and farm management decisions.
NASA Astrophysics Data System (ADS)
Barabanova, Olga
2013-04-01
Nowadays the Main Aviation Meteorological Centre in Moscow (MAMC) provides forecasts of icing conditions in Moscow Region airports using information of surface observation network, weather radars and atmospheric sounding. Unfortunately, satellite information is not used properly in aviation meteorological offices in Moscow Region: weather forecasters deal with satellites images of cloudiness only. The main forecasters of MAMC realise that it is necessary to employ meteorological satellite numerical data from different channels in aviation forecasting and especially in nowcasting. Algorithm of nowcasting aircraft in-flight icing conditions has been developed using data from geostationary meteorological satellites "Meteosat-7" and "Meteosat-9". The algorithm is based on the brightness temperature differences. Calculation of brightness temperature differences help to discriminate clouds with supercooled large drops where severe icing conditions are most likely. Due to the lack of visible channel data, the satellite icing detection methods will be less accurate at night. Besides this method is limited by optically thick ice clouds where it is not possible to determine the extent to which supercooled large drops exists within the underlying clouds. However, we determined that most of the optically thick cases are associated with convection or mid-latitude cyclones and they will nearly always have a layer where which supercooled large drops exists with an icing threat. This product is created hourly for the Moscow Air Space and mark zones with moderate or severe icing hazards. The results were compared with mesoscale numerical atmospheric model COSMO-RU output. Verification of the algorithms results using aircraft pilot reports shows that this algorithm is a good instrument for the operational practise in aviation meteorological offices in Moscow Region. The satellite-based algorithms presented here can be used in real time to diagnose areas of icing for pilots to avoid.
Big data driven cycle time parallel prediction for production planning in wafer manufacturing
NASA Astrophysics Data System (ADS)
Wang, Junliang; Yang, Jungang; Zhang, Jie; Wang, Xiaoxi; Zhang, Wenjun Chris
2018-07-01
Cycle time forecasting (CTF) is one of the most crucial issues for production planning to keep high delivery reliability in semiconductor wafer fabrication systems (SWFS). This paper proposes a novel data-intensive cycle time (CT) prediction system with parallel computing to rapidly forecast the CT of wafer lots with large datasets. First, a density peak based radial basis function network (DP-RBFN) is designed to forecast the CT with the diverse and agglomerative CT data. Second, the network learning method based on a clustering technique is proposed to determine the density peak. Third, a parallel computing approach for network training is proposed in order to speed up the training process with large scaled CT data. Finally, an experiment with respect to SWFS is presented, which demonstrates that the proposed CTF system can not only speed up the training process of the model but also outperform the radial basis function network, the back-propagation-network and multivariate regression methodology based CTF methods in terms of the mean absolute deviation and standard deviation.
Earthquake Forecasting Through Semi-periodicity Analysis of Labeled Point Processes
NASA Astrophysics Data System (ADS)
Quinteros Cartaya, C. B. M.; Nava Pichardo, F. A.; Glowacka, E.; Gomez-Trevino, E.
2015-12-01
Large earthquakes have semi-periodic behavior as result of critically self-organized processes of stress accumulation and release in some seismogenic region. Thus, large earthquakes in a region constitute semi-periodic sequences with recurrence times varying slightly from periodicity. Nava et al., 2013 and Quinteros et al., 2013 realized that not all earthquakes in a given region need belong to the same sequence, since there can be more than one process of stress accumulation and release in it; they also proposed a method to identify semi-periodic sequences through analytic Fourier analysis. This work presents improvements on the above-mentioned method: the influence of earthquake size on the spectral analysis, and its importance in semi-periodic events identification, which means that earthquake occurrence times are treated as a labeled point process; the estimation of appropriate upper limit uncertainties to use in forecasts; and the use of Bayesian analysis to evaluate the forecast performance. This improved method is applied to specific regions: the southwestern coast of Mexico, the northeastern Japan Arc, the San Andreas Fault zone at Parkfield, and northeastern Venezuela.
Multiyear predictability of tropical marine productivity
Séférian, Roland; Bopp, Laurent; Gehlen, Marion; Swingedouw, Didier; Mignot, Juliette; Guilyardi, Eric; Servonnat, Jérôme
2014-01-01
With the emergence of decadal predictability simulations, research toward forecasting variations of the climate system now covers a large range of timescales. However, assessment of the capacity to predict natural variations of relevant biogeochemical variables like carbon fluxes, pH, or marine primary productivity remains unexplored. Among these, the net primary productivity (NPP) is of particular relevance in a forecasting perspective. Indeed, in regions like the tropical Pacific (30°N–30°S), NPP exhibits natural fluctuations at interannual to decadal timescales that have large impacts on marine ecosystems and fisheries. Here, we investigate predictions of NPP variations over the last decades (i.e., from 1997 to 2011) with an Earth system model within the tropical Pacific. Results suggest a predictive skill for NPP of 3 y, which is higher than that of sea surface temperature (1 y). We attribute the higher predictability of NPP to the poleward advection of nutrient anomalies (nitrate and iron), which sustain fluctuations in phytoplankton productivity over several years. These results open previously unidentified perspectives to the development of science-based management approaches to marine resources relying on integrated physical-biogeochemical forecasting systems. PMID:25071174
Hurricane track forecast cones from fluctuations
Meuel, T.; Prado, G.; Seychelles, F.; Bessafi, M.; Kellay, H.
2012-01-01
Trajectories of tropical cyclones may show large deviations from predicted tracks leading to uncertainty as to their landfall location for example. Prediction schemes usually render this uncertainty by showing track forecast cones representing the most probable region for the location of a cyclone during a period of time. By using the statistical properties of these deviations, we propose a simple method to predict possible corridors for the future trajectory of a cyclone. Examples of this scheme are implemented for hurricane Ike and hurricane Jimena. The corridors include the future trajectory up to at least 50 h before landfall. The cones proposed here shed new light on known track forecast cones as they link them directly to the statistics of these deviations. PMID:22701776
Hospital output forecasts and the cost of empty hospital beds.
Pauly, M V; Wilson, P
1986-01-01
This article investigates the cost incurred when hospitals have different levels of beds to treat a given number of patients. The cost of hospital care is affected by both the forecasted level of admissions and the actual number of admissions. When the relationship between forecasted and actual admissions is held constant, it is found that an empty hospital bed at a typical hospital in Michigan has a relatively low cost, about 13 percent or less of the cost of an occupied bed. However, empty beds in large hospitals do add significantly to cost. If hospital beds are closed, whether by closing beds at hospitals which remain in business or by closing entire hospitals, cost savings are estimated to be small. PMID:3759473
Gan, Ruijing; Chen, Ni; Huang, Daizheng
2016-01-01
This study compares and evaluates the prediction of hepatitis in Guangxi Province, China by using back propagation neural networks based genetic algorithm (BPNN-GA), generalized regression neural networks (GRNN), and wavelet neural networks (WNN). In order to compare the results of forecasting, the data obtained from 2004 to 2013 and 2014 were used as modeling and forecasting samples, respectively. The results show that when the small data set of hepatitis has seasonal fluctuation, the prediction result by BPNN-GA will be better than the two other methods. The WNN method is suitable for predicting the large data set of hepatitis that has seasonal fluctuation and the same for the GRNN method when the data increases steadily.
NASA Astrophysics Data System (ADS)
Guron, Marta
There is a need for new synthetic routes to high boron content materials for applications as polymeric precursors to ceramics, as well as in neutron shielding and potential medical applications. To this end, new ruthenium-catalyzed olefin metathesis routes have been devised to form new complex polyboranes and polymeric species. Metathesis of di-alkenyl substituted o-carboranes allowed the synthesis of ring-closed products fused to the carborane cage, many of which are new compounds and one that offers a superior synthetic method to one previously published. Acyclic diene metathesis of di-alkenyl substituted m-carboranes resulted in the formation of new main-chain carborane-containing polymers of modest molecular weights. Due to their extremely low char yields, and in order to explore other metathesis routes, ring opening metathesis polymerization (ROMP) was used to generate the first examples of poly(norbornenyl- o-carboranes). Monomer synthesis was achieved via a two-step process, incorporating Ti-catalyzed hydroboration to make 6-(5-norbornenyl)-decaborane, followed by alkyne insertion in ionic liquid media to achieve 1,2-R2 -3-norbornenyl o-carborane species. The monomers were then polymerized using ROMP to afford several examples of poly(norbornenyl- o-carboranes) with relatively high molecular weights. One such polymer, [1-Ph, 3-(=CH2-C5H7-CH2=)-1,2-C 2B10H10]n, had a char yield very close to the theoretical char yield of 44%. Upon random copolymerization with poly(6-(5-norbornenyl) decaborane), char yields significantly increased to 80%, but this number was well above the theoretical value implicating the formation of a boron-carbide/carbon ceramic. Finally, applications of polyboranes were explored via polymer blends toward the synthesis of ceramic composites and the use of polymer precursors as reagents for potential ultra high temperature ceramic applications. Upon pyrolysis, polymer blends of poly(6-(5-norbornenyl)-decaborane) and poly(methylcarbosilane) converted into boron-carbide/silicon-carbide ceramics with high char yields. These polymer blends were also shown to be useful as reagents for synthesis of hafnium-boride/hafnium-carbide/silicon carbide and zirconium-boride/zirconium-carbide/silicon carbide composites.
The Contribution of Soil Moisture Information to Forecast Skill: Two Studies
NASA Technical Reports Server (NTRS)
Koster, Randal
2010-01-01
This talk briefly describes two recent studies on the impact of soil moisture information on hydrological and meteorological prediction. While the studies utilize soil moisture derived from the integration of large-scale land surface models with observations-based meteorological data, the results directly illustrate the potential usefulness of satellite-derived soil moisture information (e.g., from SMOS and SMAP) for applications in prediction. The first study, the GEWEX- and ClIVAR-sponsored GLACE-2 project, quantifies the contribution of realistic soil moisture initialization to skill in subseasonal forecasts of precipitation and air temperature (out to two months). The multi-model study shows that soil moisture information does indeed contribute skill to the forecasts, particularly for air temperature, and particularly when the initial local soil moisture anomaly is large. Furthermore, the skill contributions tend to be larger where the soil moisture initialization is more accurate, as measured by the density of the observational network contributing to the initialization. The second study focuses on streamflow prediction. The relative contributions of snow and soil moisture initialization to skill in streamflow prediction at seasonal lead, in the absence of knowledge of meteorological anomalies during the forecast period, were quantified with several land surface models using uniquely designed numerical experiments and naturalized streamflow data covering mUltiple decades over the western United States. In several basins, accurate soil moisture initialization is found to contribute significant levels of predictive skill. Depending on the date of forecast issue, the contributions can be significant out to leads of six months. Both studies suggest that improvements in soil moisture initialization would lead to increases in predictive skill. The relevance of SMOS and SMAP satellite-based soil moisture information to prediction are discussed in the context of these studies.
The System of Inventory Forecasting in PT. XYZ by using the Method of Holt Winter Multiplicative
NASA Astrophysics Data System (ADS)
Shaleh, W.; Rasim; Wahyudin
2018-01-01
Problems at PT. XYZ currently only rely on manual bookkeeping, then the cost of production will swell and all investments invested to be less to predict sales and inventory of goods. If the inventory prediction of goods is to large, then the cost of production will swell and all investments invested to be less efficient. Vice versa, if the inventory prediction is too small it will impact on consumers, so that consumers are forced to wait for the desired product. Therefore, in this era of globalization, the development of computer technology has become a very important part in every business plan. Almost of all companies, both large and small, use computer technology. By utilizing computer technology, people can make time in solving complex business problems. Computer technology for companies has become an indispensable activity to provide enhancements to the business services they manage but systems and technologies are not limited to the distribution model and data processing but the existing system must be able to analyze the possibilities of future company capabilities. Therefore, the company must be able to forecast conditions and circumstances, either from inventory of goods, force, or profits to be obtained. To forecast it, the data of total sales from December 2014 to December 2016 will be calculated by using the method of Holt Winters, which is the method of time series prediction (Multiplicative Seasonal Method) it is seasonal data that has increased and decreased, also has 4 equations i.e. Single Smoothing, Trending Smoothing, Seasonal Smoothing and Forecasting. From the results of research conducted, error value in the form of MAPE is below 1%, so it can be concluded that forecasting with the method of Holt Winter Multiplicative.
NASA Astrophysics Data System (ADS)
Martin, A.; Ralph, F. M.; Lavers, D. A.; Kalansky, J.; Kawzenuk, B.
2015-12-01
The previous ten years has seen an explosion in research devoted to the Atmospheric River (AR) phenomena, features of the midlatitude circulation responsible for large horizontal water vapor transport. Upon landfall, ARs can be associated with 30-50% of annual precipitation in some regions, while also causing the largest flooding events in places such as coastal California. Little discussed is the role secondary frontal waves play in modulating precipitation during a landfalling AR. Secondary frontal waves develop along an existing cold front in response to baroclinic frontogenesis, often coinciding with a strong upper-tropospheric jet. If the secondary wave develops along a front associated with a landfalling AR, the resulting precipitation may be much greater or much less than originally forecasted - especially in regions where orographic uplift of horizontally transported water vapor is responsible for a large portion of precipitation. In this study, we present several cases of secondary frontal waves that have occurred in conjunction with a landfalling AR on the US West Coast. We put the impact of these cases in historical perspective using quantitative precipitation forecasts, satellite data, reanalyses, and estimates of damage related to flooding. We also discuss the dynamical mechanisms behind secondary frontal wave development and relate these mechanisms to the high spatiotemporal variability in precipitation observed during ARs with secondary frontal waves. Finally, we demonstrate that even at lead times less than 24 hours, current quantitative precipitation forecasting methods have difficulty accurately predicting the rainfall in the area near the secondary wave landfall, in some cases leading to missed or false alarm flood warnings, and suggest methods which may improve quantitative precipitation forecasts for this type of system in the future.
NASA Astrophysics Data System (ADS)
Mekanik, F.; Imteaz, M. A.; Gato-Trinidad, S.; Elmahdi, A.
2013-10-01
In this study, the application of Artificial Neural Networks (ANN) and Multiple regression analysis (MR) to forecast long-term seasonal spring rainfall in Victoria, Australia was investigated using lagged El Nino Southern Oscillation (ENSO) and Indian Ocean Dipole (IOD) as potential predictors. The use of dual (combined lagged ENSO-IOD) input sets for calibrating and validating ANN and MR Models is proposed to investigate the simultaneous effect of past values of these two major climate modes on long-term spring rainfall prediction. The MR models that did not violate the limits of statistical significance and multicollinearity were selected for future spring rainfall forecast. The ANN was developed in the form of multilayer perceptron using Levenberg-Marquardt algorithm. Both MR and ANN modelling were assessed statistically using mean square error (MSE), mean absolute error (MAE), Pearson correlation (r) and Willmott index of agreement (d). The developed MR and ANN models were tested on out-of-sample test sets; the MR models showed very poor generalisation ability for east Victoria with correlation coefficients of -0.99 to -0.90 compared to ANN with correlation coefficients of 0.42-0.93; ANN models also showed better generalisation ability for central and west Victoria with correlation coefficients of 0.68-0.85 and 0.58-0.97 respectively. The ability of multiple regression models to forecast out-of-sample sets is compatible with ANN for Daylesford in central Victoria and Kaniva in west Victoria (r = 0.92 and 0.67 respectively). The errors of the testing sets for ANN models are generally lower compared to multiple regression models. The statistical analysis suggest the potential of ANN over MR models for rainfall forecasting using large scale climate modes.
NASA Astrophysics Data System (ADS)
Abhilash, S.; Sahai, A. K.; Borah, N.; Chattopadhyay, R.; Joseph, S.; Sharmila, S.; De, S.; Goswami, B. N.; Kumar, Arun
2014-05-01
An ensemble prediction system (EPS) is devised for the extended range prediction (ERP) of monsoon intraseasonal oscillations (MISO) of Indian summer monsoon (ISM) using National Centers for Environmental Prediction Climate Forecast System model version 2 at T126 horizontal resolution. The EPS is formulated by generating 11 member ensembles through the perturbation of atmospheric initial conditions. The hindcast experiments were conducted at every 5-day interval for 45 days lead time starting from 16th May to 28th September during 2001-2012. The general simulation of ISM characteristics and the ERP skill of the proposed EPS at pentad mean scale are evaluated in the present study. Though the EPS underestimates both the mean and variability of ISM rainfall, it simulates the northward propagation of MISO reasonably well. It is found that the signal-to-noise ratio of the forecasted rainfall becomes unity by about 18 days. The potential predictability error of the forecasted rainfall saturates by about 25 days. Though useful deterministic forecasts could be generated up to 2nd pentad lead, significant correlations are found even up to 4th pentad lead. The skill in predicting large-scale MISO, which is assessed by comparing the predicted and observed MISO indices, is found to be ~17 days. It is noted that the prediction skill of actual rainfall is closely related to the prediction of large-scale MISO amplitude as well as the initial conditions related to the different phases of MISO. An analysis of categorical prediction skills reveals that break is more skillfully predicted, followed by active and then normal. The categorical probability skill scores suggest that useful probabilistic forecasts could be generated even up to 4th pentad lead.
NASA Astrophysics Data System (ADS)
Castelletti, A.; Giuliani, M.; Block, P. J.
2017-12-01
Increasingly uncertain hydrologic regimes combined with more frequent and intense extreme events are challenging water systems management worldwide, emphasizing the need of accurate medium- to long-term predictions to timely prompt anticipatory operations. Despite modern forecasts are skillful over short lead time (from hours to days), predictability generally tends to decrease on longer lead times. Global climate teleconnection, such as El Niño Southern Oscillation (ENSO), may contribute in extending forecast lead times. However, ENSO teleconnection is well defined in some locations, such as Western USA and Australia, while there is no consensus on how it can be detected and used in other regions, particularly in Europe, Africa, and Asia. In this work, we generalize the Niño Index Phase Analysis (NIPA) framework by contributing the Multi Variate Niño Index Phase Analysis (MV-NIPA), which allows capturing the state of multiple large-scale climate signals (i.e. ENSO, North Atlantic Oscillation, Pacific Decadal Oscillation, Atlantic Multi-decadal Oscillation, Indian Ocean Dipole) to forecast hydroclimatic variables on a seasonal time scale. Specifically, our approach distinguishes the different phases of the considered climate signals and, for each phase, identifies relevant anomalies in Sea Surface Temperature (SST) that influence the local hydrologic conditions. The potential of the MV-NIPA framework is demonstrated through an application to the Lake Como system, a regulated lake in northern Italy which is mainly operated for flood control and irrigation supply. Numerical results show high correlations between seasonal SST values and one season-ahead precipitation in the Lake Como basin. The skill of the resulting MV-NIPA forecast outperforms the one of ECMWF products. This information represents a valuable contribution to partially anticipate the summer water availability, especially during drought events, ultimately supporting the improvement of the Lake Como operations.
NASA Astrophysics Data System (ADS)
Salvage, R. O.; Neuberg, J. W.
2016-09-01
Prior to many volcanic eruptions, an acceleration in seismicity has been observed, suggesting the potential for this as a forecasting tool. The Failure Forecast Method (FFM) relates an accelerating precursor to the timing of failure by an empirical power law, with failure being defined in this context as the onset of an eruption. Previous applications of the FFM have used a wide variety of accelerating time series, often generating questionable forecasts with large misfits between data and the forecast, as well as the generation of a number of different forecasts from the same data series. Here, we show an alternative approach applying the FFM in combination with a cross correlation technique which identifies seismicity from a single active source mechanism and location at depth. Isolating a single system at depth avoids additional uncertainties introduced by averaging data over a number of different accelerating phenomena, and consequently reduces the misfit between the data and the forecast. Similar seismic waveforms were identified in the precursory accelerating seismicity to dome collapses at Soufrière Hills volcano, Montserrat in June 1997, July 2003 and February 2010. These events were specifically chosen since they represent a spectrum of collapse scenarios at this volcano. The cross correlation technique generates a five-fold increase in the number of seismic events which could be identified from continuous seismic data rather than using triggered data, thus providing a more holistic understanding of the ongoing seismicity at the time. The use of similar seismicity as a forecasting tool for collapses in 1997 and 2003 greatly improved the forecasted timing of the dome collapse, as well as improving the confidence in the forecast, thereby outperforming the classical application of the FFM. We suggest that focusing on a single active seismic system at depth allows a more accurate forecast of some of the major dome collapses from the ongoing eruption at Soufrière Hills volcano, and provides a simple addition to the well-used methodology of the FFM.
Cyber Surveillance for Flood Disasters
Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han
2015-01-01
Regional heavy rainfall is usually caused by the influence of extreme weather conditions. Instant heavy rainfall often results in the flooding of rivers and the neighboring low-lying areas, which is responsible for a large number of casualties and considerable property loss. The existing precipitation forecast systems mostly focus on the analysis and forecast of large-scale areas but do not provide precise instant automatic monitoring and alert feedback for individual river areas and sections. Therefore, in this paper, we propose an easy method to automatically monitor the flood object of a specific area, based on the currently widely used remote cyber surveillance systems and image processing methods, in order to obtain instant flooding and waterlogging event feedback. The intrusion detection mode of these surveillance systems is used in this study, wherein a flood is considered a possible invasion object. Through the detection and verification of flood objects, automatic flood risk-level monitoring of specific individual river segments, as well as the automatic urban inundation detection, has become possible. The proposed method can better meet the practical needs of disaster prevention than the method of large-area forecasting. It also has several other advantages, such as flexibility in location selection, no requirement of a standard water-level ruler, and a relatively large field of view, when compared with the traditional water-level measurements using video screens. The results can offer prompt reference for appropriate disaster warning actions in small areas, making them more accurate and effective. PMID:25621609
FUSION++: A New Data Assimilative Model for Electron Density Forecasting
NASA Astrophysics Data System (ADS)
Bust, G. S.; Comberiate, J.; Paxton, L. J.; Kelly, M.; Datta-Barua, S.
2014-12-01
There is a continuing need within the operational space weather community, both civilian and military, for accurate, robust data assimilative specifications and forecasts of the global electron density field, as well as derived RF application product specifications and forecasts obtained from the electron density field. The spatial scales of interest range from a hundred to a few thousand kilometers horizontally (synoptic large scale structuring) and meters to kilometers (small scale structuring that cause scintillations). RF space weather applications affected by electron density variability on these scales include navigation, communication and geo-location of RF frequencies ranging from 100's of Hz to GHz. For many of these applications, the necessary forecast time periods range from nowcasts to 1-3 hours. For more "mission planning" applications, necessary forecast times can range from hours to days. In this paper we present a new ionosphere-thermosphere (IT) specification and forecast model being developed at JHU/APL based upon the well-known data assimilation algorithms Ionospheric Data Assimilation Four Dimensional (IDA4D) and Estimating Model Parameters from Ionospheric Reverse Engineering (EMPIRE). This new forecast model, "Forward Update Simple IONosphere model Plus IDA4D Plus EMPIRE (FUSION++), ingests data from observations related to electron density, winds, electric fields and neutral composition and provides improved specification and forecast of electron density. In addition, the new model provides improved specification of winds, electric fields and composition. We will present a short overview and derivation of the methodology behind FUSION++, some preliminary results using real observational sources, example derived RF application products such as HF bi-static propagation, and initial comparisons with independent data sources for validation.
Huang, Yuanyuan; Jiang, Jiang; Ma, Shuang; ...
2017-08-18
We report that accurate simulation of soil thermal dynamics is essential for realistic prediction of soil biogeochemical responses to climate change. To facilitate ecological forecasting at the Spruce and Peatland Responses Under Climatic and Environmental change site, we incorporated a soil temperature module into a Terrestrial ECOsystem (TECO) model by accounting for surface energy budget, snow dynamics, and heat transfer among soil layers and during freeze-thaw events. We conditioned TECO with detailed soil temperature and snow depth observations through data assimilation before the model was used for forecasting. The constrained model reproduced variations in observed temperature from different soil layers,more » the magnitude of snow depth, the timing of snowfall and snowmelt, and the range of frozen depth. The conditioned TECO forecasted probabilistic distributions of soil temperature dynamics in six soil layers, snow, and frozen depths under temperature treatments of +0.0, +2.25, +4.5, +6.75, and +9.0°C. Air warming caused stronger elevation in soil temperature during summer than winter due to winter snow and ice. And soil temperature increased more in shallow soil layers in summer in response to air warming. Whole ecosystem warming (peat + air warmings) generally reduced snow and frozen depths. The accuracy of forecasted snow and frozen depths relied on the precision of weather forcing. Uncertainty is smaller for forecasting soil temperature but large for snow and frozen depths. Lastly, timely and effective soil thermal forecast, constrained through data assimilation that combines process-based understanding and detailed observations, provides boundary conditions for better predictions of future biogeochemical cycles.« less
Technical Note: Initial assessment of a multi-method approach to spring-flood forecasting in Sweden
NASA Astrophysics Data System (ADS)
Olsson, J.; Uvo, C. B.; Foster, K.; Yang, W.
2016-02-01
Hydropower is a major energy source in Sweden, and proper reservoir management prior to the spring-flood onset is crucial for optimal production. This requires accurate forecasts of the accumulated discharge in the spring-flood period (i.e. the spring-flood volume, SFV). Today's SFV forecasts are generated using a model-based climatological ensemble approach, where time series of precipitation and temperature from historical years are used to force a calibrated and initialized set-up of the HBV model. In this study, a number of new approaches to spring-flood forecasting that reflect the latest developments with respect to analysis and modelling on seasonal timescales are presented and evaluated. Three main approaches, represented by specific methods, are evaluated in SFV hindcasts for the Swedish river Vindelälven over a 10-year period with lead times between 0 and 4 months. In the first approach, historically analogue years with respect to the climate in the period preceding the spring flood are identified and used to compose a reduced ensemble. In the second, seasonal meteorological ensemble forecasts are used to drive the HBV model over the spring-flood period. In the third approach, statistical relationships between SFV and the large-sale atmospheric circulation are used to build forecast models. None of the new approaches consistently outperform the climatological ensemble approach, but for early forecasts improvements of up to 25 % are found. This potential is reasonably well realized in a multi-method system, which over all forecast dates reduced the error in SFV by ˜ 4 %. This improvement is limited but potentially significant for e.g. energy trading.
The Super Tuesday Outbreak: Forecast Sensitivities to Single-Moment Microphysics Schemes
NASA Technical Reports Server (NTRS)
Molthan, Andrew L.; Case, Jonathan L.; Dembek, Scott R.; Jedlovec, Gary J.; Lapenta, William M.
2008-01-01
Forecast precipitation and radar characteristics are used by operational centers to guide the issuance of advisory products. As operational numerical weather prediction is performed at increasingly finer spatial resolution, convective precipitation traditionally represented by sub-grid scale parameterization schemes is now being determined explicitly through single- or multi-moment bulk water microphysics routines. Gains in forecasting skill are expected through improved simulation of clouds and their microphysical processes. High resolution model grids and advanced parameterizations are now available through steady increases in computer resources. As with any parameterization, their reliability must be measured through performance metrics, with errors noted and targeted for improvement. Furthermore, the use of these schemes within an operational framework requires an understanding of limitations and an estimate of biases so that forecasters and model development teams can be aware of potential errors. The National Severe Storms Laboratory (NSSL) Spring Experiments have produced daily, high resolution forecasts used to evaluate forecast skill among an ensemble with varied physical parameterizations and data assimilation techniques. In this research, high resolution forecasts of the 5-6 February 2008 Super Tuesday Outbreak are replicated using the NSSL configuration in order to evaluate two components of simulated convection on a large domain: sensitivities of quantitative precipitation forecasts to assumptions within a single-moment bulk water microphysics scheme, and to determine if these schemes accurately depict the reflectivity characteristics of well-simulated, organized, cold frontal convection. As radar returns are sensitive to the amount of hydrometeor mass and the distribution of mass among variably sized targets, radar comparisons may guide potential improvements to a single-moment scheme. In addition, object-based verification metrics are evaluated for their utility in gauging model performance and QPF variability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Yuanyuan; Jiang, Jiang; Ma, Shuang
We report that accurate simulation of soil thermal dynamics is essential for realistic prediction of soil biogeochemical responses to climate change. To facilitate ecological forecasting at the Spruce and Peatland Responses Under Climatic and Environmental change site, we incorporated a soil temperature module into a Terrestrial ECOsystem (TECO) model by accounting for surface energy budget, snow dynamics, and heat transfer among soil layers and during freeze-thaw events. We conditioned TECO with detailed soil temperature and snow depth observations through data assimilation before the model was used for forecasting. The constrained model reproduced variations in observed temperature from different soil layers,more » the magnitude of snow depth, the timing of snowfall and snowmelt, and the range of frozen depth. The conditioned TECO forecasted probabilistic distributions of soil temperature dynamics in six soil layers, snow, and frozen depths under temperature treatments of +0.0, +2.25, +4.5, +6.75, and +9.0°C. Air warming caused stronger elevation in soil temperature during summer than winter due to winter snow and ice. And soil temperature increased more in shallow soil layers in summer in response to air warming. Whole ecosystem warming (peat + air warmings) generally reduced snow and frozen depths. The accuracy of forecasted snow and frozen depths relied on the precision of weather forcing. Uncertainty is smaller for forecasting soil temperature but large for snow and frozen depths. Lastly, timely and effective soil thermal forecast, constrained through data assimilation that combines process-based understanding and detailed observations, provides boundary conditions for better predictions of future biogeochemical cycles.« less
NASA Astrophysics Data System (ADS)
Singh, Shailesh Kumar
2014-05-01
Streamflow forecasts are essential for making critical decision for optimal allocation of water supplies for various demands that include irrigation for agriculture, habitat for fisheries, hydropower production and flood warning. The major objective of this study is to explore the Ensemble Streamflow Prediction (ESP) based forecast in New Zealand catchments and to highlights the present capability of seasonal flow forecasting of National Institute of Water and Atmospheric Research (NIWA). In this study a probabilistic forecast framework for ESP is presented. The basic assumption in ESP is that future weather pattern were experienced historically. Hence, past forcing data can be used with current initial condition to generate an ensemble of prediction. Small differences in initial conditions can result in large difference in the forecast. The initial state of catchment can be obtained by continuously running the model till current time and use this initial state with past forcing data to generate ensemble of flow for future. The approach taken here is to run TopNet hydrological models with a range of past forcing data (precipitation, temperature etc.) with current initial conditions. The collection of runs is called the ensemble. ESP give probabilistic forecasts for flow. From ensemble members the probability distributions can be derived. The probability distributions capture part of the intrinsic uncertainty in weather or climate. An ensemble stream flow prediction which provide probabilistic hydrological forecast with lead time up to 3 months is presented for Rangitata, Ahuriri, and Hooker and Jollie rivers in South Island of New Zealand. ESP based seasonal forecast have better skill than climatology. This system can provide better over all information for holistic water resource management.
Performance of time-series methods in forecasting the demand for red blood cell transfusion.
Pereira, Arturo
2004-05-01
Planning the future blood collection efforts must be based on adequate forecasts of transfusion demand. In this study, univariate time-series methods were investigated for their performance in forecasting the monthly demand for RBCs at one tertiary-care, university hospital. Three time-series methods were investigated: autoregressive integrated moving average (ARIMA), the Holt-Winters family of exponential smoothing models, and one neural-network-based method. The time series consisted of the monthly demand for RBCs from January 1988 to December 2002 and was divided into two segments: the older one was used to fit or train the models, and the younger to test for the accuracy of predictions. Performance was compared across forecasting methods by calculating goodness-of-fit statistics, the percentage of months in which forecast-based supply would have met the RBC demand (coverage rate), and the outdate rate. The RBC transfusion series was best fitted by a seasonal ARIMA(0,1,1)(0,1,1)(12) model. Over 1-year time horizons, forecasts generated by ARIMA or exponential smoothing laid within the +/- 10 percent interval of the real RBC demand in 79 percent of months (62% in the case of neural networks). The coverage rate for the three methods was 89, 91, and 86 percent, respectively. Over 2-year time horizons, exponential smoothing largely outperformed the other methods. Predictions by exponential smoothing laid within the +/- 10 percent interval of real values in 75 percent of the 24 forecasted months, and the coverage rate was 87 percent. Over 1-year time horizons, predictions of RBC demand generated by ARIMA or exponential smoothing are accurate enough to be of help in the planning of blood collection efforts. For longer time horizons, exponential smoothing outperforms the other forecasting methods.
Statistical earthquake focal mechanism forecasts
NASA Astrophysics Data System (ADS)
Kagan, Yan Y.; Jackson, David D.
2014-04-01
Forecasts of the focal mechanisms of future shallow (depth 0-70 km) earthquakes are important for seismic hazard estimates and Coulomb stress, and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude and focal mechanism. In previous publications we reported forecasts of 0.5° spatial resolution, covering the latitude range from -75° to +75°, based on the Global Central Moment Tensor earthquake catalogue. In the new forecasts we have improved the spatial resolution to 0.1° and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each gridpoint. Simultaneously, we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where shallow earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75°, the azimuths of points 1000 km away may vary by about 35°. We solved this problem by calculating focal mechanisms on a plane tangent to the Earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30° and +30° latitude, but outside that band uncorrected rotations can be significantly off. Improved forecasts at 0.5° and 0.1° resolution are posted at http://eq.ess.ucla.edu/kagan/glob_gcmt_index.html.
Large-eddy simulations of a Salt Lake Valley cold-air pool
NASA Astrophysics Data System (ADS)
Crosman, Erik T.; Horel, John D.
2017-09-01
Persistent cold-air pools are often poorly forecast by mesoscale numerical weather prediction models, in part due to inadequate parameterization of planetary boundary-layer physics in stable atmospheric conditions, and also because of errors in the initialization and treatment of the model surface state. In this study, an improved numerical simulation of the 27-30 January 2011 cold-air pool in Utah's Great Salt Lake Basin is obtained using a large-eddy simulation with more realistic surface state characterization. Compared to a Weather Research and Forecasting model configuration run as a mesoscale model with a planetary boundary-layer scheme where turbulence is highly parameterized, the large-eddy simulation more accurately captured turbulent interactions between the stable boundary-layer and flow aloft. The simulations were also found to be sensitive to variations in the Great Salt Lake temperature and Salt Lake Valley snow cover, illustrating the importance of land surface state in modelling cold-air pools.
How do I know if my forecasts are better? Using benchmarks in hydrological ensemble prediction
NASA Astrophysics Data System (ADS)
Pappenberger, F.; Ramos, M. H.; Cloke, H. L.; Wetterhall, F.; Alfieri, L.; Bogner, K.; Mueller, A.; Salamon, P.
2015-03-01
The skill of a forecast can be assessed by comparing the relative proximity of both the forecast and a benchmark to the observations. Example benchmarks include climatology or a naïve forecast. Hydrological ensemble prediction systems (HEPS) are currently transforming the hydrological forecasting environment but in this new field there is little information to guide researchers and operational forecasters on how benchmarks can be best used to evaluate their probabilistic forecasts. In this study, it is identified that the forecast skill calculated can vary depending on the benchmark selected and that the selection of a benchmark for determining forecasting system skill is sensitive to a number of hydrological and system factors. A benchmark intercomparison experiment is then undertaken using the continuous ranked probability score (CRPS), a reference forecasting system and a suite of 23 different methods to derive benchmarks. The benchmarks are assessed within the operational set-up of the European Flood Awareness System (EFAS) to determine those that are 'toughest to beat' and so give the most robust discrimination of forecast skill, particularly for the spatial average fields that EFAS relies upon. Evaluating against an observed discharge proxy the benchmark that has most utility for EFAS and avoids the most naïve skill across different hydrological situations is found to be meteorological persistency. This benchmark uses the latest meteorological observations of precipitation and temperature to drive the hydrological model. Hydrological long term average benchmarks, which are currently used in EFAS, are very easily beaten by the forecasting system and the use of these produces much naïve skill. When decomposed into seasons, the advanced meteorological benchmarks, which make use of meteorological observations from the past 20 years at the same calendar date, have the most skill discrimination. They are also good at discriminating skill in low flows and for all catchment sizes. Simpler meteorological benchmarks are particularly useful for high flows. Recommendations for EFAS are to move to routine use of meteorological persistency, an advanced meteorological benchmark and a simple meteorological benchmark in order to provide a robust evaluation of forecast skill. This work provides the first comprehensive evidence on how benchmarks can be used in evaluation of skill in probabilistic hydrological forecasts and which benchmarks are most useful for skill discrimination and avoidance of naïve skill in a large scale HEPS. It is recommended that all HEPS use the evidence and methodology provided here to evaluate which benchmarks to employ; so forecasters can have trust in their skill evaluation and will have confidence that their forecasts are indeed better.
NASA Astrophysics Data System (ADS)
Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.
2012-04-01
In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological dynamic and processes, i. e. sample heterogeneity. For a same streamflow range corresponds different processes such as rising limbs or recession, where uncertainties are different. The dynamical approach improves reliability, skills and sharpness of forecasts and globally reduces confidence intervals width. When compared in details, the dynamical approach allows a noticeable reduction of confidence intervals during recessions where uncertainty is relatively lower and a slight increase of confidence intervals during rising limbs or snowmelt where uncertainty is greater. The dynamic approach, validated by forecaster's experience that considered the empirical approach not discriminative enough, improved forecaster's confidence and communication of uncertainties. Montanari, A. and Brath, A., (2004). A stochastic approach for assessing the uncertainty of rainfall-runoff simulations. Water Resources Research, 40, W01106, doi:10.1029/2003WR002540. Schaefli, B., Balin Talamba, D. and Musy, A., (2007). Quantifying hydrological modeling errors through a mixture of normal distributions. Journal of Hydrology, 332, 303-315.
NASA Astrophysics Data System (ADS)
Rhee, Jinyoung; Kim, Gayoung; Im, Jungho
2017-04-01
Three regions of Indonesia with different rainfall characteristics were chosen to develop drought forecast models based on machine learning. The 6-month Standardized Precipitation Index (SPI6) was selected as the target variable. The models' forecast skill was compared to the skill of long-range climate forecast models in terms of drought accuracy and regression mean absolute error (MAE). Indonesian droughts are known to be related to El Nino Southern Oscillation (ENSO) variability despite of regional differences as well as monsoon, local sea surface temperature (SST), other large-scale atmosphere-ocean interactions such as Indian Ocean Dipole (IOD) and Southern Pacific Convergence Zone (SPCZ), and local factors including topography and elevation. Machine learning models are thus to enhance drought forecast skill by combining local and remote SST and remote sensing information reflecting initial drought conditions to the long-range climate forecast model results. A total of 126 machine learning models were developed for the three regions of West Java (JB), West Sumatra (SB), and Gorontalo (GO) and six long-range climate forecast models of MSC_CanCM3, MSC_CanCM4, NCEP, NASA, PNU, POAMA as well as one climatology model based on remote sensing precipitation data, and 1 to 6-month lead times. When compared the results between the machine learning models and the long-range climate forecast models, West Java and Gorontalo regions showed similar characteristics in terms of drought accuracy. Drought accuracy of the long-range climate forecast models were generally higher than the machine learning models with short lead times but the opposite appeared for longer lead times. For West Sumatra, however, the machine learning models and the long-range climate forecast models showed similar drought accuracy. The machine learning models showed smaller regression errors for all three regions especially with longer lead times. Among the three regions, the machine learning models developed for Gorontalo showed the highest drought accuracy and the lowest regression error. West Java showed higher drought accuracy compared to West Sumatra, while West Sumatra showed lower regression error compared to West Java. The lower error in West Sumatra may be because of the smaller sample size used for training and evaluation for the region. Regional differences of forecast skill are determined by the effect of ENSO and the following forecast skill of the long-range climate forecast models. While shown somewhat high in West Sumatra, relative importance of remote sensing variables was mostly low in most cases. High importance of the variables based on long-range climate forecast models indicates that the forecast skill of the machine learning models are mostly determined by the forecast skill of the climate models.
Distortion Representation of Forecast Errors for Model Skill Assessment and Objective Analysis
NASA Technical Reports Server (NTRS)
Hoffman, Ross N.; Nehrkorn, Thomas; Grassotti, Christopher
1996-01-01
We study a novel characterization of errors for numerical weather predictions. In its simplest form we decompose the error into a part attributable to phase errors and a remainder. The phase error is represented in the same fashion as a velocity field and will be required to vary slowly and smoothly with position. A general distortion representation allows for the displacement and a bias correction of forecast anomalies. In brief, the distortion is determined by minimizing the objective function by varying the displacement and bias correction fields. In the present project we use a global or hemispheric domain, and spherical harmonics to represent these fields. In this project we are initially focusing on the assessment application, restricted to a realistic but univariate 2-dimensional situation. Specifically we study the forecast errors of the 500 hPa geopotential height field for forecasts of the short and medium range. The forecasts are those of the Goddard Earth Observing System data assimilation system. Results presented show that the methodology works, that a large part of the total error may be explained by a distortion limited to triangular truncation at wavenumber 10, and that the remaining residual error contains mostly small spatial scales.
Smirnova, Alexandra; deCamp, Linda; Chowell, Gerardo
2017-05-02
Deterministic and stochastic methods relying on early case incidence data for forecasting epidemic outbreaks have received increasing attention during the last few years. In mathematical terms, epidemic forecasting is an ill-posed problem due to instability of parameter identification and limited available data. While previous studies have largely estimated the time-dependent transmission rate by assuming specific functional forms (e.g., exponential decay) that depend on a few parameters, here we introduce a novel approach for the reconstruction of nonparametric time-dependent transmission rates by projecting onto a finite subspace spanned by Legendre polynomials. This approach enables us to effectively forecast future incidence cases, the clear advantage over recovering the transmission rate at finitely many grid points within the interval where the data are currently available. In our approach, we compare three regularization algorithms: variational (Tikhonov's) regularization, truncated singular value decomposition (TSVD), and modified TSVD in order to determine the stabilizing strategy that is most effective in terms of reliability of forecasting from limited data. We illustrate our methodology using simulated data as well as case incidence data for various epidemics including the 1918 influenza pandemic in San Francisco and the 2014-2015 Ebola epidemic in West Africa.
A multifactor approach to forecasting Romanian gross domestic product (GDP) in the short run.
Armeanu, Daniel; Andrei, Jean Vasile; Lache, Leonard; Panait, Mirela
2017-01-01
The purpose of this paper is to investigate the application of a generalized dynamic factor model (GDFM) based on dynamic principal components analysis to forecasting short-term economic growth in Romania. We have used a generalized principal components approach to estimate a dynamic model based on a dataset comprising 86 economic and non-economic variables that are linked to economic output. The model exploits the dynamic correlations between these variables and uses three common components that account for roughly 72% of the information contained in the original space. We show that it is possible to generate reliable forecasts of quarterly real gross domestic product (GDP) using just the common components while also assessing the contribution of the individual variables to the dynamics of real GDP. In order to assess the relative performance of the GDFM to standard models based on principal components analysis, we have also estimated two Stock-Watson (SW) models that were used to perform the same out-of-sample forecasts as the GDFM. The results indicate significantly better performance of the GDFM compared with the competing SW models, which empirically confirms our expectations that the GDFM produces more accurate forecasts when dealing with large datasets.
NASA Astrophysics Data System (ADS)
Yoder, Mark R.; Schultz, Kasey W.; Heien, Eric M.; Rundle, John B.; Turcotte, Donald L.; Parker, Jay W.; Donnellan, Andrea
2015-12-01
In this manuscript, we introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert-based forecasting metric, and show that it exhibits significant information gain compared to random forecasts. We also discuss the long-standing question of activation versus quiescent type earthquake triggering. We show that VQ exhibits both behaviours separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California, USA and northern Baja California Norte, Mexico.
Flood Warning and Forecasting System in Slovakia
NASA Astrophysics Data System (ADS)
Leskova, Danica
2016-04-01
In 2015, it finished project Flood Warning and Forecasting System (POVAPSYS) as part of the flood protection in Slovakia till 2010. The aim was to build POVAPSYS integrated computerized flood forecasting and warning system. It took a qualitatively higher level of output meteorological and hydrological services in case of floods affecting large territorial units, as well as local flood events. It is further unfolding demands on performance and coordination of meteorological and hydrological services, troubleshooting observation, evaluation of data, fast communication, modeling and forecasting of meteorological and hydrological processes. Integration of all information entering and exiting to and from the project POVAPSYS provides Hydrological Flood Forecasting System (HYPOS). The system provides information on the current hydrometeorological situation and its evolution with the generation of alerts and notifications in case of exceeding predefined thresholds. HYPOS's functioning of the system requires flawless operability in critical situations while minimizing the loss of its key parts. HYPOS is a core part of the project POVAPSYS, it is a comprehensive software solutions based on a modular principle, providing data and processed information including alarms, in real time. In order to achieve full functionality of the system, in proposal, we have put emphasis on reliability, robustness, availability and security.
Predictability of short-range forecasting: a multimodel approach
NASA Astrophysics Data System (ADS)
García-Moya, Jose-Antonio; Callado, Alfons; Escribà, Pau; Santos, Carlos; Santos-Muñoz, Daniel; Simarro, Juan
2011-05-01
Numerical weather prediction (NWP) models (including mesoscale) have limitations when it comes to dealing with severe weather events because extreme weather is highly unpredictable, even in the short range. A probabilistic forecast based on an ensemble of slightly different model runs may help to address this issue. Among other ensemble techniques, Multimodel ensemble prediction systems (EPSs) are proving to be useful for adding probabilistic value to mesoscale deterministic models. A Multimodel Short Range Ensemble Prediction System (SREPS) focused on forecasting the weather up to 72 h has been developed at the Spanish Meteorological Service (AEMET). The system uses five different limited area models (LAMs), namely HIRLAM (HIRLAM Consortium), HRM (DWD), the UM (UKMO), MM5 (PSU/NCAR) and COSMO (COSMO Consortium). These models run with initial and boundary conditions provided by five different global deterministic models, namely IFS (ECMWF), UM (UKMO), GME (DWD), GFS (NCEP) and CMC (MSC). AEMET-SREPS (AE) validation on the large-scale flow, using ECMWF analysis, shows a consistent and slightly underdispersive system. For surface parameters, the system shows high skill forecasting binary events. 24-h precipitation probabilistic forecasts are verified using an up-scaling grid of observations from European high-resolution precipitation networks, and compared with ECMWF-EPS (EC).
Snowmelt runoff modeling in simulation and forecasting modes with the Martinec-Mango model
NASA Technical Reports Server (NTRS)
Shafer, B.; Jones, E. B.; Frick, D. M. (Principal Investigator)
1982-01-01
The Martinec-Rango snowmelt runoff model was applied to two watersheds in the Rio Grande basin, Colorado-the South Fork Rio Grande, a drainage encompassing 216 sq mi without reservoirs or diversions and the Rio Grande above Del Norte, a drainage encompassing 1,320 sq mi without major reservoirs. The model was successfully applied to both watersheds when run in a simulation mode for the period 1973-79. This period included both high and low runoff seasons. Central to the adaptation of the model to run in a forecast mode was the need to develop a technique to forecast the shape of the snow cover depletion curves between satellite data points. Four separate approaches were investigated-simple linear estimation, multiple regression, parabolic exponential, and type curve. Only the parabolic exponential and type curve methods were run on the South Fork and Rio Grande watersheds for the 1980 runoff season using satellite snow cover updates when available. Although reasonable forecasts were obtained in certain situations, neither method seemed ready for truly operational forecasts, possibly due to a large amount of estimated climatic data for one or two primary base stations during the 1980 season.
A multifactor approach to forecasting Romanian gross domestic product (GDP) in the short run
Armeanu, Daniel; Lache, Leonard; Panait, Mirela
2017-01-01
The purpose of this paper is to investigate the application of a generalized dynamic factor model (GDFM) based on dynamic principal components analysis to forecasting short-term economic growth in Romania. We have used a generalized principal components approach to estimate a dynamic model based on a dataset comprising 86 economic and non-economic variables that are linked to economic output. The model exploits the dynamic correlations between these variables and uses three common components that account for roughly 72% of the information contained in the original space. We show that it is possible to generate reliable forecasts of quarterly real gross domestic product (GDP) using just the common components while also assessing the contribution of the individual variables to the dynamics of real GDP. In order to assess the relative performance of the GDFM to standard models based on principal components analysis, we have also estimated two Stock-Watson (SW) models that were used to perform the same out-of-sample forecasts as the GDFM. The results indicate significantly better performance of the GDFM compared with the competing SW models, which empirically confirms our expectations that the GDFM produces more accurate forecasts when dealing with large datasets. PMID:28742100
Observational evidence of European summer weather patterns predictable from spring
NASA Astrophysics Data System (ADS)
Ossó, Albert; Sutton, Rowan; Shaffrey, Len; Dong, Buwen
2018-01-01
Forecasts of summer weather patterns months in advance would be of great value for a wide range of applications. However, seasonal dynamical model forecasts for European summers have very little skill, particularly for rainfall. It has not been clear whether this low skill reflects inherent unpredictability of summer weather or, alternatively, is a consequence of weaknesses in current forecast systems. Here we analyze atmosphere and ocean observations and identify evidence that a specific pattern of summertime atmospheric circulation––the summer East Atlantic (SEA) pattern––is predictable from the previous spring. An index of North Atlantic sea-surface temperatures in March–April can predict the SEA pattern in July–August with a cross-validated correlation skill above 0.6. Our analyses show that the sea-surface temperatures influence atmospheric circulation and the position of the jet stream over the North Atlantic. The SEA pattern has a particularly strong influence on rainfall in the British Isles, which we find can also be predicted months ahead with a significant skill of 0.56. Our results have immediate application to empirical forecasts of summer rainfall for the United Kingdom, Ireland, and northern France and also suggest that current dynamical model forecast systems have large potential for improvement.
Predicatbility of windstorm Klaus; sensitivity to PV perturbations
NASA Astrophysics Data System (ADS)
Arbogast, P.; Maynard, K.
2010-09-01
It appears that some short-range weather forecast failures may be attributed to initial conditions errors. In some cases it is possible to anticipate the behavior of the model by comparison between observations and model analyses. In the case of extratropical cyclone development one may qualify the representation of the upper-level precursors described in terms of PV in the initial conditions by comparison with either satellite ozone or water-vapor. A step forward has been made in developing a tool based upon manual modifications of dynamical tropopause (i.e. height of 1.5 PV units) and PV inversion. After five years of experimentations it turns out that the forecasters eventually succeed in improving the forecast of some strong cyclone development. However the present approach is subjective per se. To measure the subjectivity of the procedure a set of 15 experiments has been performed provided by 7 different people (senior forecasters and scientists involved in dynamical meteorology) in order to improve an initial state of the global model ARPEGE leading to a poor forecast of the wind storm Klaus (24 January 2009). This experiment reveals that the manually defined corrections present common features but also a large spread.
NASA Astrophysics Data System (ADS)
Boldyreff, Anton S.; Bespalov, Dmitry A.; Adzhiev, Anatoly Kh.
2017-05-01
Methods of artificial intelligence are a good solution for weather phenomena forecasting. They allow to process a large amount of diverse data. Recirculation Neural Networks is implemented in the paper for the system of thunderstorm events prediction. Large amounts of experimental data from lightning sensors and electric field mills networks are received and analyzed. The average recognition accuracy of sensor signals is calculated. It is shown that Recirculation Neural Networks is a promising solution in the forecasting of thunderstorms and weather phenomena, characterized by the high efficiency of the recognition elements of the sensor signals, allows to compress images and highlight their characteristic features for subsequent recognition.
An overview of the 1984 Battelle outside users payload model
NASA Astrophysics Data System (ADS)
Day, J. B.; Conlon, R. J.; Neale, D. B.; Fischer, N. H.
1984-10-01
The methodology and projections from a model for the market for non-NASA, non-DOD, reimbursable payloads from the non-Soviet bloc countries over the 1984-2000 AD time period are summarized. High and low forecast ranges were made based on demand forecasts by industrial users, NASA estimates, and other publications. The launches were assumed to be alloted to either the Shuttle or the Ariane. The greatest demand for launch services is expected to come form communications and materials processing payloads, the latter either becoming a large user or remaining a research item. The number of Shuttle payload equivalents over the reference time spanis projected as 84-194, showing the large variance that is dependent on the progress in materials processing operations.
Fine-grained dengue forecasting using telephone triage services
Abdur Rehman, Nabeel; Kalyanaraman, Shankar; Ahmad, Talal; Pervaiz, Fahad; Saif, Umar; Subramanian, Lakshminarayanan
2016-01-01
Thousands of lives are lost every year in developing countries for failing to detect epidemics early because of the lack of real-time disease surveillance data. We present results from a large-scale deployment of a telephone triage service as a basis for dengue forecasting in Pakistan. Our system uses statistical analysis of dengue-related phone calls to accurately forecast suspected dengue cases 2 to 3 weeks ahead of time at a subcity level (correlation of up to 0.93). Our system has been operational at scale in Pakistan for the past 3 years and has received more than 300,000 phone calls. The predictions from our system are widely disseminated to public health officials and form a critical part of active government strategies for dengue containment. Our work is the first to demonstrate, with significant empirical evidence, that an accurate, location-specific disease forecasting system can be built using analysis of call volume data from a public health hotline. PMID:27419226
Kyriazakos, Sofoklis; Valentini, Vincenzo; Cesario, Alfredo; Zachariae, Robert
2018-01-01
Well-being of cancer patients and survivors is a challenge worldwide, considering the often chronic nature of the disease. Today, a large number of initiatives, products and services are available that aim to provide strategies to face the challenge of well-being in cancer patients; nevertheless the proposed solutions are often non-sustainable, costly, unavailable to those in need, and less well-received by patients. These challenges were considered in designing FORECAST, a cloud-based personalized intelligent virtual coaching platform for improving the well-being of cancer patients. Personalized coaching for cancer patients focuses on physical, mental, and emotional concerns, which FORECAST is able to identify. Cancer patients can benefit from coaching that addresses their emotional problems, helps them focus on their goals, and supports them in coping with their disease-related stressors. Personalized coaching in FORECAST offers support, encouragement, motivation, confidence, and hope and is a valuable tool for the wellbeing of a patient.
Impact of data assimilation on ocean current forecasts in the Angola Basin
NASA Astrophysics Data System (ADS)
Phillipson, Luke; Toumi, Ralf
2017-06-01
The ocean current predictability in the data limited Angola Basin was investigated using the Regional Ocean Modelling System (ROMS) with four-dimensional variational data assimilation. Six experiments were undertaken comprising a baseline case of the assimilation of salinity/temperature profiles and satellite sea surface temperature, with the subsequent addition of altimetry, OSCAR (satellite-derived sea surface currents), drifters, altimetry and drifters combined, and OSCAR and drifters combined. The addition of drifters significantly improves Lagrangian predictability in comparison to the baseline case as well as the addition of either altimetry or OSCAR. OSCAR assimilation only improves Lagrangian predictability as much as altimetry assimilation. On average the assimilation of either altimetry or OSCAR with drifter velocities does not significantly improve Lagrangian predictability compared to the drifter assimilation alone, even degrading predictability in some cases. When the forecast current speed is large, it is more likely that the combination improves trajectory forecasts. Conversely, when the currents are weaker, it is more likely that the combination degrades the trajectory forecast.
Effects of Changing Climate During the Snow Ablation Season on Seasonal Streamflow Forecasts
NASA Astrophysics Data System (ADS)
Gutzler, D. S.; Chavarria, S. B.
2017-12-01
Seasonal forecasts of total surface runoff (Q) in snowmelt-dominated watersheds derive most of their prediction skill from the historical relationship between late winter snowpack (SWE) and subsequent snowmelt runoff. Across the western US, however, the relationship between SWE and Q is weakening as temperatures rise. We describe the effects of climate variability and change during the springtime snow ablation season on water supply outlooks (forecasts of Q) for southwestern rivers. As snow melts earlier, the importance of post-snow rainfall increases: interannual variability of spring season precipitation accounts for an increasing fraction of the variability of Q in recent decades. The results indicate that improvements to the skill of S2S forecasts of spring season temperature and precipitation would contribute very significantly to water supply outlooks that are now based largely on observed SWE. We assess this hypothesis using historical data from several snowpack-dominated basins in the American Southwest (Rio Grande, Pecos, and Gila Rivers) which are undergoing rapid climate change.
NASA Astrophysics Data System (ADS)
Wanders, Niko; Wada, Yoshihide
2015-12-01
Long-term hydrological forecasts are important to increase our resilience and preparedness to extreme hydrological events. The skill in these forecasts is still limited due to large uncertainties inherent in hydrological models and poor predictability of long-term meteorological conditions. Here we show that strong (lagged) correlations exist between four different major climate oscillation modes and modeled and observed discharge anomalies over a 100 year period. The strongest correlations are found between the El Niño-Southern Oscillation signal and river discharge anomalies all year round, while North Atlantic Oscillation and Antarctic Oscillation time series are strongly correlated with winter discharge anomalies. The correlation signal is significant for periods up to 5 years for some regions, indicating a high added value of this information for long-term hydrological forecasting. The results suggest that long-term hydrological forecasting could be significantly improved by including the climate oscillation signals and thus improve our preparedness for hydrological extremes in the near future.
A genetic-algorithm-based remnant grey prediction model for energy demand forecasting.
Hu, Yi-Chung
2017-01-01
Energy demand is an important economic index, and demand forecasting has played a significant role in drawing up energy development plans for cities or countries. As the use of large datasets and statistical assumptions is often impractical to forecast energy demand, the GM(1,1) model is commonly used because of its simplicity and ability to characterize an unknown system by using a limited number of data points to construct a time series model. This paper proposes a genetic-algorithm-based remnant GM(1,1) (GARGM(1,1)) with sign estimation to further improve the forecasting accuracy of the original GM(1,1) model. The distinctive feature of GARGM(1,1) is that it simultaneously optimizes the parameter specifications of the original and its residual models by using the GA. The results of experiments pertaining to a real case of energy demand in China showed that the proposed GARGM(1,1) outperforms other remnant GM(1,1) variants.
A genetic-algorithm-based remnant grey prediction model for energy demand forecasting
2017-01-01
Energy demand is an important economic index, and demand forecasting has played a significant role in drawing up energy development plans for cities or countries. As the use of large datasets and statistical assumptions is often impractical to forecast energy demand, the GM(1,1) model is commonly used because of its simplicity and ability to characterize an unknown system by using a limited number of data points to construct a time series model. This paper proposes a genetic-algorithm-based remnant GM(1,1) (GARGM(1,1)) with sign estimation to further improve the forecasting accuracy of the original GM(1,1) model. The distinctive feature of GARGM(1,1) is that it simultaneously optimizes the parameter specifications of the original and its residual models by using the GA. The results of experiments pertaining to a real case of energy demand in China showed that the proposed GARGM(1,1) outperforms other remnant GM(1,1) variants. PMID:28981548
Prospective Tests of Southern California Earthquake Forecasts
NASA Astrophysics Data System (ADS)
Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.; Kagan, Y. Y.; Helmstetter, A.; Wiemer, S.; Field, N.
2004-12-01
We are testing earthquake forecast models prospectively using likelihood ratios. Several investigators have developed such models as part of the Southern California Earthquake Center's project called Regional Earthquake Likelihood Models (RELM). Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. Here we describe the testing procedure and present preliminary results. Forecasts are expressed as the yearly rate of earthquakes within pre-specified bins of longitude, latitude, magnitude, and focal mechanism parameters. We test models against each other in pairs, which requires that both forecasts in a pair be defined over the same set of bins. For this reason we specify a standard "menu" of bins and ground rules to guide forecasters in using common descriptions. One menu category includes five-year forecasts of magnitude 5.0 and larger. Contributors will be requested to submit forecasts in the form of a vector of yearly earthquake rates on a 0.1 degree grid at the beginning of the test. Focal mechanism forecasts, when available, are also archived and used in the tests. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.1 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. Tests are based on the log likelihood scores derived from the probability that future earthquakes would occur where they do if a given forecast were true [Kagan and Jackson, J. Geophys. Res.,100, 3,943-3,959, 1995]. For each pair of forecasts, we compute alpha, the probability that the first would be wrongly rejected in favor of the second, and beta, the probability that the second would be wrongly rejected in favor of the first. Computing alpha and beta requires knowing the theoretical distribution of likelihood scores under each hypothesis, which we estimate by simulations. In this scheme, each forecast is given equal status; there is no "null hypothesis" which would be accepted by default. Forecasts and test results will be archived and posted on the RELM web site. Major problems under discussion include how to treat aftershocks, which clearly violate the variable-rate Poissonian hypotheses that we employ, and how to deal with the temporal variations in catalog completeness that follow large earthquakes.
Ensemble Downscaling of Winter Seasonal Forecasts: The MRED Project
NASA Astrophysics Data System (ADS)
Arritt, R. W.; Mred Team
2010-12-01
The Multi-Regional climate model Ensemble Downscaling (MRED) project is a multi-institutional project that is producing large ensembles of downscaled winter seasonal forecasts from coupled atmosphere-ocean seasonal prediction models. Eight regional climate models each are downscaling 15-member ensembles from the National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) and the new NASA seasonal forecast system based on the GEOS5 atmospheric model coupled with the MOM4 ocean model. This produces 240-member ensembles, i.e., 8 regional models x 15 global ensemble members x 2 global models, for each winter season (December-April) of 1982-2003. Results to date show that combined global-regional downscaled forecasts have greatest skill for seasonal precipitation anomalies during strong El Niño events such as 1982-83 and 1997-98. Ensemble means of area-averaged seasonal precipitation for the regional models generally track the corresponding results for the global model, though there is considerable inter-model variability amongst the regional models. For seasons and regions where area mean precipitation is accurately simulated the regional models bring added value by extracting greater spatial detail from the global forecasts, mainly due to better resolution of terrain in the regional models. Our results also emphasize that an ensemble approach is essential to realizing the added value from the combined global-regional modeling system.
Three models intercomparison for Quantitative Precipitation Forecast over Calabria
NASA Astrophysics Data System (ADS)
Federico, S.; Avolio, E.; Bellecci, C.; Colacino, M.; Lavagnini, A.; Accadia, C.; Mariani, S.; Casaioli, M.
2004-11-01
In the framework of the National Project “Sviluppo di distretti industriali per le Osservazioni della Terra” (Development of Industrial Districts for Earth Observations) funded by MIUR (Ministero dell'Università e della Ricerca Scientifica --Italian Ministry of the University and Scientific Research) two operational mesoscale models were set-up for Calabria, the southernmost tip of the Italian peninsula. Models are RAMS (Regional Atmospheric Modeling System) and MM5 (Mesoscale Modeling 5) that are run every day at Crati scrl to produce weather forecast over Calabria (http://www.crati.it). This paper reports model intercomparison for Quantitative Precipitation Forecast evaluated for a 20 month period from 1th October 2000 to 31th May 2002. In addition to RAMS and MM5 outputs, QBOLAM rainfall fields are available for the period selected and included in the comparison. This model runs operationally at “Agenzia per la Protezione dell'Ambiente e per i Servizi Tecnici”. Forecasts are verified comparing models outputs with raingauge data recorded by the regional meteorological network, which has 75 raingauges. Large-scale forcing is the same for all models considered and differences are due to physical/numerical parameterizations and horizontal resolutions. QPFs show differences between models. Largest differences are for BIA compared to the other considered scores. Performances decrease with increasing forecast time for RAMS and MM5, whilst QBOLAM scores better for second day forecast.
NASA Astrophysics Data System (ADS)
Han, H. J.; Kang, J. H.
2016-12-01
Since Jul. 2015, KIAPS (Korea Institute of Atmospheric Prediction Systems) has been performing the semi real-time forecast system to assess the performance of their forecast system as a NWP model. KPOP (KIAPS Protocol for Observation Processing) is a part of KIAPS data assimilation system and has been performing well in KIAPS semi real-time forecast system. In this study, due to the fact that KPOP would be able to treat the scatterometer wind data, we analyze the effect of scatterometer wind (ASCAT-A/B) on KIAPS semi real-time forecast system. O-B global distribution and statistics of scatterometer wind give use two information which are the difference between background field and observation is not too large and KPOP processed the scatterometer wind data well. The changes of analysis increment because of O-B global distribution appear remarkably at the bottom of atmospheric field. It also shows that scatterometer wind data cover wide ocean where data would be able to short. Performance of scatterometer wind data can be checked through the vertical error reduction against IFS between background and analysis field and vertical statistics of O-A. By these analysis result, we can notice that scatterometer wind data will influence the positive effect on lower level performance of semi real-time forecast system at KIAPS. After, long-term result based on effect of scatterometer wind data will be analyzed.
NASA Technical Reports Server (NTRS)
Barrett, Joe H., III; Roeder, William P.
2010-01-01
The expected peak wind speed for the day is an important element in the daily morning forecast for ground and space launch operations at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS). The 45th Weather Squadron (45 WS) must issue forecast advisories for KSC/CCAFS when they expect peak gusts for >= 25, >= 35, and >= 50 kt thresholds at any level from the surface to 300 ft. In Phase I of this task, the 45 WS tasked the Applied Meteorology Unit (AMU) to develop a cool-season (October - April) tool to help forecast the non-convective peak wind from the surface to 300 ft at KSC/CCAFS. During the warm season, these wind speeds are rarely exceeded except during convective winds or under the influence of tropical cyclones, for which other techniques are already in use. The tool used single and multiple linear regression equations to predict the peak wind from the morning sounding. The forecaster manually entered several observed sounding parameters into a Microsoft Excel graphical user interface (GUI), and then the tool displayed the forecast peak wind speed, average wind speed at the time of the peak wind, the timing of the peak wind and the probability the peak wind will meet or exceed 35, 50 and 60 kt. The 45 WS customers later dropped the requirement for >= 60 kt wind warnings. During Phase II of this task, the AMU expanded the period of record (POR) by six years to increase the number of observations used to create the forecast equations. A large number of possible predictors were evaluated from archived soundings, including inversion depth and strength, low-level wind shear, mixing height, temperature lapse rate and winds from the surface to 3000 ft. Each day in the POR was stratified in a number of ways, such as by low-level wind direction, synoptic weather pattern, precipitation and Bulk Richardson number. The most accurate Phase II equations were then selected for an independent verification. The Phase I and II forecast methods were compared using an independent verification data set. The two methods were compared to climatology, wind warnings and advisories issued by the 45 WS, and North American Mesoscale (NAM) model (MesoNAM) forecast winds. The performance of the Phase I and II methods were similar with respect to mean absolute error. Since the Phase I data were not stratified by precipitation, this method's peak wind forecasts had a large negative bias on days with precipitation and a small positive bias on days with no precipitation. Overall, the climatology methods performed the worst while the MesoNAM performed the best. Since the MesoNAM winds were the most accurate in the comparison, the final version of the tool was based on the MesoNAM winds. The probability the peak wind will meet or exceed the warning thresholds were based on the one standard deviation error bars from the linear regression. For example, the linear regression might forecast the most likely peak speed to be 35 kt and the error bars used to calculate that the probability of >= 25 kt = 76%, the probability of >= 35 kt = 50%, and the probability of >= 50 kt = 19%. The authors have not seen this application of linear regression error bars in any other meteorological applications. Although probability forecast tools should usually be developed with logistic regression, this technique could be easily generalized to any linear regression forecast tool to estimate the probability of exceeding any desired threshold . This could be useful for previously developed linear regression forecast tools or new forecast applications where statistical analysis software to perform logistic regression is not available. The tool was delivered in two formats - a Microsoft Excel GUI and a Tool Command Language/Tool Kit (Tcl/Tk) GUI in the Meteorological Interactive Data Display System (MIDDS). The Microsoft Excel GUI reads a MesoNAM text file containing hourly forecasts from 0 to 84 hours, from one model run (00 or 12 UTC). The GUI then displays e peak wind speed, average wind speed, and the probability the peak wind will meet or exceed the 25-, 35- and 50-kt thresholds. The user can display the Day-1 through Day-3 peak wind forecasts, and separate forecasts are made for precipitation and non-precipitation days. The MIDDS GUI uses data from the NAM and Global Forecast System (GFS), instead of the MesoNAM. It can display Day-1 and Day-2 forecasts using NAM data, and Day-1 through Day-5 forecasts using GFS data. The timing of the peak wind is not displayed, since the independent verification showed that none of the forecast methods performed significantly better than climatology. The forecaster should use the climatological timing of the peak wind (2248 UTC) as a first guess and then adjust it based on the movement of weather features.
NASA Astrophysics Data System (ADS)
Zheng, Minghua
Cool-season extratropical cyclones near the U.S. East Coast often have significant impacts on the safety, health, environment and economy of this most densely populated region. Hence it is of vital importance to forecast these high-impact winter storm events as accurately as possible by numerical weather prediction (NWP), including in the medium-range. Ensemble forecasts are appealing to operational forecasters when forecasting such events because they can provide an envelope of likely solutions to serve user communities. However, it is generally accepted that ensemble outputs are not used efficiently in NWS operations mainly due to the lack of simple and quantitative tools to communicate forecast uncertainties and ensemble verification to assess model errors and biases. Ensemble sensitivity analysis (ESA), which employs a linear correlation and regression between a chosen forecast metric and the forecast state vector, can be used to analyze the forecast uncertainty development for both short- and medium-range forecasts. The application of ESA to a high-impact winter storm in December 2010 demonstrated that the sensitivity signals based on different forecast metrics are robust. In particular, the ESA based on the leading two EOF PCs can separate sensitive regions associated with cyclone amplitude and intensity uncertainties, respectively. The sensitivity signals were verified using the leave-one-out cross validation (LOOCV) method based on a multi-model ensemble from CMC, ECMWF, and NCEP. The climatology of ensemble sensitivities for the leading two EOF PCs based on 3-day and 6-day forecasts of historical cyclone cases was presented. It was found that the EOF1 pattern often represents the intensity variations while the EOF2 pattern represents the track variations along west-southwest and east-northeast direction. For PC1, the upper-level trough associated with the East Coast cyclone and its downstream ridge are important to the forecast uncertainty in cyclone strength. The initial differences in forecasting the ridge along the west coast of North America impact the EOF1 pattern most. For PC2, it was shown that the shift of the tri-polar structure is most significantly related to the cyclone track forecasts. The EOF/fuzzy clustering tool was applied to diagnose the scenarios in operational ensemble forecast of East Coast winter storms. It was shown that the clustering method could efficiently separate the forecast scenarios associated with East Coast storms based on the 90-member multi-model ensemble. A scenario-based ensemble verification method has been proposed and applied it to examine the capability of different EPSs in capturing the analysis scenarios for historical East Coast cyclone cases at lead times of 1-9 days. The results suggest that the NCEP model performs better in short-range forecasts in capturing the analysis scenario although it is under-dispersed. The ECMWF ensemble shows the best performance in the medium range. The CMC model is found to show the smallest percentage of members in the analysis group and a relatively high missing rate, suggesting that it is less reliable regarding capturing the analysis scenario when compared with the other two EPSs. A combination of NCEP and CMC models has been found to reduce the missing rate and improve the error-spread skill in medium- to extended-range forecasts. Based on the orthogonal features of the EOF patterns, the model errors for 1-6-day forecasts have been decomposed for the leading two EOF patterns. The results for error decomposition show that the NCEP model tends to better represent both EOF1 and EOF2 patterns by showing less intensity and displacement errors during 1-3 days. The ECMWF model is found to have the smallest errors in both EOF1 and EOF2 patterns during 4-6 days. We have also found that East Coast cyclones in the ECMWF forecast tend to be towards the southwest of the other two models in representing the EOF2 pattern, which is associated with the southwest-northeast shifting of the cyclone. This result suggests that ECMWF model may have a tendency to show a closer-to-shore solution in forecasting East Coast winter storms. The downstream impacts of Rossby wave packets (RWPs) on the predictability of winter storms are investigated to explore the source of ensemble uncertainties. The composited RWPA anomalies show that there are enhanced RWPs propagating across the Pacific in both large-error and large-spread cases over the verification regions. There are also indications that the errors might propagate with a speed comparable with the group velocity of RWPs. Based on the composite results as well as our observations of the operation daily RWPA, a conceptual model of errors/uncertainty development associated with RWPs has been proposed to serve as a practical tool to understand the evolution of forecast errors and uncertainties associated with the coherent RWPs originating from upstream as far as western Pacific. (Abstract shortened by ProQuest.).
Local short-duration precipitation extremes in Sweden: observations, forecasts and projections
NASA Astrophysics Data System (ADS)
Olsson, Jonas; Berg, Peter; Simonsson, Lennart
2015-04-01
Local short-duration precipitation extremes (LSPEs) are a key driver of hydrological hazards, notably in steep catchments with thin soils and in urban environments. The triggered floodings, landslides, etc., have large consequences for society in terms of both economy and health. Accurate estimations of LSPEs on both climatological time-scales (past, present, future) and in real-time is thus of great importance for improved hydrological predictions as well as design of constructions and infrastructure affected by hydrological fluxes. Analysis of LSPEs is, however, associated with various limitations and uncertainties. These are to a large degree associated with the small-scale nature of the meteorological processes behind LSPEs and the associated requirements on observation sensors as well as model descriptions. Some examples of causes for the limitations involved are given in the following. - Observations: High-resolution data sets available for LSPE analyses are often limited to either relatively long series from one or a few stations or relatively short series from larger station networks. Radar data have excellent resolutions in both time and space but the estimated local precipitation intensity is still highly uncertain. New and promising techniques (e.g. microwave links) are still in their infancy. - Weather forecasts (short-range): Although forecasts with the required spatial resolution for potential generation of LSPEs (around 2-4 km) are becoming operationally available, the actual forecast precision of LSPEs is largely unknown. Forecasted LSPEs may be displaced in time or, more critically, in space which strongly affects the possibility to assess hydrological risk. - Climate projections: The spatial resolution of the current RCM generation (around 25 km) is not sufficient for proper description of LSPEs. Statistical post-processing (i.e. downscaling) is required which adds substantial uncertainty to the final result. Ensemble generation of sufficiently high-resolution RCM projections is not yet computationally feasible. In this presentation, examples of recent research in Sweden related to these aspects will be given with some main findings shown and discussed. Finally, some ongoing and future research directions will be outlined (the former hopefully accompanied by some brand-new results).
NASA Astrophysics Data System (ADS)
Pytlak, E.; McManamon, A.; Hughes, S. P.; Van Der Zweep, R. A.; Butcher, P.; Karafotias, C.; Beckers, J.; Welles, E.
2016-12-01
Numerous studies have documented the impacts that large scale weather patterns and climate phenomenon like the El Niño Southern Oscillation (ENSO), Pacific-North American (PNA) Pattern, and others can have on seasonal temperature and precipitation in the Columbia River Basin (CRB). While far from perfect in terms of seasonal predictability in specific locations, these intra-annual weather and climate signal do tilt the odds toward different temperature and precipitation outcomes, which in turn can have impacts on seasonal snowpacks, streamflows and water supply in large river basins like the CRB. We hypothesize that intraseasonal climate signals and long wave jet stream patterns can be objectively incorporated into what it is otherwise a climatology-based set of Ensemble Streamflow Forecasts, and can increase the predictive skill and utility of these forecasts used for mid-range hydropower planning. The Bonneville Power Administration (BPA) and Deltares have developed a subsampling-resampling method to incorporate climate mode information into the Ensemble Streamflow Prediction (ESP) forecasts (Beckers, et al., 2016). Since 2015, BPA and Deltares USA have experimented with this method in pre-operational use, using five objective multivariate climate indices that appear to have the greatest predictive value for seasonal temperature and precipitation in the CRB. The indices are used to objectively select historical weather from about twenty analog years in the 66-year (1949-2015) historical ESP set. These twenty scenarios then serve as the starting point to generate monthly synthetic weather and streamflow time series to return to a set of 66 streamflow traces. Our poster will share initial results from the 2015 and 2016 water years, which included large swings in the Quasi-Biennial Oscillation, persistent blocking jet stream patterns, and the development of a strong El Niño event. While the results are very preliminary and for only two seasons, there may be some value in incorporating objectively-identified climate signals into ESP-based streamflow forecasts.Beckers, J. V. L., Weerts, A. H., Tijdeman, E., and Welles, E.: ENSO-Conditioned Weather Resampling Method for Seasonal Ensemble Streamflow Prediction, Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-72, in review, 2016.
Operational foreshock forecasting: Fifteen years after
NASA Astrophysics Data System (ADS)
Ogata, Y.
2010-12-01
We are concerned with operational forecasting of the probability that events are foreshocks of a forthcoming earthquake that is significantly larger (mainshock). Specifically, we define foreshocks as the preshocks substantially smaller than the mainshock by a magnitude gap of 0.5 or larger. The probability gain of foreshock forecast is extremely high compare to long-term forecast by renewal processes or various alarm-based intermediate-term forecasts because of a large event’s low occurrence rate in a short period and a narrow target region. Thus, it is desired to establish operational foreshock probability forecasting as seismologists have done for aftershocks. When a series of earthquakes occurs in a region, we attempt to discriminate foreshocks from a swarm or mainshock-aftershock sequence. Namely, after real time identification of an earthquake cluster using methods such as the single-link algorithm, the probability is calculated by applying statistical features that discriminate foreshocks from other types of clusters, by considering the events' stronger proximity in time and space and tendency towards chronologically increasing magnitudes. These features were modeled for probability forecasting and the coefficients of the model were estimated in Ogata et al. (1996) for the JMA hypocenter data (M≧4, 1926-1993). Currently, fifteen years has passed since the publication of the above-stated work so that we are able to present the performance and validation of the forecasts (1994-2009) by using the same model. Taking isolated events into consideration, the probability of the first events in a potential cluster being a foreshock vary in a range between 0+% and 10+% depending on their locations. This conditional forecasting performs significantly better than the unconditional (average) foreshock probability of 3.7% throughout Japan region. Furthermore, when we have the additional events in a cluster, the forecast probabilities range more widely from nearly 0% to about 40% depending on the discrimination features among the events in the cluster. This conditional forecasting further performs significantly better than the unconditional foreshock probability of 7.3%, which is the average probability of the plural events in the earthquake clusters. Indeed, the frequency ratios of the actual foreshocks are consistent with the forecasted probabilities. Reference: Ogata, Y., Utsu, T. and Katsura, K. (1996). Statistical discrimination of foreshocks from other earthquake clusters, Geophys. J. Int. 127, 17-30.
The use of satellite data assimilation methods in regional NWP for solar irradiance forecasting
NASA Astrophysics Data System (ADS)
Kurzrock, Frederik; Cros, Sylvain; Chane-Ming, Fabrice; Potthast, Roland; Linguet, Laurent; Sébastien, Nicolas
2016-04-01
As an intermittent energy source, the injection of solar power into electricity grids requires irradiance forecasting in order to ensure grid stability. On time scales of more than six hours ahead, numerical weather prediction (NWP) is recognized as the most appropriate solution. However, the current representation of clouds in NWP models is not sufficiently precise for an accurate forecast of solar irradiance at ground level. Dynamical downscaling does not necessarily increase the quality of irradiance forecasts. Furthermore, incorrectly simulated cloud evolution is often the cause of inaccurate atmospheric analyses. In non-interconnected tropical areas, the large amplitudes of solar irradiance variability provide abundant solar yield but present significant problems for grid safety. Irradiance forecasting is particularly important for solar power stakeholders in these regions where PV electricity penetration is increasing. At the same time, NWP is markedly more challenging in tropic areas than in mid-latitudes due to the special characteristics of tropical homogeneous convective air masses. Numerous data assimilation methods and strategies have evolved and been applied to a large variety of global and regional NWP models in the recent decades. Assimilating data from geostationary meteorological satellites is an appropriate approach. Indeed, models converting radiances measured by satellites into cloud properties already exist. Moreover, data are available at high temporal frequencies, which enable a pertinent cloud cover evolution modelling for solar energy forecasts. In this work, we present a survey of different approaches which aim at improving cloud cover forecasts using the assimilation of geostationary meteorological satellite data into regional NWP models. Various approaches have been applied to a variety of models and satellites and in different regions of the world. Current methods focus on the assimilation of cloud-top information, derived from infrared channels. For example, those information have been directly assimilated by modifying the water vapour profile in the initial conditions of the WRF model in California using GOES satellite imagery. In Europe, the assimilation of cloud-top height and relative humidity has been performed in an indirect approach using an ensemble Kalman filter. In this case Meteosat SEVIRI cloud information has been assimilated in the COSMO model. Although such methods generally provide improved cloud cover forecasts in mid-latitudes, the major limitation is that only clear-sky or completely cloudy cases can be considered. Indeed, fractional clouds cause a measured signal mixing cold clouds and warmer Earth surface. If the model's initial state is directly forced by cloud properties observed by satellite, the changed model fields have to be smoothed in order to avoid numerical instability. Other crucial aspects which influence forecast quality in the case of satellite radiance assimilation are channel selection, bias and error treatment. The overall promising satellite data assimilation methods in regional NWP have not yet been explicitly applied and tested under tropical conditions. Therefore, a deeper understanding on the benefits of such methods is necessary to improve irradiance forecast schemes.
MJO prediction using the sub-seasonal to seasonal forecast model of Beijing Climate Center
NASA Astrophysics Data System (ADS)
Liu, Xiangwen; Wu, Tongwen; Yang, Song; Li, Tim; Jie, Weihua; Zhang, Li; Wang, Zaizhi; Liang, Xiaoyun; Li, Qiaoping; Cheng, Yanjie; Ren, Hongli; Fang, Yongjie; Nie, Suping
2017-05-01
By conducting several sets of hindcast experiments using the Beijing Climate Center Climate System Model, which participates in the Sub-seasonal to Seasonal (S2S) Prediction Project, we systematically evaluate the model's capability in forecasting MJO and its main deficiencies. In the original S2S hindcast set, MJO forecast skill is about 16 days. Such a skill shows significant seasonal-to-interannual variations. It is found that the model-dependent MJO forecast skill is more correlated with the Indian Ocean Dipole (IOD) than with the El Niño-Southern Oscillation. The highest skill is achieved in autumn when the IOD attains its maturity. Extended skill is found when the IOD is in its positive phase. MJO forecast skill's close association with the IOD is partially due to the quickly strengthening relationship between MJO amplitude and IOD intensity as lead time increases to about 15 days, beyond which a rapid weakening of the relationship is shown. This relationship transition may cause the forecast skill to decrease quickly with lead time, and is related to the unrealistic amplitude and phase evolutions of predicted MJO over or near the equatorial Indian Ocean during anomalous IOD phases, suggesting a possible influence of exaggerated IOD variability in the model. The results imply that the upper limit of intraseasonal predictability is modulated by large-scale external forcing background state in the tropical Indian Ocean. Two additional sets of hindcast experiments with improved atmosphere and ocean initial conditions (referred to as S2S_IEXP1 and S2S_IEXP2, respectively) are carried out, and the results show that the overall MJO forecast skill is increased to 21-22 days. It is found that the optimization of initial sea surface temperature condition largely accounts for the increase of the overall MJO forecast skill, even though the improved initial atmosphere conditions also play a role. For the DYNAMO/CINDY field campaign period, the forecast skill increases to 27 days in S2S_IEXP2. Nevertheless, even with improved initialization, it is still difficult for the model to predict MJO propagation across the western hemisphere-western Indian Ocean area and across the eastern Indian Ocean-Maritime Continent area. Especially, MJO prediction is apparently limited by various interrelated deficiencies (e.g., overestimated IOD, shorter-than-observed MJO life cycle, Maritime Continent prediction barrier), due possibly to the model bias in the background moisture field over the eastern Indian Ocean and Maritime Continent. Thus, more efforts are needed to correct the deficiency in model physics in this region, in order to overcome the well-known Maritime Continent predictability barrier.
NASA Astrophysics Data System (ADS)
Beckers, J.; Weerts, A.; Tijdeman, E.; Welles, E.; McManamon, A.
2013-12-01
To provide reliable and accurate seasonal streamflow forecasts for water resources management several operational hydrologic agencies and hydropower companies around the world use the Extended Streamflow Prediction (ESP) procedure. The ESP in its original implementation does not accommodate for any additional information that the forecaster may have about expected deviations from climatology in the near future. Several attempts have been conducted to improve the skill of the ESP forecast, especially for areas which are affected by teleconnetions (e,g. ENSO, PDO) via selection (Hamlet and Lettenmaier, 1999) or weighting schemes (Werner et al., 2004; Wood and Lettenmaier, 2006; Najafi et al., 2012). A disadvantage of such schemes is that they lead to a reduction of the signal to noise ratio of the probabilistic forecast. To overcome this, we propose a resampling method conditional on climate indices to generate meteorological time series to be used in the ESP. The method can be used to generate a large number of meteorological ensemble members in order to improve the statistical properties of the ensemble. The effectiveness of the method was demonstrated in a real-time operational hydrologic seasonal forecasts system for the Columbia River basin operated by the Bonneville Power Administration. The forecast skill of the k-nn resampler was tested against the original ESP for three basins at the long-range seasonal time scale. The BSS and CRPSS were used to compare the results to those of the original ESP method. Positive forecast skill scores were found for the resampler method conditioned on different indices for the prediction of spring peak flows in the Dworshak and Hungry Horse basin. For the Libby Dam basin however, no improvement of skill was found. The proposed resampling method is a promising practical approach that can add skill to ESP forecasts at the seasonal time scale. Further improvement is possible by fine tuning the method and selecting the most informative climate indices for the region of interest.
Pathways to designing and running an operational flood forecasting system: an adventure game!
NASA Astrophysics Data System (ADS)
Arnal, Louise; Pappenberger, Florian; Ramos, Maria-Helena; Cloke, Hannah; Crochemore, Louise; Giuliani, Matteo; Aalbers, Emma
2017-04-01
In the design and building of an operational flood forecasting system, a large number of decisions have to be taken. These include technical decisions related to the choice of the meteorological forecasts to be used as input to the hydrological model, the choice of the hydrological model itself (its structure and parameters), the selection of a data assimilation procedure to run in real-time, the use (or not) of a post-processor, and the computing environment to run the models and display the outputs. Additionally, a number of trans-disciplinary decisions are also involved in the process, such as the way the needs of the users will be considered in the modelling setup and how the forecasts (and their quality) will be efficiently communicated to ensure usefulness and build confidence in the forecasting system. We propose to reflect on the numerous, alternative pathways to designing and running an operational flood forecasting system through an adventure game. In this game, the player is the protagonist of an interactive story driven by challenges, exploration and problem-solving. For this presentation, you will have a chance to play this game, acting as the leader of a forecasting team at an operational centre. Your role is to manage the actions of your team and make sequential decisions that impact the design and running of the system in preparation to and during a flood event, and that deal with the consequences of the forecasts issued. Your actions are evaluated by how much they cost you in time, money and credibility. Your aim is to take decisions that will ultimately lead to a good balance between time and money spent, while keeping your credibility high over the whole process. This game was designed to highlight the complexities behind decision-making in an operational forecasting and emergency response context, in terms of the variety of pathways that can be selected as well as the timescale, cost and timing of effective actions.
Ensemble Bayesian forecasting system Part I: Theory and algorithms
NASA Astrophysics Data System (ADS)
Herr, Henry D.; Krzysztofowicz, Roman
2015-05-01
The ensemble Bayesian forecasting system (EBFS), whose theory was published in 2001, is developed for the purpose of quantifying the total uncertainty about a discrete-time, continuous-state, non-stationary stochastic process such as a time series of stages, discharges, or volumes at a river gauge. The EBFS is built of three components: an input ensemble forecaster (IEF), which simulates the uncertainty associated with random inputs; a deterministic hydrologic model (of any complexity), which simulates physical processes within a river basin; and a hydrologic uncertainty processor (HUP), which simulates the hydrologic uncertainty (an aggregate of all uncertainties except input). It works as a Monte Carlo simulator: an ensemble of time series of inputs (e.g., precipitation amounts) generated by the IEF is transformed deterministically through a hydrologic model into an ensemble of time series of outputs, which is next transformed stochastically by the HUP into an ensemble of time series of predictands (e.g., river stages). Previous research indicated that in order to attain an acceptable sampling error, the ensemble size must be on the order of hundreds (for probabilistic river stage forecasts and probabilistic flood forecasts) or even thousands (for probabilistic stage transition forecasts). The computing time needed to run the hydrologic model this many times renders the straightforward simulations operationally infeasible. This motivates the development of the ensemble Bayesian forecasting system with randomization (EBFSR), which takes full advantage of the analytic meta-Gaussian HUP and generates multiple ensemble members after each run of the hydrologic model; this auxiliary randomization reduces the required size of the meteorological input ensemble and makes it operationally feasible to generate a Bayesian ensemble forecast of large size. Such a forecast quantifies the total uncertainty, is well calibrated against the prior (climatic) distribution of predictand, possesses a Bayesian coherence property, constitutes a random sample of the predictand, and has an acceptable sampling error-which makes it suitable for rational decision making under uncertainty.
NASA Astrophysics Data System (ADS)
Harty, T. M.; Lorenzo, A.; Holmgren, W.; Morzfeld, M.
2017-12-01
The irradiance incident on a solar panel is the main factor in determining the power output of that panel. For this reason, accurate global horizontal irradiance (GHI) estimates and forecasts are critical when determining the optimal location for a solar power plant, forecasting utility scale solar power production, or forecasting distributed, behind the meter rooftop solar power production. Satellite images provide a basis for producing the GHI estimates needed to undertake these objectives. The focus of this work is to combine satellite derived GHI estimates with ground sensor measurements and an advection model. The idea is to use accurate but sparsely distributed ground sensors to improve satellite derived GHI estimates which can cover large areas (the size of a city or a region of the United States). We use a Bayesian framework to perform the data assimilation, which enables us to produce irradiance forecasts and associated uncertainties which incorporate both satellite and ground sensor data. Within this framework, we utilize satellite images taken from the GOES-15 geostationary satellite (available every 15-30 minutes) as well as ground data taken from irradiance sensors and rooftop solar arrays (available every 5 minutes). The advection model, driven by wind forecasts from a numerical weather model, simulates cloud motion between measurements. We use the Local Ensemble Transform Kalman Filter (LETKF) to perform the data assimilation. We present preliminary results towards making such a system useful in an operational context. We explain how localization and inflation in the LETKF, perturbations of wind-fields, and random perturbations of the advection model, affect the accuracy of our estimates and forecasts. We present experiments showing the accuracy of our forecasted GHI over forecast-horizons of 15 mins to 1 hr. The limitations of our approach and future improvements are also discussed.
Assessment of Folsom Lake Watershed response to historical and potential future climate scenarios
Carpenter, Theresa M.; Georgakakos, Konstantine P.
2000-01-01
An integrated forecast-control system was designed to allow the profitable use of ensemble forecasts for the operational management of multi-purpose reservoirs. The system ingests large-scale climate model monthly precipitation through the adjustment of the marginal distribution of reservoir-catchment precipitation to reflect occurrence of monthly climate precipitation amounts in the extreme terciles of their distribution. Generation of ensemble reservoir inflow forecasts is then accomplished with due account for atmospheric- forcing and hydrologic- model uncertainties. These ensemble forecasts are ingested by the decision component of the integrated system, which generates non- inferior trade-off surfaces and, given management preferences, estimates of reservoir- management benefits over given periods. In collaboration with the Bureau of Reclamation and the California Nevada River Forecast Center, the integrated system is applied to Folsom Lake in California to evaluate the benefits for flood control, hydroelectric energy production, and low flow augmentation. In addition to retrospective studies involving the historical period 1964-1993, system simulations were performed for the future period 2001-2030, under a control (constant future greenhouse-gas concentrations assumed at the present levels) and a greenhouse-gas- increase (1-% per annum increase assumed) scenario. The present paper presents and validates ensemble 30-day reservoir- inflow forecasts under a variety of situations. Corresponding reservoir management results are presented in Yao and Georgakakos, A., this issue. Principle conclusions of this paper are that the integrated system provides reliable ensemble inflow volume forecasts at the 5-% confidence level for the majority of the deciles of forecast frequency, and that the use of climate model simulations is beneficial mainly during high flow periods. It is also found that, for future periods with potential sharp climatic increases of precipitation amount and to maintain good reliability levels, operational ensemble inflow forecasting should involve atmospheric forcing from appropriate climatic periods.
International Aftershock Forecasting: Lessons from the Gorkha Earthquake
NASA Astrophysics Data System (ADS)
Michael, A. J.; Blanpied, M. L.; Brady, S. R.; van der Elst, N.; Hardebeck, J.; Mayberry, G. C.; Page, M. T.; Smoczyk, G. M.; Wein, A. M.
2015-12-01
Following the M7.8 Gorhka, Nepal, earthquake of April 25, 2015 the USGS issued a series of aftershock forecasts. The initial impetus for these forecasts was a request from the USAID Office of US Foreign Disaster Assistance to support their Disaster Assistance Response Team (DART) which coordinated US Government disaster response, including search and rescue, with the Government of Nepal. Because of the possible utility of the forecasts to people in the region and other response teams, the USGS released these forecasts publicly through the USGS Earthquake Program web site. The initial forecast used the Reasenberg and Jones (Science, 1989) model with generic parameters developed for active deep continental regions based on the Garcia et al. (BSSA, 2012) tectonic regionalization. These were then updated to reflect a lower productivity and higher decay rate based on the observed aftershocks, although relying on teleseismic observations, with a high magnitude-of-completeness, limited the amount of data. After the 12 May M7.3 aftershock, the forecasts used an Epidemic Type Aftershock Sequence model to better characterize the multiple sources of earthquake clustering. This model provided better estimates of aftershock uncertainty. These forecast messages were crafted based on lessons learned from the Christchurch earthquake along with input from the U.S. Embassy staff in Kathmandu. Challenges included how to balance simple messaging with forecasts over a variety of time periods (week, month, and year), whether to characterize probabilities with words such as those suggested by the IPCC (IPCC, 2010), how to word the messages in a way that would translate accurately into Nepali and not alarm the public, and how to present the probabilities of unlikely but possible large and potentially damaging aftershocks, such as the M7.3 event, which had an estimated probability of only 1-in-200 for the week in which it occurred.